Opensignal’s 2020 U.S. Mobile Performance Report

Today, Opensignal released a new report on the performance of U.S. wireless networks. The report details results on seven different metrics.

Here are the networks that took the top spot for each metric at the national level:

  • Video Experience – Verizon
  • Voice App Experience – T-Mobile/AT&T (draw)
  • Download Speed Experience – AT&T
  • Upload Speed Experience – T-Mobile
  • Latency Experience – AT&T
  • 4G Availability – Verizon
  • 4G Coverage Experience – Verizon

It’s important to interpret these results cautiously due to limitations in Opensignal’s crowdsourcing approach. Since performance data is collected from actual subscribers’ devices, factors not directly related to underlying network quality may impact the organization’s results. For example, if subscribers on a network are unusually likely to use low-end devices or live in rural areas, that will affect the outcome of Opensignal’s analyses. Still, Opensignal’s results are interesting; they’re drawn from a huge data set involving primarily automated performance tests.

Download speed findings

The most notable result in the latest report might be AT&T’s first-place finish on the download speed metric. In the previous Opensignal report, T-Mobile won first place for download speeds, and AT&T took third place. I’ve recently been critical of the methodologies used in some other evaluations that suggested AT&T had the nation’s fastest network. While many of those methodological criticisms still stand, the fact that Opensignal’s arguably more reliable methodology also found AT&T to have the fastest network leads me to believe I was too harsh. I’ll be interested to see whether AT&T also takes the top spot for speeds in RootMetrics’ upcoming report.

New metrics

Two new metrics were introduced in this report: Voice App Experience and 4G Coverage Experience. The Voice App Experience metric assesses the quality of voice calls via apps like Skype and Facebook Messenger. I’m not exactly sure how the metric works, but it looks like all four networks received similar scores. Opensignal deemed all these scores as indicative of “acceptable” quality.

The 4G Coverage Experience metric adds a bit of complexity to the previously existing 4G Availability metric. The coverage metric assesses 4G availability across areas all Opensignal’s users find themselves in, regardless of their network.

Woman making a skeptical face

Opensignal Released a New Report – I’m Skeptical

Opensignal just released a new report on the performance of U.S. wireless networks. The report ranks major U.S. networks in five categories based on crowdsourced data:

  • 4G availability
  • Video experience
  • Download speed experience
  • Upload speed experience
  • Latency experience

Verizon took the top spot for 4G availability and video experience. T-Mobile came out on top for both of the speed metrics. T-Mobile and AT&T shared the top placement for the latency experience metric.

Selection bias

I’ve previously raised concerns about selection bias in Opensignal’s data collection methodology. Opensignal crowdsources data from typical users. Crowdsourcing introduces issues since there are systematic differences between the typical users of different networks. Imagine that Network A has far more extensive coverage in rural areas than Network B. It stands to reason that Network A likely has more subscribers in rural areas than Network B. Lots of attributes of subscribers vary in similar ways between networks. E.g., expensive networks likely have subscribers that are wealthier.

Analyses of crowdsourced data can capture both (a) genuine differences in network performance and (b) differences in how subscribers on each network use their devices. Opensignal’s national results shouldn’t be taken too seriously unless Opensignal can make a compelling argument that either (a) its methodology doesn’t lead to serious selection bias or (b) it’s able to adequately adjust for the bias.

Speed metrics

Opensignal ranks carriers based on average download and upload speeds. In my opinion, average speeds are overrated. The portion of time where speeds are good enough is much more important than the average speed a service offers.

Opensignal’s average download speed results are awfully similar between carriers:

  • Verizon – 22.9 Mbps
  • T-Mobile – 23.6 Mbps
  • AT&T – 22.5 Mbps
  • Sprint – 19.2 Mbps

Service at any of those speeds would be sufficient for almost any activities people typically use their phones for. Without information about how often speeds were especially low on each network, it’s hard to come to conclusions about differences in the actual experience on each network.

Network Evaluation Should Be Transparent

Several third-party firms collect data on the performance of U.S. wireless networks. Over the last few months, I’ve tried to dig deeply into several of these firms’ methodologies. In every case, I’ve found the public-facing information to be inadequate. I’ve also been unsuccessful when reaching out to some of the firms for additional information.

It’s my impression that evaluation firms generally make most of their money by selling data access to network operators, analysts, and other entities that are not end consumers. If this was all these companies did with their data, I would understand the lack of transparency. However, most of these companies publish consumer-facing content. Often this takes the form of awards granted to network operators that do well in evaluations. It looks like network operators regularly pay third-party evaluators for permission to advertise the receipt of awards. I wish financial arrangements between evaluators and award winners were a matter of public record, but that’s a topic for another day. Today, I’m focusing on the lack of transparency around evaluation methodologies.

RootMetrics collects data on several different aspects of network performance and aggregates that data to form overall scores for each major network. How exactly does RootMetrics do that aggregation?

The results are converted into scores using a proprietary algorithm.[1]

I’ve previously written about how difficult it is to combine data on many aspects of a product or service to arrive at a single, overall score. Beyond that, there’s good evidence that different analysts working in good faith with the same raw data often make different analytical choices that lead to substantive differences in the results of their analyses. I’m not going take it on faith that RootMetrics’ proprietary algorithm aggregates data in a highly-defensible manner. No one else should either.

Opensignal had a long history of giving most of its performance awards to T-Mobile.[2] Earlier this year, the trend was broken when Verizon took Opensignal’s awards in most categories.[3] It’s not clear why Verizon suddenly became a big winner. The abrupt change strikes me as more likely to have been driven by a change in methodology than a genuine change in the performance of networks relative to one another. Since little is published about Opensignal’s methodology, I can’t confirm or disconfirm my speculation. In Opensignal’s case, questions about methodology are not trivial. There’s good reason to be concerned about possible selection bias in Opensignal’s analyses. Opensignal’s Analytics Charter states:[4]

Our analytics are designed to ensure that each user has an equal impact on the results, and that only real users are counted: ‘one user, one vote’.

Carriers will differ in the proportion of their subscribers that live in rural areas versus densely-populated areas. If the excerpt from the analytics charter is taken literally, it may suggest that Opensignal does not control for differences in subscribers’ geography or demographics. That could explain why T-Mobile has managed to win so many Opensignal awards when T-Mobile obviously does not have the best-performing network at the national level.

Carriers advertise awards from evaluators because third-parties are perceived to be credible. The public deserves to have enough information to assess whether third-party evaluators merit that credibility.

Lies, Damned Lies, and AT&T’s 5GE

There are three kinds of lies: lies, damned lies, and statistics.Benjamin Disraeli*

Fortunately, the sentiment behind this quote isn’t always accurate. Sometimes statistics can reveal lies. AT&T has recently taken a lot of heat for misleadingly branding advanced 4G networks as “5GE.” Ian Fogg at Opensignal published a post where he draws on Opensignal’s data to assess how AT&T’s 5GE-enabled phones perform compared to similar phones on other carriers. The results:[1]

In response to AT&T’s misleading branding, Verizon launched a video advertisement showing a head-to-head speed comparison between Verizon’s network and AT&T’s 5GE network.

In that video, Verizon’s 4G LTE network comes out with a download speed near 120Mbps while AT&T’s 5GE network came out around 40Mbps. That, of course, seems funny given the Opensignal data suggesting the networks deliver similar speeds on average.

A portion of the Verizon video—not long enough to show the final results—showed up in a Twitter ad. That ad led to a Twitter exchange between myself; Light Reading’s editorial director, Mike Dano; and Verizon’s PR manager, Steven Van Dinter. Dinter explained that Verizon chose to film in a public spot where AT&T’s 5GE symbol was very strong. I take Dinter’s word that there wasn’t foul play or blatant manipulation, but it is funny to see Verizon fighting misleading branding from AT&T with a misleading ad of its own.

Average Download Speed Is Overrated

I’ve started looking into the methodologies used by entities that collect cell phone network performance data. I keep seeing an emphasis on average (or median) download and upload speeds when data-service quality is discussed.

  • Opensignal bases it’s data-experience rankings exclusively on download and upload speeds.[1]
  • Tom’s Guide appears to account for data-quality using average download and possibly upload speeds.[2]
  • RootMetrics doesn’t explicitly disclose how it arrives at final data-performance scores, but emphasis is placed on median upload and download speeds.[3]

It’s easy to understand what average and median speeds represent. Unfortunately, these metrics fail to capture something essential—variance in speeds.

For example, OpenSignal’s latest report for U.S. networks shows that Verizon has the fastest average download speed of 31 Mbps in the Chicago area. AT&T’s average download speed is only 22 Mbps in the same area. Both those speeds are easily fast enough for typical activities on a phone. At 22 Mbps per second, I could stream video, listen to music, or browse the internet seamlessly. For the rare occasion where I download a 100MB file, Verizon’s network at the average speed would beat AT&T’s by about 10.6 seconds.[4] Not a big deal for something I do maybe once a month.

On the other hand, variance in download speeds can matter quite a lot. If I have 31 Mbps speeds on average, but I occasionally have sub-1 Mbps speeds, it may sometimes be annoying or impossible to use my phone for browsing and streaming. Periodically having 100+ Mbps speeds would not make up for the inconvenience of sometimes having low speeds. I’d happily accept a modest decrease in average speeds in exchange for a modest decrease in variance.[5]