Lies, Damned Lies, and AT&T’s 5GE

There are three kinds of lies: lies, damned lies, and statistics.Benjamin Disraeli*

Fortunately, the sentiment behind this quote isn’t always accurate. Sometimes statistics can reveal lies. AT&T has recently taken a lot of heat for misleadingly branding advanced 4G networks as “5GE.” Ian Fogg at Opensignal published a post where he draws on Opensignal’s data to assess how AT&T’s 5GE-enabled phones perform compared to similar phones on other carriers. The results:1

In response to AT&T’s misleading branding, Verizon launched a video advertisement showing a head-to-head speed comparison between Verizon’s network and AT&T’s 5GE network.

In that video, Verizon’s 4G LTE network comes out with a download speed near 120Mbps while AT&T’s 5GE network came out around 40Mbps. That, of course, seems funny given the Opensignal data suggesting the networks deliver similar speeds on average.

A portion of the Verizon video—not long enough to show the final results—showed up in a Twitter ad. That ad led to a Twitter exchange between myself; Light Reading’s editorial director, Mike Dano; and Verizon’s PR manager, Steven Van Dinter. Dinter explained that Verizon chose to film in a public spot where AT&T’s 5GE symbol was very strong. I take Dinter’s word that there wasn’t foul play or blatant manipulation, but it is funny to see Verizon fighting misleading branding from AT&T with a misleading ad of its own.

Average Download Speed Is Overrated

I’ve started looking into the methodologies used by entities that collect cell phone network performance data. I keep seeing an emphasis on average (or median) download and upload speeds when data-service quality is discussed.

  • Opensignal bases it’s data-experience rankings exclusively on download and upload speeds.1
  • Tom’s Guide appears to account for data-quality using average download and possibly upload speeds.2
  • RootMetrics doesn’t explicitly disclose how it arrives at final data-performance scores, but emphasis is placed on median upload and download speeds.3

It’s easy to understand what average and median speeds represent. Unfortunately, these metrics fail to capture something essential—variance in speeds.

For example, OpenSignal’s latest report for U.S. networks shows that Verizon has the fastest average download speed of 31 Mbps in the Chicago area. AT&T’s average download speed is only 22 Mbps in the same area. Both those speeds are easily fast enough for typical activities on a phone. At 22 Mbps per second, I could stream video, listen to music, or browse the internet seamlessly. For the rare occasion where I download a 100MB file, Verizon’s network at the average speed would beat AT&T’s by about 10.6 seconds.4 Not a big deal for something I do maybe once a month.

On the other hand, variance in download speeds can matter quite a lot. If I have 31 Mbps speeds on average, but I occasionally have sub-1 Mbps speeds, it may sometimes be annoying or impossible to use my phone for browsing and streaming. Periodically having 100+ Mbps speeds would not make up for the inconvenience of sometimes having low speeds. I’d happily accept a modest decrease in average speeds in exchange for a modest decrease in variance.5

Issues with Consumer Reports’ 2017 Cell Phone Plan Rankings


Consumer Reports offers ratings of cellular service providers based on survey data collected from Consumer Reports subscribers. Through subscriber surveying in 2017, Consumer Reports collected data on seven metrics:1

  1. Value
  2. Data service quality
  3. Voice service quality
  4. Text service quality
  5. Web service quality
  6. Telemarketing call frequency
  7. Support service quality

The surveys collected data from over 100,000 subscribers.2 I believe Consumer Reports would frown upon a granular discussion of the exact survey results, so I’ll remain vague about exact ratings in this post. If you would like to see the full results of their survey, Consumer Reports subscribers can do so here.

Survey results

Results are reported for 20 service providers. Most of these providers are mobile virtual network operators (MVNOs). MVNOs don’t operate their own network hardware but make use of other companies’ networks. For the most part, MVNOs use networks provided by the Big Four (Verizon, Sprint, AT&T, and T-Mobile).

Interestingly, the Big Four do poorly in Consumer Reports’ evaluation. Verizon, AT&T, and Sprint receive the lowest overall ratings and take the last three spots. T-Mobile doesn’t do much better.

This is surprising. The Big Four do terribly, even though MVNOs are using the Big Four’s networks. Generally, I would expect the Big Four to offer network access to their direct subscribers that is as good or better than the access that MVNO subscribers receive.

It’s possible that the good ratings can be explained by MVNOs offering prices and customer service far better than the Big Four—making them deserving of the high ratings for reasons separate from network quality.

Testing the survey’s validity

To test the reliability of Consumer Reports methodology, we can compare MVNOs to the Big Four using only the metrics about network quality (ignoring measures of value, telemarketing call frequency, and support quality). In many cases, MVNOs use more than one of the Big Four’s networks. However, several MVNOs use only one network, allowing for easy apples-to-apples comparisons.3

  • Boost Mobile is owned by Sprint.
  • Virgin Mobile is owned by Sprint.
  • Circket Wireless is owned by AT&T.
  • MetroPCS is owned by T-Mobile.
  • GreatCall runs exclusively on Verizon’s network.
  • Page Plus Cellular runs exclusively on Verizon’s network.

When comparing network quality ratings between these MVNOs and the companies that run their networks:

  • Boost Mobile’s ratings beat Sprint’s ratings in every category.
  • Virgin Mobile’s ratings beat Sprint’s ratings in every category.
  • Cricket Wireless’s ratings beat or tie AT&T’s ratings in every category.
  • MetroPCS’s ratings beat or tie T-Mobile’s ratings in every category.
  • GreatCall doesn’t have a rating for web quality due to insufficient data. GreatCall’s ratings match or beat Verizon in the other categories.
  • Page Plus Cellular doesn’t have a rating for web quality due to insufficient data. Page Plus’ ratings match or beat Verizon in the other categories.
World’s best stock photo.
Taken at face value, these are odd results. There are complicated stories you could tell to salvage the results, but I think it’s much more plausible that Consumer Reports’ surveys just don’t work well for evaluating the relative quality of cell phone service providers.

Why aren’t the results reliable?

I’m not sure why the surveys don’t work, but I see three promising explanations:

  • Metrics may not be evaluated independently. For example, consumers might take a service’s price into account when providing a rating of its voice quality.
  • Lack of objective evaluations. Consumers may not provide objective evaluations. Perhaps consumers are aware of some sort of general stigma about Sprint that unfairly affects how they evaluate Sprint’s quality (but that same stigma may not be applied to MVNOs that use Sprint’s network).
  • Selection bias. Individuals who subscribe to one carrier are probably, on average, different from individuals who subscribe to another carrier. Perhaps individuals who have used Carrier A tend to use small amounts of data and are lenient when rating data service quality. Individuals who have used Carrier B may get more upset about data quality issues. Consumer Cellular took the top spot in the 2017 rankings. I don’t think it’s coincidental that Consumer Cellular has pursued branding and marketing strategies to target senior citizens.4

Consumer Reports’ website gives the impression that their cell phone plan rankings will be reliable for comparison purposes.5 They won’t be.

The ratings do capture whether survey respondents are happy with their services. However, the ratings have serious limitations for shoppers trying to assess whether they’ll be satisfied with a given service.

I suspect Consumer Reports’ ratings for other product categories that rely on similar surveys will also be unreliable. However, the concerns I’m raising only apply to a subset of Consumer Reports’ evaluations. A lot of Consumer Reports’ work is based on product testing rather than consumer surveys.