AT&T’s Claim To Being America’s Best Network

AT&T has been running an ad campaign with commercials where the company claims to offer the best network.

These commercials start with a funny skit that leads to the line, “just ok is not ok.” The commercials’ narrator then says something along the lines of: “AT&T is America’s best wireless network according to America’s biggest test.”

Here’s an example:



Alternate versions of the commercial involve ok babysitters, ok sushi, ok surgeons, and more.

AT&T bases its “best network” claim on the results of Global Wireless Solutions’s (GWS) 2018 tests. The claim is at odds with the results of many other companies’ evaluations and my own view.

The meaning of the word “best” is ambiguous, but I’d guess that a survey of professionals in the wireless industry would find that most people consider RootMetrics to be the best evaluation firm in the wireless industry. Verizon fared far better than AT&T in RootMetrics’s most recent evaluation.

It’s unclear to me what AT&T is claiming when it calls GWS’s test, “America’s biggest test.” Is it the biggest test in terms of miles driven, data points collected, area covered, or something else? GWS may have the biggest test according to one metric, but it’s not unambiguously the biggest test in the nation.

GWS’s OneScore Methodology & 2019 Results

Global Wireless Solutions (GWS) evaluates wireless networks according to the company’s OneScore methodology. At the moment, AT&T cites GWS’s results in commercials where AT&T claims to offer the best network.

In an article about performance tests of wireless networks, GWS’s founder, Dr. Paul Carter, writes:1

With so many conflicting research reports and with every network touting itself as number one, it’s critical that wireless carriers are transparent about how and what they actually test. If what was tested doesn’t match up with the average consumer experience, then was that test truly worthwhile?

Unfortunately, GWS itself is not especially transparent about its methodology. The public-facing information about the company’s methodology is sparse, and I did not receive a response to my email requesting additional information.

As I understand it, GWS’s methodology has two components:

  • Technical performance testing in about 500 markets
  • Consumer surveying that helps determine how much weight to give different metrics

Technical testing

In 2019, GWS conducted extensive drive testing; GWS employees drove close to 1,000,000 miles as phones in their vehicles performed automated tests of networks’ performance.2

The drive testing took place in about 500 of the markets, including all of the largest metropolitan areas. GWS says the testing represents about 94% of the U.S. population.3 I expect that GWS’s focus on these markets limits the weight placed on rural and remote areas. Accordingly, GWS’s results may be biased against Verizon (Verizon tends to have better coverage than other networks in sparsely populated areas).

Consumer surveying

In 2019, GWS surveyed about 5,000 consumers to figure out how much they value different aspects of wireless performance.4 GWS finds that consumers place a lot of importance on phone call voice quality, despite the fact the people are using their phones for more and more activities unrelated to phone calls.5 GWS also finds that, as I’ve suggested, consumers care a lot more about the reliability of their wireless service than its raw speed.6

Combining components

As I understand it, GWS draws on the results of its surveying to decide how much weight to place of different aspects parts of the technical performance tests:

The consumer survey includes questions asking respondents to rank the importance of different tasks they perform on their mobile device, as well as the importance of different aspects of network performance. Our network test results are then weighted according to how consumers prioritize what’s important to them, and evaluated in eleven different network performance areas related to voice, data, network reliability and network coverage.

The methodology’s name, OneScore, and the graphic below suggest that the company combines all of its data to arrive at final, numerical scores for each network:7

GWS OneScore Visual

Oddly enough, I can’t find GWS publishing anything that looks like final scores. That may be a good thing. I’ve previously gone into great detail about why scoring systems that use weighted rubrics to give companies or products a single, overall score tend to work poorly.

2019 Results

In GWS’s 2019 report, the company lists which networks had the best performance in several different areas:

AT&T:

  • Download speed
  • Data reliability
  • Network capacity
  • Video streaming experience
  • Voice accessibility
  • Voice retainability

T-Mobile:

  • Voice quality

Verizon:

  • Upload speed

Open questions

I have a bunch of open questions about GWS’s methodology. If you represent GWS and can shed light on any of these topics, please reach out.

  • Does the focus on 501 markets (94% of the U.S.) tend to leave out rural areas where Verizon has a strong network relative to other operators?
  • Do operators pay GWS? Does AT&T pay to advertise GWS’s results?
  • What does the consumer survey entail?
  • How directly are the results of the consumer survey used to determine weights used later in GWS’s analysis?
  • What does GWS make of the discrepancies between its results and those of RootMetrics?
  • How close were different networks’ scores in each category?
  • GWS shares the best-performing network in several categories. Is information available about the second, third, and fourth-place networks in each category?
  • Does GWS coerce its raw data into a single overall score for each network?
    • Are those results publicly available?
    • How are the raw performance data coerced into scores that can be aggregated?

Dawson On The Government’s Role In 5G

I recently stumbled across a fantastic post by Doug Dawson about the government’s role in 5G. Here’s a bit of it (emphasis mine):

It’s been really interesting to watch how much the federal government talks about 5G technology. I’ve not seen anything else like this in my adult lifetime…I’ve been hearing about the 5G war for a few years now and I still don’t know what it means. 5G is ultimately a broadband technology. I can’t figure out how the US is harmed if China gets better broadband. If there is now a 5G war, then why hasn’t there been a fiber-to-the-home war? I saw recently where China passed us in the number of last-mile fiber connections, and there wasn’t a peep about it out of Congress…Cellular carriers worldwide are crowing about 5G deployment, yet those deployments contain none of the key technology that defines 5G performance. There is no frequency slicing. There is no bonding together of multiple frequencies to create larger data pipes. There is no massive expansion of the number of connections that can be made at a website. Cellphones can’t yet connect to multiple cell sites. What we have instead, for now, are new frequencies layered on top of 4G LTE…The carriers admit that the 600 MHz and the 850 MHz spectrum being deployed won’t result in faster speeds than 4G LTE…It’s starting to look like the real reason for the talk about a 5G war is to drum up sympathy for the big cellular carriers as a justification for big government giveaways.

I mostly agree with Dawson, and I strongly recommend the full post.

Boost’s “Super Reliable, Super Fast” Network

Boost Mobile is running a new commercial that features Pitbull and pitches the company’s low prices. Towards the end of the ad, a narrator says that Boost has a “super fast, super reliable network.” The narration is accompanied by this image:



In most commercials that involve carriers making claims about service quality, carriers will use fine print to cite research that backs up their claims. The Boost ad doesn’t include a citation; perhaps that’s because Boost’s claim doesn’t have much substance. “Super fast, super reliable” is super vague. Boost’s service probably is super fast compared to wireless service from 15 years ago. On the other hand, Boost’s service is not super fast compared to most services currently offered by other U.S. carriers.

Boost runs over Sprint’s network. There are a lot of different companies that evaluate network performance, and Sprint tends to do poorly relative to its competitors in the rigorous evaluations. Sprint had the lowest speeds among all four major networks in both RootMetrics’ and Opensignal’s most recent national assessments.

Boost’s claim looks even sillier in light of the fact that its subscribers tend to have low priority data access on Sprint’s network. When Sprint’s network is congested, Boost Mobile’s subscribers will tend to experience slower data speeds than those who subscribe directly to Sprint’s service.

A misleading image accompanies Boost’s misleading claims about quality. The image looks like a coverage map showing extensive coverage, but a disclaimer states: Coverage not available everywhere. Not a depiction of actual coverage.

Here’s the full commercial: