In the piece, AT&T writes:
Ookla’s reliance on consumer-initiated tests has serious downsides. For example, consumers on different networks use different kinds of phones. Premium phones tend to have hardware that supports faster connections than cheap phones. If subscribers on Network A tend to run tests using premium phones while subscribers on Network B tend to run tests from budget phones, Network A is going to have an advantage in consumer-initiated tests that’s unrelated to underlying network performance.
Ookla’s methodology is also prone to selection bias. Consumer-initiated tests don’t occur among a randomly selected sample of subscribers on each network. Consequently, test results likely aren’t representative of typical network performance. There was a clear case of this problem when AT&T began misleadingly labeling some of its 4G service “5GE.” Here’s an excerpt from a Speedtest.net blog post:
While no method for evaluating mobile network performance is perfect, I tend to think Opensignal and RootMetrics use methodologies that are more reliable than Ookla’s. AT&T didn’t take the top spot for speeds in RootMetrics’ most recent report or Opensignal’s most recent report.
AT&T’s news release includes ridiculous digs at competitors:
Year-over-year changes in average speeds don’t on their own indicate whether networks are fast. A visual in the news release is illuminating:
AT&T performed poorly relative to T-Mobile and Verizon in early 2018. In a sense, AT&T was able to get a 45.7% year-over-year increase in speeds because it performed so poorly in 2018.
I’d argue that AT&T’s news release ignores the most important part of Speedtest’s 2019 report. In my opinion, average speed is overrated. For most consumers, it’s far more important to consistently have decent speeds than to have high average speeds.
Ookla reports a consistency metric based on the proportion of tests that exceed a threshold of 5Mbps. Verizon takes the top spot on this metric, followed by T-Mobile, with AT&T coming in third. Verizon also beats out AT&T for coverage availability, another metric that can act as a proxy for consistency.
Bias against Verizon
In Ookla’s main analyses, data is only included from “competitive geographies.” Competitive geographies only include areas where Ookla has a substantial number of test results from at least three major networks. There are defensible reasons for Ookla to use the competitive geographies filter. However, it should be acknowledged that Verizon has the nation’s most extensive network and likely outperforms AT&T and other networks in non-competitive geographies.
- The blog post is from April 10, 2019 and is archived here.
- I believe the coverage availability tests run in the background of Android phones and are not consumer-initiated.
- Here’s how Ookla describes competitive geographies:
“To meet the definition of ‘competitive’ in the U.S., a zip code must contain samples from at least three top national competitors (those who have at least 3% of market share at a national level), but no competitor can have more than 2/3 of the samples in that zip code. Operators are considered present in a zip code if they have at least 3% of the samples in the area and show samples on multiple devices. Limiting any operator from having more than 2/3 of samples ensures actual competition in a zip code rather than including areas where one competitor dominates the market.”
- A secondary analysis published by Ookla looks outside of competitive geographies and shows that AT&T does poorly in terms of 4G availability.