There are four U.S. wireless networks with nationwide infrastructure: AT&T, T-Mobile, Verizon, and Sprint. These carriers are sometimes referred to collectively as the “Big Four.” Any other carrier that offers nationwide service in the U.S. is, at least in some regions, piggybacking off of one or more of the Big Four networks.
While average download speed is a commonly used network-performance metric, I think it’s overrated. For most consumers, it’s going to be more important that a network can deliver good enough speeds consistently.
I believe RootMetrics’ methodology is particularly well-suited for assessing nationwide reliability. RootMetrics drives high-end phones connected to the major networks all around the country. During the drives, the phones conduct tests of network performance. The company’s testing has substantial geographic coverage, and RootMetrics’ nationwide results may be less prone to selection bias than other companies’ results.
As of April 2019, RootMetrics’ most recent national report is based on data collected in the second half of 2018. In that report, Verizon takes the top spot overall with AT&T coming in a close second. T-Mobile comes in a more distant third. Sprint takes the last spot in the rankings. Here’s a snapshot showing the “Overall performance” scores:
When considering only network reliability as opposed to overall performance, the rankings are unchanged:
Verizon didn’t just do well in the most recent reporting period. Verizon has taken the top overall ranking and the top network reliability ranking in every biannual period since at least the second half of 2013.
Unfortunately, RootMetrics is not transparent about how exactly it reaches its final scores or what exactly the scores mean. While I expect RootMetrics does a better job of ranking major networks in terms of nationwide quality than other evaluators, it may still be worth considering results published by other evaluators.
Opensignal relies on crowdsourced data from consumers. While I’m concerned that Opensignal’s nationwide results may be affected by massive selection bias—especially geographic selection bias—the methodology Opensignal uses has a lot going for it. Opensignal tests real-world scenarios via crowdsourced data from actual consumers.
As of April 2019, Opensignal’s most recent report was published in January 2019. The report was based on a data collection period from mid-September to mid-December of 2018. The report doesn’t appear to list an overall winner but instead ranks carriers in a couple of different categories. Verizon alone wins in 3 of the 5 categories, and Verizon ties for the top spot in another. Here’s a snapshot of the results in the “4G availability” category (the category I think is most relevant to overall reliability):
The results seem to indicate that T-Mobile has substantially more 4G coverage than AT&T. I doubt that’s true. T-Mobile may have an atypical proportion of its subscriber base in densely-populated areas, which could cause the results to skew in T-Mobile’s favor. It’s also worth keeping in mind that the category only accounts for 4G reliability. AT&T likely has substantially better coverage than T-Mobile under pre-4G technologies.
I would like to see analyses of Opensignal’s data that include attempts to control for geographic differences and other, potential sources of selection bias. Unless I see Opensignal transparently attempting these sorts of analyses, I expect I’ll remain skeptical of Opensignal’s national assessments.
I don’t take other evaluation firms’ nationwide results too seriously due to limited public information and/or concerns about methodologies. That said, if you want to view other firms’ results as weak signals, it looks like most firms come up with results similar to RootMetrics’ results. Based on my interpretation, Consumer Reports’ metrics related to network quality come out in Verizon’s favor, with AT&T and T-Mobile coming next, and Sprint coming in last. Tutela’s most recent U.S. report as of April 2019 isn’t exactly national in scope, but it covers 10 large cities. Verizon takes the top ranking among Big Four carriers in 9 of the 10 cities.
None of the third-party firms evaluating wireless network performance are as transparent as I would like. Despite the lack of transparency, I expect RootMetrics’ methodology is better suited for assessing nationwide network quality than other evaluators’ methodologies. RootMetrics’ results suggest that Verizon has the best nationwide network followed in order by AT&T, T-Mobile, and Sprint. This ordering fits with my personal experience and is, for the most part, consistent with recent results published by other somewhat-rigorous evaluation firms.
- Some regional operators with regional infrastructure will offer nationwide service via roaming agreements that allow subscribers to use other networks when outside of the covered region. Mobile virtual network operators don’t have their own network infrastructure and instead resell access to other networks.
- With crowdsourced data, I worry that users on some networks may be systematically different from users on other networks. Differences in network quality based on crowdsourced data could be due to either true differences in networks’ quality or differences in the behavior and location of users on each network.
RootMetrics’ tests are pretty well-controlled. High-end devices are used, and the phones from major networks all conduct tests in the same locations at the same times. I go into more detail in my article on RootMetrics’ methodology.
- I’m unsure whether RootMetrics provided the same kind of rankings prior to 2013. To see RootScore rewards in previous reporting periods, scroll to the bottom of the most recent report.
- “After the tests are evaluated for accuracy, the results are converted into scores using a proprietary algorithm.”
From RootMetrics Methodology page as of 4/23/2019 (archived here).
- Testing with actual consumers may allows Opensignal to capture some real-world factors that drive testing methodologies miss. For example, lots of people use their phones indoors; drive tests won’t directly measure the quality of indoor coverage.
While my impression is that RootMetrics primarily drive tests, its drivers occasionally bring phones indoors and the drive test data is supplemented with some crowdsourced data. Unfortunately, RootMetrics isn’t transparent about the weight different kinds of testing receive in analyses. I go into more detail in my page on RootMetrics’ methodology.
- The data collection period at the top of the report (archived here) is noted as “Sep 16 – Dec 14, 2018”.
- Verizon takes the top spot for 4G availability, upload speed experience, and video experience. Verizon and T-Mobile tie for the top spot in the download speed experience category. AT&T alone takes the top spot in the latency experience category.
- Past reports can be accessed here.
- It’s a shame. I expect a lot of valuable insights could be gleaned from Opensignal’s data.
- For example, Nielsen might collect useful data and perform useful analysis, but I’ve been unable to find much information about either its results or its methodology.
- For what it’s worth, I’m really negative on Consumer Reports’ methodology for assessing wireless services. It may even be better to ignore the results than to treat them as a weak signal. Based on my interpretation of the 2017 results (the most recent as of April 2019), Consumer Reports has four metrics related to network quality. Verizon gets four “good” ratings on these metrics. AT&T gets two “good” ratings and two “fair” ratings. T-Mobile gets three “fair” ratings and one “good” rating. Sprint gets two “fair” ratings and two “poor” ratings. Subscribers to Consumer Reports can see the results here.
- T-Mobile narrowly beats Verizon in Houston. In a few cities, a mobile virtual network operator beat all the Big Four carriers for the top ranking.