Image representing the idea of a network

Ookla Acquires RootMetrics

Today, Ookla announced that it acquired RootMetrics. I’ve long argued that RootMetrics has the best methodology for assessing the quality of cellular networks at the national level. Further, I’ve argued that Ookla’s traditional methodology is lousy. Since Ookla primarily relies on tests initiated by users, the company’s data is subject to biases that RootMetrics’ drive tests and Opensignal’s automated tests avoid.

Effects of Consolidation

The network-evaluation industry has consolidated substantially over the last year. In September, a similar story surfaced when Comlinkdata announced that Tutela and Opensignal would join forces.

I’m curious how the consolidation will affect the messages consumers receive about the quality of networks. Here are some optimistic possibilities:

  • Now that Ookla owns RootMetrics, Ookla might be more upfront about the limitations of user-initiated tests.
  • With fewer independent companies publishing evaluations, we might see movement away from today’s situation where questionable financial incentives nearly guarantee that even lousy networks will win awards.
  • Ookla’s app is used by a lot of people. The app may end up integrating some of RootMetrics’ data that otherwise wouldn’t be available to consumers.
Phone abstract

Verizon Sweeps In RootMetrics’ Late 2020 Assessment

RootMetrics recently released a teaser of its results from network testing in the second half of 2020. As is usual in RootMetrics’ assessments, Verizon came out as the big winner. In all seven of RootMetrics’ primary scoring categories, Verizon either took first place or tied first place.

I didn’t find a lot of big surprises in the information that’s come out so far. While AT&T has been winning the top spot for speeds from some evaluators, Verizon had the highest median download speed in RootMetrics’ assessment.

RootMetrics’ full report on network performance in the second half of 2020 comes out on February 3. That report might include some interesting updates on the status of 5G deployments.

RootMetrics’ Early 2020 Report

Earlier this week, RootMetrics released its report on cellular networks’ performance in the U.S. in the first half of 2020. The most recent round of RootMetrics’ testing got screwed up by the pandemic, but the company did its best to publish something that followed the structure of its usual reports.

While I continue to think RootMetrics has the best methodology of any network evaluator, I’m not going to discuss the latest results in detail. For the most part, the results were as expected and similar to what RootMetrics found in its previous round of testing.

Highlights

As usual, Verizon was the big winner:

Verizon continued its run of excellence in our national testing, winning or sharing six out of seven awards.

AT&T showed solid performance and easily took the second spot for overall performance. AT&T managed to beat Verizon on RootMetrics’ speed score.

While Sprint is gradually disappearing, RootMetrics included the in its testing. Oddly enough, Sprint’s overall score beat T-Mobile’s score.

Proof & Network Evaluations

RootMetrics recently tweeted about how its latest analyses prove which networks are the fastest and most reliable:

RootMetrics' Tweet

I’m a big fan of RootMetrics, but the tweet annoyed me. There’s a ton of flexibility in how network evaluators can approach their work. Will performance data be crowdsourced from consumers or collected via in-house testing? How will data be cleaned and aggregated? What regions will be included? Etc.

Phrases like “fastest network” and “most reliable network” are ambiguous. Do you determine the fastest network based on average download speeds, median download speeds, or something else?

RootMetrics’ tweet is especially odd in light of their latest report. Depending on which speed metric you choose to look at, you could argue that either Verizon or AT&T has the fastest network. AT&T narrowly beats Verizon in terms of median download speed. Verizon narrowly beats AT&T in RootMetrics’ overall speed scores.1

RootMetrics’ Report For Late 2019

Yesterday, RootMetrics released its latest report on the performance of U.S. wireless networks. I’d been looking forward to this report. RootMetrics’ drive testing methodology has some advantages over the approaches used by other companies that evaluate network performance.

Results

RootMetrics’ results were generally unsurprising. As with the last report, Verizon was the big winner, followed by AT&T in second place, T-Mobile in third, and Sprint in fourth.

Here are the overall, national scores out of 100 for each of the major networks:

  • Verizon – 94.6 points
  • AT&T – 93.2 points
  • T-Mobile – 86.5 points
  • Sprint – 83.2 points

RootMetrics also reports which carriers scored the best on each of its metrics within individual metro areas. Here’s how many metro area awards each carrier won (along with the change in the number of rewards received since the last report):

  • Verizon – 660 awards (-12)
  • AT&T – 401 awards (+21)
  • T-Mobile – 217 awards (-20)
  • Sprint – 80 awards (-9)

AT&T’s improvements

RootMetrics’ results align with the results of other recent evaluations suggesting aspects of AT&T’s network are becoming more competitive. AT&T fared particularly well in RootMetrics’ latest speed metrics. While Verizon narrowly beat AT&T in the final speed score out of 100 (90.7/100 for Verizon vs. 90.2/100 for AT&T), AT&T narrowly beat Verizon in aggregate median download speed (33.1 Mbps for AT&T vs. 32.7 Mbps for Verizon).

It appears that RootMetrics’ final speed scores are based on something more than median download speed. That may be a good thing: having consistent speeds is arguably much more important than having high average or median speeds. Still, I’m frustrated that I can’t figure out exactly how the final speed scores are derived. RootMetrics continues to be non-transparent about the math underlying its analyses.

A section of the latest report suggests that Verizon may do a particularly good job of avoiding sluggish speeds:

Verizon’s ‘slowest’ median download speed of 17.9 Mbps, recorded in Fresno, CA, was still quite strong and would allow end users to complete the majority of data tasks with ease. In fact, Fresno was the only market in which Verizon registered a median download speed below 20 Mbps. No other carrier came close to matching Verizon’s consistency of delivering fast speeds in metros across the US.

5G performance

The new report includes details about RootMetrics’ recent tests on 5G networks. I found the 5G results unsurprising, and I’m not going to comment on them further at this time. I think 5G deployments are still in too early a stage for the results to be of much interest.

Abstract photo representing wireless technology

New RootMetrics Report – Verizon Wins Again

Yesterday, RootMetrics released its report on mobile network performance in the first half of 2019. Here are the overall, national scores for each network:1

  • Verizon – 94.8 points
  • AT&T – 93.2 points
  • T-Mobile – 86.9 points
  • Sprint – 86.7 points

While Verizon was the overall winner, AT&T wasn’t too far behind. T-Mobile came in a distant third with Sprint just behind it.

RootMetrics also reports which carriers scored the best on each of its metrics within individual metro areas. Here’s how many metro area awards each carrier won along with the change in the number of rewards received since the last report:2

  • Verizon – 672 awards (+5)
  • AT&T – 380 (+31)
  • T-Mobile – 237 (-86)
  • Sprint – 89 (+9)

My thoughts

Overall this report wasn’t too surprising since the overall results were so similar to those from the previous report. The decline in the number of metro area awards T-Mobile won is large, but I’m not sure I should take the change too seriously. There may have been a big change in T-Mobile’s quality relative to other networks, but I think it’s also possible the change can be explained by noise or a change in methodology. In its report, RootMetrics notes the following:3

T-Mobile’s performance didn’t necessarily get worse. Rather, AT&T, Sprint, and Verizon each made award gains in the test period, which corresponded with T-Mobile’s decreased award count.

I continue to believe RootMetrics’ data collection methodology is far better than Opensignal’s methodology for assessing networks at the national level. I take this latest set of results more seriously than I take the Opensignal results I discussed yesterday. That said, I continue to be worried about a lack of transparency in how RootMetrics aggregates its underlying data to arrive at final results. Doing that aggregation well is hard.

A final note for RootMetrics:
PLEASE DISCLOSE FINANCIAL RELATIONSHIPS WITH COMPANIES YOU EVALUATE!

Network Evaluation Should Be Transparent

Several third-party firms collect data on the performance of U.S. wireless networks. Over the last few months, I’ve tried to dig deeply into several of these firms’ methodologies. In every case, I’ve found the public-facing information to be inadequate. I’ve also been unsuccessful when reaching out to some of the firms for additional information.

It’s my impression that evaluation firms generally make most of their money by selling data access to network operators, analysts, and other entities that are not end consumers. If this was all these companies did with their data, I would understand the lack of transparency. However, most of these companies publish consumer-facing content. Often this takes the form of awards granted to network operators that do well in evaluations. It looks like network operators regularly pay third-party evaluators for permission to advertise the receipt of awards. I wish financial arrangements between evaluators and award winners were a matter of public record, but that’s a topic for another day. Today, I’m focusing on the lack of transparency around evaluation methodologies.

RootMetrics collects data on several different aspects of network performance and aggregates that data to form overall scores for each major network. How exactly does RootMetrics do that aggregation?

The results are converted into scores using a proprietary algorithm.1

I’ve previously written about how difficult it is to combine data on many aspects of a product or service to arrive at a single, overall score. Beyond that, there’s good evidence that different analysts working in good faith with the same raw data often make different analytical choices that lead to substantive differences in the results of their analyses. I’m not going take it on faith that RootMetrics’ proprietary algorithm aggregates data in a highly-defensible manner. No one else should either.

Opensignal had a long history of giving most of its performance awards to T-Mobile.2 Earlier this year, the trend was broken when Verizon took Opensignal’s awards in most categories.3 It’s not clear why Verizon suddenly became a big winner. The abrupt change strikes me as more likely to have been driven by a change in methodology than a genuine change in the performance of networks relative to one another. Since little is published about Opensignal’s methodology, I can’t confirm or disconfirm my speculation. In Opensignal’s case, questions about methodology are not trivial. There’s good reason to be concerned about possible selection bias in Opensignal’s analyses. Opensignal’s Analytics Charter states:4

Our analytics are designed to ensure that each user has an equal impact on the results, and that only real users are counted: ‘one user, one vote’.

Carriers will differ in the proportion of their subscribers that live in rural areas versus densely-populated areas. If the excerpt from the analytics charter is taken literally, it may suggest that Opensignal does not control for differences in subscribers’ geography or demographics. That could explain why T-Mobile has managed to win so many Opensignal awards when T-Mobile obviously does not have the best-performing network at the national level.

Carriers advertise awards from evaluators because third-parties are perceived to be credible. The public deserves to have enough information to assess whether third-party evaluators merit that credibility.

Average Download Speed Is Overrated

I’ve started looking into the methodologies used by entities that collect cell phone network performance data. I keep seeing an emphasis on average (or median) download and upload speeds when data-service quality is discussed.

  • Opensignal bases it’s data-experience rankings exclusively on download and upload speeds.1
  • Tom’s Guide appears to account for data-quality using average download and possibly upload speeds.2
  • RootMetrics doesn’t explicitly disclose how it arrives at final data-performance scores, but emphasis is placed on median upload and download speeds.3

It’s easy to understand what average and median speeds represent. Unfortunately, these metrics fail to capture something essential—variance in speeds.

For example, OpenSignal’s latest report for U.S. networks shows that Verizon has the fastest average download speed of 31 Mbps in the Chicago area. AT&T’s average download speed is only 22 Mbps in the same area. Both those speeds are easily fast enough for typical activities on a phone. At 22 Mbps per second, I could stream video, listen to music, or browse the internet seamlessly. For the rare occasion where I download a 100MB file, Verizon’s network at the average speed would beat AT&T’s by about 10.6 seconds.4 Not a big deal for something I do maybe once a month.

On the other hand, variance in download speeds can matter quite a lot. If I have 31 Mbps speeds on average, but I occasionally have sub-1 Mbps speeds, it may sometimes be annoying or impossible to use my phone for browsing and streaming. Periodically having 100+ Mbps speeds would not make up for the inconvenience of sometimes having low speeds. I’d happily accept a modest decrease in average speeds in exchange for a modest decrease in variance.5