Image representing the idea of a network

Ookla Acquires RootMetrics

Today, Ookla announced that it acquired RootMetrics. I’ve long argued that RootMetrics has the best methodology for assessing the quality of cellular networks at the national level. Further, I’ve argued that Ookla’s traditional methodology is lousy. Since Ookla primarily relies on tests initiated by users, the company’s data is subject to biases that RootMetrics’ drive tests and Opensignal’s automated tests avoid.

Effects of Consolidation

The network-evaluation industry has consolidated substantially over the last year. In September, a similar story surfaced when Comlinkdata announced that Tutela and Opensignal would join forces.

I’m curious how the consolidation will affect the messages consumers receive about the quality of networks. Here are some optimistic possibilities:

  • Now that Ookla owns RootMetrics, Ookla might be more upfront about the limitations of user-initiated tests.
  • With fewer independent companies publishing evaluations, we might see movement away from today’s situation where questionable financial incentives nearly guarantee that even lousy networks will win awards.
  • Ookla’s app is used by a lot of people. The app may end up integrating some of RootMetrics’ data that otherwise wouldn’t be available to consumers.

Don’t Take Ookla Too Seriously

Ookla recently published its Q2 report on the performance of U.S. wireless networks. As I’ve discussed before, I’m not a fan of Ookla’s methodology.1 Because of my qualms, I’m not going to bother summarizing Ookla’s latest results. However, I do want to draw attention to a part of the recent report.

Ookla’s competitive geographies filter

In the last year or two, Ookla has restricted its main analyses to only account for data from “competitive geographies.” Here’s how Ookla explains competitive geographies:

To meet the definition of ‘competitive’ in the U.S., a zip code must contain samples from at least three top national competitors…but no competitor can have more than 2/3 of the samples in that zip code.

The competitive geographies filter mitigates some of the problems with Ookla’s methodology but also introduces a bunch of new issues.

Availability

Ookla’s latest results for 4G availability illustrate the issues:

Ookla 4G availability scores

Sprint unambiguously has the smallest coverage profile of the four nationwide networks.2 The competitive geographies filter makes Ookla’s availability metric so meaningless that Sprint can nevertheless tie for the best availability.

Lots of regions only have coverage from Verizon. All those data points get thrown away because they come from non-competitive geographies. Other areas have coverage from only Verizon and AT&T. Again, those data points get thrown out because they’re not from competitive geographies. What’s the point of measuring availability while ignoring the areas where differences in network availability are most substantial?

Giving Ookla credit

Despite my criticisms, I want to give Ookla some credit. Many evaluators develop complicated, poorly thought-out metrics and only share those metrics when the results seem reasonable. I appreciate that Ookla didn’t hide its latest availability information because the results looked silly.