Photo of a frustrated person with a broken phone

Consumer Reports’ Broken Cell Service Rankings

Several months ago, I published a blog post arguing that Consumer Reports’ cell phone rankings were broken. This month, Consumer Reports updated those rankings with data from another round of surveying its subscribers. The rankings are still broken.

Consumer Reports slightly changed its approach this round. While Consumer Reports used to share results on 7 metrics, it now uses 5 metrics:

  1. Value
  2. Customer support
  3. Data
  4. Reception
  5. Telemarketing call frequency

Of the 19 carriers Consumer Reports’ assesses, only 5 operate their own network hardware.1 The other 14 carriers resell access to other companies’ networks while maintaining their own customer support teams and retail presences.2

Several of the carriers that don’t run their own network offer service over only one host network:

  • Cricket Wireless – AT&T’s network
  • Page Plus Cellular – Verizon’s network
  • MetroPCS – T-Mobile’s network
  • CREDO Mobile – Verizon’s network
  • Boost Mobile – Sprint’s network
  • GreatCall – Verizon’s network
  • Virgin Mobile – Sprint’s network

To test the validity of Consumer Reports’ methodology, we can compare scores on metrics assessing network quality between each of these carriers and their host network. At first glance, it looks like the reception and data metrics should both be exclusively about network quality. However, the scores for data account for value as well as quality:3

Data service indicates overall experience (e.g., cost, speed, reliability) with the data service.
I think it was a methodological mistake to account for value within the data metric then account for value again in the value metric. That leaves us with only the reception scores.4 Here are the scores the four host operators get for reception:

  • Verizon – Good
  • T-Mobile – Fair
  • AT&T – Poor
  • Sprint – Poor

How do those companies’ scores compare to scores earned by carriers that piggyback on their networks?

  • Cricket Wireless has good reception while AT&T has poor reception.
  • Page Plus and Verizon both have good reception.
  • MetroPCS has good reception while T-Mobile has fair reception.
  • CREDO and Verizon both have good reception.
  • Boost has very good reception while Sprint has poor reception.
  • GreatCall and Verizon both have good reception.
  • Virgin has good reception while Sprint has poor reception.

In the majority of cases, carriers beat their host networks. The massive differences between Cricket/AT&T and Boost/Sprint are especially concerning. In no cases do host operators beat the carriers that piggyback on their networks. I would have expected the opposite outcome. Host networks generally give higher priority to their direct subscribers when networks are busy.

The rankings are broken.

What’s the problem?

I see two especially plausible explanations for why the survey results aren’t valid for comparison purposes:

  • Non-independent scoring – Respondents may take prices into account when assessing metrics other than value. If that happens, scores won’t be valid for comparisons across carriers.
  • Selection bias – Respondents were not randomly selected to try certain carriers. Accordingly, respondents who use a given carrier probably differ systematically from respondents that use another carrier. Differences in scores between two carriers could reflect either (a) genuine differences in service quality or (b) differences in the type of people who use each service.

Consumer Reports, please do better!

My earlier blog post about Consumer Reports’ methodology is one of the most popular articles I’ve written. I’m nearly certain staff at Consumer Reports have read it. I’ve tried to reach out to Consumer Reports through two different channels. First, I was ignored. Later, I got a response indicating that an editor might reach out to me. So far, that hasn’t happened.

I see three reasonable ways for Consumer Reports’ to respond to the issues I’ve raised:

  • Adjust the survey methodology.
  • Cease ranking cell phone carriers.
  • Continue with the existing methodology, but mention its serious problems prominently when discussing results.

Continuing to publishing rankings based on a broken methodology without disclosing problems is irresponsible.

Footnotes

  1. AT&T, T-Mobile, Verizon, T-Mobile, and Verizon.
  2. These companies are commonly called “mobile virtual network operators” (MVNOs). I explain MVNOs in more detail here.
  3. The excerpted text appeared on Consumer Reports’ web page with the rankings when hovering over the word “data.” The page is only accessible by subscribers to Consumer Reports.
  4. The reception metric was described with the following text on 8/6/2019:
    “Rating of adequacy of cell phone reception both inside and outside the home.”

Leave a Reply

Your email address will not be published. Required fields are marked *