Thoughts on TopTenReviews

Thumbs down image
I’m not a fan.

TopTenReviews ranks products and services in a huge number of industries. Stock trading platforms, home appliances, audio editing software, and hot tubs are all covered.

TopTenReviews’ parent company, Purch, describes TopTenReviews as a service that offers, “Expert reviews and comparisons.”1

Many of TopTenReviews’ evaluations open with lines like this:

We spent over 60 hours researching dozens of cell phone service providers to find the best ones.2

I’ve seen numbers between 40 and 80 hours in a handful of articles. It takes a hell of a lot more time to understand an industry at an expert level.

I’m unimpressed by TopTenReviews’ rankings in industries I’m knowledgable about. This is especially frustrating since TopTenReviews often ranks well in Google.

A particularly bad example: indoor bike trainers. These devices can turn regular bikes into stationary bikes that can be ridden indoors.

I love biking and used to ride indoor trainers a fair amount. I’m suspicious the editor who came up with the trainer rankings at TopTenReviews couldn’t say the same.

The following paragraph is found under the heading “How we tested on the page for bike trainers”:

We’ve researched and evaluated the best roller, magnetic, fluid, wind and direct-drivebike [sic] trainers for the past two years and found the features that make the best ride for your indoor training. Our reviewers dug into manufacturers’ websites and engineering documents, asked questions of expert riders on cycling forums, and evaluated the pros and cons of features on the various models we chose for our product lineup. From there, we compared and evaluated the top models of each style to reach our conclusions. 3

There’s no mention of using physical products.

The top overall trainer is the Kinetic Road Machine. It’s expensive but probably a good recommendation. I know lots of people with either that model or similar models who really like their trainers.

However, I don’t trust TopTenReviews’ credibility. TopTenReviews has a list of pros and cons for the Kinetic Road Machine. One con is: “Not designed to handle 700c wheels.” It is.

It’s a big error. 700c is an incredibly common wheel size for road bikes. I’d bet the majority of people using trainers have 700c wheels.4 If the trainer wasn’t compatible with 700c wheels, it wouldn’t deserve the “best overall” designation.

TopTenReviews even states, “The trainer’s frame fits 22-inch to 29-inch bike wheels.” 700c wheels fall within that range. A bike expert would know that.

Bike crash

TopTenReviews’ website has concerning statements about its approach and methodology. An excerpt from their about page (emphasis mine):

Our tests gather data on features, ease of use, durability and the level of customer support provided by the manufacturer. Using a proprietary weighted system (i.e., a complicated algorithm), the data is scored and the rankings laid out, and we award the three top-ranked products with our Gold, Silver and Bronze Awards.5

Maybe TopTenReviews came up with an awesome algorithm no one else has thought of. I find it much more plausible that—if a single algorithm exists—the algorithm is private because it’s silly and easy to find flaws in.

TopTenReviews receives compensation from many of the companies it recommends. While this is a serious conflict of interest, it doesn’t mean all of TopTenReviews’ work is bullshit. However, I see this line on the about page as a red flag:

Methods of monetization in no way affect the rankings of the products, services or companies we review. Period.6

Avoiding bias is difficult. Totally eliminating it is almost always unrealistic.

Employees doing evaluations will sometimes have a sense of how lucrative it will be for certain products to receive top recommendations. These employees would probably be correct to bet that they’ll sometimes be indirectly rewarded for creating content that’s good for the company’s bottom line.

Even if the company is being careful, bias can creep up insidiously. Someone has to decide what the company’s priorities will be. Even if reviewers don’t do anything dishonest, the company strategy will probably entail doing evaluations in industries where high-paying affiliate programs are common.

Reviews will need occasional updates. Won’t updates in industries where the updates could shift high-commission products to higher rankings take priority?

TopTenReviews has a page on foam mattresses that can be ordered online. I’ve bought two extremely cheap Zinus mattresses on Amazon.7 I’ve recommended these mattresses to a bunch of people. They’re super popular on Amazon.8 TopTenReviews doesn’t list Zinus.9

Perhaps it’s because other companies offer huge commissions.10 I recommend The War To Sell You A Mattress Is An Internet Nightmare for more about how commissions shadily distort mattress reviews. It’s a phenomenal article.

R-Tools Technology Inc. has a great article discussing their software’s position in TopTenReviews’ rankings, misleading information communicated by TopTenReviews, and conflicts of interest.

The article suggests that TopTenReviews may have declined in quality over the years:

In 2013, changes started to happen. The two principals that had made TopTenReviews a household name moved on to other endeavors at precisely the same time. Jerry Ropelato became CEO of WhiteClouds, a startup in the 3D printing industry. That same year, Stan Bassett moved on to Alliance Health Networks. Then, in 2014, the parent company of TopTenReviews rebranded itself from TechMediaNetwork to Purch.

Purch has quite a different business model than TopTenReviews did when it first started. Purch, which boasted revenues of $100 million in 2014, has been steadily acquiring numerous review sites over the years, including TopTenReviews, Tom’s Guide, Tom’s Hardware, Laptop magazine, HowtoGeek, MobileNations, Anandtech, WonderHowTo and many, many more.11

I don’t think I would have loved the pre-2013 website, but I think I’d have more respect for it than today’s version of TopTenReviews.

I’m not surprised TopTenReviews can’t cover hundreds of product types and consistently provide good information. I wish Google didn’t let it rank so well.

Issues with Consumer Reports’ 2017 Cell Phone Plan Rankings


Consumer Reports offers ratings of cellular service providers based on survey data collected from Consumer Reports subscribers. Through subscriber surveying in 2017, Consumer Reports collected data on seven metrics:1

  1. Value
  2. Data service quality
  3. Voice service quality
  4. Text service quality
  5. Web service quality
  6. Telemarketing call frequency
  7. Support service quality

The surveys collected data from over 100,000 subscribers.2 I believe Consumer Reports would frown upon a granular discussion of the exact survey results, so I’ll remain vague about exact ratings in this post. If you would like to see the full results of their survey, Consumer Reports subscribers can do so here.

Survey results

Results are reported for 20 service providers. Most of these providers are mobile virtual network operators (MVNOs). MVNOs don’t operate their own network hardware but make use of other companies’ networks. For the most part, MVNOs use networks provided by the Big Four (Verizon, Sprint, AT&T, and T-Mobile).

Interestingly, the Big Four do poorly in Consumer Reports’ evaluation. Verizon, AT&T, and Sprint receive the lowest overall ratings and take the last three spots. T-Mobile doesn’t do much better.

This is surprising. The Big Four do terribly, even though MVNOs are using the Big Four’s networks. Generally, I would expect the Big Four to offer network access to their direct subscribers that is as good or better than the access that MVNO subscribers receive.

It’s possible that the good ratings can be explained by MVNOs offering prices and customer service far better than the Big Four—making them deserving of the high ratings for reasons separate from network quality.

Testing the survey’s validity

To test the reliability of Consumer Reports methodology, we can compare MVNOs to the Big Four using only the metrics about network quality (ignoring measures of value, telemarketing call frequency, and support quality). In many cases, MVNOs use more than one of the Big Four’s networks. However, several MVNOs use only one network, allowing for easy apples-to-apples comparisons.3

  • Boost Mobile is owned by Sprint.
  • Virgin Mobile is owned by Sprint.
  • Circket Wireless is owned by AT&T.
  • MetroPCS is owned by T-Mobile.
  • GreatCall runs exclusively on Verizon’s network.
  • Page Plus Cellular runs exclusively on Verizon’s network.

When comparing network quality ratings between these MVNOs and the companies that run their networks:

  • Boost Mobile’s ratings beat Sprint’s ratings in every category.
  • Virgin Mobile’s ratings beat Sprint’s ratings in every category.
  • Cricket Wireless’s ratings beat or tie AT&T’s ratings in every category.
  • MetroPCS’s ratings beat or tie T-Mobile’s ratings in every category.
  • GreatCall doesn’t have a rating for web quality due to insufficient data. GreatCall’s ratings match or beat Verizon in the other categories.
  • Page Plus Cellular doesn’t have a rating for web quality due to insufficient data. Page Plus’ ratings match or beat Verizon in the other categories.
World’s best stock photo.
Taken at face value, these are odd results. There are complicated stories you could tell to salvage the results, but I think it’s much more plausible that Consumer Reports’ surveys just don’t work well for evaluating the relative quality of cell phone service providers.

Why aren’t the results reliable?

I’m not sure why the surveys don’t work, but I see three promising explanations:

  • Metrics may not be evaluated independently. For example, consumers might take a service’s price into account when providing a rating of its voice quality.
  • Lack of objective evaluations. Consumers may not provide objective evaluations. Perhaps consumers are aware of some sort of general stigma about Sprint that unfairly affects how they evaluate Sprint’s quality (but that same stigma may not be applied to MVNOs that use Sprint’s network).
  • Selection bias. Individuals who subscribe to one carrier are probably, on average, different from individuals who subscribe to another carrier. Perhaps individuals who have used Carrier A tend to use small amounts of data and are lenient when rating data service quality. Individuals who have used Carrier B may get more upset about data quality issues. Consumer Cellular took the top spot in the 2017 rankings. I don’t think it’s coincidental that Consumer Cellular has pursued branding and marketing strategies to target senior citizens.4

Consumer Reports’ website gives the impression that their cell phone plan rankings will be reliable for comparison purposes.5 They won’t be.

The ratings do capture whether survey respondents are happy with their services. However, the ratings have serious limitations for shoppers trying to assess whether they’ll be satisfied with a given service.

I suspect Consumer Reports’ ratings for other product categories that rely on similar surveys will also be unreliable. However, the concerns I’m raising only apply to a subset of Consumer Reports’ evaluations. A lot of Consumer Reports’ work is based on product testing rather than consumer surveys.

Third-party Evaluation: Trophies for Everyone!

A lot of third-party evaluations are not particularly useful. Let’s look at HostGator, one of the larger players in the shared web hosting industry, for some examples. For a few years, HostGator had an awards webpage that proudly listed all of the awards it “won.”

Many of the entities issuing awards were obviously affiliate sites that didn’t provide anything even vaguely resembling rigorous evaluation:

Fortunately, HostGator’s current version of the page is less ridiculous.

Even evaluations carried out by serious, established entities often have problems. Rigorous evaluation tends to be difficult. Accordingly, third-party evaluators generally use semi-rigorous methodologies—i.e., methodologies that have merit but also serious flaws.

In many industries, there will be several semi-rigorous evaluators using different methodologies. When an evaluator enters an industry, it will have to make a lot of decisions about its methods:

  • Should products be tested directly or should consumers be surveyed?
  • What metrics should be measured? How should those metrics be measured?
  • If consumers are surveyed, how should the surveyed population be selected?
  • How should multiple metrics be aggregated into an overall rating?

These are tough questions that don’t have straightforward answers.

Objective evaluation is often impossible. Products and services may have different characteristics that matter to consumers—for example, download speed and call quality for cell phone services. There’s no defensible, objective formula you can use to assess how important one characteristic’s quality is versus another.

There’s a huge range of possible, defensible methods that evaluators can use. Different semi-rigorous methods will lead to different rankings of overall quality. This can lead to situations where every company in an industry can be considered the “best” according to at least one evaluation method.

In other words: Everyone gets a trophy!

This phenomenon occurs in the market for cell phone carriers. At the time of writing, Verizon, AT&T, T-Mobile, and Sprint all get at least one legitimate evaluator’s approval. (More details in The Mobile Phone Service Confusopoly.)

Evaluators are often compensated in exchange for permission to use their results and/or logos in advertisements. Unfortunately, details on the specific financial arrangements between evaluators and the companies they recommend are often private.

Here are a few publicly known examples:

  • Businesses must pay a fee before displaying Better Business Bureau (BBB) logos in their advertisements.1
  • J.D. Power is believed to charge automobile companies for permission to use its awards in advertisements.2
  • AARP-approved providers pay royalties to AARP.3

An organization that is advertising an endorsement from the most rigorous evaluator in its field probably won’t be willing to pay a lot to advertise an endorsement from a second evaluator. A company with no endorsements will probably be much more willing to pay for its first endorsement.

Since there are many possible, semi-rigorous evaluation methodologies, maybe we should expect at least one evaluator to look kindly upon each major company in an industry. This phenomenon could even occur without any evaluator deliberately acting dishonestly. For example, lots of evaluators might try their hand at evaluation in a given industry. Each evaluator would use its own method. If an evaluator came out in favor of a company that didn’t have an endorsement, the evaluator would be rewarded monetarily and continue to evaluate within the industry. If an evaluator came out in favor of a company that already had an endorsement, the evaluator could exit the industry.

Bogus Evaluation Websites

Sturgeon’s law: Ninety percent of everything is crap.1

Rankings & reviews online

The internet is full of websites that ostensibly rank, rate, and/or review companies within a given industry. Most of these websites are crappy. Generally, these ranking websites cover industries where affiliate programs offering website owners large commissions are common.

Here are a few examples of industries and product categories where useless review websites are especially common:

  • Credit cards
  • Web hosting services
  • Online fax services
  • VoIP services
  • VPN services
  • Foam mattresses

If you Google a query along the lines of “Best [item from the list above]” you’ll likely receive a page of search results with a number of “top 10 list” type sites. At the top of your search results you will probably see ads like these:

Lack of in-depth evaluation methodologies

Generally, these “review” sites don’t go into any kind of depth to assess companies. As far as I can tell, rankings tend to be driven primarily by a combination of randomness and the size of commissions offered.

Admittedly, it’s silly to think that the evaluation websites found via Google’s ads would be reliable. Unfortunately, the regular (non-ad) search results often include a lot of garbage “review” websites. From the same query above:

Most of these websites don’t offer evaluation methodologies that deserve to be taken seriously.

Even the somewhat reputable names on the list (i.e. CNET & PCMag) don’t offer a whole lot. Neither CNET nor PCMag clearly explain their methodologies, and the written content doesn’t lead me to believe either entity went in depth to evaluate the services considered.2

Fooling consumers

If consumers easily recognized these bogus evaluation websites for what they are, the websites would just be annoyances. Unfortunately, it looks like a substantial portion of consumers don’t realize these websites lack legitimacy.

Google offers a tool that presents prices that “advertisers have historically paid for a keyword’s top of page bid.” According to this tool, advertisers are frequently paying several dollars per click on the kind of queries that return ads for bogus evaluation websites:3

We should expect that advertisers will only be willing to pay for ads when the expected revenue per click is greater than the cost per click. The significant costs paid per click suggest that a non-trivial portion of visitors to bogus ranking websites end up purchasing from one of the suggested companies.

How biased are evaluation websites found via Google?

Let’s turn to another industry. The VPN industry shares a lot of features with the web hosting industry. Both VPN and web hosting services tend to be sold online with reasonably low prices and reoccurring billing cycles. Affiliate programs are very common in both industries.

There’s an awesome third-party website, ThatOnePrivacySite.net, that assesses VPN services and refuses to accept commissions.4 ThatOnePrivacySite has reviewed over thirty VPN services. At the time of writing, only one, Mullvad, has received a “TOPG Choice” award,5 indicating an excellent review.6

Interestingly, Mullvad doesn’t have an affiliate program. That allowed me to perform a little experiment. I Googled the query “Best VPN service”. I received 15 results directing to websites that ranked VPN services.

Six of the results came from paid ads. None of those six websites listed Mullvad.

Of the nine websites in the organic results, only three listed Mullvad:7

  • Tom’s Guide
  • TheBestVPN.com
  • PCWorld