Consumer Reports’ Fundraising Gimmicks

You better cut the pizza in four pieces because I’m not hungry enough to eat six.Yogi Berra (allegedly)

The other day, I received a mailing from Consumer Reports. It was soliciting contributions for a raffle fundraiser. The mailing had nine raffle tickets in it. Consumer Reports was requesting that I send back the tickets with a suggested donation of $9 (one dollar for each ticket). The mailing had a lot of paper:

The raffle had a grand prize that would be the choice of an undisclosed, top-rated car or $35,000. There were a number of smaller prizes bringing the total amount up for grabs to about $50,000.

The materials included a lot of gimmicky text:

  • “If you’ve been issued the top winning raffle number, then 1 of those tickets is definitely the winner or a top-rated car — or $35,000 in cash.”
  • “Why risk throwing away what could be a huge pay day?”
  • “There’s a very real chance you could be the winner of our grand prize car!”

Consumer Reports also indicates that they’ll send a free, surprise gift to anyone who donates $10 or more. It feels funny to donate money hoping that I might win more than I donate, but I get it. Fundraising gimmicks work. That said, I get frustrated when fundraising gimmicks are dishonest.

One of the papers in the mailing came folded with print on each side. Here’s the front:

On the other side, I found a letter from someone involved in Consumer Reports’ marketing. The letter argues that it would be silly for me not to find out if I received winning tickets:

It amazes me that among the many people who receive our Consumer Reports Raffle Tickets — containing multiple tickets, mind you, not just one — some choose not to mail them in. And they do this, despite the fact there is no donation required for someone to find out if he or she has won…So when people don’t respond it doesn’t make any sense to me at all.

The multiple tickets bit is silly. It’s like the Yogi Berra line at the opening of the post; cutting a pizza into more slices doesn’t create more food. It doesn’t matter how many tickets I have unless I get more tickets than the typical person.

Come on. Consumer Reports doesn’t care if a non-donor decides not to turn in tickets. What’s the most plausible explanation for why Consumer Reports includes the orange letter? People who would otherwise ignore the mailing sometimes end up feeling guilty enough to make a donation. Checking the “I choose not to donate at this time, but please enter me in the Raffle” box on the envelope doesn’t feel great.

Writing my name on each ticket, reading the materials, and mailing the tickets takes time. My odds of winning are low. Stamps cost money.

Let’s give Consumer Reports the benefit of the doubt and pretend that the only reason not to participate is that stamps cost money. The appropriate stamp costs 55 cents at the moment.1 Is the expected reward for sending in the tickets greater than 55 cents?

Consumer Reports has about 6 million subscribers.2 Let’s continue to give Consumer Reports the benefit of the doubt and assume it can print everything, send mailings, handle the logistics of the raffle, and send gifts back to donors for only $0.50 per subscriber. That puts the promotion’s cost at about 3 million dollars. The $50,000 of prizes is trivial in comparison. Let’s further assume that Consumer Reports runs the promotion expecting that additional donations the promotion brings in will cover the promotion’s cost.

The suggested donation is $9. Let’s say the average, additional funding brought in by this campaign comes out to $10 per respondent.3 To break even, Consumer Reports needs to have 300,000 respondents.

With 300,000 respondents, nine tickets each, and $50,000 in prizes, the expected return is about 1.7 cents per ticket.4 Sixteen cents per person.5 Not even close to the cost of a stamp.


4/12/2019 Update: I received a second, almost-identical mailing in early April.

10/3/2019 Update: I received a few more of these mailings.

Average Download Speed Is Overrated

I’ve started looking into the methodologies used by entities that collect cell phone network performance data. I keep seeing an emphasis on average (or median) download and upload speeds when data-service quality is discussed.

  • Opensignal bases it’s data-experience rankings exclusively on download and upload speeds.1
  • Tom’s Guide appears to account for data-quality using average download and possibly upload speeds.2
  • RootMetrics doesn’t explicitly disclose how it arrives at final data-performance scores, but emphasis is placed on median upload and download speeds.3

It’s easy to understand what average and median speeds represent. Unfortunately, these metrics fail to capture something essential—variance in speeds.

For example, OpenSignal’s latest report for U.S. networks shows that Verizon has the fastest average download speed of 31 Mbps in the Chicago area. AT&T’s average download speed is only 22 Mbps in the same area. Both those speeds are easily fast enough for typical activities on a phone. At 22 Mbps per second, I could stream video, listen to music, or browse the internet seamlessly. For the rare occasion where I download a 100MB file, Verizon’s network at the average speed would beat AT&T’s by about 10.6 seconds.4 Not a big deal for something I do maybe once a month.

On the other hand, variance in download speeds can matter quite a lot. If I have 31 Mbps speeds on average, but I occasionally have sub-1 Mbps speeds, it may sometimes be annoying or impossible to use my phone for browsing and streaming. Periodically having 100+ Mbps speeds would not make up for the inconvenience of sometimes having low speeds. I’d happily accept a modest decrease in average speeds in exchange for a modest decrease in variance.5

deceptive fish

I’m Not Unbiased

Warning: This post is a rant and contains foul language. Enjoy!


Tons of research suggests that people engage in deception and self-deception all the damn time. People are biased.

Despite this, pretty much every website offering reviews makes claims of objectivity and independence. These websites don’t claim that they try to minimize bias. They claim to actually be unbiased.

Let’s take a look at an excerpt from TopTenReviews, a high-traffic review site:

Methods of monetization in no way affect the rankings of the products, services or companies we review. Period.
Bullshit. Total bullshit.

I’ve ranted enough in the past about run-of-the-mill websites offering bogus evaluations. What about websites that have reasonably good reputations?

NerdWallet

NerdWallet publishes reviews and recommendations related to financial services.

Looking through NerdWallet’s website, I find this (emphasis mine):1

The guidance we offer, info we provide, and tools we create are objective, independent, and straightforward. So how do we make money? In some cases, we receive compensation when someone clicks to apply, or gets approved for a financial product through our site. However, this in no way affects our recommendations or advice. We’re on your side, even if it means we don’t make a cent.
Again, bullshit.

NerdWallet meets Vanguard

Stock brokerages are one of the types of services that NerdWallet evaluates.

One of the most orthodox pieces of financial advice—with widespread support from financial advisors, economists, and the like—is that typical individuals who invest in stocks shouldn’t actively pick and trade individual stocks.2 This position is often expressed with advice like: “Buy and hold low-cost index funds from Vanguard.”

Vanguard has optimized for keeping fees low and giving its clients a rate of return very close to the market’s rate of return.3 Since Vanguard keeps costs low, it cannot pay NerdWallet the kind of referral commissions that high-fee investment platforms offer.

What happens when NerdWallet evaluates brokers? Vanguard gets 3 out of 5 stars.4 It’s the worst rating for a broker I’ve seen on the site.5

NerdWallet slams Vanguard for not offering the sort of stuff Vanguard’s target audience doesn’t want. Vanguard gets the worst-possible ratings in the “Promotions” and “Trading platform” categories. Why? Vanguard doesn’t offer those things.6

Imagine someone going to a nice restaurant and complaining that the restaurant’s steak doesn’t come with cake frosting. NerdWallet is doing something similar.

The following excerpt comes from NerdWallet’s Vanguard review (emphasis mine):

Ask yourself this question: Are you part of Vanguard’s target audience of retirement investors with a relatively high account balance? If so, you’ll likely find no better home. You really can’t beat the company’s robust array of low-cost funds.

Investors who fall outside of that audience — those who can’t meet the fund minimums or want to regularly trade stocks — should look for a broker that better caters to those needs.

Vanguard’s minimum is $1,000.7 You shouldn’t buy stocks if you have less than $1,000 to put into stocks! If you invest in stocks, you shouldn’t regularly trade individual stocks!8

From my perspective, NerdWallet is saying that if you are (a) the typical kind of person that should be buying stocks and (b) you don’t use a stupid strategy, then “you really can’t beat [Vanguard].”

So there we have it. Despite the lousy review, NerdWallet correctly recognizes that Vanguard is awesome.

NerdWallet didn’t really lie, but NerdWallet is definitely biased.9

thumbs down

WireCutter

Sometimes evaluators aim to create divisions between editorial content (e.g., review writing) and revenue generation. I think divisions of this sort are a good idea, but they are not magic bullets.

WireCutter is one of my favorite review sites, but it makes the mistake of overemphasizing how much divisions can do to reduce bias:10

We pride ourselves on following rigorous journalistic standards and ethics, and we maintain editorial independence from our business operations. Our recommendations are always made entirely by our editorial team without input from our revenue team, and our writers and editors are never made aware of any business relationships.
I believe WireCutter takes actions to encourage editorial independence. However, I’m skeptical of how the commitment to editorial integrity is described. Absent extreme precautions, people talk. Information flows between coworkers. Even if editors aren’t explicitly informed about financial arrangements, it’s easy for editors to make educated guesses.11

Bias is sneaky

Running Coverage Critic, I face all sorts of decisions unrelated to accuracy or honesty where bias still has potential to creep in. For example, in what order should cell phone plans I recommend by displayed? Alphabetically? Randomly? One of those options will be more profitable than the other.

I don’t have perfect introspective access to what happens in my head. A minute ago, I scratched my nose. I can’t precisely explain exactly how or why I chose to do that. It just happened. Similarly, I don’t always know when and how biases affect my decisions.

I’m biased

I have conflicts of interest. Companies I recommend sometimes pay me commissions. You can take a look at the arrangements here.

I’ve tried to align my incentives with consumers by building my brand around commitments to transparency and rigor. I didn’t make these commitments for purely altruistic reasons. If the branding strategy succeeds, I stand to benefit.

Even with my branding strategy, my alignment with consumers will never be perfect. I’ll still be biased. If you ever think I could be doing better, please let me know.