The Great Weight Loss Program Wars – Part 3: Consumer Reports Bamboozled

In its June, 2011 issue, Consumer Reports compared seven commercial weight loss programs – Atkins, Jenny Craig, Nutrisystem, Ornish, Slim-Fast, Weight Watchers, and Zone.

And the winner was …. Jenny Craig.  Why?  According to the article, “What gave it the edge over the other big names we assessed” was, believe it or not, that researchaganda puff piece published in the Journal of the American Medical Association (JAMA) that we discussed in the last post.

“92% of the participants stuck with the Jenny Craig program for two years – a remarkable level of adherence….” they gushed, in explaining why Jenny Craig was the winner.

Well, the you-know-what immediately hit the fan.  Not a few people saw what was going on and called Consumer Reports to task.  Unfortunately for Consumer Reports, National Public Radio got wind of the story and asked them what was up. You can ignore a lot of critics, but you had best not ignore NPR.

Now one would expect that Consumer Reports, champion of the consumer and critic of corporate malfeasance and baloney, would immediately recognize its mistake, own up to it, and go back and rework its ratings.  Instead sadly, they dug in, circled the wagons and went on the defensive – a response little different from the major corporations they so regularly criticize.

Their response to NPR on May 10, intentionally or unintentionally, avoids the real issue and leads us on a wild goose chase.  It reads in part:  “All the major diet plans have funded their own research. Some of the studies that feed into our ratings, such as the BBC Diet Trials published in BMJ on June 17, 2006, Christopher Gardner’s “A to Z Weight Loss Study” published in JAMA on March 7, 2007, and Michael Dansinger’s comparison of Atkins, Ornish, Weight Watchers, and Zone published in the Jan. 5, 2005 JAMA, were independently conducted.”

By May 18, Consumer Reports seems to have figured out that it had a more serious problem on its hands and wrote a more thoughtful and slightly less defensive response that it posted on its website.  It acknowledged that the New York Times and others (including an editorial in the same JAMA issue) had faulted it for ignoring a study published in 2007 in the International Journal of Obesity which followed 60,164 participants in the Jenny Craig platinum program for one year.

That study found that instead of “92% of the participants stuck with the Jenny Craig program for two years – a remarkable level of adherence….” reported in the JAMA researchaganda piece, the real world retention rate for the average participant was “42% at 13 weeks, 22% at 26 weeks and 6.6% at 52 weeks.”  In other words, the dropout rate at the end of the year was a massive 93.4%.

And so did Consumer Reports now acknowledge its mistake?  Not at all.  They continued to follow the big corporation “defend at all costs” play book and explain how their statistical sorcerers had somehow managed to massage the data, included the 2007 study, and still figured out a way to justify leaving their ratings intact.

All of which totally misses the point.  The problem is not really about the statistics or the data.  The problem is that Consumer Reports violated one of its single, most important core principles.

When Consumer Reports evaluates cars, or TVs, or even dishwashing liquid, it goes to great lengths to make sure that the product that it buys is the same product that everyone can buy.  To make sure of that, it sends out secret shoppers to buy everything that it tests.  It knows that if it called up Joe’s Big City Ford and said “Hi, I’m from Consumer Reports and I want to buy a blue Taurus with the following options so that we can test it,” the car that they would get would bear no resemblance to the car that the average consumer would actually be able to buy.  Instead, they would get the most perfect blue Taurus that the Ford factory would ever be able to produce.

By gushing that“92% of the participants stuck with the Jenny Craig program for two years – a remarkable level of adherence….,” they left the average reader with the impression that they could expect the same results if they signed up for Jenny Craig.

They ignored the fact that when you look at a study that followed 60,146 average participants who didn’t receive $7,000 worth of free food and program fees, and who didn’t get the very best counselors (you can be sure Jenny Craig assigned the very best counselors to the study participants), the average person who signed up for the program would have a 93% chance of dropping out before the end of the first year.

And so Consumer Reports violated its own basic principles and gave their readers the false impression that they could expect the same results as those experienced by the study participants, when in fact, nothing could be further from the truth.

Bookmark the permalink.

Comments are closed.