Emails showed how a famous Ivy League food lab was cooking up shoddy data

Author(s)
Published on
October 31, 2018

Your plate size influences how much you eat. Brightly lit restaurants inspire you to make healthier choices. You slurp more soup out of a self-refilling bowl than a normal one.

This was the kind of buzzy, accessible research that Cornell University’s Food and Brand Lab famously churned out for years. Its science-backed insights about eating behavior and weight loss made headlines everywhere, from The New York Times to Good Morning America.

So in late 2016, I was intrigued when outside scientists started scrutinizing many of the lab’s studies and pointed out tons of errors. Naturally, I and everyone else wanted to know what the researchers thought of all this. But they — and their oft-quoted leader Brian Wansink — were suddenly all but silent.

To find out what was being said behind the scenes, I filed public records requests to access the scientists’ emails. They kicked off a series of stories that I ended up writing over a year for BuzzFeed News, a dozen in all. Most significantly, I reported in February that, for years, Wansink and his colleagues discussed massaging low-quality data into studies in a brazen ploy for media coverage. Last month, Cornell announced that it had found Wansink guilty of scientific misconduct and that he will resign. As of this writing, he has retracted 15 papers and corrected 15.

I’ve come away with a fresh conviction that for any journalist, on any beat, records requests can be a powerful way of opening up opaque institutions.

My reporting started in the summer of 2017. At that point, outlets like New York, Retraction Watch and Ars Technica had thoughtfully covered the first wave of allegations against the Food and Brand Lab, brought forward by a group of outside researchers. In response, Cornell had examined some of Wansink’s research but cleared him of misconduct, and Wansink pledged to do better going forward.

But the critics — Nick Brown, Tim van der Zee, James Heathers, and Jordan Anaya — were just getting started. When journals, the researchers, and Cornell stopped responding to them, they did the painstaking (and totally unpaid) work of combing through Wansink’s research and compiling a “dossier” of more than 50 problematic studies.

As a private university, Cornell is not subject to open records laws, so I couldn’t request Wansink’s emails directly. But I went through the studies in the dossier and identified those with co-authors at public colleges: the University of Illinois at Urbana-Champaign, New Mexico State University, and Eastern Illinois University, to name a few. I asked each university for all correspondence between those scientists and Wansink from over the previous year.

To be clear, I had no idea if this would lead to anything. But BuzzFeed News’ science editor, Virginia Hughes, encourages us to routinely request documents about things that pique our curiosity.

In the end, I got enough material for my first story, in September 2017, which revealed that Wansink had privately downplayed the criticism as “cyber-bullying.” I also reported that he and his colleagues were discussing problems in studies that had been cited as evidence for a nearly $22 million federally funded program to make school cafeterias “smarter.”

One paper, for example, found that putting Elmo stickers on apples swayed children to choose the fruit over cookies. But Nick Brown, one of Wansink's critics, suspected that one of its fundamental premises was wrong: that the children studied were not between the ages of 8 and 11, as reported, but were toddlers. Brown’s hunch was correct, as I reported in another story, and the paper got retracted (twice). When I asked Wansink for comment, he said that his team had discovered the age errors the week prior. Yet his emails showed that, months before, he and his collaborators had referred to the children being in day care and as preschoolers.

Trying to get to the bottom of how the study had been done, I filed yet another request for more emails from one author, a scientist at New Mexico State. This time, I asked for eight years’ worth — and they revealed a long history of troubling practices behind multiple studies, as I went on to report.

Primary documents strengthen stories. Here are tips for reporters interested in obtaining them in the pursuit of similar investigations into misconduct or fraud:

1) First, get familiar with requesting documents in general. Check out the Investigative Reporters and Editors’ website for useful links and resources, like sample letters, to figure out where to send requests and how to spell out what you’re looking for.

2) Get creative with requests. Think broadly about all the public agencies and institutions — local, state, national — with which your subject might interact. For example, I also sent Freedom of Information Act requests to federal agencies that had given grants to the Food and Brand Lab, such as the Department of Agriculture and the National Institutes of Health. These weren’t successful, but it was worth a shot. Depending on the story, you can ask for a whole range of materials: emails, texts, handwritten notes, computer data, electronic files, audio and video recordings.

3) Be persistent. Follow up when you’re supposed to receive documents. Offer yourself up to answer the records officer’s questions. If institutions stall, seek outside help: a university that denied my request magically became responsive when our staff attorney stepped in.

4) Get a second, third, and fourth opinion. Once you have materials in hand, it’s important to show them to trusted experts who can explain what they mean, advise leaving something out, or tell you to go back for more. I will always be grateful to the statisticians and psychologists who gave me a crash-course in statistics over the phone and helped me understand what exactly was being said in the emails.

One resource I recommend is Sense About Science, which connects journalists with statistics experts. Another way of finding experts is to read news articles about your topic to see who gets quoted (but make sure they don’t have a financial or personal connection to whatever you’re writing about). In addition, people close to the situation can help bring your documents to life. For example, for my February story, a former lab member bravely went on the record about what it was like when her famous boss told her to do things that she believed were unethical. Her comments underscored the consequences of the behaviors described in the emails.

5) Be judicious. Both journalists and advocacy groups have used open records laws to obtain scientists’ emails in recent years, efforts that have sometimes been controversial. Being on the receiving end of these requests is understandably uncomfortable. But I believe the process can be done responsibly. Sometimes the best strategy is to approach a subject for an interview first, then follow up with a records request if there’s information you still aren’t getting. Then carefully weigh everything that you quote, publish and leave out, and always seek comment from the people who produced it in the first place.

Getting something doesn’t automatically make it worth sharing. In my case, there were many emails in which Wansink and his collaborators looked less than stellar, but I didn’t publish or quote from them because they didn’t clearly rise to the level of misconduct. I also published just two emails in full, because my editor and I felt that they were the strongest examples of data-massaging, and they showed that we weren’t unfairly cherry-picking.

[Photo by USDA via Flickr.]