Guia Covid-19
Imprimir Republish

GOOD PRACTICES

Questionable methods admitted publicly

Articles by nutritionist who advised the US government on healthy eating retracted due to errors and negligence

WALTER REGOIn November 2017, scientific journal Frontiers on Psychology announced the retraction of an article originally published in 2016 by behaviorist and nutrition expert Brian Wansink, a researcher at Cornell University, USA. The study in question used data on the habits of 355 World War II veterans and concluded that traumatic experiences may affect the way people buy food: veterans who experienced extreme violence exhibited less brand loyalty, while victims of milder trauma were more influenced by advertising. After receiving a complaint about possible biases in the study, the journal reanalyzed the raw research data and concluded that “there is no empirical support for the conclusions of the article,” according to the retraction note.

This was the fourth article by Wansink retracted in 2017 and another seven had to undergo corrections—roughly 40 more are still under investigation. One article was even retracted twice. Published in Jama Pediatrics in 2012, the paper was retracted and republished after statistical errors had been corrected, but it was later permanently retracted after its premises were discovered to be fundamentally flawed. The paper had concluded that children aged 8–11 prefer apples to cookies if the fruit includes a sticker of Elmo from Sesame Street, but the study data was actually obtained from much younger children aged 3–5 years old.

Brian Wansink is a popular figure in the United States. He has written several popular books, including Mindless Eating, published in 2007, in which he claims that people consume fewer calories when they eat off smaller plates. He was the director of the USDA’s Center for Nutrition Policy and Promotion and coordinated the US Dietary Guidelines between 2007 and 2009, making science-based recommendations for a healthier diet. In 2010, he was one of the founders of a US$22 million federally funded program that implements healthy-eating strategies in more than 30,000 schools.
The strategies are largely based on scientific studies conducted by Wansink and his research group.

Investigations into his articles were sparked by the publication of a controversial text on Wansink’s personal blog in November 2016, in which he discussed the use of questionable data processing methods. In the post, titled “The Grad Student Who Never Said No,” he praised Turkish PhD student Ozge Sigirci, an intern at his laboratory, for her willingness to reanalyze raw data that other researchers had already examined and found only null results. Wansink claimed to have offered her a chance to rework the results of a survey taken at an Italian restaurant offering an all-you-can-eat pizza buffet.

“I said, ‘This cost us a lot of time and our own money to collect. There’s got to be something here we can salvage because it’s a cool (rich & unique) data set,” he wrote. Ozge Sigirci, who was a volunteer, reanalyzed the data and identified trends that according to Wanksink, led to a number of hypotheses and resulted in four articles submitted to journals. One of the papers reported that men eat more when they are in the company of women. Another examined the relationship between the price of the buffet and customer satisfaction. Happy with the student’s hard work, Wansink challenged her to analyze another data set that had been rejected by a postdoctoral intern working in the laboratory—who had found no basis for any articles in the data. The student reanalyzed the data and once again made discoveries that resulted in published papers.

Wansink’s blog post circulated in academic circles and was heavily criticized. Some people condemned the fact that a foreign student was working without pay, while others found evidence in the report of a practice known as p-hacking, or the manipulation of p-values, which involves exhaustively processing and selecting statistical data to derive a trend from inconclusive results. P is a measure of the probability that the observations of a study occurred due to chance and not due to the factors being studied. A value of less than or equal to 0.05 is used as an indicator of statistical significance, suggesting that the results are robust. People were also critical of Wansink’s approach of using just one piece of research to yield multiple articles.

JASON COSKI / CORNELL NEWS Brian Wansink praised a PhD trainee who agreed to rework statistical data that had not previously produced any scientific articlesJASON COSKI / CORNELL NEWS

A sensitive moment
The Cornell researcher wrote about his controversial practices at a particularly sensitive time, with psychology research experiencing a crisis punctuated by an inability to reproduce and confirm conclusions of studies that have yielded interesting results and good newspaper headlines.
In 2014, a task force of 100 researchers from a range of countries began the challenge of reproducing a set of 27 social psychology articles, and in a third of cases they were not able to do so (see Pesquisa FAPESP, issue No. 200). This crisis was triggered by scandals such as the case of Diederick Stapel, a social psychology professor at the University of Tilburg in the Netherlands who had 30 articles retracted for data manipulation (see Pesquisa FAPESP, issue No. 190).

Criticism of Wansink’s stance attracted the attention of a group of Dutch researchers who have created tools that detect statistical anomalies in scientific papers. The team, which includes Jordan Anaya, a computer biologist, Nick Brown, a PhD student at the University of Groninger, and Tim van der Zee, a professor at Leiden University, found 150 statistical inconsistencies in the four articles produced from the pizza buffet study. In many cases, the data reproduced in the tables was very loosely rounded. These findings led to an article published on a preprints repository, titled “Statistical heartburn: An attempt to digest four pizza publications from the Cornell Food and Brand Lab.” The trio asked Wansink for access to the research data, but he claimed that giving it to them would compromise the anonymity of the participants. When contacted, the Office of Research Integrity and Assurance at Cornell University reaffirmed that the data is classified. In its only official action so far, the university reported in May that it had found no evidence of misconduct regarding the errors in the four pizza buffet articles—but that it was still assessing further complaints.

In a statement posted on his laboratory’s website, Wansink took responsibility for the errors and claimed to have requested corrections. He also announced that he had adopted procedures to prevent the repetition of such “negligence and errors” in the future as well as having created a system capable of making the research data relating to the participants anonymous so that it could be shared. Regarding recent retractions, he told Buzzfeed that the articles were retracted at his request. The editors of Jama Pediatrics denied this was true for the articles it published.

Republish