In a recently-published study co-authored by a UC Berkeley Haas School of Business professor, statistical research methods widely used by researchers were proven instead sometimes to yield false results.
In the study, Leif Nelson and researchers Joseph Simmons and Uri Simonsohn from the Wharton School of Business ran a test case where they used computer simulations to demonstrate how listening to different songs such as “Hot Potato” and The Beatles’ “When I’m Sixty-Four” can make people feel older or younger.
The authors manipulated the variables in their experiment to show that experimental subjects who listened to “Hot Potato” felt older afterward while those who listened to The Beatles felt younger. This experiment reveals how easy it was for researchers to make decisions to collect only some data or exclude certain observations and therefore draw false conclusions about the relationship between the results and the causes, which were actually a matter of chance.
“The problem (the study’s authors) document is not that researchers are being dishonest or intentionally biased, but that researchers are using a statistical methodology that is not sensible,” said Professor Robert MacCoun of the Goldman School of Public Policy, who teaches a graduate course on research design and data collection.
Among 2,000 psychologists surveyed in their study, one in 10 research psychologists have introduced similar types of false data into the scientific record, according to a joint study conducted by researchers from Carnegie Mellon University, Harvard University and the Massachusetts Institute of Technology.
Nelson and his fellow authors offer several solutions to solve the problem of ambiguity in their study, asking authors to include observations or data that they chose leave out of their final results. They also ask journal reviewers to oversee that researchers follow the guidelines described in the study.
While the study’s suggestions have not been implemented in any formal manner, Nelson said he has been contacted by fellow researchers from a variety of disciplines who have started to practice the guidelines laid out in the paper.
Jonathan Schooler, a psychology professor at UC Santa Barbara, said he likes the report’s recommendations but is not convinced they are ideal solutions.
“I am not persuaded it is the ideal. I think what is required is an open repository that logs the experiments, initial hypothesis and associated measurements that come before conducting the experiments,” Schooler said.
Although their views on how to solve the problem vary, all three researchers of the study agree the findings do not necessarily reflect that the discipline of psychology is flawed.
“Despite the potential of false positives, it’s important that the field is finding its flaws and trying to get better,” Nelson said.
Comments should remain on topic, concerning the article or blog post to which they are connected. Brevity is encouraged. Posting under a pseudonym is discouraged, but permitted. The Daily Cal encourages readers to voice their opinions respectfully in regard to the readers, writers and contributors of The Daily Californian. Comments are not pre-moderated, but may be removed if deemed to be in violation of this policy. Click here to read the full comment policy.