Study Calls Out Lack of Reproducibility in Cancer Research
A foundational tenet of scientific research is that experiments can be reproduced. Eight years ago, a group of scientists began to carefully repeat early laboratory experiments in cancer research that had much influence on the field. In their study, they recreated 50 experiments and found that 54% of the studies could not be reproduced. The results were published in eLife, a nonprofit that receives funding from the Howard Hughes Medical Institute.
Known as the Reproducibility Project: Cancer Biology, it spent eight years trying to “replicate experiments from high-impact cancer biology papers published between 2010 and 2012.” It was a collaboration between the Center of Open Science and Science Exchange.
They noted that in working on repeating 193 experiments from 53 papers, they found that 0% of the protocols were completely described, 2% had open data, 70% required asking for key reagents, 32% of the original authors were not helpful or unresponsive, and 69% required a key reagent the original were willing to share. However, 41% of the original authors were very helpful.
Still, 67% of the studies required modifications to complete. They specifically cite three key findings: First, the replication effect sizes were 85% smaller on average than the original findings; second, 46% of the effects that were replicated successfully on more criteria than they failed; and third, original positive results were half as likely to replicate successfully than the initial null results.
All of the experiments were the type of preliminary research performed in mice and laboratory assays. Marcia McNutt, president of the National Academy of Sciences, says that there is little incentive for researchers to share methods and data to verify their research. If their data doesn’t hold up, they lose prestige.
Vinay Prasad, a cancer physician and researcher at the University of California, San Francisco, who was not involved in the study, told AP, “The truth is we fool ourselves. Most of what we claim is novel or significant is no such thing.” He also cautions about how these early studies can create a sense of false hopes for cancer patients, promising a cure “just around the corner. Progress in cancer is always slower than we hope.”
The study emphasizes that there are shortcomings early in the scientific process, not with established, approved treatments. This reproducibility study isn’t related to extensive clinical trials of drugs conducted in patients but to early research studies.
Examples of studies that couldn’t be replicated are one that identified a specific gut bacterium associated with colon cancer in humans, and another was a drug that shrunk breast tumors in mice. Yet another was a mouse study of an experimental prostate cancer drug.
Erkki Ruoslahti, the researcher at Sanford Burnham Prebys research institute, co-author of the prostate cancer study, notes that the research had held up to scrutiny from other sources. “There’s plenty of reproduction in the (scientific) literature of our results,” he told AP.
Michael Lauer, deputy director of extramural research at the National Institutes of Health (NIH) said, “I wasn’t surprised, but it is concerning that about a third of scientists were not helpful, and, in some cases, were beyond not helpful.”
Laurer added that the NIH will attempt to improve data sharing by requiring it of institutes that receive NIH grants in 2023. “Science, when it’s done right, can yield amazing things,” he said.