True or False: Publishing Negative Results Ruins Your Science Career

By Mark Terry, Breaking News Staff

Work as a researcher long enough—sometimes not long at all—and you will have a failed experiment.

You will produce “negative results.” You will fail to create data that is clinically significant. You may, even find that your method of collecting data doesn’t disprove your theory, but may just prove that the method you’re using is inappropriate for what you’re trying to achieve.

So do you publish your negative results?

8 Arguments For and Against Publishing Negative Results

In a recent blog post on SmartScienceCareer, a researcher who only identifies himself as Sven, wrote a lengthy piece on the subject and broke down a number of reasons why publishing negative results are a problem. They include:


8 Reasons Why Publishing Negative Results Are A Problem
Frustrating and demotivating.
Pluses and minuses to halting a long-term project.
Waste of resources.
Low impact factor publications.
Interesting but costly.
“Publishing negative results ruins your career.”
Publication bias and the reproducibility crisis.

1. Frustrating and demotivating.

Sven, who is a professor, notes that during PhD training that “most early stage researchers experience an emotional dip anyway—approximately after two years.” What’s going to make you feel worse is publishing what you may perceive as your failures. On the other hand, since life is both frustrating and demotivating, perhaps you should just get used to it and make overcoming it a part of your skill set.

2. Pluses and minuses to halting a long-term project.

Aside from the emotional hit of abandoning a project, Sven argues that it’s the job of the supervising researcher to not only help develop a new project, but to “support the PhD student emotionally in this difficult situation.”

3. Waste of resources.

This argument is along the lines of: why throw good time (and money) after bad, that publishing negative results “is a waste of resources.”

4. Low impact factor publications.

That is to say, the “best” scientific journals tend to want exciting articles, which Sven describes as “new mechanisms, unexpected findings and dramatic effects, which increase citations, clicks, shares and press coverage. Unfortunately, negative results are often very boring.”

5. Interesting but costly.

Sven points out that, “High impact journals may be interested in negative studies when they destroy a long-held paradigm or when a new method is used to show that most previous studies are flawed.” This dovetails with #3, which is to say, with limited time, energy and funds, why spend time publishing negative results when you could otherwise be pursuing more promising data?

6. “Publishing negative results ruins your career.”

This seems to be the crux of the issue, at least for PhD students. Sven writes, “Many supervisors are convinced that publishing negative results will ruin the career of their PhD students as well as their own. They will spend a lot of resources on the wrong project, publish with a low impact factor, and consequently get less future funding.”

7. Unethical.

Many researchers have no issues with abandoning projects that don’t have positive results. “Fail faster, Sven writes, “is a common credo which means to screen for dramatic effects (for example, of a treatment, a drug, a genetic intervention etc.) and leave the less dramatic results untouched. The big problem is that this knowledge is low because all these experiments disappear and many other scientists may repeat the same or similar experiments because these results are not documented and are not publicly available.”

8. Publication bias and the reproducibility crisis.

In a Nature survey published in 2016, more than 70 percent of researchers (out of 1,576 surveyed) have tried and failed to reproduce someone else’s experiments. Much of this seems to stem from a “positive-results bias,” which Sven said is “just a fancy term for the tendency described above: when authors are more likely to submit, or editors to accept, positive results than negative or inconclusive results.”

Leave It to the Old Guys?

Sven comes to a conclusion that is practical, although unlikely to be universally embraced. He argues that, “After many years of struggling with this question, I came to the conclusion that it is not the task of young scientists to publish negative results in the current system. They still have to pursue their career and—as explained above—publishing negative results may have quite negative effects on their careers.”

And, he goes on to argue, “Leave it to the senior scientists who already have a successful career and can afford to publish negative findings for the sake of good science!”

He emphasizes that he is not suggesting “selective reporting,” which is “the incomplete publication of analyses performed in a study that leads to the over- or underestimation of treatment effects or harms. Selective reporting is scientific misconduct. In contrast, I advise young scientists not to waste their time, grant money and energy on studies with negative results.”


In recent years, a number of publications have cropped up that focus specifically on negative results. In March 2015, Emma Granqvist, helped launch “New Negatives in Plant Science,” which focuses on “negative, unexpected or controversial results in the field.”

Another is the “Journal of Negative Results in Biomedicine.” And PLOS ONE launched a collection that highlights studies of inconclusive, null findings or that demonstrated failed replication of other researchers’ works.

There’s also an argument to be made that being published in any journal that is indexed in PubMed is better than not being published at all. With limited publication space, certainly not all research is published in Nature, Science, JAMA, or The New England Journal of Medicine.

Grandqvist wrote, “Ignoring the vast information source that is negative results is troublesome in several ways. Firstly, it skews the scientific literature by only including chosen pieces of information. Secondly, it causes a huge waste of time and resources, as other scientists considering the same questions may perform the same experiments.”

The Science-Career Balance

Maybe there is no balance between career and good science. But it seems like researchers, young or old, would want to focus on good science over bad science. Elisabeth Pain, writing in Science in 2013, wrote, “The imperative for thorough, transparent, and accurate reporting is often in conflict with the need young scientists have to add items to their CVs. Fortunately there are ways—some straightforward and safe; others risky or requiring more effort—to manage this conflict, staying close to data-disclosure ideals while also getting on with your career.”

Pain suggests seven ways for younger scientists to balance their future career needs with good science.



7 Ways to Balance Future Career Needs With Good Science
Splitting your data.
Describing the methodology.
Explaining your statistical analysis.
Inclusion and exclusion of data.
Telling the story.
Negative studies: into the file-drawer?
Asking for help.

1. Splitting your data.

Which is to say, there’s a tendency to publish early and often. She suggests that the focus shouldn’t be on number of publications, but on quality, and not breaking down data just to get out more publications.

2. Describing the methodology.

This is an area that researchers often skimp on, but journals now offer space for supplementary materials. Pain writes, “Indeed, online supplementary materials allow you to go further by presenting a more comprehensive context for your core results, including ancillary data.”

3. Explaining your statistical analysis.

Not only show what you did fully, explain why you did it that way. “The goal,” Elizabeth Wagner, a publications consultant in the UK, told Pain, “is to convince readers that your choice of methods is responsible and appropriate—and not opportunistic.”

4. Inclusion and exclusion of data.

Pain writes, “Real-life science is messy. There are almost always outliers, for a wide range of reasons, and they often give scientists headaches. Don’t even consider hiding them.”

5. Telling the story.

Pain writes, “What makes science communication so difficult is the need to balance all this disclosure and complexity with a clear story that people can follow.” Daniele Fanelli, an evolutionary biologist at the University Of Montreal, told Pain, “A good compromise would be … presenting the pretty result in the main text, but having an appendix where everything that might make the result look less pretty is reported.”

6. Negative studies: into the file-drawer?

That used to be the case, but with the journals focusing on null or negative results, that’s no longer necessary. And it was probably never desirable anyway—not in terms of transparency and good science, anyway.

7. Asking for help.

Pain notes that research publishing can be daunting, and that not all reporting guidelines and checklists are terribly useful. Pay attention to them, but do so critically.

Fanelli told Pain, “At this moment in time, in this form of publication system and scientific system, if you want to make a career, you need to play the game a bit, so you need to sell your result and so on. They would be naïve to say, ‘don’t do that.’” A compromise, he says, is “You do present a pretty, neat, short, to-the-point paper, but you give space somewhere for the not-so-pretty bits to be available.”

And above all, keep in mind Thomas Edison, who on his way to inventing the modern light bulb, said, “I have not failed. I’ve just found 10,000 ways that won’t work.”





Back to news