Friday, October 10, 2008

Current Biomedical Publication System: A Distorted View of the Reality of Scientific Data?

Why Current Publication Practices May Distort Science

Young NS, Ioannidis JPA, Al-Ubaydli O

PLoS Medicine Vol. 5, No. 10, e201 / October 7 2008



The current system of publication in biomedical research provides a distorted view of the reality of scientific data that are generated in the laboratory and clinic. This system can be studied by applying principles from the field of economics. The “winner's curse,” a more general statement of publication bias, suggests that the small proportion of results chosen for publication are unrepresentative of scientists' repeated samplings of the real world.

The self-correcting mechanism in science is retarded by the extreme imbalance between the abundance of supply (the output of basic science laboratories and clinical investigations) and the increasingly limited venues for publication (journals with sufficiently high impact). This system would be expected intrinsically to lead to the misallocation of resources. The scarcity of available outlets is artificial, based on the costs of printing in an electronic age and a belief that selectivity is equivalent to quality.

Science is subject to great uncertainty: we cannot be confident now which efforts will ultimately yield worthwhile achievements. However, the current system abdicates to a small number of intermediates an authoritative prescience to anticipate a highly unpredictable future. In considering society's expectations and our own goals as scientists, we believe that there is a moral imperative to reconsider how scientific data are judged and disseminated.

Full Text Available At:


Box 1. Potential Competing or Complementary Options and Solutions for Scientific Publication

  • Accept the current system as having evolved to be the optimal solution to complex and competing problems.
  • Promote rapid, digital publication of all articles that contain no flaws, irrespective of perceived “importance”.
  • Adopt preferred publication of negative over positive results; require very demanding reproducibility criteria before publishing positive results.
  • Select articles for publication in highly visible venues based on the quality of study methods, their rigorous implementation, and astute interpretation, irrespective of results.
  • Adopt formal post-publication downward adjustment of claims of papers published in prestigious journals.
  • Modify current practice to elevate and incorporate more expansive data to accompany print articles or to be accessible in attractive formats associated with high-quality journals: combine the “magazine” and “archive” roles of journals.
  • Promote critical reviews, digests, and summaries of the large amounts of biomedical data now generated.
  • Offer disincentives to herding and incentives for truly independent, novel, or heuristic scientific work.
  • Recognise explicitly and respond to the branding role of journal publication in career development and funding decisions.
  • Modulate publication practices based on empirical research, which might address correlates of long-term successful outcomes (such as reproducibility, applicability, opening new avenues) of published papers.

>>>Extended Version<<<

The Market for Exchange of Scientific Information: The Winner’s Curse, Artificial Scarcity, and Uncertainty in Biomedical Publication


Guest Blog

More Evidence on Why We Need Radical Reform of Science Publishing / Richard Smith

PLoS Medicine invited Richard Smith, former editor of the BMJ and current board member of PLoS, to discuss an essay published this week by Neal Young, John Ioannidis and Omar Al-Ubaydli that argues that the current system of publication in biomedical research provides a distorted view of the reality of scientific data.

"For me this paper simply adds to the growing evidence and argument that we need radical reform of how we publish science. I foresee rapid publication of studies that include full datasets and the software used to manipulate them without prepublication peer review onto a large open access database that can be searched and mined. Instead of a few studies receiving disproportionate attention we will depend more on the systematic reviews that will be updated rapidly (and perhaps automatically) as new results appear."


News Coverage

The Economist [10-09-08] : Scientific Journals: Publish and Be Wrong

"Dr Ioannidis made a splash three years ago by arguing, quite convincingly, that most published scientific research is wrong. Now, along with Neal Young of the National Institutes of Health in Maryland and Omar Al-Ubaydli, an economist at George Mason University in Fairfax, Virginia, he suggests why."

>>>With Comments<<<


Newsweek [10-06-08] : Don't Believe What You Read, Redux / Sharon Begley

"Bottom line: when it comes to 'the latest studies,' take what you read with a grain of salt.



Why Most Published Research Findings Are False / John P. A. Ioannidis / PLoS Med. 2005 August; 2(8): e124 / August 30 2005



There is increasing concern that most current published research findings are false. The probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and, importantly, the ratio of true to no relationships among the relationships probed in each scientific field. In this framework, a research finding is less likely to be true when the studies conducted in a field are smaller; when effect sizes are smaller; when there is a greater number and lesser preselection of tested relationships; where there is greater flexibility in designs, definitions, outcomes, and analytical modes; when there is greater financial and other interest and prejudice; and when more teams are involved in a scientific field in chase of statistical significance.

Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias. In this essay, I discuss the implications of these problems for the conduct and interpretation of research.


No comments: