Many Papers Lack Detailed Methods

Nearly half of scientific publications lack sufficient information about the research resources needed to reproduce experimental findings, a study finds.

Reproducibility is a hallmark of good science. However, despite the fact that most scientific journals require authors to list the resources used in their experiments, almost half of the papers examined in a new study failed to specify all of the items needed to replicate the findings. The study was published Thursday (September 5) in the journal PeerJ.

Researchers at Oregon Health and Science University (OHSU) examined the methods sections of nearly 240 scientific articles from more than 80 journals spanning five disciplines: neuroscience, immunology, cell biology, developmental biology, and general science. They scanned the articles, including the supplementary information and references, searching for exact product numbers for five types of biomedical resources: antibodies, model organisms, cell lines, DNA constructs, and knockdown reagents.

Just under 50 percent of the articles studied failed to fully identify all of the materials used in the experiment. This held true even for journals with supposedly more stringent reporting procedures. While suggestive of a trend, the study did not, however, report on how often a lack of specific information about experiment resources hindered reproducibility.

Part of the problem, the authors noted, is that reporting guidelines for methods sections vary widely from journal to journal, with even top-tier journals maintaining loose requirements due to strict space limitations. “As researchers, we don’t entirely know what to put into our methods section,” Shreejoy Tripathy, a neurobiology graduate student at Carnegie Mellon University, whose laboratory served as a case study for the research team, told The Chronicle of Higher Education.

“The stories we tell in scientific publications are not necessarily instructions for replication,” lead study author Melissa Haendel, an ontologist at OHSU, said in a statement. “This study illuminates how if we aim to use the literature as the scientific basis for reproducibility, then we have to get a lot more specific.”

Using the criteria laid out in the new study, the researchers, many of whom are members of Force11—a collective with the goal of modernizing scientific publishing—have developed guidelines for reporting research resources.

Fonte: The Scientist