The Reproducibility Initiative: A Good Idea in Theory that Won’t Work in Practice

The failure of scientists to independently confirm much of the data contained in “hot” academic publications is casting a long shadow over the biopharmaceutical industry. Research groups at Amgen and Bayer reported that the data in a significant percentage of published “breakthrough” papers from academic scientists could not be confirmed in their labs. Given that Big Pharma has increasingly turned to academic investigators as a source of molecular targets for new drugs, this represents a Big Problem. I have argued that this lack of experimental reproducibility represents an especially acute problem for virtual biotech companies, who lean on contractors to do most of their R&D, and have no internal lab facilities in which they can try to replicate the data. A new proposal, the Reproducibility Initiative, has recently been established to create a pathway for verifying experimental data outside of the lab that generated it. Science Exchange, a for-profit online marketplace of laboratory services, is coordinating this new initiative in partnership with the open access journal PLoS ONE.

The basic concept behind the Science Exchange marketplace is that it enables scientists to hire service providers to perform experiments that are beyond the capabilities of their own labs. The Reproducibility Initiative is layered on top of this marketplace, with the goal of addressing this irreproducibility problem. Researchers sign up with the Reproducibility Initiative to have their work replicated. An advisory panel finds an outside lab group to perform the studies (which are done anonymously), and the results are then shared with the investigators. Researchers who wish to avail themselves of the Initiative must pay for the confirmation work to be done. In addition, they must also pay a 5 percent transaction fee to Science Exchange for tapping into their network.

While the goal of this new initiative is admirable (and likely profitable for the Science Exchange), I don’t think the approach will work in the real world. Here’s why this initiative will not pan out, despite its good intentions:

Money

Running an academic research lab is expensive, and grants that support these efforts are hard to get. As a result, budgets are very tight, focused on the “must have” items and not the “would be nice” ones. The Principal Investigator in a lab has to find a way to pay him or her self as well as their post-docs, grad students, technicians, and other personnel. They may need to buy very expensive equipment (or contribute towards the expense of a core facility, like an electron microscope or nuclear magnetic resonance machine), and to invest in a costly stream of consumable reagents (e.g. cell culture dishes, chemical supplies, growth media for cells).

This doesn’t leave a lot of leftover money to pay someone else to replicate your studies. Elizabeth Iorns, the CEO of Science Exchange, has said she is hopeful that granting agencies will eventually fund these replication efforts. This is extremely difficult to picture in the current economic climate, where there are serious concerns that the NIH budget will get significantly whacked in the upcoming sequestration process. Even without sequestration, it is hard to fathom that funding agencies would decide to significantly cut the number of grants they award in order to spend the money essentially replicating already obtained results. Iorns estimates that the replication studies might cost only 10 percent as much as the original studies. I have no idea how this number (which sounds way too low to me) was generated, or how these studies could be done so inexpensively by another lab.

Beyond the obvious problems with the proposed academic funding situation, another concern is the notion that the original authors have the option to “republish” their results in the journal PLoS ONE, with a link to the original publication. Who will pay for the additional publication costs? Should people wishing to cite the science refer to the original publication, the rehashed confirmatory paper, or both? Since this confirming publication is optional, labs that learn their work was not capable of being replicated are highly likely to not advertise this fact by publishing this finding. Other ethical and practical concerns related to the “replicating” papers have been raised as well by those with a focus on scholarly publications. Finally, would the failure to ask for “reproducibility” funds as part of a grant application be considered tantamount to admitting that the investigator doing the work viewed it as second rate?

Ego

Most scientists have a strong regard for their ability to do science. The good ones are trained to test and retest their hypotheses, to look for alternate explanations of their data. They run

Author: Stewart Lyman

Stewart Lyman is Owner and Manager of Lyman BioPharma Consulting LLC in Seattle. He provides advice to biotechnology and pharmaceutical companies as well as academic researchers and venture capital firms. Previously, he spent 14 years as a scientist at Immunex prior to its acquisition by Amgen.