Five Biotechnologies That Will Fade Away This Decade

investing in other areas. It is likely that all these signatures will be replaced continually. Each time a larger sample is gathered it will allow a refinement that warrants replacing the last signature. Moreover, each time scientists can subdivide patients into more coherent sets of patients new markers will be more predictive. Prepare to live in a world where the platform owners and database organizers have a greater proportion of the value proposition. VCs would be smart to invest in platforms and those who can offer up access to the evolving models of disease from which will spring the dynamic biomarkers.

4. Indications for drugs will be determined by clinical trials performed by the biotech/pharma company developing the therapy. Most drugs today get approved by the FDA even if they only work in a small fraction of patients. This practice is going to end, because once a drug is approved with regards to its safety profile it may well be that the definition of who should get the drug is modified continuously by large trials organized by payers and patients. In real time, these groups will evolve the criteria of who should get which drug by participating in ongoing trials even after the drug has “been approved.” This is something to look forward to, as it will take much of the trial-and-error nature out of prescription medicine. It will be a world of real evidence based therapies

5. Hunter-gatherer approaches where large groups collect massive clinical and genomic information and expect that they as the data generator will be the data analyzer. Funding of large cohort studies like the famous “Framingham Heart Study” that has been following the health of patients in Framingham, MA for more than 60 years have been extremely valuable. The old methods used for these studies assumed that the analysis would be done by the small group of primary collectors of the samples and data. This model, too, will be fading as distributed groups of scientists evolve the knowledge faster and more efficiently than those who generated the data. Remember, this is already how physicists work today. Also remember Jim Gray of Microsoft Research, and his ideas on “The Fourth Paradigm,” which says that scientific theory, experimentation, and large-scale computational simulation will begin to interact and reinforce each other in ways that will speed up scientific progress.

We will need to do biology research in fundamentally different social contexts as we move into this next decade. This means biologists will need to start pooling their knowledge through social networking channels, not unlike how computer scientists have long done for open source software development.

Author: Stephen Friend

Stephen H. Friend is president, CEO and a co-founder of Sage Bionetworks, an international genomic research collaborative. He was previously a Senior Vice President at Merck, where he led Merck’s Basic Cancer Research efforts. In 2005, he led the Advanced Technologies and Oncology groups to firmly establish molecular profiling activities throughout Merck’s laboratories around the world, as well as to coordinate oncology programs from Basic Research through phase IIA clinical trials. Prior to joining Merck, Dr. Friend was recruited by Dr. Leland Hartwell to join the Fred Hutchinson Cancer Research Center’s Seattle Project, an advanced institute for drug discovery. While there Drs. Friend and Hartwell developed a method for examining large patterns of genes that led them to co-found Rosetta Inpharmatics in 2001. Dr. Friend has also held faculty positions at Harvard Medical School from 1987 to 1995 and at Massachusetts General Hospital from 1990 to 1995. He received his B.A. in philosophy, his Ph.D. in biochemistry and his M.D. from Indiana University.