Can Molecular Medicine Survive Its Teenage Years, and Reach its Potential This Decade?

As the 21st century approaches its teenage years, so too does molecular medicine. Discovery of the structure of DNA more than 50 years ago launched the field of molecular biology. During the last decade, we have seen the first translation of some fundamental discoveries in this field into medical tools. Clearly, however, we are early in their development for routine clinical use.

The biggest question for the next decade is whether molecular medicine can survive its teenage years, moving from first discoveries to mature approaches enabling inexpensive, practical, and reliable clinical tools.

Healthcare spending now represents more than 16 percent of the United States gross domestic product and is growing rapidly. The traditional approach to treating disease is reactive: we typically wait until someone is sick before treatment. As many health care professionals realize, a radical departure from this approach is required to contain healthcare costs. The key is to shift from diagnosing patients when they already have symptoms to detecting disease much earlier, before symptoms appear. This is the potential that molecular medicine brings to personalized healthcare delivery. Personalized healthcare will be predictive and preventive, probing an individual’s unique biology to assess disease probability and then designing appropriate treatments, even before symptoms. Many healthcare professionals believe this transformation will shift how the nation’s healthcare dollar will be used over the next decade, dramatically reducing the amount spent on today’s reactive treatments, while increasing the amount spent on prediction and diagnosis to almost a third of all expenditures.

Most work to date in molecular diagnostics has focused on identifying disease biomarkers in the blood or other easily obtainable bodily fluids. Developing comprehensive yet personalized assays for diverse populations, however, is highly complex and expensive.

Another approach is to focus on molecular screening in which tests based on blood-borne biomarkers are tuned for high sensitivity (which means you never let a patient with the disease go undetected) but relatively low specificity (which means you may predict a number of healthy people have the disease – i.e., significant false positives). Typical diagnostic tests today must balance between sensitivity and specificity because of the high cost usually associated with false positives.odonnelldiagram

Under the new model, depicted in the figure, molecular screening first identifies a high risk individual. Screening results are augmented by highly specific molecular imaging tests to confirm disease onset, characterize the disease, and determine location.

Finally, molecular therapies can be delivered noninvasively, and molecular imaging can be used to both guide procedures and assess treatment efficacy. This integrated approach, including several feedback loops specific to the patient, will be more affordable and reliable, as highly specific imaging can greatly reduce false positives that create unnecessary expenses and anxiety today. Overall, it will help translate molecular medicine into a robust personalized tool.

This model of molecularly-enabled, personalized medicine will become a reality in the next decade only if the following five questions can be successfully answered:

Will molecular screening technologies based on genomics and proteomics be able to detect the early onset of complex diseases such as diabetes and cancer?

Molecular screening has made tremendous progress in the last decade. This progress, however, has produced new challenges. For example, the volume of data

Author: Matthew O'Donnell

Matthew O'Donnell is the Dean of the University of Washington's College of Engineering. He came to the UW in September 2006 from the University of Michigan, where he was chair of the Department of Biomedical Engineering. O'Donnell is a physicist by training with undergraduate through doctoral degrees from Notre Dame. He joined the University of Michigan faculty in 1990 as a professor of electrical engineering and computer science. In 1998, he was named the Jerry W. and Carol L. Levin Professor of Engineering and was appointed chair of the Biomedical Engineering Department in 1999. He won several engineering teaching awards at Michigan. O'Donnell, who was elected to the National Academy of Engineering in February 2009, is an expert in ultrasound imaging, and other new diagnostic imaging technologies, including ultrafast optics, in vivo microscopy, catheter imaging of coronary arteries, optoacoustic arrays, and elasticity and molecular imaging. He is principal or co-principal investigator on numerous research projects funded by the National Institutes of Health and other federal agencies. O'Donnell holds 50 patents and has authored or co-authored more than 200 publications. He is associate editor of the journal Ultrasonic Imaging, is a permanent member of the National Institutes of Health Imaging Study Section, a fellow of both IEEE and AIMBE, and a member of Sigma Xi, and the American Physical Society. Earlier steps on his career path included postdoctoral fellowship and senior research associate positions at Washington University, St. Louis, a research fellowship at Yale University, and a decade of private-sector experience as a research and development physicist at General Electric in Schenectady, New York.