FDA Commish Califf to Big Data Crowd: Flood Of New Treatments Coming

New FDA commissioner Robert Califf said Wednesday that the recent spike in drug approvals is no temporary phenomenon, thanks to a better understanding of biology and continued adoption of sophisticated technology throughout healthcare.

“I predict we’ll see of a flood of effective treatments”—not just drugs but combinations of drugs and data-enabled devices—“and we’ll have to figure out what to do with them,” Califf said. He was speaking before hundreds of attendees at a Stanford University conference dedicated to the use of big data in medicine and health care.

Califf is a cardiologist and spent 33 years at Duke University, often in charge of big clinical trials, before joining the FDA as a deputy commissioner in 2015. By a vote of 89-4, the Senate confirmed his appointment earlier this year, but not before a delay as critics raised pointed questions about his ties at Duke to the pharma industry. They questioned whether a Califf-run FDA would be tough enough on the industry, which has enjoyed an extended run of political and regulatory goodwill, including record levels of drug approvals in recent years.

Under Califf’s predecessor, Margaret “Peggy” Hamburg, the FDA’s drugs division opened new channels for experimental therapies to gain a speedier review if they show potential to address hard-to-treat diseases. With those measures and more, the agency has regained favor with the industry after the safety crises of painkiller rofecoxib (Vioxx) and diabetes drug rosiglitazone (Avandia) last decade had brought stricter regulation.

The FDA’s top priority, as Califf reminded the audience, is to keep America as safe as possible. But there is inherent tension between that mandate and the pressure to approve more drugs, motivated by patient demands, industry economics, and pressure from politicians. (That tension has been at the center of the drama surrounding FDA’s rejections of two drugs for Duchenne muscular dystrophy, a progressive, fatal disease with no effective treatments. The agency was scheduled this week to either approve or reject a third drug, eterplirsen—the only other prospective Duchenne treatment currently close to market—but it announced a delay.)

That tension has also surfaced with the 21st Century Cures Act, a package of biomedical and public safety reforms, which passed through the House of Representatives last year. It has hit roadblocks in the Senate [over concerns that FDA safety standards for new antibiotics and other products would be weakened as a quid pro quo for increased National Institutes of Health funding. In a Wall Street Journal article earlier this month, former FDA commissioner David Kessler called the legislation a “deal with the devil.”

Califf tailored his comments Thursday for his data-centric Silicon Valley audience, outlining how the agency is trying to corral massive amounts of data to improve drug and medical device regulation and food safety. “I’m astounded at the impact of big data,” he said, using the common catch-all term for systems that gather and analyze vast pools of data for hidden patterns or insights.

One example is a genomics data-sharing system called precisionFDA that the agency is opening to outsiders who want to work on developing better molecular diagnostic tests.

Another example is the FDA’s Sentinel system, which aims to alert regulators and health officials to side effects of drugs on the market that in previous eras might have gone unnoticed or seemed like isolated incidents. “We no longer live in a segmented society with everyone off in their corners wondering what everyone else is doing,” Califf said.

One ambition is to combine the Sentinel side-effect data with insurance claims data and patient registries—databases about patients who share a condition, like this one for inherited heart disease—to start building a “pretty comprehensive picture of health outcomes,” he said.

Eventually those layers of “real world” data could be used to bolster clinical trial experiments, Califf said. Right now, clinical trials of a drug or medical device are isolated, controlled experiments; the only data they use are the data they generate. But in many cases, it’s difficult or impossible to recruit enough volunteers with a disease to conduct experiments with reliable results.

Califf is a proponent of using real-world results from patients outside a trial—“pre-existing data” is how he phrased it—in later-stage trials as long as the quality can be assured and the trial methods, such as randomly assigning patients to different arms, keep their integrity. “For later phases we’ve got to switch to a system where you bring in pre-existing data but stick to randomization,” he said. “We’re not there, but we’re getting there.”

During his time at Duke, Califf was one of the country’s top clinical investigators and spent years working with multi-institution groups on clinical trial research and reform. (He was credited as a coauthor on this 2012 study, for example, that showed cancer trials were more likely to veer from the so-called “gold standard” of randomization.)

His organization at Duke also worked with drug companies on trials to test their products, a common form of income for academic groups. Those fiscal ties spurred much of the criticism of Califf, led by Vermont senator and presidential candidate Bernie Sanders.

Ultimately Califf would like to see interconnected health systems that learn from each other: deeper real world health data informing smarter product R&D, better products leading to better health outcomes, and so on. But he acknowledged we’ve got a long way to go. He cited studies that less than 15 percent of the guidelines that steer doctors’ medical decisions are based on what he called “high quality evidence.”  If the goal, he said, is “evidence-based practice, the problem is we don’t yet have enough evidence to practice on.”

An audience member asked if better methods of collecting real world data would eventually allow FDA to approve drugs with less clinical evidence, on condition that the drugs continue to perform well after approval. Califf replied that drug approvals will always be driven by the quality of the science behind the R&D. But he also acknowledged that, “as the data environment changes, the regulatory environment will change.”

Author: Alex Lash

I've spent nearly all my working life as a journalist. I covered the rise and fall of the dot-com era in the second half of the 1990s, then switched to life sciences in the new millennium. I've written about the strategy, financing and scientific breakthroughs of biotech for The Deal, Elsevier's Start-Up, In Vivo and The Pink Sheet, and Xconomy.