After reviewing its mid-pandemic performance, it became clear that its greatest challenge is around the handling of data, according to Amy P. Abernethy, M.D., Ph.D., principal deputy commissioner, FDA, speaking at the recent Demy-Colton Virtual Salon, the “Fusion of Technologies.”
The fusion of technology is blurring the lines between physical, biological and digital sciences, creating new ways of working, and new ways of designing, testing and analyzing drug candidates. The FDA, long a proponent of machine learning, is on board with these changes, and is integrating them into the agency’s normal workflow.
After reviewing its mid-pandemic performance, it became clear that its greatest challenge is around the handling of data, according to Amy P. Abernethy, M.D., Ph.D., principal deputy commissioner, FDA, speaking at the recent Demy-Colton Virtual Salon, the “Fusion of Technologies.”
She called out data that is siloed, in incompatible formats, moved by fax machines or that lacks needed data points.
“Unless we do something differently, we’ll be mired in paperwork forever,” Abernethy said. “Automation is key across the agency.”
Abernethy, who took on the role of chief information officer this year, got specific.
“We need a coordinated data architecture and technology infrastructure that allows us to understand and use real world evidence (RWE), work smarter and more efficiently, and scale the work we do,” and it needs to be developed in a way that doesn’t create a choke point, Abernethy said.
The difference a digitized approach can make is dramatic. The FDA recently brought in 7-15 day safety reports for INDs as structured data with a machine interface so it can be rapidly analyzed. “That frees medical reviewers (who previously dealt manually with the forms). It’s more efficient.
“Our goal is to rethink how we manage, analyze, and use data,” she said.
One element involves moving data from silos into the cloud, ultimately enabling data lakes for pooled data sets. The agency also is developing machine learning algorithms for certain purposes. One involves determining which food containers should be inspected at U.S. ports.
Abernethy pointed out that they have our own data sets, and also access data sets from industry and government.
“Our analysis of data, whether in service of a specific application or review, or in research, is a unique muscle of the agency,” Abernethy said. “It allows us to be very thoughtful in our work and is a beacon of what high quality analysis and credible results look like.”
Such analyses, she said, result in better clinical designs and model-informed drug development. The insights can help the FDA “…think differently, manage drug shortages, better understand the PPE shortage and the supply chain, as well as to identify safety signals or improve personalized drug labels. First though, we have to get the infrastructure right.”
While it develops the infrastructure, the agency also is developing a strong data governance program to resolve such issues as who owns specific data (including patient data) and how it can be used.
“It’s an important, ongoing conversation that is not part of FDA’s specific purview, but part of the larger social contract,” she said.
The initial focus is domestic, but Abernethy anticipates international harmonization eventually.
The FDA is modernizing more than its data infrastructure.
“It’s important to recognize our core mission, to follow the path we know regarding the review of science, attention to data, and the process…but also to ask where the agency can be flexible,” Abernethy said.
At the FDA, that means not only “regulating medical products at the leading edge of healthcare delivery, but also defining what ‘good’ looks like.” Part of that may involve developing or identifying and using new statistical models that identify value.
For example, it is considering new strategies to increase efficiency clinical trials and within the FDA. Projects include:
- Analyzing the totality of evidence. This is rarely possible, but combining traditional trials and real world data moves us down that path, she said.
- Investigating model-informed drug development. One CDER project, for example, uses clinical data, exposure-based and statistical models to refine trial design and optimize dose selection relative to the therapeutic to improve success rates.
- Developing pilot programs. For example, Abernethy said, “One Center is simulating heart models to evaluate a catheterization device.”
Although reimbursement isn’t an FDA concern, “We have a body of work emerging now in which we’re looking to align post-approval FDA requirements with the Centers for Medicare & Medicaid Services (CMS) concerns around coverage and evidence, to create multipurpose data that can be used by both simultaneously.
“Lastly, manufacturing will continue to be a major area of focus and innovation,” Abernethy added. “In cell and gene therapies, it is a central component to move that space forward.”
To support modernization efforts, “The FDA does a lot of its own science,” she noted. “The 800-person Office of the Chief Scientist (OCS) is developing a lot of simulation models. One of the areas is the reduction of the use of non-human primates. We’re evaluating organs on a chip and other approaches.”
The OCS helps fill in the gaps and builds a cadre of experts across the agency to provide expertise. Familiarity is important, whether talking about simulations, RWE or preclinical models.
“The FDA has made incredible strides in terms of collaboration, urgency, and it using real world data and clinical trial information. Moving forward, it’s important to ask what it takes to ensure continued responsible progress,” she said.
The goal, she stressed, is to bring safe, effective vaccines, therapeutics, and diagnostics to people without regard for commercial concerns.
Advancing the capability “…requires leadership across the FDA to describe this cohesive vision. As we go into the next decade, this is an important narrative to push forward,” Abernethy concluded.