ARTICLE
6 September 2024

FDA Continues To Intentionally Incorporate AI Into Medical Product Development With Its Establishment Of A New Internal Advisory Body

M
Mintz

Contributor

Mintz is a general practice, full-service Am Law 100 law firm with more than 600 attorneys. We are headquartered in Boston and have additional US offices in Los Angeles, Miami, New York City, San Diego, San Francisco, and Washington, DC, as well as an office in Toronto, Canada.
Agencies across the federal government continue to grapple with how to respond to the directives in President Biden's October 2023 executive order on artificial intelligence, including the Food and Drug Administration (FDA) and its parent agency the Department of Health and Human Services (DHHS).
United States Technology

Agencies across the federal government continue to grapple with how to respond to the directives in President Biden's October 2023 executive order on artificial intelligence, including the Food and Drug Administration (FDA) and its parent agency the Department of Health and Human Services (DHHS). As summarized in this handy Mintz/ML Strategies timeline of the actions set forth in President Biden's AI executive order, DHHS has a mandate to "develop a strategy for regulating the use of AI in the drug development process" and a deadline of October 29, 2024 to meet that obligation. Notably, earlier this year DHHS announced a functional reorganization within the department that included creating the position of Chief AI Officer in response to the presidential order.

The upcoming October 2024 milestones may explain why FDA organized a workshop in early August to collect feedback and hear from stakeholders on diverse applications of AI in the drug and biological product development processes (see our report from the workshop here), and why it more recently created an Artificial Intelligence Council within the Center for Drug Evaluation and Research (CDER).

According to an email communication sent internally by the CDER director to her staff, the new Artificial Intelligence Council is intended to serve as a centralized resource for both information-gathering and information-dissemination within and from the Center. It is also expected to manage inquiries from external stakeholders such as DHHS leadership, the White House, Congress, and members of the pharmaceutical industry. The internal communication also explained that the new Artificial Intelligence Council will consolidate efforts previously initiated by groups with smaller footprints like the CDER AI Steering Committee and the AI Policy Working Group and will continue the Center's collaboration within the wider federal government's AI Community of Practice. Among the primary goals of the new CDER AI Council are promoting consistency and facilitating trustworthy use of the technology in CDER's regulatory and administrative missions, as well as its own research and internal management activities.

An important area the AI Council could play a role in shaping is how to approach and manage the use of AI during the manufacture and testing of pharmaceutical products, as well as the development of complex drug-device combination products. In particular, drug-device combination products may incorporate multiple applications of AI with different levels of confidence and may present challenging issues within the premarket review process, especially if each Center applies a slightly different approach to AI in products and product development. It is unclear whether the nature of such a combination product as "drug-led" or "device-led" will result in different regulatory or development outcomes, depending upon whether a private developer is receiving the bulk of its feedback from CDER or its sister entity within FDA, the Center for Devices and Radiological Health (CDRH). Although in March 2024 the agency released a white paper entitled "Artificial Intelligence and Medical Products: How CBER, CDER, CDRH, and OCP are Working Together," the paper makes high-level commitments to collaborate and share information but does not address the operational aspects of coordinating on specific product development initiatives.

Mintz and ML Strategies will continue monitoring and reporting on key developments in the oversight and regulation of AI applications.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More