- within Energy and Natural Resources topic(s)
Healthcare organizations of every shape and size are rapidly expanding their use of artificial intelligence solutions from high-risk applications like clinical decision-support interventions, ambient listening, and charting to lower-risk administrative activities like automated patient communications and scheduling. While adoption is widespread and increasing in depth and breadth across the industry, not every healthcare organization has established governance around AI or a monitoring process for exploration and adoption of new tools – including those contemplating a sale of assets or equity. For buyers in healthcare mergers and acquisitions today, AI diligence needs to be a focus, given the potential risk of compliance and class action concerns related to high-risk AI solutions, particularly those that have any interaction with protected health information ("PHI") regulated under the Health Insurance Portability and Accountability Act, as amended and pursuant to its implementing regulations (collectively, "HIPAA").
Understanding AI Risks in Healthcare Transactions
As mentioned, not every seller in a healthcare transaction is fully aware of the scope of its use and deployment of AI and may not have a comprehensive AI governance and monitoring strategy. For a buyer, understanding how the seller uses AI and assessing the potential risk level posed by its existing applications is the best way to identify and mitigate potential problems and plan for success in the integrations process post-closing. Once buyers identify what AI applications are in use at their target, they and their advisors might look at potential HIPAA and intellectual privacy risks in addition to the assessing the seller's related vendor arrangements, particularly with respect to data ownership and use, security/data privacy, indemnification and reporting obligations, etc. Having a good sense of where AI arrangements may involve a high degree of compliance or contractual risk will allow the buyer to negotiate effectively to avoid assuming potential liabilities above its risk tolerance and be clear about areas for mitigation and improvement after closing.
State Laws and Evolving AI Regulations in Healthcare
In addition to purely operational and contractual risks, buyers need to understand whether applicable state laws impact the seller's use of AI applications (since AI is not currently subject to federal regulation). The highest risk areas relate to issues such as required disclosures of AI use in decision-making for activities such as prior authorization, patient consent and authorization (in compliance with HIPAA), including for purposes of ambient listening, and consumer privacy protections. Buyers may consider relevant resources to monitor regulatory developments related to AI in states where they operate or are targeting potential acquisitions.
For example, in California, the governor signed Assembly Bill 489 on October 11, 2025. This bill prohibits AI systems and chatbots that communicate directly with patients from suggesting that the advice they give is coming from a licensed health professional.1 The prohibition applies not only to direct statements by the AI, but also to any implication that the medical advice has come from a licensed person. Similarly, in Illinois, the Wellness and Oversight for Psychological Resources Act prohibits anyone—even licensed providers—from using AI in the decision-making process for mental health and therapy, including recommendations that AI might make to diagnose, treat, or improve someone's mental or behavioral health (with carve outs for administrative support).2 We expect that states will continue to expand on regulation in this space and that enforcement activities are likely to increase in response to industries where use of AI may pose inordinate risk for the public, particularly healthcare.
What Buyers Can Examine During AI Due Diligence
Staying ahead of the many challenges that accompany AI use in healthcare means conducting due diligence of a target's AI use with an eye towards identifying areas of highest risk and planning for potential mitigation strategies. Buyers may structure their AI diligence to cover the following areas for AI risk management:
- Understanding AI oversight in the target (e.g., AI governance committee, Chief Information Officer, or Chief AI Officer);
- Assessing the degree to which the target has developed and implemented AI oversight activities (e.g., through a formal AI governance survey and strategy or other informal assessments);
- If the target has adopted an AI governance program, assessing its implementation and any AI-specific policies and procedures (i.e., pilot programs, use of approved technologies, bias controls, data validation, audits, etc.);
- Confirming the target's approved uses of AI technologies and the level of potential risk (e.g., clinical decision support interventions, patient monitoring, diagnostic assistance, ambient listening technologies, etc.) and vendor relationships;
- Examining a list and descriptions of all AI tools and AI models used by, developed by, or trained by the target company, including detailed information related to the use cases for each AI tool/model, scope of use, and methods of access;
- If the target company relies on third-party AI developers or vendors to support its AI implementation, reviewing all third-party AI vendor/developer model cards and contracts (including contracts involving AI use for clinical research purposes); and
- Reviewing the target company's standard terms for its AI vendors (e.g., with respect to data ownership, auditing, reporting, service level agreements, and indemnity terms) and any material open claims.
Collaborating Across Legal, IT, and Clinical Teams
The buyer's legal counsel might have specific expertise not only in healthcare but with respect to healthcare data privacy, security, and AI to assess risk related to the target's operations, structure, and potential high-risk areas and make practical recommendations on risk for go forward operations and integrations. Further, the buyer might have its counsel in the loop to coordinate the diligence review process.
By virtue of their responsibilities, the buyer's IT, operations, and clinical employees will bring valuable insights into how the seller's AI use may impact the buyer's go-forward operations, including with respect to integrating with the buyer's AI strategy. Advisors might focus on potential quality of care and privacy concerns and work together to provide a comprehensive evaluation of potential concerns and high-quality recommendations for the buyer's executive team.
Building a Post-Closing AI Governance and Compliance Strategy
In conjunction with the due diligence review, the buyer may consider developing a strategy for how it and its target will manage AI risks post closing (e.g., determining whether and to what extent the target's existing vendor agreements may be assigned or amended in connection with closing, ensuring appropriate integration planning for AI tools, IT capabilities, and planning for go forward AI governance, oversight and monitoring, patient care and safety, etc.). To the extent a buyer does not have its own existing governance plan, it may consider undertaking an AI use survey and adopting a formal AI governance strategy, which allows for data protection and access controls, long-term compliance protections, streamlined assessment and adoption of potential AI tools and vendor negotiations, and oversight of ongoing AI activities.3
Key Takeaways for Healthcare Buyers and Investors
AI is an evolving legal and operational risk area in healthcare transactions. Conducting effective due diligence review of AI in a proposed transaction calls for a buyer's and its counsel's detailed understanding of the technology itself, as well as potential risks and liabilities surrounding its use. This rapidly developing area of law will continue shaping the regulatory landscape of the healthcare field, but with the right preparation, the diligence process will minimize a buyer's exposure and best position it for post-closing success.
Footnotes
1. A.B. 489, State Leg. 2025–26 Sess. (Cal. 2025) https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202520260AB489
2. H.B. 1806, 104th Gen. Assemb. (Ill. 2025) https://www.ilga.gov/legislation/PublicActs/View/104-0054
3. Sheppard Mullin Healthcare Law Blog, Key Considerations Before Negotiating Healthcare AI Vendor Contracts, (March 2024) https://www.sheppardhealthlaw.com/2025/03/articles/artificial-intelligence/key-considerations-before-negotiating-healthcare-ai-vendor-contracts/
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.