- with Senior Company Executives, HR and Finance and Tax Executives
- in United States
- with readers working within the Banking & Credit and Insurance industries
On 6 August 2025, the European Insurance and Occupational Pensions Authority (EIOPA) published its Opinion on AI Governance and Risk Management (Opinion).
Addressed to national competent authorities such as the Central Bank of Ireland, the Opinion's stated objective is to provide guidance on existing insurance sectoral legislation as both (re)insurance undertakings and (re)insurance intermediaries (Undertakings) integrate AI systems into their respective areas of competence in the insurance value chain.
The Opinion provides guidance on the main principles and requirements in relation to insurance AI systems that are not considered to be prohibited AI practices or high-risk under the AI Act. It does not include additional requirements nor does it alter the scope of the AI Act by extending the requirements of the AI Act for high-risk AI systems to all AI systems used in insurance. It considers the application of insurance sectoral legislation to AI systems which were either non-existent or not widely used when that legislation was implemented.
The Opinion focuses on AI Governance and Risk Management Frameworks under the following headings (with the themes reflecting the AI Act's high risk AI principles):
- Risk-based approach and proportionality;
- Risk management system;
- Fairness and ethics;
- Data governance;
- Documentation and record-keeping;
- Transparency and explainability;
- Human oversight; and
- Accuracy, robustness and cybersecurity.
Under the Solvency II Directive (Solvency II), the Insurance Distribution Directive (IDD) and the Digital Operational Resilience Act (DORA), Undertakings are required to establish and maintain effective systems of governance and risk management frameworks which are proportionate, appropriate and tailored to the nature, scale, and complexity of the Undertaking's operations or the specific insurance products offered. To ensure alignment with this requirement, the Opinion stipulates that Undertakings conduct a two-step impact assessment of the risk posed by the different AI systems in use.
Impact Assessment
Step 1: Assessment Considerations
To ensure they have effective AI governance and risk management frameworks in place, the first step of the assessment requires Undertakings to assess their AI systems based on a non-exhaustive list of criteria provided for by EIOPA, including:
- the processing of data on a large scale;
- the sensitivity of the data;
- the number of customers (including vulnerable customers);
- the extent to which the AI system can act autonomously; and
- whether the AI system is used only internally or in consumer-facing applications.
Undertakings will also need to consider insurance-specific criteria, including:
- where certain categories of personal data need to be used (e.g. the age of the customer) to underwrite the risk; and
- the extent to which an AI system is used in a line of business that is important for the financial inclusion of customers or which is compulsory by law.
Undertakings must also account for prudential considerations including the impact of AI systems on the business continuity, financial position, legal obligations and reputation of an Undertaking.
Step 2: Post-Assessment Actions
As a second step, taking into account the impact assessment, Undertakings should develop a set of proportionate measures that aim to ensure the responsible use of the AI system. The measures must be proportionate to the level of risk posed by the AI system. For AI systems that have a low or very limited impact on customers or Undertakings themselves, the supervisory expectations would be very limited, and conversely, for AI systems that pose higher risks, there would be more rigorous expectations.
Other Considerations
Additionally, the Opinion provides that Solvency II, the IDD, DORA, the Solvency II Delegated Regulation and the Product Oversight and Governance Delegated Regulation require Undertakings to consider the following areas as they develop governance and risk-management frameworks, including:
- Fairness and ethics – Undertakings must treat
customers fairly by using AI in a responsible, customer-focused
manner. This includes developing a corporate culture documented in
policies that include:
- ethics and fairness;
- guidance and training for relevant staff; and
- sound data governance that aims to remove biases in the data.
- Data governance – Data governance policies must be implemented to ensure data used to train the AI system is complete, accurate and appropriate.
- Documentation and record keeping – Clear records of AI-related decisions and processes should be maintained and reviewed regularly.
- Transparency and explainability – Outcomes of AI systems must be explainable and tailored to the needs of different stakeholders.
- Human oversight – Undertakings must ensure appropriate human oversight and accountability for AI systems.
- Accuracy, robustness and cybersecurity – AI systems should be secure, resilient, and regularly tested for performance and reliability.
The Opinion provides that the responsible use of AI systems is not achieved by a standalone measure, but by a combination of different risk management measures. Undertakings must adopt a holistic approach to AI governance and risk management, acknowledging that cross-references and interdependencies between measures are to be expected. The Opinion emphasises that Undertakings are responsible for the AI systems they use, regardless of how the AI systems were developed.
Conclusion
The Opinion emphasises the importance of a proportionate, risk-based approach to AI governance and risk management for Undertakings. It provides guidance on interpreting key legislative frameworks including Solvency II, IDD, and DORA in this context, and clarifies that Undertakings are expected to remain compliant with evolving regulatory expectations as AI systems become more embedded in insurance operations.
Should you have any queries on the EIOPA Opinion on AI Governance and Risk Management or any point raised in this article, please contact a member of the Insurance Team or your usual William Fry contact.
Contributed by Molly Ryan and Caitlin Lenihan
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.