In this first part of our series on the procurement and deployment of AI solutions, we outline some key considerations for organisations when conducting a risk assessment of a proposed new AI solution and what to think about when undertaking due diligence on the proposed provider of the solution.

Initial determination and risk assessment

Why is an AI solution needed?

As with the start of any procurement journey, one of the initial considerations must be whether a third party AI solution is actually needed to meet the relevant business needs and would such an AI-driven solution deliver value for the organisation.

Cost/benefit analysis

Part of this assessment will be a cost / benefit analysis of the proposed AI solution. While the benefits of the AI solution may be increased productivity and automation, it is important to consider that the solution may also require some potentially hidden upfront investments, for example, in terms of collecting, collating and sanitising data sets to train or otherwise feed the AI solution. Other costs may include the resources involved in the initial and ongoing training required in operating the AI solution. AI, despite the buzz, is not a "one size fits all" solution. The usability and effectiveness of AI tools often depends on the quality of training provided and information ingested.

Alignment with business and commercial strategies

Organisations should think clearly about how the particular use of an AI solution will align with its business and commercial strategies. For example, the commercial strategy may be "innovation first" or the goal to be first to market with a particular solution to provide a competitive edge. On the other hand, the strategy may be cost efficiency and supporting under-resourced teams or addressing compliance obligations. How the use of AI solutions supports or advances these objectives will be an important consideration.

AI preparedness

Does your organisation have expertise in the deployment, operation and governance of the AI solution which is sought? For example, does your organisation understand the data required to effectively utilise the AI solution and is such data sufficiently accessible and usable within your organisation? This assessment might clarify whether pre-procurement workstreams are needed before going to tender or whether a longer implementation time is required. Under the AI Act, both providers and deployers of AI systems must take measures to ensure a sufficient level of AI literacy of their personnel dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in. For high risk AI solutions, human oversight is an important component of compliance with the AI Act (i.e. human-in-the-loop). Therefore, it will be important for your organisation to understand what expertise it already has, or what it needs to develop as part of the procurement and deployment process.

Risk assessment

Many organisations already have tiered rating systems to determine the criticality of different solutions. These systems will need to be appropriately adapted to account for the criticality of AI solutions both from an operational and risk perspective. As part of this, organisations will need to consider whether an AI solution would be deemed "high risk" under the AI Act. For example, if the solution will be used to determine credit worthiness of borrowers or eligibility of applicants for a job role, then they will likely be "high risk" and the risk assessment and due diligence process will need to be tailored to address specific requirements of the AI Act (e.g. determining if the AI provider can provide sufficient information on the operation of its AI solution for the organisation to comply with transparency requirements of the AI Act).

Due Diligence

Assuming the initial determination and risk assessment in relation to the proposed AI solution determines that your organisation wishes to proceed with the procurement and deployment of that solution, it will be necessary for the organisation to conduct due diligence specific to the proposed AI provider.

While due diligence of an AI provider is similar to due diligence undertaken for any other important technology provider, there are certain issues that should be of particular focus:

Cybersecurity and resilience

It goes without saying that security of AI solutions, particularly those processing large volumes of personal data, is paramount with high risk systems subject to prescriptive requirements on cybersecurity under the AI Act. While eliminating security incidents may be the goal, they do occur, and organisations that have detailed incident preparation plans will have identified any novel vulnerabilities that a new solution may have. From a remediation and breach notification perspective, organisations should be able to identify any affected data sets and explain how an incident occurred. To this end, under the AI Act, deployers of highrisk systems must keep logs from their AI systems for a prescribed period of at least six months and it will be important during the due diligence phase to get assurances from the proposed AI provider as to the maintenance and efficacy of those logs.

Training data and data protection

AI providers should provide detailed descriptions of the provenance of any training data and should be able to provide assurance around their compliance with data protection laws in training and building their AI solution. They should also be able to support the organisation in relation to its data protection compliance obligations (e.g. data subject requests).

Transparency and explainability

Understanding how an AI solution works is important to ensure that the organisation is transparent with its customers. Therefore, as part of its due diligence, the organisation will need to ensure that it obtains information on how the AI solution is trained, the provenance of data sets within the AI system and how data will be used and managed by the AI provider. In the context of high risk AI solutions (e.g. solutions operating in the area of employment/recruitment) there are additional transparency requirements around the "explainability" of the AI solution under the AI Act, such as the logic behind the solution's outputs. The organisation should gather information during the due diligence phase to help it understand whether the AI provider will be able to help it comply with these requirements.

Intellectual property

For generative AI solutions, an important aspect of due diligence will be understanding what rights are being sought by the AI provider in relation to inputs (including data sets and prompts) and outputs. For example, an AI provider may seek rights of use and ownership in inputs and outputs connected to the organisation's use of the generative AI solution (e.g. for product improvement and model learning purposes) which may prejudice the organisation's ability to use such data on an ongoing basis. It will also be important to clarify during the due diligence phase that the AI provider is able to grant the organisation with all licences required to use the AI solution.

Accuracy, non-discrimination and bias

During the due diligence phase, AI providers should explain how they ensure that their solution is accurate and does not create a risk of bias or discrimination. Depending on the use of the AI solution and whether it is high risk, the organisation may also wish to conduct a data protection impact assessment (and/or a legitimate interest assessment, if applicable) at an early stage to further assess the privacy risks associated with the AI solution and how they may be mitigated. Information obtained from the AI provider during the due diligence phase around how it ensures accuracy and mitigates the risk of potential bias/discrimination in its solution will help with preparing these assessments.

Alignment with sustainability and ethical values

Any AI solution should align with the organisation's sustainability and ethical values and principles. Therefore, it will be important to gather information from the AI provider during the due diligence phase to affirm this.

This article contains a general summary of developments and is not a complete or definitive statement of the law. Specific legal advice should be obtained where appropriate.