In this video, Rob Corbet, Partner in our Technology and Innovation Group, shares some of the keyinsights from the recent Governing AI, Powering Innovation survey, the results of which were shared at a launch event at our offices last month.Rob discusses the opportunities and cost reductions that clients have identified through AI. He also highlights areas where clients have concerns, particularly regarding confidentiality, data privacy, and IT security. Additionally, Rob emphasises the importance of an AI strategy and outlines its different stages.
Video Transcription
I'm Rob Corbet, and I'm a partner in the Technology and Innovation Group in Arthur Cox. So what was interesting and not really surprising was the fact that most of our clients have identified opportunities around efficiencies and cost reductions in AI, and most clients are well on their way on that journey. Only around one third of clients would say they were at an early stage of identifying potential use cases within their organisations. Another one third of the respondents to our client survey are a little bit further down the line in that they've identified and are developing proof of concept type projects and rolling those through their organisations as well. So well, there is trepidation, I think, in being too quick out of the blocks in terms of deploying AI into businesses. We are seeing it rapidly evolving because it is very early in the cycle.
I suppose the areas of trepidation that have come through in our survey, our clients are concerned about making sure that they get their governance right, that they get their risk management right and in particular, issues are emerging from the survey around confidentiality, data privacy, IT security, and in particular, 40% of our clients are partnering with third party vendors and obviously, that takes a little bit of time in terms of building trust with those vendors so that the AI that they are building and bringing into their businesses will fulfil the purposes and will be adequately risk-managed. A mature AI strategy starts with AI literacy across across the organisation. One quarter of our client respondents are struggling around some of the concepts in AI, and for example, one quarter of them don't understand the difference between a deployer and a provider, which is a very important distinction under the EU AI Act. 38% of the respondents to our survey do have a single person who is charged with being the AI business owner or AI champion. About a third of our clients have developed an AI strategy that's now in place. Another third are working on that strategy, and it's in train and is likely to be developed in the short term. So the strategy comes from the top down, and then an AI governance framework sits beneath the strategy, and that enables the business then to really fulfil the potential of AI in a way that's thoughtful and manages the associated legal and technical challenges associated with bringing any new technology into the business.
I suppose the final part of a mature strategy is embedding it into the organisation and the areas where we see most attention we're focusing here is areas like procurement and partnering with third-party vendors who build AI solutions for the business. Hr, where there can be particular use cases that can present particular risks around discrimination, equality, confidentiality, privacy, and so forth. It security is obviously also integral to any AI strategy that's going to fulfil its purpose. And of course, the legal team should be involved to make sure that all the legal obligations and risk management frameworks are in place to ensure that they're both legally compliant and in line with the company's own ethical framework around AI.
Originally published 8 Jul 2025
This article contains a general summary of developments and is not a complete or definitive statement of the law. Specific legal advice should be obtained where appropriate.