Ian Duffy, Partner in the Technology and Innovation Group, shares key takeaways from theGoverning AI, Powering Innovationsurvey event.
Ian explores how effective governance is becoming central to AI strategy, especially as organisations prepare for compliance with the EU AI Act. He outlines a practical roadmap for boards and senior leadership to ensure oversight and accountability in AI deployment.
Video Transcription
My name is Ian Duffy and I'm a partner in the Technology and Innovation Group here at Arthur Cox. Effective governance and oversight is really a key part of lots of regulatory developments and new rules we've seen in the data, cyber and digital space in recent years and the AI Act is similar in that regard. The board and senior management will remain ultimately responsible for how the organisation develops and how it deploys AI in practise so it will be really important to ensure that there are appropriate reporting lines and there is appropriate oversight from the Board's perspective, so that the organisation can benefit from their bird's eye point of view around how AI is being used within the organisation, how it's being developed, how it's being rolled out, and that they can challenge that development and use as they deem appropriate.
I think the key for many organisations, if they haven't already done so, will be really starting to try and get their arms around how AI is currently being used within their organisation and how they propose to use it over the short to medium-term. And I think this will really involve compiling an AI inventory, right? How is AI currently being used within the organisation and how do you propose to use it going forward? And from there you can start to figure out, well, what sort of obligations attached to that AI and those AI systems under the AI Act.
And the next step on that journey will be trying to categorise the various different AI systems that you've identified by reference to the different categories of AI systems under the AI Act and then figuring out your role in respect of that AI system. Are you developing the AI system? So are you a provider under the AI Act or are you simply using it and thus you're a deployer under the AI Act. And from there you'll be able to clearly identify what your obligations are under the act and build out an implementation plan to help you achieve alignment with those requirements under the AI Act as well. Also important to bear in mind that lots of organisations will have very sophisticated risk management frameworks in place already that they'll have developed for the purpose of alignment with other relevant regulatory regimes, so the likes of DORA, NIS2, the GDPR.
So look for opportunities to leverage off those existing regimes when trying to comply with the AI Act as well and also be mindful of the staggered implementation deadlines for the AI Act as well, when you're trying to figure out the timelines for you complying with different requirements of the legislation and also being mindful of the fact that requirements around AI literacy and requirements around prohibited AI practises are already in force. For more information, please visit arthurcox.com/technologyandinnovation
Originally published 18 Jul 2025
This article contains a general summary of developments and is not a complete or definitive statement of the law. Specific legal advice should be obtained where appropriate.