EU AI Act Adopted

Mayer Brown


Mayer Brown is a distinctively global law firm, uniquely positioned to advise the world’s leading companies and financial institutions on their most complex deals and disputes. With extensive reach across four continents, we are the only integrated law firm in the world with approximately 200 lawyers in each of the world’s three largest financial centers—New York, London and Hong Kong—the backbone of the global economy. We have deep experience in high-stakes litigation and complex transactions across industry sectors, including our signature strength, the global financial services industry.
Today, on May 21, 2024, the last missing approval for the EU AI Act was given by the European Council.
European Union International Law
To print this article, all you need is to be registered or login on

Today, on May 21, 2024, the last missing approval for the EU AI Act was given by the European Council. The text of the law is final and will be published in the coming days. Here we summarize key points as well as note potential impacts on businesses.


  • Provisions on banned AI systems will start applying within 6 months of publication of the text – very likely still this year;
  • Provisions on general purpose AI systems (GPAI) will start applying 1 year after adoption;
  • The bulk of the obligations will start applying 2 years after adoption, with some of these obligations applying 3 years after adoption (particularly those relating to high-risk AI systems that are safety components in products regulated in EU product safety legislation).


  • US and global companies that use AI anywhere could be subject to the EU AI Act if the output of the system is used in the EU.
  • For example, if a US company uses an AI tool to filter CVs for a job vacancy in the EU, that means that the output will be used in the EU, and the EU AI Act will apply.
  • Another example is a high-risk AI system developed in the US and integrated into a product (e.g., in a connected vehicle) that is then sold in the EU.


  • For each AI system being developed or used, companies will need to assess several aspects in order to determine their obligations, if any, arising from the EU AI Act:
    • What their role is with regard to the AI system (as a provider, deployer, importer or distributor)
    • The type of AI, in particular, whether the system is general purpose AI or not
    • For general purpose AI, whether there is systemic risk
    • For other AI systems, what the level of risk is (unacceptable, high risk, limited risk, low risk, or no risk)
  • For example, providers of high-risk AI systems will have to conduct conformity assessments and comply with extensive compliance obligations (e.g., relating to cybersecurity, privacy, data governance, risk and quality management, and technical documentation).
  • In contrast, deployers of limited-risk systems like chatbots only have to comply with transparency obligations under the EU AI Act.


  • The EU AI Act will likely help businesses purchasing AI because it will clearly stipulate which obligations fall on providers of those systems (which will be the most extensive obligations).
  • Terms and conditions of providers of AI systems will need to be amended to reflect that, and technical documentation will need to be provided by the provider of the system, and attached to the contract, so that the deployer of the system can follow any applicable instructions.
  • In addition, the provider will want the contract to clarify those obligations that fall on the deployer of the system (e.g., data governance obligations if the deployer is the one in control of the data that is input into the system). So the purchasing of AI systems will likely undergo some contractual changes with the adoption of the EU AI Act.
  • Additional interesting questions relate to the allocation of obligations between providers (e.g., if an AI system is sold by a company other than the developer, under its trademark) or between deployers (e.g., in outsourcing, if a vendor of a company uses AI, and both are likely to be considered deployers). Another question is whether liability for a violation of the EU AI Act can fall on the contractual party in these scenarios. We will address these and other questions in upcoming presentations and publications.


For help complying with the EU AI law, contact one of the authors or your regular Mayer Brown contact. We have been advising clients leveraging our system to assess their highest AI risks—a method that considers their specific business models—and helping them establish robust AI governance.

Visit us at

Mayer Brown is a global services provider comprising associated legal practices that are separate entities, including Mayer Brown LLP (Illinois, USA), Mayer Brown International LLP (England & Wales), Mayer Brown (a Hong Kong partnership) and Tauil & Chequer Advogados (a Brazilian law partnership) and non-legal service providers, which provide consultancy services (collectively, the "Mayer Brown Practices"). The Mayer Brown Practices are established in various jurisdictions and may be a legal person or a partnership. PK Wong & Nair LLC ("PKWN") is the constituent Singapore law practice of our licensed joint law venture in Singapore, Mayer Brown PK Wong & Nair Pte. Ltd. Details of the individual Mayer Brown Practices and PKWN can be found in the Legal Notices section of our website. "Mayer Brown" and the Mayer Brown logo are the trademarks of Mayer Brown.

© Copyright 2024. The Mayer Brown Practices. All rights reserved.

This Mayer Brown article provides information and comments on legal issues and developments of interest. The foregoing is not a comprehensive treatment of the subject matter covered and is not intended to provide legal advice. Readers should seek specific legal advice before taking any action with respect to the matters discussed herein.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More