On 21 April 2021, the European Commission published a bill containing the EU's first rules on Artificial Intelligence (AI). This AI Act was recently adopted by the European Parliament and is expected to enter into force 20 days after its publication in the Official Journal in the second half of 2024. The AI Act will focus mainly on high-risk AI systems, including AI systems used in the workplace. This Alert outlines what the new harmonised rules will mean for employers.
The AI Act
The AI Act is the first legislation regulating the use of AI in order to protect fundamental rights, democracy, rule of law and sustainability against AI systems.
Article 3 of the Act defines an AI system as: “A machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments."
This is a very broad definition, essentially covering any system that operates with a certain degree of autonomy. In this context, autonomy means that the system produces outputs that are influenced by factors beyond the user's control, such as chatbots, personalised suggestions, and automated recruitment systems.
The AI regulation takes what it terms a "risk-based approach", distinguishing between AI applications that create (i) an unacceptable risk, (ii) a high risk, and (iii) low or minimal risk:
- Unacceptable risk: AI systems that pose an unacceptable risk, for instance because they violate fundamental human rights, will be banned. Title II of the Act lists prohibited AI systems, such as those that manipulate behaviour or classify people based on their appearance or social characteristics.
- High risk: under Title III of the act, AI systems that pose a significant threat to health, safety, fundamental rights, the environment, democracy or the rule of law are classified as "High Risk". A more detailed list of categories of high-risk AI systems is found in Annex III. High-risk AI systems will be subject to strict regulation.
- Low and minimal risk: low-risk AI systems will be subject to transparency requirements to ensure that people are properly informed where needed. AI systems will have to be flagged, for instance. What's more, the AI Act will allow providers to offer minimal-risk AI systems – the vast majority – free of charge.
Workplace AI systems classified as high-risk
The AI Act revolves mainly around high-risk AI systems. Annex III also classifies AI systems used in the field of employment, workers management and access to self-employment, in particular for the recruitment and selection of natural persons, for making decisions on promotions and dismissals and for the allocation of tasks, monitoring or evaluation of persons in work-related contractual relationships as high-risk systems. After all, these systems could have a major impact on people's future careers and livelihoods.
What are the rules for companies using a high-risk system?
Companies using high-risk AI systems have to meet certain obligations. The AI Act distinguishes between the provider or distributor of an AI system and its user. High-risk AI-systems in the workplace will be subject to rules for users, unless the company in question is the provider or distributor. Users of high-risk AI systems are required to:
- Monitor the use of AI systems and track what data are used to produce output. Notify the provider or distributor and stop using the system if they find signs that the system may pose a risk to health, safety or the protection of fundamental rights.
- Check, if they are able, whether the input data are relevant to the intended purpose of the AI system;
- Follow the provider's instruction and use the system for its intended purposes. Providers are responsible for having high-risk systems approved, registering them system in the designated EU database and reporting risks to national competent authorities. Any user failing to observe these requirements will be deemed a provider and will be subject to the rules for providers;
- To inform natural persons exposed to high-risk AI systems.
Entry into force
The AI Act will enter into force 20 days after its publication in the Official Journal of the European Union in the second half of 2024. After this, the AI Act will come into force in stages. After six months, the provisions on prohibited AI systems will be effective. After two years, the remaining provisions will come into force, including the rules on high-risk AI systems.
Penalties
The penalties for infringing the AI Act have been aligned with GDPR penalties. In principle, Article 71 of the AI Act lays down the following penalties: up to €30 million or 6% of total global annual turnover for violations of the rules on prohibited AI systems, up to €20 million or 4% of total global annual turnover for violations of the rules or obligations of the AI Act or €10 million or 2% of total global annual turnover for providing false information.
All relevant circumstances will be taken into account in imposing a fine, including the size of the company committing the infringement.
To do
For companies using AI systems, the AI Act will introduce several new administrative obligations. Our advice is to get a head start on the new rules by:
- Identifying which AI systems are already in use and how they are classified. Examples include AI used for onboarding processes, systems that automate HR tasks, or systems to allocate bonus payments.
- Checking whether the AI systems being used meet the relevant rules;
- Checking whether high-risk AI systems are used for the right purposes;
- Drawing up a plan on how systems will be monitored and how logs will be stored. Consider making contractual arrangements with AI system providers to ensure the provider is responsible for keeping logs in accordance with Article 20 of the AI Act.
Originally published 24 July 2024
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.