ARTICLE
31 July 2025

Surveillance Or Safeguard? The CNIL's Verdict On Augmented Cameras For Age Verification

LS
Lewis Silkin

Contributor

We have two things at our core: people – both ours and yours - and a focus on creativity, technology and innovation. Whether you are a fast growth start up or a large multinational business, we help you realise the potential in your people and navigate your strategic HR and legal issues, both nationally and internationally. Our award-winning employment team is one of the largest in the UK, with dedicated specialists in all areas of employment law and a track record of leading precedent setting cases on issues of the day. The team’s breadth of expertise is unrivalled and includes HR consultants as well as experts across specialisms including employment, immigration, data, tax and reward, health and safety, reputation management, dispute resolution, corporate and workplace environment.
The French Data Protection Authority (CNIL) has determined that the use of augmented cameras to estimate a customer's age for the sale of restricted items – such as tobacco or alcohol – is both...
France Technology

The French Data Protection Authority (CNIL) has determined that the use of augmented cameras to estimate a customer's age for the sale of restricted items – such as tobacco or alcohol – is both disproportionate and unnecessary. This decision follows from the introduction of augmented cameras in retail environments, which were adopted as a means to detect and prevent the sale of tobacco to minors. However, significant concerns have been raised regarding their operational use, particularly in the context of data protection compliance and the safeguarding of individual rights.

A closer look into France's AI-driven age verification processes

French law requires age verification checks to be conducted by retailers prior to the sale of age-restricted goods such as tobacco. In an effort to meet this obligation, some tobacconists introduced augmented cameras in their establishments. Unlike traditional video surveillance systems, these cameras employ AI-driven algorithms to scan facial features and estimate whether an individual is a minor or an adult. The benefit of this technology is to streamline compliance checks for retailers by having systems that operate in real time, processing images and signaling eligibility through a green or red light to indicate whether the individual meets the legal age threshold – typically 18 or 21 years old. The design is purported to align with the General Data Protection Regulation (GDPR) and data protection legislation by ensuring that images are neither stored, recorded, nor transmitted to external servers.

However, this promise of privacy falls short. The CNIL have raised concerns as these "devices can only estimate the age of people, without certainty". This inherent risk of error – common in many AI age estimation tools – raises questions about the reliability of the technology. From a GDPR perspective, this lack of accuracy undermines the necessity of its use, particularly since traditional age verification methods must still be employed. Moreover, as with any AI-driven system, there is a risk of bias, which could lead to discriminatory outcomes if an individual is incorrectly assessed. As a result, the initial facial analysis may be viewed as superfluous, adding an unnecessary and potentially problematic layer to the verification process.

The risks of AI in age verification

In its official statement the CNIL emphasised that "these algorithmic age estimation devices inherently present risks to the fundamental rights and freedoms of individuals, despite certain guarantees such as local data processing and rapid deletion of images". A key concern is the infringement of an individual's right to object as provided under Article 21 of the GDPR – a right that must be upheld under data protection law. The technology, by design, captures and processes the biometric data of every customer entering a store regardless of their age. This blanket approach means that surveillance is applied to all individuals (rather than limiting it to those for whom age verification is genuinely required), raising concerns around proportionality and necessity. As such, augmented cameras impose a level of surveillance that is difficult to justify under data protection principles.

The CNIL also warned against the broader implications of widespread deployment of surveillance tools. The normalisation of such surveillance tools risks creating an environment of constant monitoring that far exceeds what is necessary for regulatory compliance. Individuals are effectively forced to submit to facial analysis simply to purchase age-restricted goods, further removing their ability to exercise the right to object. In this context, the CNIL concluded that the use of augmented cameras fails to meet the GDPR's core principles of necessity and proportionality.

This position aligns with the CNIL's longstanding approach to camera surveillance in public spaces, as outlined in its July 2022 guidance. It also signals a broader regulatory trend: AI-powered technologies, particularly those involving biometric data, will continue to face heightened scrutiny where their deployment risks undermining fundamental rights.

Why does this matter?

The decision underscores the importance of ensuring that any technological solution deployed for compliance purposes must not only be effective but also proportionate, necessary, and fully compliant with data protection laws. The CNIL's findings serve as a clear warning that the adoption of innovative technologies, particularly those involving biometric data and surveillance, will be subject to rigorous scrutiny by regulators. Businesses must therefore take a cautious and well-informed approach when considering the implementation of such tools, ensuring that they do not inadvertently infringe upon individuals' rights or fall foul of legal requirements.

Takeaways

There are several key takeaways from the CNIL's decision:

  1. Respect Individual Rights: Businesses must ensure that individuals can meaningfully exercise their rights and are not forced to undergo biometric analysis simply to access goods or services.
  2. Targeted Surveillance: Surveillance measures should be narrowly tailored to their intended purpose i.e. what is strictly necessary. Capturing and processing the biometric data of all customers, regardless of age or purchase intent, constitutes indiscriminate monitoring and is difficult to justify under data protection law.
  3. Necessity and Proportionality Are Non-Negotiable: Any technology used for compliance, particularly those involving personal data or surveillance, must be demonstrably necessary and proportionate to the aim pursued.
  4. Regulatory Scrutiny of AI and Surveillance Technology: The CNIL's stance reflects a broader trend of heightened regulatory scrutiny of AI-powered and surveillance technologies. Organisations should anticipate ongoing and evolving requirements in this area and remain vigilant of changes in regulatory expectations and guidance.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More