On June 24, 2024, the Brazilian Data Protection Authority (ANPD) released the second edition of Radar Tecnológico (Technology Radar). This edition (the "Paper") focuses on "Biometrics and Facial Recognition." The Paper analyzes the impacts of this technology, particularly the risks and challenges for the protection of personal data, especially in view of the Brazilian General Data Protection Law (LGPD).
DEFINITIONS
The Paper first defines what biometric recognition is: a technical analysis, by an automated system, that identifies an individual's physiological (fingerprint, face, iris, etc.) or behavioral characteristics (voice, facial expression, way of walking, etc.). The Paper then defines a biometric template as the basis for verifying someone's identity through facial recognition. This template is stored as a hash code in the database of the recognition system.
USES
The Paper focuses on facial recognition, which, in biometric recognition, has three main purposes: (i) to simply detect the presence of people, (ii) to unequivocally identify an individual, and (iii) to classify an individual according to their emotional expressions. The Paper also notes facial recognition's use in neurotechnology, especially brain-computer interface (BCI) technologies, which are already widely used in healthcare and have made significant progress in recent years.
AI-RELATED AND OTHER RISKS
The ANPD repeatedly notes the use of artificial intelligence (AI), which, through training with large amounts of data, has contributed to the accuracy of biometric identification systems. However, it is essential to note that the European Union's Artificial Intelligence Regulation ("EU AI Act") prohibits, for example, the use of biometric AI systems aimed at inferring an individual's emotions within the workplace or educational institutions. The EU AI Act also bans the use of biometric AI systems for other purposes, such as categorizing natural persons based on their race.
Also on this point, biometric identification and authentication
systems for recognizing emotions are considered high-risk—but
are not banned—in Article 14, XI of the June 18, 2024,
version of the Brazilian Artificial Intelligence Act (Bill No.
2.338/2023).
The Paper also stresses that biometric data is considered sensitive
data under the LGPD, which heightens the risk on any processing
activity. So identification and possible classification of
individuals using neurotechnology, for example, raises notable data
protection and privacy risks and other concerns.
The ANPD highlights the possible existence of biases in the training databases of facial recognition systems, which could lead to discrimination against individuals. The ANPD also points out how data breaches of biometric templates could assist in identity theft and financial fraud, for example.
PUBLIC SPACE CONTROVERSY
While the Paper outlines the many uses of facial recognition technology to identify individuals—for border control, public security, fraud prevention, among others—the main focus of the Paper is facial recognition's use in public security, i.e., mass surveillance. The ANPD analyzes several cases, identifying the possible risks and impacts on individuals of this particular use case.
The EU AI Act has banned the use of real-time biometric identification AI systems in public spaces, with some exceptions, such as to identify a person suspected of having committed a criminal offence, or to conduct a criminal investigation. Similarly, Article 13, Section VII of the June 2024 version of the Brazilian Artificial Intelligence Act bans using real-time biometric identification systems in spaces accessible to the public, with some exceptions, such as searching for victims of crimes or missing persons or in other circumstances involving a serious and imminent threat to the life or physical integrity of natural persons.
The US Federal Trade Commission (FTC)1 has been investigating facial recognition technology. The FTC ordered a five-year ban on the use of facial recognition technology to monitor the stores of a pharmacy chain because the technology produced false positives of people allegedly stealing or engaging in other wrongdoing at the stores. This technology led to discrimination against certain people, especially because of their color, which led to the FTC's decision.
The use of surveillance technology based on facial biometrics is also controversial at a global level; Amnesty International, Human Rights Watch and 180 other organizations and experts are calling for these systems to be banned.2
GUIDANCE LIKELY TO COME
The Paper does not give any clearer indications of the measures that companies using facial recognition technology could implement. However, the ANPD said that this Paper is just a brief overview of the subject, and we should expect more specific guidance soon.
Footnotes
2. https://edri.org/wp-content/uploads/2023/09/Global-statement-Stop-facial-recognition-now.pdf
Visit us at mayerbrown.com
Mayer Brown is a global services provider comprising associated legal practices that are separate entities, including Mayer Brown LLP (Illinois, USA), Mayer Brown International LLP (England & Wales), Mayer Brown (a Hong Kong partnership) and Tauil & Chequer Advogados (a Brazilian law partnership) and non-legal service providers, which provide consultancy services (collectively, the "Mayer Brown Practices"). The Mayer Brown Practices are established in various jurisdictions and may be a legal person or a partnership. PK Wong & Nair LLC ("PKWN") is the constituent Singapore law practice of our licensed joint law venture in Singapore, Mayer Brown PK Wong & Nair Pte. Ltd. Details of the individual Mayer Brown Practices and PKWN can be found in the Legal Notices section of our website. "Mayer Brown" and the Mayer Brown logo are the trademarks of Mayer Brown.
© Copyright 2024. The Mayer Brown Practices. All rights reserved.
This Mayer Brown article provides information and comments on legal issues and developments of interest. The foregoing is not a comprehensive treatment of the subject matter covered and is not intended to provide legal advice. Readers should seek specific legal advice before taking any action with respect to the matters discussed herein.