- within Privacy topic(s)
- with readers working within the Property industries
Cosmetics brands are increasingly turning to AI-driven facial skin analysis tools to deliver personalized skincare recommendations. These tools invite consumers to upload facial images or engage in real-time video capture, enabling algorithms to assess skin issues such as dryness, hyperpigmentation, and blemishes to recommend the best product to resolve users' skin concerns or identify products in response to user preferences and profiles. While this technology promises tailored experiences and higher brand engagement, it also raises significant privacy concerns. When facial images are processed for identification or analysis in virtual try-on services, they may be treated as biometric data under UK, EU and U.S. privacy regimes, requiring more onerous legal obligations.
The Regulatory Landscape
UK and EU
Biometric data is personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow the unique identification of that natural person, and this includes facial images. As biometric data can be used for the purpose of uniquely identifying an individual, it is treated as more sensitive data which requires extra protection.
In Europe and the United Kingdom, the General Data Protection Regulation (GDPR) and UK GDPR classify biometric data used for unique identification as "special category data". The regulations require that brands establish a lawful basis for processing, which in the consumer-facing context of cosmetics, means obtaining explicit consent. Such consent must be freely given, specific, informed, and unambiguous, and brands should avoid bundling it with other terms or making it a condition of service unless strictly necessary. Given that facial imaging is a high-risk activity, brands are also required to conduct Data Protection Impact Assessments (DPIAs) before implementing such tools, and to provide clear, accessible privacy notices explaining how data will be used, stored, and shared. Given the sensitive nature of facial images, brands must implement technical and organizational measures to protect the data, including encryption, strict access controls, and regular security audits. Regulatory authorities, such as the UK's Information Commissioner's Office, have repeatedly stressed that biometric technologies should only be deployed when they are necessary and proportionate to achieve a legitimate aim. Failure to meet these standards can result not only in financial penalties but also reputational damage, which is particularly acute in the luxury cosmetics sector where consumer trust is paramount.
Of further note is that the EU AI Act introduces a new layer of regulation targeting biometrics. This includes biometric categorization systems which assign individuals to specific categories based on their biometric data. However, the definition excludes biometric categorization purely ancillary to commercial services, such as virtual try-on features.
USA
In the United States biometric data regulation is left to a patchwork of state-level statutes.
The most stringent privacy laws have been enacted by the Illinois Biometric Information Privacy Act, which requires organizations to obtain written informed consent before collecting or using biometric identifiers, such as scans of facial geometry. It also mandates clear retention schedules and prohibits the sale or profit from biometric data. The consequences of non-compliance can be severe as BIPA provides a private right of action, enabling individuals to sue for statutory damages.
A further example is the California Consumer Privacy Act (CCPA), as amended by the California Privacy Rights Act (CPRA), which takes a broader approach. While not exclusively focused on biometrics, it classifies biometric data as a category of personal information, and includes imagery of the face from which an identifier template like a faceprint can be extracted. Biometric information is considered sensitive personal information when processed for the purpose of uniquely identifying a consumer. CCPA, as amended, grants consumers rights to access, delete, and opt out of the sale or sharing of personal information as well as limit the use of sensitive personal information.
An increasing number of states are considering enactment of privacy laws, signaling a trend toward greater regulation. Brands should therefore stay up to date with their expanding obligations across emerging jurisdictions, as failure to comply can result in substantial financial liability. For luxury cosmetics brands, the current fragmented landscape poses significant challenges. A single online facial analysis tool may attract users from multiple jurisdictions, each with different consent standards, retention rules, and enforcement risks. Companies must therefore adopt a multi-layered compliance strategy, harmonizing practices across states while anticipating future legislative developments. This often means implementing the highest common denominator, such as explicit consent and robust deletion protocols, to minimize risk and maintain consumer trust.
Practical Tips for Cosmetics Brands
Implementing facial skin analysis tools requires a strategic approach to privacy compliance. In-house counsel should ensure that privacy considerations are embedded from the outset, rather than treated as an afterthought. The following measures can help cosmetics brands balance such innovations with regulatory obligations and consumer trust:
- Conduct DPIAs early to identify and mitigate risks before launching facial analysis tools.
- Secure explicit consent through clear, granular mechanisms for image capture and processing.
- Draft robust privacy notices that explain data types, purposes, retention periods, and user rights in plain language.
- Perform vendor due diligence to ensure third-party providers comply with GDPR, UK GDPR, and relevant US state laws.
- Apply data minimization principles, collecting only what is necessary and avoiding storage of raw images where possible.
- Monitor regulatory developments, including the EU AI Act and evolving US state privacy and biometric-specific laws.
- Train marketing and product teams to prevent secondary use of biometric data without renewed consent.
Whilst this article has focused on the privacy risks and compliance obligations associated with biometric data, which are subject to the strictest regulatory requirements, cosmetics brands should be aware that facial analysis tools routinely collect a broader spectrum of personal data. This includes device and location data, user-provided details (such as email, age, and preferences), behavioral and interaction logs, and AI-inferred insights about individuals. Addressing privacy risks and compliance obligations for all these data categories is essential for maintaining consumer trust and meeting legal standards.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.