In her Opinion, Ms Denham explains that facial recognition technology (FRT) relies on the use of people's personal data and biometric data. Data protection law therefore applies to any organisation using FRT. Live facial recognition (LFR) is a type of FRT that often involves the automatic collection of biometric data, which means it has greater potential to be used in a privacy-intrusive way.
In her Opinion, Ms Denham summarises the ICO's assessments of, and investigations into, 14 examples of LFR deployments and proposals. The ICO also conducted wider research and engagement in the UK and internationally.
Ms Denham says that the ICO has seen an increasing appetite to use LFR for marketing, targeted advertising and other commercial purposes, which can involve using an individual's biometric data to place them in a particular category. The technology has the potential to be used for more advanced practices, including integration with big-data ecosystems which combine large datasets from multiple sources, such as social media. The ICO is investigating some examples of FRT systems where images captured from online sources are being used to identify individuals in other contexts.
Based on the 14 examples, the Opinion focuses on the use of LFR for the purposes of identification and categorisation. It does not address verification or other "one-to-one" uses. It defines public places as any physical space outside a domestic setting, whether publicly or privately owned. However, it acknowledges that the nature and context of such places may be very different, as will the public's expectations of privacy in different settings. The Opinion does not address the online environment.
The Commissioner identifies several key data protection issues which can arise where LFR is used for the automatic collection of biometric data in public places. These issues include:
- the governance of LFR systems, including why and how they are used;
- the automatic collection of biometric data at speed and scale without clear justification, including of the necessity and proportionality of the processing;
- a lack of choice and control for individuals;
- transparency and data subjects' rights;
- the effectiveness and the statistical accuracy of LFR systems;
- the potential for bias and discrimination;
- the governance of watchlists and escalation processes;
- the processing of children's and vulnerable adults' data; and
- the potential for wider, unanticipated impacts for individuals and their communities.
In a blog post published alongside the Opinion, Ms Denham says that she is "deeply concerned" about the potential for LFR technology to be used inappropriately, excessively or recklessly.
In her Opinion, she notes that other parties, including international organisations and civil society groups, have raised further issues about LFR, including ethical, equalities and human rights concerns. The Opinion sets out where such issues may be relevant to data protection analysis, for example, where bias in facial recognition algorithms could lead to unfair treatment of individuals.
In terms of the law, Ms Denham explains that LFR involves the processing of personal data, biometric data and, in the vast majority of cases seen by the ICO, special category personal data. While the use of LFR for law enforcement is covered by Part 3 of the Data Protection Act 2018 (DPA 2018), outside of this context the relevant legislation is the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA 2018). Controllers seeking to deploy LFR must comply with all relevant parts of the legislation. They must also enable individuals to exercise their rights.
While all relevant elements of the legislation apply, based on the ICO's experience, the central legal principles to consider before deploying LFR are lawfulness, fairness and transparency, including a robust evaluation of necessity and proportionality. This evaluation is particularly important, Ms Denham says, because LFR involves the automatic collection of biometric data, potentially on a mass scale and without individuals' choice or control.
The legal requirements mean that where LFR is used for the automatic, indiscriminate collection of biometric data in public places, there is a high bar for its use to be lawful. While this is the Commissioner's general assessment of what the legislation requires in this context, she emphasises that any investigation or regulatory assessment would be based on the facts of the case, considering the specific circumstances and relevant laws.
As for next steps, the Commissioner says she will continue her investigative and advisory work. This includes completing investigations already under way, assessing DPIAs which identify high-risk processing, conducting a proactive audit of LFR systems in deployment, and, where appropriate, supporting data protection Codes of Conduct or certification schemes. Further next steps for the ICO and for controllers are detailed in the conclusion to the Opinion, alongside recommendations for technology vendors and the wider industry. To access the Opinion, click here. To read the blog post, click here.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.