BCL partner, Julian Hayes discusses UK and EU law relevant to facial recognition technology (FRT) and how such law would apply if FRT were to be used within quasi- public spaces.

What is FRT and which laws presently govern it in the UK? Are there any plans for regulation?

FRT is a form of biometric recognition technology which uses facial features, usually comparing them to images held in a database, to verify that someone is who they claim to be (eg ePassport gates at airports or identifying 'persons of interest' in a busy street or recipients of football banning orders at a football match).

There are many variations in the way in which FRT works, but in general terms, FRT works by detecting and capturing a facial image, often via CCTV footage. Using a recognition algorithm, it then standardises the captured facial image (eg by size, rotation, etc) so that it is in the same format as images held on a database ('watchlist') of known individuals. The standardised captured image is then statistically compared to images on the watchlist. If a similarity threshold, set by the FRT operator, is reached, a 'match' takes place between the captured image and an image on the watchlist, and the person is identified. The match is usually verified by a human agent before any action is taken.

At present, there is no dedicated FRT legislation or Code of Practice in the UK. Instead, depending on who is using it and why, its use is affected by the Human Rights Act 1998, as well as bringing into play considerations under:

  • the European Convention on Human Rights (ECHR)
  • the General Data Protection Regulation (EU) 2016/679 (GDPR)
  • the Data Protection Act 2018 (DPA 2018)
  • guidance produced by the Surveillance Camera Commissioner (SCC), which is applicable in England & Wales including the Surveillance Camera Code of Practice (SC Code)
  • potentially by the ordinary law of confidence
  • potentially by the Regulation of Investigatory Powers Act 2000 (RIPA) where the surveillance is covert

As a result of this non-specific miscellany of primary and subordinate legislation, the use of FRT falls within the regulatory remit of the Information Commissioner (ICO), the SCC, the Biometrics Commissioner and the Investigatory Powers Commissioner's Office (IPCO). Despite the proliferation of oversight bodies, concerns have been expressed by some about the speed at which FRT is being deployed and the risk that regulatory lacunae may lead to misuse. For example, although private operators of FRT (eg retail parks) are encouraged to adopt the guiding principles for system operators set out in the SC Code, there is no obligation on them to do so. Even where 'relevant authorities' (eg the police) are obliged to have regard to the SC Code, failure to act in accordance with it does not of itself make that person liable to criminal or civil proceedings.

As a result of these concerns, legislators, regulators and NGOs have called for more comprehensive FRT regulation.

In the first legal challenge to the use of FRT brought by Ed Bridges against the South Wales Police, the High Court concluded that the current legal regime is adequate to ensure the appropriate and non-arbitrary use of FRT and that its use in this instance had complied with the applicable laws. However, it also noted that: 'the future development of [FRT] is likely to require periodic re-evaluation of the sufficiency of the legal regime' (R (Ed Bridges) v CC South Wales Police [2019] EWHC 2341 (Admin), [2019] All ER (D) 05 (Sep), para [97]).For more on the case, see News Analysis: High Court rules on lawfulness of police use of Automated Facial Recognition technology (R (on the application of Bridges) v Chief Constable of South Wales Police (Information Commissioner and another intervening)).

In the wake of the High Court decision, the Biometrics Commissioner suggested that parliament should consider whether to enact a specific framework for the use of biometrics such as FRT by the police and others, although the government has no official plans to do so. The ICO has indicated that it is finalising recommendations and guidance to police forces about planning, authorising and deploying future FRT. It is anticipated that the ICO will publish a revised code of practice for surveillance cameras and personal information, applicable to public and private operators, to ensure their use is compliant with GDPR and DPA 2018 obligations.

Meanwhile, it is understood that the European Commission is planning regulation to limit the indiscriminate use of facial recognition by companies and public authorities. The European Commission proposals are expected to be published by spring 2020 and it is likely that, despite the UK's intended departure from the EU, they will be mirrored in the UK.

Does GDPR apply to FRT? How will Brexit impact the UK in this area?

'Personal data' includes 'biometric data', of which facial images are one example and it applies to the processing of 'personal data' wholly or partly by automated means. As a result, the provisions of GDPR apply to the processing of FRT.

However, depending on the circumstances of the FRT application, the GDPR does not apply to processing by individuals for domestic purposes, nor does it apply to processing by competent authorities for the purposes of preventing, investigating, detecting or prosecuting criminal offences (which is instead covered by DPA 2018, Pt 3, see Practice Note: Processing personal data by law enforcement and intelligence agencies—an introduction to the Data Protection and Law Enforcement Directive and Part 3 of the Data Protection Act 2018.

Under GDPR, biometric data is 'special category personal data'. The processing of special category personal data is prohibited unless there is a lawful basis for processing and also one of a limited number of exceptions applies.

These exceptions include where:

  • the data subject has given explicit consent
  • the processing is vital to the interests of the data subject or another natural person
  • the processing is necessary for the reasons of substantial public interest, provided it has a clear legal basis, is proportionate and respects the essence of the right to data protection

DPA 2018 stipulates various mandatory conditions for using the 'substantial public interest' as an exception to process data such as the biometric images of an FRT system.

In the event of departing the EU, the UK will become a 'third country', restricting the cross-border flow of personal data from continental Europe to the UK until the EU decides that the UK's personal data protection regime provides substantially the same level of personal data protection as that of the EU. To facilitate such a decision, the government plans to incorporate the GDPR into domestic legislation. As a result, GDPR should effectively continue to apply after Brexit.

In recent years, the UK has seen the rise of so-called 'quasi-public spaces'—namely, open spaces that look like public spaces, but are instead private spaces that are conditionally opened to the public. What would be the legal repercussions of introducing FRT in quasi-public spaces and could this be open to legal challenge?

Private investment in civic spaces, for example museums and galleries, has existed for many years. Large-scale quasi or hybrid public spaces such as those at King's Cross in London containing places of work, retail and leisure establishments, however, are a relatively new phenomena, where—at the owner's discretion—members of the public are granted licence to enter what is private property in law. A private owner wishing to install FRT must be able to satisfy the requirements of the GDPR for processing the biometric data, including documenting a lawful basis for its processing and the exception which applies to the general prohibition on processing such special category personal data.

Increasingly, office buildings are introducing FRT systems to avoid sign-in queues. Such employers are relying on consent as an exception to the prohibition on processing biometric data. However, such consent must be voluntary, informed and unambiguous. One can see that where an alternative sign-in system is not available, relying on consent as a basis for FRT may be open to legal challenge. Some owners of quasi-public spaces argue that FRT is necessary for the substantial public interest of preventing or detecting crime or anti-social behaviour on their property. To rely on this exception, the owners must be able to demonstrate that the processing is necessary:

  • for the prevention or detection of an unlawful act
  • to perform it without the consent of the data subject because obtaining their permission would frustrate the purpose of the processing
  • for reasons of substantial public interest

Investigating recent instances of FRT at King's Cross, the ICO has emphasised that processing on the basis of substantial public interest must be 'strictly' necessary and proportionate. In other words, an alternative, less intrusive means of means of achieving the same end may give rise to legal challenge to the use of FRT on this basis.

Whereas the ECHR does not normally apply to private organisations, it may arguably apply to them in the performance where they are carrying out a public function (see HRA, s 6(3)(b)). For example, if when exercising their law enforcement duties the police routinely use the FRT system of a private owner, it is conceivable that an aggrieved litigant might seek to argue that the ECHR principles also apply to the private owner as well as the police, and bring a challenge for breach of their human rights under HRA. As a result, where law enforcement authorities seek to co-operate with private entities over FRT, private organisations might first wish to be confident that they were not exposing themselves to unnecessary legal risk.

If owners of quasi-public spaces were to introduce FRT, how practically could and should this be regulated? What are some of the key challenges to this?

Owners of quasi-public spaces that introduce FRT must comply with the GDPR and applicable provisions of DPA 2018. As such, they are regulated by the ICO whose powers are set out in DPA 2018, Pt 6 and supplemented by the ICO's Regulatory Action Policy. They may also voluntarily abide by the SC Code and although there is no obligation on them to do so, such compliance may go some way to demonstrating adherence to the data protection legislation enforced by the ICO.

That said, it is fair to say that the deployment of technology has outstripped the ability of the applicable legislation and regulation to keep pace. The legal challenge brought by Bridges against South Wales Police's use of FRT (see above) was a first step towards developing an accepted framework within which law enforcement deploys this

technology. Although the High Court accepted that the police trials under scrutiny were lawful in both human rights and data protection terms, the judgment is subject to appeal, which will take some time. Any appeal is unlikely to address directly the burgeoning private use of FRT.

The reality is that this technology cannot be 'uninvented' and the key challenge is ensuring that clear, accessible and generally accepted rules are in place to ensure that all concerned—operators and data subjects (including both those whose images are captured and those on the watch list)—are in no doubt as to their rights and obligations as the technology develops and improves.

The House of Commons Science and Technology Committee recently stated that authorities should cease all trials of FRT until a legal framework is established, calling into question the legal basis and the potential threat to privacy raised by the use of the technology. What—if any—differentiating or distinguishing factors might be applied to justify the use of the technology in 'quasi-public' as opposed to public spaces, or are the issues likely to be the same?

Given the tighter legal and regulatory framework applicable to the deployment of FRT by law enforcement agencies in public spaces, and the greater public scrutiny they ultimately face, the public might arguably feel there is more justification for the use of FRT by the police in public spaces than by private entities in quasi-public areas.

That said, the circumstances in which biometric data in FRT systems may be processed by private entities under the GDPR (as amplified by DPA 2018) are narrowly defined to ensure processing is lawful. Provided private entities are able to bring themselves within the existing legislative criteria for using FRT, it is not difficult to see how they might justify its use in quasi-public areas. FRT would free-up law enforcement resources allowing their deployment only where necessary, help reduce crime and disorder, and create a safer environment for those using quasi-public spaces.

Admittedly, these are all factors which law enforcement agencies also cite for the use of FRT. Important in avoiding a future backlash will be ensuring that private FRT operators are aware of how to use this next generation technology responsibly and within the constraints of the current and any future legal framework which is introduced.

Interviewed by Tom Inchley.

This article was originally published by LexisNexis on 24th September 2019. You can read the full version here.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.