The Information Commissioner's Office in the UK (ICO) has announced an investigation into the use of facial recognition technology following a string of high profile uses. Prior to the results of this investigation, companies using facial recognition technology should:

  • undertake a balancing test to ensure proportionality in the use of such technology, acknowledging its intrusiveness;
  • ensure that appropriate documentation, including data protection impact assessments and policy documentation are developed and maintained; and
  • monitor use of the technology to eliminate any potential bias in the algorithms.

The use of Live Facial Recognition Technology (LFR) in public places has increased considerably over the last few years by the police, other law enforcement agencies and also by the private sector. This increase is causing growing concern amongst regulators, government and ethics committees relating to the serious risks it poses to privacy given the sensitive nature of the processing involved, the potential volume of people affected and the level of intrusion into privacy it has the capacity to create. Moves are now being made to address the use of this technology and put a legal framework in place in a bid to mitigate the risks it poses.

ICO launches facial recognition Investigation

The Information Commissioner, Elizabeth Denham, published a blog on 9th July 2019 entitled "Live Facial Technology – data protection law applies" (available at: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/07/blog-live-facial-recognition-technology-data-protection-law-applies/) and announced that the Information Commissioner's Office (ICO) is conducting investigations into the use of LFR in the King's Cross area of London.

The ICO investigation follows a spate of trials of LFR going on at various sites across the country including Meadowfield Shopping Centre in Liverpool, Liverpool's World Museum, Manchester's Trafford Centre and King's Cross where the technology has been used by police forces primarily but also in conjunction with site owners to identify individuals at risk or linked to criminal activity.

The ICO was also recently called to advise the judge on data protection law in the case of R (Bridges) v Chief Constable of South Wales Police (SWP).

The ICO's principle concern is that organisations utilising facial recognition technology, including the police, be able to provide demonstrable evidence when deploying this technology that it is 'necessary, proportionate and effective considering its invasiveness.'

In addition, she emphasises that police forces must comply with data protection law which currently includes the GDPR and the Data Protection Act 2018, paying particular attention to the compilation of watch lists, the selection of images used and the need to remove inherent bias in the technology to prevent false-positive matches from certain ethnic groups.

ICO Guidance

The ICO has issued a guidance for police forces considering the deployment of LFR which consists of four basic instructions:

  1. Conduct a Data Protection Impact Assessment (DPIA) before any deployment of LFR and submit these to the ICO for consideration to ensure timely discussion on mitigation of risks.
  2. Create a separate policy document to cover the use of LFR which establishes for what type of circumstances, in what types of places, at what times and in what way the technology will be used.
  3. Monitor algorithms within the software to ensure that no race or sex bias is created.
  4. Read the ICO Guide to Law Enforcement Processing (available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-law-enforcement-processing/) which deals with Part 3 of the DPA and highlights individual rights (including the right to be informed, the right of access, the right to rectification, the right to erasure, the right to restriction and the right not to be subject to automated decision-making) and the importance of accountability and governance.

This Guidance should be appropriately considered by any business considering the use of this type of technology.

It has been a critical moment for regulators to begin to scrutinise LFR and provide guidance given the inherent risk of abuse of data protection and privacy laws it poses and the results of the ICO's investigation are anticipated with great interest. It is likely that greater vigilance will be called for in the future, especially given the expected rise in the use of this technology and when new uses of the technology come into play.

LFR technology has already been developed, for example, that uses a combination of facial recognition technology and people's mobile phones which may be used to speed up the immigration process. It is evident that LFR is potentially an extremely useful tool for the enhancement of public safety but the accuracy of images and the elimination of bias in algorithms will undoubtedly be critical to ensure that this technology can be adopted in the mainstream and in compliance with applicable privacy legislation.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.