We use cookies to give you the best online experience. By using our website you agree to our use of cookies in accordance with our cookie policy. Learn more here.Close Me
In July 2019, the UK privacy regulator, the Information
Commissioner's Office (ICO) issued a warning about the privacy
implications of automated facial recognition technology (AFR). The
ICO was concerned that AFR "represent[s] the widespread
processing of biometric data of thousands of people as they go
about their daily lives."
The UK High Court recently handed down an important and timely
decision in Bridges, R (on application of) v. Chief Constable
of South Wales Police [2019] EWHC 2341 (Admin). The Court ruled
that the South Wales Police's (SWP) use of AFR was
proportional, lawful, and consistent with human rights and data
protection laws. This was despite SWP's use of AFR interfering
with the privacy and data protection rights of affected
individuals.
This is the world's first legal challenge over the use of
AFR.
Background
SWP has trialled a particular type of facial recognition
technology (AFR Locate) since April 2017 with a view to it being
rolled out nationally. AFR Locate works by capturing digital images
from CCTV feeds of the faces of members of the public. The images
are then processed in real time to extract facial biometric
information. This information is compared with facial biometric
information of people on police watch lists. If there is no match
between the images, the data is immediately deleted after being
processed.
The legal challenge was brought by the human rights campaign
group Liberty on behalf of a Cardiff resident, Ed Bridges. Mr.
Bridges argued that SWP's use of AFR contravened his human
rights as well as data protection laws.
The question for the Court was whether the current legal regime
is adequate to ensure appropriate and non-arbitrary use of AFR.
Key findings
The High Court dismissed Mr. Bridges' claim on all grounds.
Leaving aside claims relating to the breaches of human rights laws,
the key data protection findings were as follows.
Justifiable processing: Biometric
data captured by AFR is the personal data of people who are not on
a police watch list. The Court considered that members of the
public whose images are captured by AFR are sufficiently
individuated from all others. Although this processing of the
biometric data was also "sensitive processing," the Court
ruled that it was justified. For SWP to achieve its purpose of
identifying persons on watch lists, biometric information must be
processed in the first place. The processing of this data is
necessary for SWP's legitimate interests to detect and prevent
crime.
Law enforcement processing: Biometric
data processing does not contravene the data protection principle
that any personal data processing for law enforcement purposes must
be lawful and fair.
Data protection impact assessment
(DPIA): SWP's DPIA for AFR complied with UK law. The Court
ruled that SWP's DPIA set out a clear narrative that took
account of the potential breaches of UK human rights laws. It also
identified safeguards that are in place to determine what personal
data will be retained and why.
Appropriate use: The current legal
regime is adequate to ensure appropriate and non-arbitrary use of
AFR. SWP's use of AFR Locate is consistent with human rights
and data protection laws.
Comment
It has always been difficult to achieve a regulatory balance
between harnessing new technologies and safeguarding the privacy
and data protection rights of affected individuals. This ruling is
fact-specific and should not be interpreted as a UK-wide green
light to use AFR. However, the ruling nonetheless provides much
needed judicial clarity on AFR. The ICO, which has been critical of
police and private use of facial recognition technology, has said
it will review the judgment carefully. As such, we expect the ICO
to issue further recommendations and guidance to law enforcement
about how to deploy any future AFR technology. Keep an eye on this
blog as we'll be sure to keep you fully up-to-date.
The content of this article is intended to provide a general
guide to the subject matter. Specialist advice should be sought
about your specific circumstances.
To print this article, all you need is to be registered on Mondaq.com.
Click to Login as an existing user or Register so you can print this article.
The Court of Appeal has held that an individual can claim for compensation under section 13 of the Data Protection Act 1998 where a breach of the DPA results in a "loss or diminution of a right to ...
Since our last update on the effects of Brexit on data privacy, we have seen an immense flurry of activity and controversy around Brexit. In our previous update, the Brexit deadline was looming
The Finnish presidency of the Council of the EU (Finnish Presidency) released an updated draft of the Regulation on Privacy and Electronic Communications (ePrivacy Regulation) on October 30, 2019...
The Supreme Court last week heard the supermarket chain Morrisons argue that it should not be held vicariously liable for its then in-house senior internal auditor publishing the personal data of almost 100,000 employees.
Gain access to Mondaq global archive of over 375,000 articles covering 200 countries with a personalised News Alert and automatic login on this device.