Police use of facial recognition is legal, Cardiff high court rules

Ruling comes as London mayor acknowledges Met police role in its deployment in King’s Cross development

Police use of automatic facial recognition technology to search for people in crowds is lawful, the high court in Cardiff has ruled.

Although the mass surveillance system interferes with the privacy rights of those scanned by security cameras, two judges have concluded, it is not illegal.

The legal decision came on the same day the mayor of London, Sadiq Khan, acknowledged that the Metropolitan police had participated in the deployment of facial recognition software at the King’s Cross development in central London between 2016 and 2018, sharing some images with the property company running the scheme.

That contradicted previous assurances about the relationship with King’s Cross given by the mayor, who asked the Met “as a matter of urgency” to explain what images of people had been shared with the developer and other companies.

Last month, King’s Cross became one of the first property companies to say it had used facial recognition software in two street cameras until 2018 for reasons of “public safety”, but following an outcry it said it had abandoned plans to deploy the controversial technology more widely on the site.

The legal challenge in Cardiff was brought by Ed Bridges, a former Liberal Democrat councillor from the city, who noticed the cameras when he went out to buy a lunchtime sandwich. He was supported by the human rights organisation Liberty. He plans to appeal against the judgment.

Bridges said he was distressed by police use of the technology, which he believes captured his image while out shopping and later at a peaceful protest against the arms trade. During the three-day hearing in May, his lawyers alleged the surveillance operation breached data protection and equality laws.

The judges found that although automated facial recognition (AFR) amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.

Dismissing the challenge, Lord Justice Haddon-Cave, sitting with Mr Justice Swift, said: “We are satisfied both that the current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR Locate, and that South Wales police’s use to date of AFR Locate has been consistent with the requirements of the Human Rights Act and the data protection legislation.”

Responding to the judgment, Megan Goulding, a Liberty lawyer, said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms … It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use.”

Bridges said: “South Wales police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent. This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”

Facial recognition technology maps faces in a crowd and compares them to a watch list of images, which can include suspects, missing people and persons of interest to the police.

The cameras scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.

Three UK forces have used facial recognition in public spaces since June 2015: the Met, Leicestershire and South Wales police.

Lawyers for South Wales police told the hearing facial recognition cameras prevented crime, protected the public and did not breach the privacy of innocent people whose images were captured.

The technology was likened to police use of DNA. Those not on a watch list would not have their data stored after being scanned by AFR cameras, the court was told.

The chief constable of South Wales police, Matt Jukes, said: “I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern. So, I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme.

“There is, and should be, a political and public debate about wider questions of privacy and security. It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”

A spokeswoman for the Information Commissioners’ Office, which intervened in the case, said: “We welcome the court’s finding that the police use of live facial recognition systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.

“This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police.”

A survey of more than 4,000 adults released on Wednesday by the Ada Lovelace Institute found that a majority (55%) want the government to impose restrictions on police use of facial recognition technology but that nearly half (49%) support use of facial recognition technology in day to day policing, assuming appropriate safeguards are in place.

The Met said the ruling’s implications would be carefully considered before a decision was taken on any future use of live facial recognition technology.

Leicestershire police said they use facial recognition technology in criminal investigations, within locally agreed guidelines and legislation to identify possible suspects. “[It] was last used at a public event in 2015, as a pilot scheme and it has not been used in that way since,” the force said.

Contributor

Owen Bowcott Legal affairs correspondent

The GuardianTramp

Related Content

Article image
Police to use live facial recognition in Cardiff during Beyoncé concert
Technology be to deployed in city centre to identify wanted individuals, leading to concerns over normalisation of surveillance

Geneva Abdul

17, May, 2023 @11:39 AM

Article image
Facial recognition tech: watchdog calls for code to regulate police use
Barrister for information commissioner tells court formal legal framework is required

Steven Morris and agency

23, May, 2019 @2:39 PM

Article image
Anger over use of facial recognition at south Wales football derby
Protest held before Cardiff v Swansea as fans say technology is taking away their rights

Steven Morris

12, Jan, 2020 @2:05 PM

Article image
Police trials of facial recognition backed by home secretary
Sajid Javid supports use of technology despite concern from human rights groups

Jamie Grierson Home affairs correspondent

12, Jul, 2019 @10:08 AM

Article image
Police face legal action over use of facial recognition cameras
Campaigners say technology risks turning UK citizens into ‘walking ID cards’

Owen Bowcott Legal affairs correspondent

14, Jun, 2018 @8:45 PM

Article image
South Wales police lose landmark facial recognition case
Call for forces to drop tech use after court ruled it breached privacy and broke equalities law

Dan Sabbagh

11, Aug, 2020 @5:24 PM

Article image
Office worker launches UK's first police facial recognition legal action
Ed Bridges, from Cardiff, says ‘intrusive’ technology is used on thousands of people

Steven Morris and agency

21, May, 2019 @4:24 PM

Article image
Facial recognition tech prevents crime, police tell UK privacy case
South Wales force defends use of technology after office worker claims rights breach

Steven Morris and agency

22, May, 2019 @3:50 PM

Article image
Met police to begin using live facial recognition cameras in London
Civil liberties groups condemn move as ‘a breathtaking assault on our rights’

Vikram Dodd Police and crime correspondent

24, Jan, 2020 @1:16 PM

Article image
UN report criticises use of facial recognition by Welsh police
Privacy chief says use during demonstration disproportionate and unnecessary

Ewen MacAskill Defence and security correspondent

29, Jun, 2018 @5:49 PM