Facial recognition tech: watchdog calls for code to regulate police use

Barrister for information commissioner tells court formal legal framework is required

The information commissioner has expressed concern over the lack of a formal legal framework for the use of facial recognition cameras by the police.

A barrister for the commissioner, Elizabeth Denham, told a court the current guidelines around automated facial recognition (AFR) technology were “ad hoc” and a clear code was needed.

In a landmark case, Ed Bridges, an office worker from Cardiff, claims South Wales police violated his privacy and data protection rights by using AFR on him when he went to buy a sandwich during his lunch break and when he attended a peaceful anti-arms demonstration.

The technology maps faces in a crowd and then compares them with a watchlist of images, which can include suspects, missing people or persons of interest to the police. The cameras have been used to scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.

On the final day of the hearing, Gerry Facenna QC, for the information commissioner, said there was a lack of “clarity and certainty” over how police watchlists were drawn up. Facenna said: “There ought to be a proper and clear code that has been consulted on.”

He added: “If you live in a police state where everyone is monitored all the time, no doubt crime will fall.” He said the state needed to grapple with a “balancing exercise”.

Facenna said a legal framework should address the nature of a watchlist and in what circumstances the technology was deployed. “Can you roll it out at every sports match? Does it need to be intelligence led? What can you do with the footage?”

Facenna said there were also questions around what training AFR operators should have, how to ensure the technology was not hacked and if people could refuse to be scanned.

In his closing speech, Dan Squires QC, for Bridges, said AFR gave police “extraordinary power”. Squires said: “If you have someone’s biometric data and you have a series of CCTV cameras, you are able to log someone’s movements around the city or potentially around the country if AFR is rolled out.”

South Wales police argued during the hearing at the Cardiff civil justice and family centre that the cameras prevented crime, protected the public and did not breach the privacy of innocent people whose images were captured.

The case has been adjourned and two judges will make a ruling at a later date yet to be fixed.

Speaking after the hearing, the deputy chief constable Richard Lewis said: “This process has allowed the court to scrutinise decisions made by South Wales police in relation to facial recognition technology. We welcomed the judicial review and now await the court’s ruling on the lawfulness and proportionality of our decision making and approach during the trial of the technology.

“The force has always been very cognisant of concerns surrounding privacy and understands that we, as the police, must be accountable and subject to the highest levels of scrutiny to ensure that we work within the law.”

Contributor

Steven Morris and agency

The GuardianTramp

Related Content

Article image
Anger over use of facial recognition at south Wales football derby
Protest held before Cardiff v Swansea as fans say technology is taking away their rights

Steven Morris

12, Jan, 2020 @2:05 PM

Article image
Police use of facial recognition is legal, Cardiff high court rules
Ruling comes as London mayor acknowledges Met police role in its deployment in King’s Cross development

Owen Bowcott Legal affairs correspondent

04, Sep, 2019 @5:37 PM

Article image
South Wales police lose landmark facial recognition case
Call for forces to drop tech use after court ruled it breached privacy and broke equalities law

Dan Sabbagh

11, Aug, 2020 @5:24 PM

Article image
Facial recognition tech prevents crime, police tell UK privacy case
South Wales force defends use of technology after office worker claims rights breach

Steven Morris and agency

22, May, 2019 @3:50 PM

Article image
Office worker launches UK's first police facial recognition legal action
Ed Bridges, from Cardiff, says ‘intrusive’ technology is used on thousands of people

Steven Morris and agency

21, May, 2019 @4:24 PM

Article image
Police trials of facial recognition backed by home secretary
Sajid Javid supports use of technology despite concern from human rights groups

Jamie Grierson Home affairs correspondent

12, Jul, 2019 @10:08 AM

Article image
Police face legal action over use of facial recognition cameras
Campaigners say technology risks turning UK citizens into ‘walking ID cards’

Owen Bowcott Legal affairs correspondent

14, Jun, 2018 @8:45 PM

Article image
Facial recognition must not introduce gender or racial bias, police told
Benefits should be great enough to outweigh any public distrust, says ethics report

Frances Perraudin

29, May, 2019 @12:42 PM

Article image
Met police to begin using live facial recognition cameras in London
Civil liberties groups condemn move as ‘a breathtaking assault on our rights’

Vikram Dodd Police and crime correspondent

24, Jan, 2020 @1:16 PM

Article image
Watchdog criticises 'chaotic' police use of facial recognition
Report calls for government guidance on whether and how to use intrusive technology

Ian Sample Science editor

27, Jun, 2019 @10:30 AM