Facial recognition tech prevents crime, police tell UK privacy case

South Wales force defends use of technology after office worker claims rights breach

Facial recognition cameras prevent crime, protect the public and do not breach the privacy of innocent people whose images are captured, a police force has argued.

Ed Bridges, an office worker from Cardiff, claims South Wales police violated his privacy and data protection rights by using facial recognition technology on him.

But Jeremy Johnson QC compared automated facial recognition (AFR) to the use of DNA to solve crimes and said it would have had little impact on Bridges.

Johnson, representing the police, said: “AFR is a further technology that potentially has great utility for the prevention of crime, the apprehension of offenders and the protection of the public.”

The technology maps faces in a crowd and then compares them with a watch list of images, which can include suspects, missing people and persons of interest to the police. The cameras scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.

Johnson said the process also included human interaction. He said: “It is up to the operator to decide whether the person is a match or not. You then have the intervention.

“It’s not that the operator makes their own assessment, the officer on the ground looking at the individual will make their own assessment and will decide whether or not to intervene and speak to the individual.”

The hearing at Cardiff civil and family justice centre was told by Johnson that under common law police had the power to use visual imagery for the “prevention and detection of crime”.

It has been argued that the use of AFR is unregulated, but Johnson said police must adhere to data protection rules and have a code of practice for the management of information.

The court heard South Wales police did not believe article 8 of the Human Rights Act – which enshrines rights around private life – and the Data Protection Act had been breached by the use of CCTV or AFR cameras.

Johnson argued a police officer monitoring CCTV manually had the same “practical impact” on an individual as an AFR camera.

He said: “So far as the individual is concerned, we submit there is no difference in principle to knowing you are on CCTV and somebody looking at it.”

Johnson added that those not on a watch list would not have their data stored after being scanned by AFR cameras.

The court heard a trial period for the use of AFR started in south Wales in May 2017 and is still under way.

Bridges believes his face was scanned while he was shopping in 2017 and at a peaceful anti-arms protest in 2018, and that this had caused him distress. He has used crowdfunding to pay for the legal action with the support of the human rights organisation Liberty. It argues AFR has profound consequences for privacy and data protection rights.

But Johnson said: “It’s difficult to say that an automated immediate computerised comparison is more intrusive than police officers sitting down looking at albums of photographs.”

The court heard Bridges was not on a watch list. Johnson said: “He was not spoken to by a police officer, far less arrested. We say the practical impact on him was very limited.”

Johnson said AFR was used at the anti-arms trade protest in Cardiff in 2018, which was attended by Bridges. A woman had made a bomb threat at the same event last year and was therefore on a watch list, he added.

The barrister said: “It’s of obvious value for these police officers to know that person is there so that if another bomb threat is made they can deal with it accordingly. We say a fair balance has been struck.”

The case continues.


Steven Morris and agency

The GuardianTramp

Related Content

Article image
Police face legal action over use of facial recognition cameras
Campaigners say technology risks turning UK citizens into ‘walking ID cards’

Owen Bowcott Legal affairs correspondent

14, Jun, 2018 @8:45 PM

Article image
Office worker launches UK's first police facial recognition legal action
Ed Bridges, from Cardiff, says ‘intrusive’ technology is used on thousands of people

Steven Morris and agency

21, May, 2019 @4:24 PM

Article image
South Wales police lose landmark facial recognition case
Call for forces to drop tech use after court ruled it breached privacy and broke equalities law

Dan Sabbagh

11, Aug, 2020 @5:24 PM

Article image
UK’s facial recognition technology ‘breaches privacy rights’
South Wales police accused of using system with ‘racial bias’ that breaks data protection law

Owen Bowcott Legal affairs correspondent

23, Jun, 2020 @4:23 PM

Article image
South Wales police to use facial recognition apps on phones
Force testing app that lets officers run snapshot through ‘watchlist’ to identify suspects

Ian Sample Science editor

07, Aug, 2019 @6:48 PM

Article image
Police trials of facial recognition backed by home secretary
Sajid Javid supports use of technology despite concern from human rights groups

Jamie Grierson Home affairs correspondent

12, Jul, 2019 @10:08 AM

Article image
Facial recognition tech: watchdog calls for code to regulate police use
Barrister for information commissioner tells court formal legal framework is required

Steven Morris and agency

23, May, 2019 @2:39 PM

Article image
Met police to begin using live facial recognition cameras in London
Civil liberties groups condemn move as ‘a breathtaking assault on our rights’

Vikram Dodd Police and crime correspondent

24, Jan, 2020 @1:16 PM

Article image
Watchdog warns over police database of millions of facial images
Biometrics commissioner says 20m photos held despite retention of images of innocent people being unlawful

Alan Travis Home affairs editor

13, Sep, 2017 @4:52 PM

Article image
Met police to use facial recognition software at Notting Hill carnival
Civil liberties groups say plan to scan faces of thousands of revellers at London event has no basis in law and is discriminatory

Vikram Dodd Police and crime correspondent

05, Aug, 2017 @6:00 AM