Police face legal action over use of facial recognition cameras

Campaigners say technology risks turning UK citizens into ‘walking ID cards’

Two legal challenges have been launched against police forces in south Wales and London over their use of automated facial recognition (AFR) technology on the grounds the surveillance is unregulated and violates privacy.

The claims are backed by the human rights organisations Liberty and Big Brother Watch following complaints about biometric checks at the Notting Hill carnival, on Remembrance Sunday, at demonstrations and in high streets.

Liberty is supporting Ed Bridges, a Cardiff resident, who has written to the chief constable of South Wales police alleging he was tracked at a peaceful anti-arms protest and while out shopping.

Big Brother Watch is working with the Green party peer Jenny Jones who has written to the home secretary, Sajid Javid, and the Metropolitan police commissioner, Cressida Dick, urging them to halt deployment of the “dangerously authoritarian” technology.

If the forces do not stop using AFR systems then legal action will follow in the high court, the letters said. Money for the challenges is being raised through a crowdfunding site.

According to Liberty, South Wales police have used facial recognition technology in public spaces at least 20 times since May 2017. On one occasion – at the 2017 Champions League final in Cardiff – the technology was later found to have wrongly identified more than 2,200 people as possible criminals.

At the time a force spokesman said: “Technical issues are common to all face recognition systems, which means false positives will be an issue as the technology develops ... The accuracy of the system used by South Wales police has continued to improve.”

Bridges claimed he was monitored on a shopping street in Cardiff last December, and again while protesting outside the Cardiff Arms Fair in March. He said: “Indiscriminately scanning everyone going about their daily business makes our privacy rights meaningless. The inevitable conclusion is that people will change their behaviour or feel scared to protest or express themselves freely – in short, we’ll be less free.

“The police have used this intrusive technology throughout Cardiff with no warning, no explanation of how it works and no opportunity for us to consent. They’ve used it on protesters and on shoppers.”

Corey Stoughton, Liberty’s advocacy director, said: “The police’s creeping rollout of facial recognition into our streets and public spaces is a poisonous cocktail – it shows a disregard for democratic scrutiny, an indifference to discrimination and a rejection of the public’s fundamental rights to privacy and free expression.”

Liberty has argued that AFR systems capture peoples’ biometric data without their consent, disproportionately misidentifies the faces of females and minority ethnic people and breaches data protection laws.

Lady Jones expressed fears that she could end up on a facial recognition watch list when conducting her parliamentary and political duties. Details about her were held on the Met’s domestic extremism intelligence unit.

Big Brother Watch’s director, Silkie Carlo, said: “Facial recognition cameras are dangerously authoritarian, hopelessly inaccurate and risk turning members of the public into walking ID cards. The prospect of facial recognition turning those CCTV cameras into identity checkpoints like China is utterly terrifying.”

Jones said: “I’m extremely concerned about the impact that the Met police’s use of automated facial recognition will have on my ability to carry out my democratic functions. Police use of this technology has no legal basis, and infringes people’s rights and civil liberties. That’s why I’m challenging the Met to end its use, now.”

Anna Dews, a solicitor at the law firm Leigh Day, who is representing Big Brother Watch and Jones, said: “The lack of a statutory regime or code of practice regulating this technology, the uncertainty as to when and where automated facial recognition can be used, the absence of public information and rights of review, and the use of custody images unlawfully held, all indicate that the use of automated facial recognition, and the retention of data as a result, is unlawful and must be stopped as a matter of priority.”

The Home Office defends the use of biometric technologies as a crime-tackling measure but said it believes it should respect individuals’ privacy. It is considering improvements to the governance of police use of facial recognition technology, including creating an oversight board consisting of the biometrics commissioner, surveillance camera commissioner, information commissioner and police representatives.

The government is due to publish a biometrics strategy later this month.


Owen Bowcott Legal affairs correspondent

The GuardianTramp

Related Content

Article image
Met police to begin using live facial recognition cameras in London
Civil liberties groups condemn move as ‘a breathtaking assault on our rights’

Vikram Dodd Police and crime correspondent

24, Jan, 2020 @1:16 PM

Article image
Office worker launches UK's first police facial recognition legal action
Ed Bridges, from Cardiff, says ‘intrusive’ technology is used on thousands of people

Steven Morris and agency

21, May, 2019 @4:24 PM

Article image
Facial recognition tech prevents crime, police tell UK privacy case
South Wales force defends use of technology after office worker claims rights breach

Steven Morris and agency

22, May, 2019 @3:50 PM

Article image
Police trials of facial recognition backed by home secretary
Sajid Javid supports use of technology despite concern from human rights groups

Jamie Grierson Home affairs correspondent

12, Jul, 2019 @10:08 AM

Article image
South Wales police to use facial recognition apps on phones
Force testing app that lets officers run snapshot through ‘watchlist’ to identify suspects

Ian Sample Science editor

07, Aug, 2019 @6:48 PM

Article image
Police face calls to end use of facial recognition software
Analysts find system often wrongly identifies people and could breach human rights law

Robert Booth Social affairs correspondent

03, Jul, 2019 @6:00 PM

Article image
South Wales police lose landmark facial recognition case
Call for forces to drop tech use after court ruled it breached privacy and broke equalities law

Dan Sabbagh

11, Aug, 2020 @5:24 PM

Article image
Met police to use facial recognition software at Notting Hill carnival
Civil liberties groups say plan to scan faces of thousands of revellers at London event has no basis in law and is discriminatory

Vikram Dodd Police and crime correspondent

05, Aug, 2017 @6:00 AM

Article image
Facial recognition must not introduce gender or racial bias, police told
Benefits should be great enough to outweigh any public distrust, says ethics report

Frances Perraudin

29, May, 2019 @12:42 PM

Article image
Watchdog warns over police database of millions of facial images
Biometrics commissioner says 20m photos held despite retention of images of innocent people being unlawful

Alan Travis Home affairs editor

13, Sep, 2017 @4:52 PM