Facial recognition must not introduce gender or racial bias, police told

Benefits should be great enough to outweigh any public distrust, says ethics report

Facial recognition software should only be used by police if they can prove it will not introduce gender or racial bias to operations, an ethics panel has said.

A report by the London policing ethics panel, which was set up to advise City Hall, concluded that while there were “important ethical issues to be addressed” in the use of the controversial technology, they did not justify not using it at all.

Live facial recognition (LFR) technology is designed to check people passing a camera in a public place against images on police databases, which can include suspects, missing people or persons of interest to the police.

The technology has been used to scan faces in large crowds in public places such as streets and shopping centres, and in football crowds and at events such as the Notting Hill carnival.

The Metropolitan police have carried out 10 trials using the technology across London, the most recent being in Romford town centre in mid-February.

In these trials the watchlist only contained images of individuals wanted by the Met and the courts for “violent-related offences”. Police said the trials led to a number of arrests based on positive identifications.

In a report following a review of the Met’s use of the software, the panel said it should only be used if the overall benefit to public safety was “great enough to outweigh any potential public distrust in the technology”.

Concerns have been raised by scientific and civic groups that there are possible intrinsic biases in facial recognition technology, which may mean it is less effective at identifying BAME and female faces. The panel said the Met’s trials with the software were “a source of insight into any intrinsic bias, and should help to indicate how such bias would or would not feed forward into policing operations”.

“We argue it is in the public interest to publish the trial data and evaluations, to address these concerns,” the panel concluded. “Additionally, because the actions of human operators affect the technology’s functioning in the field and therefore the public’s experience of automated recognition, appropriate LFR operating procedures and practices need to be developed.”

As part of their research the panel conducted a survey of a weighted sample of 1,092 Londoners into the police’s use of LFR. More than 57% felt its use by police was acceptable. This figure increased to 83% when respondents were asked whether the technology should be used to search for serious offenders.

Half of respondents thought the use of the software would make them feel safer, but more than a third said they were concerned about its impact on their privacy and that police would collect data on people who had not committed crimes. Only 56% of those surveyed thought that police would use their personal data in accordance with the law.

Almost half of respondents thought the technology would lead to personal information being collected about some groups more than others. Younger people were less accepting of police use of facial recognition technology than older people, and Asian and black people were less accepting of it than white respondents.

The report comes after the information commissioner expressed concern last week over the lack of a formal legal framework for the use of facial recognition cameras by police.

The comments were made during a court hearing in the landmark case of Ed Bridges, an office worker from Cardiff who claims South Wales police violated his privacy and data protection rights by using the technology on him when he went to buy a sandwich during his lunch break and when he attended a peaceful anti-arms demonstration.

The Metropolitan police welcomed the report. DCS Ivan Balhatchet, who has led the force’s trials, said: “We want the public to have trust and confidence in the way we operate as a police service and we take the report’s findings seriously. The MPS will carefully consider the contents of the report before coming to any decision on the future use of this technology.”

Contributor

Frances Perraudin

The GuardianTramp

Related Content

Article image
Police trials of facial recognition backed by home secretary
Sajid Javid supports use of technology despite concern from human rights groups

Jamie Grierson Home affairs correspondent

12, Jul, 2019 @10:08 AM

Article image
Police face legal action over use of facial recognition cameras
Campaigners say technology risks turning UK citizens into ‘walking ID cards’

Owen Bowcott Legal affairs correspondent

14, Jun, 2018 @8:45 PM

Article image
Met police to begin using live facial recognition cameras in London
Civil liberties groups condemn move as ‘a breathtaking assault on our rights’

Vikram Dodd Police and crime correspondent

24, Jan, 2020 @1:16 PM

Article image
South Wales police lose landmark facial recognition case
Call for forces to drop tech use after court ruled it breached privacy and broke equalities law

Dan Sabbagh

11, Aug, 2020 @5:24 PM

Article image
Met police deploy live facial recognition technology
Cameras used at east London shopping centre despite experts warning against them

Damien Gayle

11, Feb, 2020 @7:20 PM

Article image
Met police chief: facial recognition technology critics are ill-informed
Cressida Dick defends tech after civil liberties groups raise fears over accuracy and privacy

Haroon Siddique

24, Feb, 2020 @3:43 PM

Article image
South Wales police to use facial recognition apps on phones
Force testing app that lets officers run snapshot through ‘watchlist’ to identify suspects

Ian Sample Science editor

07, Aug, 2019 @6:48 PM

Article image
Facial recognition row: police gave King's Cross owner images of seven people
Met apologises after local police passed on images for controversial surveillance scheme

Dan Sabbagh Defence and security editor

04, Oct, 2019 @2:06 PM

Article image
Met police to use facial recognition software at Notting Hill carnival
Civil liberties groups say plan to scan faces of thousands of revellers at London event has no basis in law and is discriminatory

Vikram Dodd Police and crime correspondent

05, Aug, 2017 @6:00 AM

Article image
Police face calls to end use of facial recognition software
Analysts find system often wrongly identifies people and could breach human rights law

Robert Booth Social affairs correspondent

03, Jul, 2019 @6:00 PM