Usage of Facial Recognition by UK Police Forces

Background and Goals

The research aimed to create the first overview of UK forces’ use of facial recognition technology used within CCTV. The databank used for the facial recognition technology against which passers-by are matched contains unlawfully retained custody images. Furthermore, it was shown in several studies that mismatches of Black and dark-skinned people, as well as women, are disproportionately higher than for white males. Thus, such groups face higher chances of being wrongfully identified as a criminal and being detained.

The research aimed to:

  • Provide new statistics on the use of facial recognition by UK police

  • Investigate current policies and legislation on facial recognition

  • Contextualise the findings with legal and human rights implications

Methods

  • Sending Freedom of Information (FOI) requests to all xx UK police forces

  • Analysing FOI responses and creating statistics on the data

  • Interviews with policy experts in this area

  • Secondary research on pre-existing literature

Results

  • The two police forces were actively trialling facial recognition technology in public places like shopping centres, Notting Hill Carnival, concerts and football stadiums.

  • The research showed that facial recognition systems used by the police were nana inaccurate, having a high number of false positives:

    South Wales Police: 91 %

    Metropolitan Police: 98 %

  • The usage of facial recognition did have any oversight and was not covered by existing policies that protected people’s privacy

Impact

  • The report resulting from this research was launched in the houses of parliament in May 2018.

  • It was widely covered in the British press and featured in the Netflix documentary “Coded Bias”.