Essex Police Halt Facial Recognition Use After Study Reveals Racial Bias
Essex Police Pause Facial Recognition Over Racial Bias

Essex Police Suspend Facial Recognition Technology Following Racial Bias Study

Essex Police have paused the deployment of live facial recognition (LFR) technology after a study uncovered significant racial bias in its operations. The research, conducted by University of Cambridge academics, revealed that black people were "statistically significantly more likely" to be correctly identified by the cameras compared to other ethnic groups, prompting concerns over fairness and potential discrimination.

Study Details and Findings

The study involved 188 actors walking past LFR cameras mounted on marked police vans in Chelmsford. While the system correctly identified about half of the individuals on a police watchlist and made incorrect identifications rarely, it showed a clear bias. Specifically, the technology was more likely to correctly identify men than women and demonstrated a heightened accuracy for black participants over other ethnicities. Dr. Matt Bland, a criminologist and co-author of the report, emphasized that this disparity warrants further investigation, stating, "If you're an offender passing facial recognition cameras... the chances of being identified are greater if you're black."

Regulatory Response and Broader Implications

The Information Commissioner's Office (ICO) disclosed the suspension, warning other police forces to implement mitigations against accuracy and bias risks. LFR systems, which are either fixed or van-mounted, have been used by at least 13 police forces across England and Wales, including London, Greater Manchester, and West Yorkshire. This pause comes despite Home Secretary Shabana Mahmood's announcement in January to increase LFR vans five-fold, making 50 available to every police force in England and Wales.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Contrasting Concerns and Incidents

The issue of racial bias differs from more common public fears about facial recognition, such as misidentifying innocent people. For instance, last month, a man was wrongly arrested for a burglary 100 miles away due to retrospective face scanning software confusing him with another person of South Asian heritage. Experts suggest the bias in LFR might stem from overtraining algorithms on black faces, which could be rectified by adjusting system settings. A separate government study by the National Physical Laboratory found black men were most likely to be matched correctly, though the effect was not statistically significant.

Support and Opposition to the Technology

The Home Office has defended LFR, noting that deployments in London from January 2024 to September 2025 led to over 1,300 arrests for serious crimes like rape and burglary. However, critics argue the latest research validates long-standing warnings about bias. Jake Hurfurt of Big Brother Watch stated, "AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets."

Future Steps and Monitoring

Essex Police have revised their policies and procedures after working with software providers and academics to address the bias. They expressed confidence in resuming deployments to trace and arrest wanted criminals, with ongoing monitoring to prevent bias against any community. The report concluded that the findings "raise questions about fairness that require continued monitoring," highlighting the need for vigilant oversight in the use of AI technologies in law enforcement.

Pickt after-article banner — collaborative shopping lists app with family illustration