Facial Recognition Technology Error Prompts Police to Arrest Innocent Asian Man
An Asian software engineer was wrongfully arrested for a burglary that occurred 100 miles away from his home, after automated facial recognition software deployed by UK police forces incorrectly identified him as the suspect. Alvi Choudhury, 26, was taken into custody by Thames Valley police in January while working at his parents' home in Southampton, handcuffed and held for nearly 10 hours before being released at 2am without charge.
Technology Confuses Man with Younger Suspect
According to documents shared with the Guardian by Liberty Investigates, Thames Valley police used facial recognition software that matched Choudhury with CCTV footage of a suspect involved in a £3,000 burglary in Milton Keynes. However, Choudhury pointed out significant discrepancies between himself and the individual captured on camera.
"I was very angry, because the kid looked about 10 years younger than me," said Choudhury, who wears a beard. "Everything was different. Skin was lighter. Suspect looked 18 years old. His nose was bigger. He had no facial hair. His eyes were different. His lips were smaller than mine."
Choudhury initially assumed that the investigative officer saw he was a brown person with curly hair and decided to arrest him based on that superficial similarity.
Systematic Bias in Facial Recognition Algorithms
UK police forces utilize an algorithm procured by the Home Office from German company Cognitec, which conducts approximately 25,000 monthly searches against around 19 million police mugshots stored on the national database. While facial matches are supposed to be treated as intelligence rather than fact according to the National Police Chiefs' Council, Thames Valley police stated that the decision to arrest Choudhury followed a human visual assessment in addition to the algorithmic match.
Nevertheless, Home Office commissioned research revealed in December that the technology produces far higher false positive rates for black (5.5%) and Asian (4.0%) faces compared to white faces (0.04%) at certain settings. Police and crime commissioners have warned of "concerning in-built bias" within these systems.
Consequences and Legal Action
Despite offering evidence of work meetings in Southampton on the day of the crime, Choudhury was taken into custody. His neighbors witnessed him being led away in handcuffs, causing significant distress to his father and preventing him from working the following day. Choudhury is now claiming damages against both Thames Valley police and Hampshire constabulary, which executed the arrest.
This incident marks the second time Choudhury has been wrongfully arrested. His mugshot remained in the police system from a previous incident in 2021 when he was attacked on a night out while at university in Portsmouth and released without further action. Now with a second mugshot on record, he fears the automated system could trigger additional wrongful arrests.
"In my head, if a brown person in Scotland robs a bank are they going to come and arrest me?" he questioned. The situation has also created professional complications, as Choudhury sometimes requires security clearance for government clients and must disclose arrest records.
Police Response and Systemic Issues
Thames Valley police admitted to Choudhury that the arrest "may have been the result of bias within facial recognition technology," yet an officer told him that since the use of facial recognition is already under strategic review, there was no need to raise the issue for wider organizational learning. A police spokesperson denied the arrest was unlawful, stating it was based on investigating officers' visual assessment following a retrospective facial recognition match and was not influenced by racial profiling.
However, Choudhury reported that officers at the Hampshire police station laughed when he asked if he resembled the suspect, and Thames Valley police officers who later interviewed him acknowledged they knew he wasn't the suspect after comparing footage.
Broader Concerns and Regulatory Warnings
Warnings about automated facial recognition technology have been repeatedly raised by oversight bodies. In December 2024, the UK's biometrics and surveillance camera commissioner, William Webster, expressed concern about police retaining and using images of people who were arrested but never charged. Last month, South Wales police paid damages to a black man wrongfully arrested and held for 13 hours due to facial recognition errors.
Choudhury's lawyer, Iain Gould of DPP Law, emphasized that police "must ensure that artificial intelligence is not substituted for human intelligence and due diligence, but instead is used in careful partnership with it."
The Home Office confirmed that guidance and training to minimize errors and maintain public confidence in retrospective facial recognition is under review by the Police Inspectorate. A new national facial matching system with an improved, independently tested algorithm is currently in development.
Expanding Use of Controversial Technology
Since December, Thames Valley police has deployed live facial recognition technology to scan the public in locations including Oxford, Slough, Reading, Wycombe and Milton Keynes. The system has captured approximately 100,000 faces, resulting in six arrests. This expansion occurs despite mounting evidence of racial bias and wrongful arrests linked to the technology.



