MPs have urged the Government to halt the deployment of live facial recognition cameras following concerns of racial bias, with Essex Police suspending their use. Labour’s Bell Ribeiro-Addy cautioned against bias and inaccuracies being prevalent in the technology, highlighting potential risks and advocating for stronger safeguards before further implementation.
The Liberal Democrats also emphasized the need for a national rollout pause until new legislation is established. Home Secretary Shabana Mahmood’s proposal to increase the number of live facial recognition vans five-fold stirred controversy, as Essex Police paused their usage due to a study revealing a higher identification rate of black individuals, indicating potential bias.
A study by Cambridge University disclosed that under “extreme conditions,” false positive alerts disproportionately involved Black individuals, though the overall rate of inaccurate identifications was deemed minimal. Essex Police adjusted the software following the findings. Ribeiro-Addy underscored the importance of addressing errors and instating robust regulations before expanding facial recognition technologies.
Expressing concerns, Max Wilkinson MP echoed the necessity of eliminating biases, publishing comprehensive impact assessments, and enacting statutory safeguards before a nationwide deployment. Live facial recognition cameras utilize AI to scan faces in public areas, cross-referencing them with police watchlists, with unmatched faces promptly deleted. The technology has aided in apprehending over 1,700 suspects, including individuals accused of serious crimes like rape and child abuse.
Mahmood stressed the potential of technology in aiding law enforcement, emphasizing the importance of ethical usage to prevent wrongful implications. An independent study by the National Physics Laboratory revealed low false positive rates based on watchlist sizes. Essex Police’s software differs from others, with the force emphasizing their commitment to addressing bias concerns through independent studies and software adjustments.
The Information Commissioner’s Office highlighted Essex Police’s suspension of live facial recognition deployments due to identified risks of inaccuracy and bias, urging further evaluation before resuming operations. Jake Hurfurt from Big Brother Watch criticized the widespread use of experimental and potentially biased AI surveillance, calling for accountability and ethical considerations in policing technologies.

