Loading News Article...
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
Data watchdog demands answers after tests reveal algorithms used to hunt criminals are far more likely to misidentify Black and Asian suspects.

The integrity of modern policing is under fire in the United Kingdom following top-level admissions that facial recognition technology used to hunt criminals is systematically failing Black and Asian suspects.
For the United Kingdom, it is a crisis of confidence in law enforcement; for Kenyans watching the global march toward digital surveillance, it serves as a chilling warning about the fallibility of artificial intelligence when applied to human rights.
The UK’s Information Commissioner’s Office (ICO) has demanded “urgent clarity” from the Home Office regarding these failures. The move comes after the National Physical Laboratory (NPL) conducted rigorous testing on the algorithms used within the police national database, uncovering a disturbing trend: the software is significantly more likely to incorrectly flag Black and Asian individuals than their White counterparts.
The technology in question is not merely experimental; it is designed to catch serious offenders. Yet, the NPL report revealed that the digital net cast by these algorithms is flawed by racial bias. The Home Office has since admitted that the system was “more likely to incorrectly include some demographic groups in its search results.”
This admission strikes at the core of fair policing. If an algorithm cannot distinguish between innocent citizens and wanted suspects due to their skin colour, the risk of wrongful arrest and harassment escalates dramatically. The ICO’s intervention signals that the watchdog is prepared to take hard measures to protect civil liberties.
Emily Keaney, the Deputy Commissioner for the ICO, expressed deep frustration over the revelation. In a sharp rebuke, she noted that the watchdog had been kept in the dark regarding these specific flaws.
The ICO is not ruling out severe consequences. Keaney emphasized that the office is currently assessing the situation to determine its next move. The spectrum of potential enforcement is broad and could include legally binding orders to cease using the technology entirely or the imposition of significant fines.
“It’s disappointing that we had not previously been told about this,” Keaney stated, underscoring the gap between government assurances and operational reality. She added that the ICO’s mandate is to hold the government accountable for how data is used—a mission now complicated by this lack of transparency.
As nations like Kenya continue to digitize identity systems and adopt smart city surveillance, the UK’s struggle offers a critical lesson: technology is only as neutral as its creators, and without rigorous oversight, innovation can quickly turn into discrimination.
Keep the conversation in one place—threads here stay linked to the story and in the forums.
Other hot threads
E-sports and Gaming Community in Kenya
Active 6 months ago
Popular Recreational Activities Across Counties
Active 6 months ago
The Role of Technology in Modern Agriculture (AgriTech)
Active 6 months ago
Investing in Youth Sports Development Programs
Active 6 months ago