LONDON (Realist English). The UK Ministry of Justice is developing a predictive risk assessment tool designed to identify individuals who may be likely to commit serious violent crimes, according to a report by Statewatch. While officially dubbed a “data sharing for better risk assessment” initiative, internal documents reveal its original name was the “predicting murder project”.
Launched under Prime Minister Rishi Sunak, the program uses a vast array of personal data — including criminal records, interactions with police, incidents of self-harm, mental health markers, disability status, and histories of substance abuse. The system reportedly processes not only data of convicted offenders, but also information about individuals with no criminal history.
The algorithm draws on police records from Greater Manchester dating back to 2015, alongside databases from the Probation Service. Data points include name, date of birth, gender, ethnicity, and police system ID numbers.
Officials say the model is being developed “for research purposes” to enhance existing risk evaluation tools used in prisons and probation. The Ministry of Justice maintains that the system will not replace current procedures but may improve the accuracy of risk assessments.
However, civil liberties groups have raised serious concerns. Sophia Lyall, a researcher with Statewatch, warned that the project builds its predictions on data from institutions known for systemic racial bias, such as the police and Home Office, thereby reinforcing discrimination against minorities and low-income groups.
The use of sensitive medical and social data to train an “algorithm of crime” risks undermining both privacy and the core tenets of justice. If prediction replaces action, justice turns into institutionalised suspicion. In a society marked by structural inequality, predictive systems risk automating injustice — with algorithmic precision.