Researchers Are Developing A Wild Minority Report-Like Murder Prediction Tool

A digital face protruding from a computer monitor.
Oftentimes when there's a reference to the movie "Minority Report" in the tech news sector, it has do with with some clever user interface (UI) design, especially with the advent of hand-tracking tricks in virtual reality setups. That's not always the case, though. Sometimes, it's a lot more creepy or more of a privacy concern, as is the case with a "murder prediction" tool being developed in the United Kingdom.

Just like it sounds, the technology at play is being designed to identify individuals who are most likely to commit murder. According to at report in The Guardian, the tool was originally dubbed "homicide prediction report," but those in charge shifted to friendlier, less ominous nomenclature and are now calling it "sharing data to improve risk assessment."

Call it whatever you want, it's still going to be controversial. Discovered by the nonprofit activist group Statewatch, what the project boils down to is an algorithm that analyzes crime data from several official sources, such as the Probation Service and Greater Manchester police department. The data includes names, dates of birth, gender, and ethnicity.

The idea, according to the Ministry of Justice, is to analyze known criminals to determine how much risk they pose of committing a violent crime while out on probation. Part of that entails analyzing various health markers, including data related to mental health, suicide, addiction, and other categories. For now, it's supposedly just in the research stage.

"This project is being conducted for research purposes only. It has been designed using existing data held by HM Prison and Probation Service and police forces on convicted offenders to help us better understand the risk of people on probation going on to commit serious violence. A report will be published in due course," a spokesperson for the MoJ said.

One of Statewatch's concerns, however, is that the so-called murder prediction tool would inherently have a bias against minorities and poor people. Part of that claim stems from the allegation that some of the data comes from "institutionally racist police" departments. According to Statewatch, past research has shown time and again that "algorithmic systems for 'predicting' crime are inherently flawed."

"Like other systems of its kind, it will code in bias towards racialized and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction and disability is highly intrusive and alarming," Statewatch says.

The activist group is also concerned that crime victims and innocent people who have sought help from the police will be pooled into the dataset, which will include details relating to self-harm and domestic abuse. There's an obvious privacy concern there, though according to the report, officials "strongly deny this," saying that the data is limited to individuals who have at least one criminal conviction on record.