UK develops a vastly criticsed crime prediction tool | bjarne-vijfvinkel-spzophuzd0M-unsplash
The UK Ministry of Justice, however, is creating a “chilling dystopian” algorithm using arguably sensitive personal data of innocent people. It’s a noble attempt to create an “murder predictor” tool that can identify individuals who are likely to murder.
“The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems,” said Sofia Lyall, a Statewatch researcher.
Statewatch found out about the project, and some of its workings, through documents obtained by Freedom of Information requests. Lyall criticised the project because, she said, “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed.” She said that these AI systems label people as criminals even before they have committed any crimes.
Researchers at the government are said to be using algorithms to analyse information from thousands of people, including victims of crime, in order to try and prevent crime. They do this by trying to identify those who have a higher risk of committing violent crimes.
‘Detecting those about to commit murder’
According to The Guardian’s exclusive interview, a government spokeswoman said that the Ministry of Justice would “review offenders characteristics that increase risk of committing murder” and “explore new and innovative data-science techniques for risk assessments of homicide.”
The spokesperson said, “The project will contribute to better public protection via better analysis and provide evidence for improving risk assessment of serious crimes.”
Statewatch, along with other activists, slammed the project’s ill-conceivedness. This was due to the fact that it used data from UK’s “institutionally racial police and Home Office”, and Lyall claimed: “Will reinforce the structural discrimination underlying the criminal justice system.”
Statewatch called for the MoJ “to immediately halt any further development of this tool to predict murders.” Instead of spending money on developing racist AI algorithms and dodgy AI, the government should invest it in real welfare services.
The controversial surveillance program was commissioned by the UK prime minister’s cabinet when Rishi sunak was in office. The project uses data from various government agencies, including the Probation service and Greater Manchester Police, dating back to before 2015.
The project utilizes people’s personal information including names, dates of their birth, genders and ethnicities as well as a number which identifies the person on the national computer.
Costa News Spain Breaking News | English News in Spain.