Human rights group claims the algorithms threaten a ‘tech veneer to biased practices’
'The rapid growth in the use of computer programs to predict crime hotspots and people who are likely to reoffend risks locking discrimination into the criminal justice system, a report has warned.
Amid mounting financial pressure, at least a dozen police forces are using or considering the predictive analytics. Leading police officers have said they want to make sure any data they use has “ethics at its heart”.
But a report by the human rights group Liberty raises concern that the programs encourage racial profiling and discrimination, and threaten privacy and freedom of expression.
Hannah Couchman, a policy and campaigns officer at Liberty, said that when decisions were made on the basis of arrest data it was “already imbued with discrimination and bias from way people policed in the past” and that was “entrenched by algorithms”.
She added: “One of the key risks with that is that it adds a technological veneer to biased policing practices. People think computer programs are neutral but they are just entrenching the pre-existing biases that the police have always shown.”
Using freedom of information data, the report finds that at least 14 forces in the UK are using algorithm programs for policing, have previously done so or conducted research and trials into them.'
Read more: UK police use of computer programs to predict crime sparks discrimination warning