Ethics committee raises alarm over ‘predictive policing’ tool
A computer tool used by police to predict which people are likely to reoffend has come under scrutiny from one force’s ethics committee, who said there were a lot of “unanswered questions” and concerns about potential bias.
Amid mounting financial pressure, at least a dozen police forces are
using or considering predictive analytics, despite warnings from
campaigners that use of algorithms and “predictive policing” models
risks locking discrimination into the criminal justice system.
West Midlands police are at the forefront, leading on a £4.5m project
funded by the Home Office called National Data Analytics Solution
(NDAS).
The long-term aim of the project is to analyse data from force
databases, social services, the NHS and schools to calculate where
officers can be most effectively used. An initial trial combined data on
crimes, custody, gangs and criminal records to identify 200 offenders
“who were getting others into a life on the wrong side of the law”.
A report
by West Midlands police’s ethics committee, however, raised concerns
about the project. They said there were a lot of “unanswered questions
giving rise to the potential for ethical concerns”.
The committee noted that no privacy impact assessments had been made available, and there was almost no analysis of how it impacted rights. The new tool will use data such as that linked to stop and search, and the ethics committee noted this would also include information on people who were stopped with nothing found, which could entail “elements of police bias”.
Hannah Couchman, the advocacy and policy officer at the human rights
organisation Liberty, said: “The proposed program would rely on data
loaded with bias and demonstrates exactly why we are deeply concerned
about predictive policing entrenching historic discrimination into
ongoing policing strategies.
“It is welcome that the ethics committee has raised concerns about
these issues, but not all forces have similar oversight and the key
question here should be whether these biased programs have any place in
policing at all. It is hard to see how these proposals could be reformed
to address these fundamental issues.”
Tom McNeil, the strategic adviser to the West Midlands police and
crime commissioner, said: “The robust advice and feedback of the ethics
committee shows it is doing what it was designed to do. The committee is
there to independently scrutinise and challenge West Midlands police
and make recommendations to the police and crime commissioner and chief
constable.”
He added: “This is an important area of work, that is why it is right
that it is properly scrutinised and those details are made public.”
The ethics committee recommended more information be provided about
the benefits of the mode. “The language use in the report has the
potential to cause unconscious bias. The committee recommends the lab
looks at the language used in the report, including the reference to
propensity for certain ethnic minorities to be more likely to commit
high-harm offences, given the statistical analysis showed ethnicity was
not a reliable predictor,” it said.
In February, a report by Liberty raised concern that predictive
programs encouraged racial profiling and discrimination, and threatened
privacy and freedom of expression.
Couchman said that when decisions were made on the basis of arrest
data, this was “already imbued with discrimination and bias from the way
people policed in the past” and that was “entrenched by algorithms”.
She added: “One of the key risks with that is that it adds a
technological veneer to biased policing practices. People think computer
programs are neutral but they are just entrenching the pre-existing
biases that the police have always shown.”
Using freedom of information data, Liberty discovered that at least
14 forces in the UK are either using algorithm programs for policing,
have previously done so or have conducted research and trials into them.
Comments (0 )