So the police have a tool that helps them predict where the highest incidence of crime is.
And it is racist, of course. Data and analytic tools are inherently racist, don’t ‘cha know.
The algorithm added folks to the list based not just on arrest records but whether someone is socially connected to a shooter or victim. To prevent crime, these persons of interest would get visited before anything happened. But it wasn’t just police knocking on doors: Social workers and community leaders would approach the data-suggested at-risk folks and attempt to intervene and offer ways out of, say, the gang life that endangered them.
That was the idea, anyway. The RAND Corporation was given incredible access to the Chicago police department as it built and used the first version of the SSL, but found they weren’t using it to provide social services, instead using it to target people for arrest. Partially, The Verge explains, the SSL system got lost in the shuffle of no less than 11 different violence reduction initiatives. “The list just got lost,” said one of the RAND report’s authors.
Given the concerns it raises, the RAND report doesn’t have a suggestion for how the predictions should be applied to policing. It’s possible that being placed on the list draws police attention, as certain officers may have used it as a source for leads to close shooting cases.
The Chicago police department released a statement defending the current SSL system, noting that the RAND Corporation had evaluated version one of the list; They’re now on version five, with a sixth on the way. They have invited them to review the newest SSL. But the report’s criticisms of the actual impact of the SSL are still valid, says law professor at the University of Columbia Andrew G. Ferguson.
“Just creating a data-driven ‘most-wanted’ list misses the value of big data prediction,” Ferguson told The Verge in an email. “The ability to identify and proactively intervene in the lives of at risk youth is a positive, but you have to commit to the intervention piece.”