Governments cautioned against crime prediction algorithm
By GovInsider
Software could be biased against races and ethnicities, warns Physics Professor at Emory University.
Researchers have warned governments about using algorithms for crime prediction that could take on a racial bias. Skewed data may distort the prediction by systems, argues Sidney Perkowitz, Professor Emeritus of Physics at Emory University in the US.
He cites Predpol and HunchLab as examples; both softwares are currently being used by police units in the country. Predictive algorithms see a lot of black criminals arrested and can echo those racial stereotypes, so governments must ensure that software is explicitly coded to prevent this.
Hunchlab gathers crime records across agencies, correlates it against location, time and weather to churn out likelihood of crime types in patrol areas. Predpol works similarly, and retools new data with existing crime patterns every six months.
“PredPol and HunchLab state that they use no racial or ethnic information, but the data they do use might already embed racial bias that would carry forward in new crime predictions to further entrench the bias”, Perkowitz writes in Aeon, a non-profit online publication.
Activists and legal experts believe that predictive policing enforces racial bias by pre-profiling the poor and minority - groups perceived widely by the police to commit crimes. The technology floods “officers into the very same neighbourhoods they’ve always over-policed”, states Perkowitz, citing legal experts from the American Civil Liberties Union.
As a result of predictive algorithms, police officers “might reasonably view otherwise innocent behaviour as suspicious” in a high-crime area, says Alexander Kipperman - a Philadelphia lawyer - referring to a court decision in 2000 which led to the insight.
Minorities’ presence in such areas may, therefore, be scrutinised more heavily. Perkowitz suggests three work-arounds to the biases of predictive policing.
First, such algorithms should be used to “augment human abilities rather than replace them”, he advised.
“For officers on the street, we might find that combining personal experience with guidance from predictive software enables them to deal better with what they encounter daily.” Second, officers need to be trained to combine software policing with their street knowledge.
Third, he puts forth Kipperman’s stance that independent reviews should be conducted to audit crime data, correct biases and for accountability in both police units and the community Kipperman further suggests that predictive policing be separated from police departments altogether, run instead by independent agencies to reduce external pressure.
Image by Elvert Barnes, licensed under CC BY 2.0