World
2022.07.06 05:57 GMT+8

Researchers develop algorithm to predict crime in cities before it happens

Updated 2022.07.06 05:57 GMT+8
CGTN

Social and data scientists have made an algorithm they say can predict crime a week in advance with 90% accuracy.

The team, led by University of Chicago professor Ishanu Chattopadhyay, published its findings in the journal Nature Human Behavior.

The algorithm utilized historical data on violent crimes and property crimes for the U.S. city of Chicago to test the model, detecting the patterns over time to predict future events. Data was also gathered and tested equally well for other major U.S. cities, including Detroit, Atlanta, Philadelphia and San Francisco.

The team hopes the technology will lead to changes in how police resources are allocated.

The research attempted to address criticism of previous predictive crime models developed for major cities.

“While predictive models may enhance state power through criminal surveillance, they also enable surveillance of the state by tracing systemic biases in crime enforcement,” the report said.

The team’s tool contrasts from previous models for prediction which focused more on crime “hotspots” that spread to surrounding areas.

That practice, said the researchers, tends to miss the complex social environments of cities, as well as the nuanced relationship between crime and the effects of police enforcement, leaving room for bias.

Chattopadhyay added that these previous tools were often based on flawed assumptions about crime and its causes, citing algorithms that gave undue weight to variables like the presence of graffiti.

Further research found that police resources were rarely balanced for a city.

In a separate model on arrest data, Chattopadhyay’s team found that crime in wealthier parts of town led to more arrests in those areas, while arrests declined for disadvantaged neighborhoods. But crime in poor neighborhoods didn’t always lead to more arrests, suggesting “biases in enforcement.”

Other crime prediction models used by law enforcement have been criticized for targeting people erroneously.

Chicago police, for example, were criticized for using a predictive model that was supposed to focus on potential victims and perpetrators of shooting incidents.

An investigation by the Chicago Sun-Times, however, revealed that nearly half of the people identified in the model as potential perpetrators had never been charged with illegal gun possession, while 13% had never been charged with a serious offense.

Most machine learning models used by law enforcement are built on proprietary systems, making it difficult for the public to understand how their algorithms work or how accurate they are.

This lack of transparency led the researchers of this latest model to make the algorithm available to the public for audit. 

For more, check out our exclusive content on CGTN Now and subscribe to our weekly newsletter, The China Report.

Copyright © 

RELATED STORIES