Although crime in Reading, Pennsylvania, remained high despite the city’s budget constraints, police chief William Heim invested significantly in crime prediction software in 2013. Predictive Policing is the name of this Big Data start-program up’s dubbed PredPol (short for predictive policing). As promised, the program would analyze past crime data to identify where and when cries most likely occur hourly. The number of burglaries in high-crime neighborhoods dropped by nearly 20% when police started paying more attention to these hotspots. Predictive policing software is similar to a lot of baseball statistical modeling. It is presumably devoid of the racism and prejudices incorporated in the recidivism models that the court system utilizes. For this program to work, it has to have much data about the region to justify the presence of police there.
It has been established that human initiatives generate tension and danger in vulnerable populations. Mathematical models currently dominate law enforcement. Because of ideas that relate nonviolent crimes to a rise in violent crimes, police chiefs feel that even “nuisance data” may be utilized to create “better data” that can concentrate more intensely on violent crimes. Police departments throughout the nation are bolstering zero-tolerance rules for both violent and nonviolent offenses by relying on data from predictive policing methods. One of the most hazardous aspects of this system is that its inner workings are kept secret from the general people.
At a “hackathon” in the spring of 2011, O’Neil and the New York Civil Liberties Union worked together to uncover crucial information on the NYPD’s controversial and destructive stop-and-frisk program. Many Black and Latino teenagers were being frisked because of data, even though just 0.1% of those frisked were in any way connected to a serious crime. “Stop and Frisk,” O’Neil argues in his book, is not a WMD, but it utilizes mathematics to justify thousands of intrusive stops in vulnerable communities. Even though humans administered it, stop and frisk resulted in horrendous feedback loops that disproportionately penalized Black and Latino males for minor infractions seldom enforced against whites.
Police are likely to target non-whites in low-income areas that lack access to excellent schools and jobs. As a result, sentencing recommendations based on WMDs, like predictive policing and recidivism models, are both racially discriminatory and logically faulty. There is no consideration for human nature in these models that claim to have a wealth of information on the likelihood that an ex-convict from a particular area would re-offend after being released from jail. According to O’Neil, justice system data scientists should understand prison life and how it affects inmates’ conduct. Many inmates suffer from malnourishment, solitary confinement, and sexual assault in prisons. When these institutions reach total capacity, they can generate a $5 billion industry. When it comes to making prisons mysterious, these firms go out of their way to do it.
In the not-too-distant future, facial recognition algorithms will be able to identify even more dangerous weapons of mass destruction. The Chicago Police Department knocked on the door of a 22-year-old male who resided in a high-crime, low-income area in 2013. They warned him that the police were keeping an eye on him because he was linked to persons who had been arrested. By focusing on persons not involved in criminal activity, police were stoking racial tensions in the communities where crime was a problem. According to O’Neil, building models that presume everyone is the same is more accessible than developing policies that make the justice system more equitable (though perhaps less efficient).