The High Point, N.C., Police Department is nationally known as a leader in innovative policing, particularly on issues of domestic violence. Over the past decade, High Point's dedicated strategy of "focused deterrence" has led to a three-fold drop in domestic abuser homicides. Predictive policing, or as High Point would say, “intelligence-led policing,” has been part of the strategy to reduce domestic crimes.
Now, the department has been expanding that approach, applying it to efforts to reduce crimes committed by street gangs. But as policing has gone high tech, questions continue to arise about the boundaries between good policing and violations of people's constitutional right to privacy.
In 2017, High Point experienced an uptick in gang violence. One street gang would attack, often with firearms, prompting the rival gang to retaliate.
But by utilizing a new database system, the police department was able to identify connections between victims, their gang allies and potential targets of violence. Then, officers were able to intervene in gang disputes before the violence escalated.
“We know it’s likely that the group on the receiving end is [then] going to be on the giving end," says High Point Police Chief Kenneth Shultz. "So our strategy is to go out there and deliver a message: ‘We know you guys are going to retaliate. If you do, here is what is going to happen.’”
Once a group or individual has been identified as likely to commit a future crime, High Point police make contact with them, either in a formal meeting or an informal discussion, apprising them of the consequences of future actions. The idea is to convince them that the risk is not worth taking.
“We need to drive up their risk assessment so they are deterred,” Shultz says.
High Point officials believe the strategy has worked.
Officers in the field have told Schultz the data has allowed them to intervene in conflicts before retaliatory violence occurs. But there is no data as of yet measuring the impact of intelligence-led policing on gang violence in High Point.
The idea of predictive policing is not new. Veteran law enforcement officers have always been able to predict some crimes to a certain extent.
What's changed in recent years is the technology that enables police departments to take those observations from the field -- those patrol officers' hunches -- and turn them into data and, eventually, into actionable strategy.
In an era where police departments have been deploying fewer officers, this kind of smart policing has become critical, says Simon Angove, the CEO of Superion, the technology company that produces the ONESolution database software used by High Point police.
“One of the challenges police forces around the country face are staff shortages. With a much smaller police force, we need to be smart about how we deploy those resources,” he says.
Superion doesn’t tout ONESolution as a predictive policing software, instead calling it "intelligence-led policing." Predictive policing focuses on the where and when of crime. It weighs historic crime patterns, weather and personal information on suspects and attempts to predict general times and locations that crime will occur.
Intelligence-led policing, meanwhile, focuses on people -- who is likely to be the victim and who is likely to be the suspect.
Similar predictive technologies have been implemented by police departments in places across the country, from New Orleans to Chicago to Los Angeles. But as the technology gains popularity, civil liberties groups and researchers have raised concerns about whether the technology breaches a person's right to privacy, and whether predictive policing is even effective.
The Los Angeles Police Department’s PredPol technology was twice as effective as predicting where crime was going to occur than the department’s crime analyst, but still only reduced crime by 7 percent, according to a 2016 study published in the Journal of the American Statistical Association. (No additional research has been done to support those findings.)
A Rand Corporation study of location-based predictive policing found the software employed in Shreveport, La, had no significant impact on property crime.
There are also widespread privacy concerns.
“The technical capabilities of big data have reached a level of sophistication and pervasiveness that demands consideration of how best to balance the opportunities afforded by big data against the social and ethical questions these technologies raise,” the White House wrote in a 2014 report.
For example, a Chicago man was placed on "heat list," a tool used by Chicago police to identify the 400 people likely to commit crime in violent sections of the city, despite the 22-year-old never having committed a violent crime. Civil liberties group contend the technology in Chicago has become a tool to monitor whomever police want to surveil, and that many of those tagged by the technology are only guilty of living in a high crime neighborhood.
In 2016, the American Civil Liberties Union issued a harsh critique of predictive policing technologies, condemning the technology for relying on previous police contacts and reports to predict outcomes, which the ACLU said reinforced biases historically present in policing.
“Decades of criminology research have shown that crime reports and other statistics gathered by the police primarily document law enforcement’s response to the reports they receive and situations they encounter, rather than providing a consistent or complete record of all the crimes that occur. Vendors who sell and departments who embrace these new tools are failing to account for these realities, or to evaluate whether the data is so flawed that it cannot be relied upon at all. As a result, current systems reinforce bias and sanitize injustice,” the organization said in a statement in August 2016.
The ACLU went on to say that the use of the technology infringes on the rights of citizens by prompting police to make unlawful stops based not on reasonable suspicion, but on “computer-drive hunches.”
“Predictive policing must not be allowed to erode rights of due process and equal protection. Systems that manufacture unexplained “threat” assessments have no valid place in constitutional policing,” the ACLU said.