Even the head of a Santa Cruz tech company that sells software to Bay Area police departments admits that using an algorithm to tell cops where and when to patrol raises a host of complicated issues.
With the promise of trying to predict crime before it happens, police departments across the United States are experimenting with artificial intelligence programs like the one from PredPol in Santa Cruz. It’s an evolution of the “hot-spot” crime maps police have been using for decades to guide their patrolling — with 21st century twists that opponents say can reinforce bias and make people less safe.
At a time when tension is high over police misconduct and shootings of unarmed suspects, predictive policing is under increasing scrutiny from privacy advocates, watchdogs and even law enforcement itself.
The software for predictive policing relies on data, ranging from crime-victim reports to arrests to individuals’ histories of police interaction. Data is then fed into machine-learning algorithms, which generate predictions about where and when particular types of crime are likely to occur, and in some cases, who might be expected to break the law.
But if that data derives from bad policing — over-enforcement in African-American or Hispanic communities, for example, or lack of enforcement in wealthier areas — critics say those biases infect the algorithm.
“You’re working with the fruits of the poison tree,” said Suresh Venkatasubramanian, a computer science professor at the University of Utah.
That’s precisely why PredPol avoids using arrest data, said CEO Brian MacDonald, instead feeding its algorithm only basic information derived from crimes reported to police by the public: time and day of the reported crime, location, and the type of crime. Keeping arrest information and data specific to individuals out of the algorithm’s feed helps take biases within a police force out of the equation, he said.
The data is visualized on police maps as 500-foot boxes representing areas where crime is likely to occur during a particular time period. “When you’re not answering calls for service, you proactively patrol these areas,” MacDonald said. Spending six minutes per hour in a box is the “sweet spot” for deterring crime, he said.
“A lot of people are concerned about AI, and rightfully so, just because it’s a new technology that can be wielded in different ways,” MacDonald said. “We feel like we’re helping. We’ve seen dramatic crime decreases in some of our cities, in most case by double digits.”
PredPol’s software has been used by about 50 departments in the United States and about 10 in California, including UC Berkeley, Mountain View, Santa Cruz, Los Gatos and Campbell. It seeks to predict robbery, burglary, car break-ins, car theft, aggravated assault and homicide, but not rape and sexual assault, domestic violence or drug crimes.
Critics argue that it’s next to impossible to remove bias from such systems. “There is reporting bias in what crimes are being reported,” said Venkatasubramanian. In communities of color, for example, many people believe calling police will just bring more trouble, he said.
“If you are not cognizant of the fact that victim-report data itself is subject to the same kind of reporting and collection biases that any other kind of data is, and if you don’t account for that when building your model, your model will be subject to the same kind of biases,” Venkatasubramanian said. “These systems are being rolled out without any kind of proper examination of how they work. If we have to wait for someone to be harmed before we do anything on this, that’s a mistake.”
Even limiting the algorithm’s data diet as PredPol does can produce “feedback loops” in which the software tells officers to continue focusing on the same areas, potentially leading to over-policing, he said.
“Maybe certain neighborhoods get over-policed and that results in more people being incarcerated,” said Dave Maass, senior investigative researcher at the Electronic Frontier Foundation. “The other area you might worry about is where communities aren’t getting served enough because they just didn’t produce the data that attracts attention from the algorithm.”
PredPol’s research-and-development chief Jeffrey Brantingham acknowledged that reporting bias can’t be avoided, but said reports from the public provide the most reliable data.
PredPol, founded in 2012 and now employing a dozen staff, is not the only Bay Area company producing predictive policing software. Last fall, Newark’s ShotSpotter gunshot-detection firm bought HunchLab and its predictive software. And Palo Alto’s Palantir has a product that has been used to target individuals likely to commit violence or get shot. PredPol declined to disclose the price of its product, which it has offered to some departments on a trial basis.
In Santa Cruz, police have been trying out PredPol’s system, and Chief Andrew Mills said it’s helped identify the times when particular crimes in specific areas are likely to occur. Police were already aware of those hot spots, but the software’s guidance about when to patrol has been helpful, Mills said.
Around the Bay Area, police experiences with PredPol have varied. Tech website Motherboard last month reported that a number of departments in the region appeared to have shown interest in PredPol’s product.
Mountain View used the software for almost five years until last June, said department spokeswoman Katie Nelson. “We serve a community that is incredibly adept at either testing and/or adopting new technology, and this was something we wanted to see whether it could potentially serve a positive purpose in Mountain View,” Nelson said in an email. “We tested the software and eventually subscribed to the service for a few years, but ultimately the results were mixed and we discontinued the service.” Mountain View police have had their own in-house crime analyst since January 2018, Nelson added.
UC Berkeley police have been using PredPol’s product for a couple of years, said Sgt. Nicolas Hernandez. “Anecdotally, I’ve used it before, and it seems to be effective,” Hernandez said.
However, since police are a visual deterrent for criminals, “if you’re using predictive policing to say, maybe predict where a crime’s going to happen, and you’re patrolling that area, you don’t know if you’ve prevented a crime,” Hernandez said.
Palo Alto police used PredPol’s software on a trial basis between 2013 and 2015. “We weren’t getting any value out of Predpol,” said department spokeswoman Janine de la Vega. “We weren’t solving a higher rate of crime by using it.”
Police in the Los Gatos/Monte Sereno department used PredPol’s software from 2012 until canceling its subscription last year, said Capt. Michael D’Antonio. Guidance from the software was helpful for briefing patrol officers, but ultimately the $5,000 to $7,000 annual cost was not justifiable, D’Antonio said. “I think there was value to it, but our folks knew where things happened previously and we would spend time in those areas anyway.” The department saw no large reduction in crime rates while it was using the software, D’Antonio said.
Next door in Campbell, police tried PredPol’s product for a year or so six years ago, but Capt. Gary Berg said, “The software was providing us information that we already had.”
Despite the cancellations, PredPol contends that its product can help fight crime in communities of any size. “In absolutely the smallest departments, it can still be advantageous for helping you to organize patrol planning,” Brantingham said. “Most police officers can say, ‘I know where my top one or two hot spots are.’ But what is the fourth? We haven’t seen any community where there’s zero value returned for predictive policing.”