Skip to main content

Ethical Framework Aims to Reduce Bias in Data-Driven Policing

While the public may not always be aware, police departments are using machine learning technologies to forecast where crime might occur. Since these systems often use historical data to make predictions, they could potentially exacerbate biased and discriminatory policing practices.

The Northwestern Center for Advancing Safety of Machine Intelligence (CASMI) has supported newly published research that is aimed at mitigating these risks. Duncan Purves, associate professor of philosophy at the University of Florida, and Ryan Jenkins, associate professor of philosophy at California Polytechnic State University, released their findings Aug. 23 in a paper entitled, “A Machine Learning Evaluation Framework for Place-based Algorithmic Patrol Management.”

Place-based algorithmic patrol management is a method in which past crime data is collected and analyzed to identify and predict areas with an elevated likelihood of criminal activity.

Researchers developed the framework as part of a four-year project, and it was fine-tuned following a two-day workshop that CASMI hosted in June 2022. Participants in CASMI’s “Best Practices in Data-Driven Policing” workshop included software developers, computer scientists, social scientists, law enforcement, lawyers, and community advocacy groups.

Ryan Jenkins“We had all of the expertise in the room that you would need to get this conversation off the ground,” Jenkins said. “Not only did we work on identifying what the possible risks are, but we also asked, ‘What can we do about them?’

“The technology is not going away,” Jenkins continued. “Police departments are not just going to abandon all of the data that they've collected. The question is, ‘How can we shepherd the responsible development and deployment of these technologies that benefits public safety without running afoul of any of the legitimate concerns of the affected communities?’”

The voluntary framework has 63 recommendations for developers of place-based algorithmic patrol management systems, police departments, and community advocates. There is a strong emphasis on seeking input from community advocates and listening to their concerns.Kristian Hammond

“We are not properly listening to stakeholders. What we’re doing now is just getting them to the table,” said Kristian Hammond, Bill and Cathy Osborn professor of computer science and director of CASMI. “You have to make sure you empower them correctly. I am optimistic that if we identify the problems that people will want to fix them.”

The framework’s recommendations are focused on areas such as legitimacy, data, user interaction, organizational ethics, and how to avoid problematic “feedback loops” that can repeat biased patrols in minority neighborhoods.

Purves said there are two main themes in the framework. The first one details what developers can do to address concerns about bias and discrimination in policing.

Duncan Purves“This starts with the selection of data sources,” Purves said. “Focusing exclusively on crime data, especially arrest data, can still lead to feedback loops that generate disparate impacts for minority communities — even if that data is not collected in a racially biased way.”

The second theme in the framework explains what police can do to incorporate place-based algorithmic patrol management in a way that builds community trust.

“The spirit of these recommendations and their goal is to help police become better at their job,” Jenkins said. “They are being given a new tool to use, and they need to understand how to incorporate the tool into their work.”

Why Usage of These Technologies Isn’t Widely Known

Law enforcement agencies don’t typically share whether they use algorithms to predict crime, although the public can find out by filing a Freedom of Information Act request. The Electronic Frontier Foundation collates law enforcement’s use of technologies such as drones and facial recognition through an online database called the Atlas of Surveillance.

“I sympathize with police departments that there's a lot of risk in publicizing their use of the technology,” Jenkins said. “The technology is often widely misunderstood. That's another place where I think our work could be helpful to guide those conversations.”

According to Atlas of Surveillance, 196 law enforcement agencies across the country use or have used predictive policing, which identifies locations or individuals that officers should focus on or investigate.

One company called Geolitica claims its software is being used to “help protect one out of every 30 people in the United States.”

“It's extremely widespread, but it's really difficult to get hard numbers,” Purves said.

To gain public trust, the framework makes two recommendations about transparency: 1) developers and police departments should prioritize transparency in the development process, and 2) include ethical requirements in product specifications and continually revisit metrics for levels of bias, transparency, and explainability.

How to Implement the Framework

Software companies can implement the framework with their chief ethics officer or with a company liaison whose job is to ensure police are using the technologies responsibly. Jenkins and Purves have found there is already a movement in the tech industry to operationalize ethical expectations.

“Some companies have drawn lines in the sand about certain applications of their systems,” Purves said. “They have said no to police departments who want them to do certain things, like predict nuisance crimes such as loitering or drug possession.”

Some of the recommendations in the framework are easier to implement than others. For example, many police departments already observe the “Koper Curve,” a theory which tells officers to spend no greater than 15 minutes patrolling the same area. Purves said it’s fairly simple for software developers to stop giving officers credit if they don’t follow this technique.

Other recommendations, like hiring a chief ethics officer, are more costly for companies. However, Jenkins said the payoff is immense.

“Doing the right thing is often good business,” he said. “We see companies who care about ethics as our audience for the framework.”

Jenkins said that researchers are in conversations with a major developer of data-driven police technologies to test some of the recommendations in the framework.

“Our aspiration is to move the needle towards positive outcomes for public safety and police-community relations,” Jenkins said. “There are things in this report that should help ameliorate the issues we’ve seen in policing and lead to a more harmonious future. That’s our hope.”

Back to top