Racial Profiling, Biased Policing, and AI

Police officer facing a screen

Racial bias in policing can take different forms, for example:

  1. Consciously determining to investigate a person or enforce a criminal law against a person solely based on the person’s race or ancestry.
  2. Choosing to investigate a person or enforce a criminal law against a person based at least partially on the person’s race or ancestry, while ignoring similar suspicious activity or criminal law violations committed by those of another race or ancestry.
  3. Being influenced by implicit bias while policing.
  4. Profiling by proxy, where members of the public who have racial biases misconstrue innocent activity by diverse individuals and call 9-1-1 to request a police response.

Missouri statutes and regulations, and accrediting organizations such as CALEA, require certain public safety officers to receive training to help prevent racial profiling and improper bias.

Many law enforcement agencies across the United States have recently introduced AI tools to help in police work. One justification for using AI is that it can help reduce bias in policing. The reasoning is that if a computer makes the determination as to which areas need heavier policing or which suspect is the most likely perpetrator of an unsolved crime, the element of human bias will be removed. Some of these AI tools include algorithms to efficiently deploy police resources throughout communities, facial recognition technology (FRT) to find suspects, and transcription technology to create written copies of recorded interviews with witnesses and suspects.

Unfortunately, AI can sometimes perpetuate racial bias, instead of reducing it.

First, AI programs must be trained on data. If the data used to train the AI is biased, then the results the AI program gives may reflect that bias. For example, if, years earlier, officers had been ordered to make as many arrests as possible of individuals of a certain race who were concentrated in a certain community, the resulting data would likely show that to be a high-crime area. An AI program trained on that data might instruct police officers to continue to heavily police that area, even though this would perpetuate the racial bias, not eliminate it.

Second, FRT is usually trained predominantly using white male faces, so its maximum accuracy is with this demographic. However, FRT is much less accurate when asked to identify females and people of color. The tendency to misidentify people of color can be exacerbated when officers use low-quality images, such as those obtained from security cameras. Rather than accurately identifying the actual perpetrator, the AI may simply look through all the driver’s license photos or other photos in its database and pick the one that most closely resembles the fuzzy image, thus causing an innocent person—especially a female person of color—to be identified as the perpetrator. By subjecting females of color to false arrests, but not white males, FRT can cause racial and gender bias in policing.

Third, AI is prone to “hallucinations” or “confabulations.” The way AI is developed leads programs to always try to provide an answer of some kind, even if the AI does not know what answer is “right.” If, for example, the AI cannot understand an audio recording, instead of simply saying, “unintelligible,” as a court reporter might, the AI will supply words and phrases of its own. As a result, AI transcription programs are prone to insert plausible nonsense information into what appears to be a reliable document. When law enforcement officers use AI to transcribe statements from suspects, witnesses, and victims, those transcriptions may contain false information. If the AI was trained on data that contains racial bias, the hallucinations may reflect that bias.

Ultimately, while AI tools can be of great assistance to police work, peace officers should still retain a healthy skepticism for their results—including being wary of possible racial bias in those results. No AI program can substitute for an officer’s own best judgment, based on the totality of the facts and circumstances known to the officer, and the officer’s training and experience.

Interested in learning more?

PLS offers police online self-study legal training on a wide variety of practical issues to help police officers make good decisions in challenging situations.

Link: https://www.policelegalsciences.com/