AI is suddenly everywhere — in our doctors’ offices, our workplaces, and even our bathrooms when we brush our teeth. There are endless ways AI technology is streamlining and improving our everyday lives. That also includes one of the most important parts of society: law enforcement.
“AI is everywhere — including policing,” said a representative from ForceMetrics, an AI-focused software company specializing in law enforcement. “But as with any new technology, there are two sides to this coin.”
AI is increasingly being tested to improve public safety, enhance accuracy, and facilitate better relationships between police and local communities.
However, as with all new AI technologies, these innovations aren’t perfect. Due to the sensitive nature of law enforcement work, some uses of AI could inadvertently hinder personal freedoms.
As AI continues to develop, law enforcement and government agencies must explore and outline clear standards around data-sharing, invasion of personal privacy, and police protocol.
We’ll explore four tools that demonstrate the good and the bad, where AI has been used responsibly to make the justice system more effective and productive — and where the opposite has played out. These examples pose key questions for community leaders and law enforcement officials to consider when examining if they are using AI to the best of its abilities.
When used responsibly, AI can help shape a justice system that upholds fairness and dignity for victims, offenders, officers, and parolees alike. While innovation can be daunting and prone to missteps, thoughtful implementation can harness its potential to promote justice, safety, and well-being for all.
The good: ForceMetrics provides officers with critical data when they need it most
One of the most significant challenges police officers face is accessing sufficient information to make accurate, split-second decisions.
ForceMetrics aims to provide officers with reliable, up-to-date data when they need it.
“There are very few situations where anyone goes from zero to shooting people,” said ForceMetrics founder Andre McGregor, who previously worked for the FBI and as an emergency first responder. “All of these situations escalate. When ForceMetrics provides data on the community side, police officers feel safer with certain interactions and more prepared for others.”
ForceMetrics uses AI to scan multiple public safety sources and quickly provide police officers and 911 dispatchers with critical insights for emergency calls, such as whether the caller has a history of mental illness, drug usage, or domestic violence. For example, a police officer might interpret an individual’s avoidance of physical contact as resistance to arrest when the person could actually be autistic and sensitive to touch.
The technology can provide the context officers need to make better, more informed decisions during emergencies, reducing the risk of escalation.
That can have huge benefits for police-community relations. If officers respond with empathy, they can increase local trust in law enforcement and make communities safer for everyone.
“We want to provide enough information so that officers can rethink how they interact,” McGregor said. “If they see ‘autism,’ now they’re thinking, ‘Is this person processing what I’m saying?’ It creates a momentary pause of safety.”
The bad: Predicting crime before it happens can result in harassment and curb second chances
A program that analyzes historical crime data to predict potential offenders might seem like a way to stop crime before it happens — but in reality, the results can harm communities more than help them.
In Pasco County, Florida, an AI-based policing program used an algorithm to predict who in the community would be “most likely” to commit future crimes, drawing on criminal histories, police records, and other data. However, as a result, police officers unfairly targeted these individuals without cause and despite their innocence.
These tools often single out former juvenile offenders who have since been released. In Pasco County, for instance, at least 1 in 10 individuals identified as targets were under 18, many with only one or two minor offenses in their histories.
Defining justice-involved youth by their past mistakes rather than their present actions undermines their ability to rebuild and grow.
Any AI tool should provide real insights, not speculation. It should focus on collaboration rather than conflict.
Sign up for the Stand Together newsletter and get stories, ideas, and advice from changemakers to help you tackle America’s biggest problems.
The bad: Face recognition technology can put the wrong people behind bars
Another challenge law enforcement faces is swiftly and accurately identifying suspects. To accelerate the process, many police departments rely on face recognition technology. However, when these tools get it wrong, the fallout is significant.
Face recognition compares security camera footage with hundreds of publicly available photos — including driver’s licenses, mug shots, and social media posts — to analyze key facial features and link an image to an individual.
Although a good idea in theory, it is far from universally accurate (one program used by the Detroit police misidentified suspects roughly 96% of the time). This is in part because many law enforcement agencies are not properly trained in how to use face recognition.
In Texas, a man was falsely imprisoned for robbery after face recognition technology misidentified him, and he later endured violence in prison. In Michigan, police arrested a woman who was eight months pregnant for carjacking after face recognition falsely identified her.
Face recognition can also lead to troubling invasions of privacy, such as tracking what health care services a person has used or what religious meetings a person has attended.
To avoid these pitfalls, organizations should prioritize thorough training in best practices. When using face recognition technology, it’s crucial not to rely solely on the tool and remain highly vigilant for potential biases.
The good: JusticeText is helping ensure balance and fair public defense
Public defenders in the United States are overwhelmed with cases, to the point that they have only seven minutes on average to prepare for even their most critical hearings.
The first task to be cut from their preparation is the time-consuming process of reviewing video evidence, which is involved in about 80% of criminal cases.
That’s where JusticeText comes in: The AI-powered software analyzes hours of audio and video footage, including police body camera footage, depositions, and interrogations. It can transcribe up to 50 files in only 30 minutes and highlight important moments.
By making evidence more accessible to public defenders, AI technology enhances their ability to provide clients with a more transparent and thorough trial. This leads to a fairer justice system for all, reducing the likelihood of wrongful incarceration.
“We want to build technology that improves outcomes for the individuals in the communities that public defenders serve,” said Devshi Mehrotra, cofounder of JusticeText.
The possibilities of an AI-powered justice system
Developing these technologies is just the first step. The real challenge comes with making sure they are implemented, used, and monitored effectively.
Policymakers and law enforcement should work together to ensure that all new AI software is thoroughly vetted by knowledgeable third parties, safeguarding citizens’ privacy and protecting their freedoms.
If these standards are met, the potential for law enforcement and the communities they serve could be life-changing.
“AI has the potential to improve our living standards and well-being,” the ForceMetrics representative said. “But with every new technology and innovation, there is the chance for misuse and abuse. … Our leaders who deploy these technologies must ensure that the benefits are real and that our rights are protected.”
ForceMetrics is supported by Stand Together Ventures Lab, which invests in and supports founders and their early-stage start-ups that are challenging the status quo.
Learn more about the Stand Together community’s constitutionally limited government efforts, and explore ways you can partner with us.

A landmark Supreme Court decision is bringing a return to federalism.

How the Supreme Court decision affects Congress’ job.

What we think we know about other Americans’ views — and what we get very wrong.

This federal agency is leaving one business in legal limbo.