Culture & Society
13 min read

How AI Can Change the Way Police Talk to People

Jennifer Eberhardt’s innovative research on the language of policing shows the potential for data to make a difference.

Video of everyday police encounters can provide insights into how officers treat people of different races. | Saiman Chow

May 11, 2026

| by Katia Savchuk

The day before she received her PhD in psychology from Harvard, Jennifer Eberhardt got pulled over. It was an afternoon in June 1993, and Eberhardt and a classmate had just picked up a bound copy of her dissertation. Twenty minutes from campus, the lights of a Boston police cruiser flashed behind them.

Eberhardt’s friend, who was driving, stopped the car. “I need your license, registration, and proof of insurance,” the officer barked. When the two young African American women asked why they’d been stopped, he didn’t reply. “Does this vehicle belong to you?” he asked. Eberhardt explained that the car was registered to her mother and that the tags were six weeks out of date. The officer returned to his car, offering no explanation.

“From the way he was treating us, I knew he had no regard for us, no respect,” Eberhardt recalls. The sharp commands and lack of clarification were more than isolated indignities. They were signs of an encounter that could quickly take a turn for the worse — a linguistic signature Eberhardt would later identify in her research on thousands of police stops.

A tow truck arrived. “Exit the vehicle!” the officer ordered. When Eberhardt refused, he called for backup. Four more cruisers encircled the scene. The officer dragged Eberhardt out of the car and slammed her onto its roof, knocking the wind out of her. She slumped to the ground. The officer handcuffed her and took her to a police station.

She was released after a Harvard dean intervened, but was charged with assault and battery on a police officer — for placing her finger on his hand as he unbuckled her seat belt. A judge tossed out the charges.

For many years, Eberhardt blocked the encounter from her memory, even as she built a body of groundbreaking research that examined the workings of unconscious racial bias and its impact on everything from criminal justice and school discipline to real estate values and investments. Eberhardt, a professor of organizational behavior at Stanford Graduate School of Business and a professor of public policy, law, and psychology at Stanford, joined the university faculty in 1998. She received a MacArthur “genius grant” in 2014 and in 2019 she published Biased, a book synthesizing more than two decades of her work trying to understand the “distorting lens that’s a product of both the architecture of our brain and the disparities in our society.”

In her early research, Eberhardt used neuroscience and cognitive psychology to explore the implicit associations Americans make between race and crime. More than a decade ago, she went beyond the lab to investigate police stops, an area that had mostly been studied using police administrative records and narratives from those stopped. In a series of innovative studies, she and her colleagues have used computational linguistics to analyze how police officers speak to members of the public at a broad scale. This research has shed new light on the “respect deficit” many people of color have experienced in their interactions with police officers, detailing how a lack of courtesy can presage an encounter that’s about to intensify.

“At the time of my arrest, I felt that I had no power in that situation at all,” Eberhardt says. “But doing research is a powerful thing — trying to understand how the simple use of words can lead to escalation and use of force.” Working closely with police departments and communities, Eberhardt has translated these insights into interventions that have been shown to make policing fairer and change the culture of law enforcement.

“Jennifer goes to places where many others would fear to tread and works on really tough problems,” says Hazel Rose Markus, a professor of psychology at Stanford who cofounded SPARQ, a behavioral science “do tank,” with Eberhardt. “Jennifer’s principle is, ‘What could I do to make the world better?’” says Dan Jurafsky, a professor of linguistics and computer science at Stanford who has collaborated with her on several studies. “It’s not just about writing another paper.”

Quote
Jennifer’s principle is, ‘What could I do to make the world better?’
Author Name
Dan Jurafsky

 

In the spring of 2014, the Oakland Police Department (OPD) was still grappling with the fallout from a decade-old scandal involving “the Riders,” a group of officers who had been accused of abusing and planting evidence on innocent Black residents. After victims filed a class-action lawsuit, the city agreed to a $10.9 million settlement, federal oversight of its police department, and reforms that included gathering information on routine police stops. Now that the data was available, the plaintiffs’ attorneys and the federal monitor asked Eberhardt to help analyze it.

Eberhardt had already conducted research with law enforcement in Los Angeles and California’s San Mateo County. She’d also organized conferences that brought academics and police officials together from around the country. But these collaborations left her wanting to go beyond simply measuring disparities. “I realized that we knew more about how to track a problem than how to solve a problem,” she says.

Eberhardt and her colleagues began their collaboration with OPD by listening to the concerns of both rank-and-file police officers and community members. They also analyzed basic data on common interactions between officers and the public. Led by Benoît Monin, a professor of organizational behavior at Stanford GSB, the team examined reports from over 28,000 pedestrian and traffic stops made over a single year. Black people accounted for less than 30% of the city’s population but around 60% of police stops. They were also much more likely to be searched, handcuffed, and arrested when stopped.

When Eberhardt and her team shared these findings, officers bristled. They claimed that the vast majority of stops were based on sound intelligence and that if they made fewer stops, crime rates would go up. “They felt we were claiming that they were racially profiling,” Eberhardt says. To determine the path forward, she, Markus, and Monin joined a task force that included officers of different ranks. “It was clear that she was listening, that she was there to try to understand their situation,” Markus recalls.

Encouraged by Eberhardt’s team, OPD began requiring officers to note whether each stop they made was “intelligence-led.” Initially, only around 20% of stops fit that description; within three years, it reached as high as 50%. In the year after the question was added, the total number of stops dropped by around 12,000 and stops of Black drivers fell by 43%. Yet crime rates continued to fall.

A small step to gather more data proved to be a powerful nudge for change. Not only did it encourage officers to be more aware of their reasons for stopping people, it also signaled a shift in OPD’s priorities. And it demonstrated the ability of data analysis to not just diagnose a problem but offer solutions. Officers, Eberhardt says, “understood that we were there to use data to understand what was really going on and make recommendations about what could be improved.”

Quote
By systematically analyzing thousands of stops at a time, we could see patterns that we couldn’t see before.
Author Name
Jennifer Eberhardt

 

OPD had been one of the first law enforcement agencies in the country to roll out body-worn cameras to record its officers’ encounters with the public. By the time Eberhardt was working with the department, it was collecting thousands of hours of footage, but this trove was largely seen as evidence that could be used to investigate individual cases.

Eberhardt recognized an opportunity to analyze raw video of everyday police encounters at scale. “I thought, ‘Wow, we could actually see moment by moment how these interactions are unfolding in a way that administrative data can’t document,’” she says. “We realized we could use the footage as data. By systematically analyzing thousands of stops at a time, we could see patterns that we couldn’t see before.” Those patterns could provide new insights into how officers treated people of different races — and might suggest solutions.

Teaming up with Jurafsky’s computational linguistics lab, Eberhardt and her collaborators started by examining how respectful officers were during day-to-day interactions, an issue that community members had emphasized during focus groups. The researchers used natural language processing techniques to analyze 183 hours of footage taken during nearly 1,000 routine traffic stops. They developed an algorithm that could score how respectful a slice of dialogue was, based on features such as an officer expressing gratitude or concern for a driver’s safety (more respectful) or issuing sharp commands or using informal titles like “my man” (less respectful).

Their study, published in Proceedings of the National Academy of Sciences in 2017, found that OPD officers were consistently less respectful when addressing Black drivers compared to white ones, even after controlling for factors like the officer’s race and the severity of the violation. Based only on an officer’s words, a basic machine learning model could correctly predict the driver’s race for two-thirds of the stops.

In a follow-up study, Eberhardt, Jurafsky, and their colleagues examined the first words officers said during nearly 600 routine traffic stops of Black drivers in a medium-sized U.S. city. Their research, which appeared in PNAS in 2023, found that officers’ language was notably different in stops that led to searches, handcuffing, or arrests. Officers in encounters that escalated were 2.5 times more likely not to give the reason for the stop and almost three times more likely to start the interaction with an order. Based on the first 27 seconds of an officer’s speech, a natural language processing model could predict with more than 70% accuracy whether a stop would escalate.

In another study, Eberhardt’s team altered snippets of dialogue from hundreds of traffic stops so that the officers’ tone was audible, but the specific content of their speech was obscured. The study, which appeared in the Journal of Personality and Social Psychology, found that people who heard the audio thought officers used a more positive tone with white drivers than with Black ones, even though they did not know to whom the officers were speaking. The researchers also found that participants who heard clips with a less respectful tone had less trust in the police department.

This research documented a dynamic that was depressingly familiar to many people of color who had been stopped by the police. It also echoed Eberhardt’s own experience years earlier.

Despite pulling her car over for expired tags, a minor violation, the officer in Boston had failed to give a reason for the stop. He opened with an order, which Eberhardt says sparked an instinctive defiance in her, adding to the likelihood that the officer would use force and arrest her.

“It speaks to the power of language,” Eberhardt says. “Words matter to the point where you can risk your life for them.”

The officer who had pulled Eberhardt over was Black, illustrating another pattern documented in her studies. “Our research showed that black police officers were just as likely as white officers to exhibit less respect to black drivers,” she explains in Biased. “The drivers’ race trumped the officers’ race.”

After the initial study on the respect deficit in Oakland was published, Eberhardt frequently heard from residents that the findings had validated their experiences. Once, she was in the elevator at police headquarters when a Black employee recognized her. “Thank you so much,” the woman told Eberhardt. “We had been talking to the department for years and years about how we were treated during these stops, and they just wouldn’t hear us.”

Quote
We changed the style of policing. We actually used the data to get better.
Author Name
LeRonne Armstrong

 

Soon after she began working with the Oakland Police Department, Eberhardt helped develop a training program that taught officers how hidden racial biases could influence their interactions on the street. In 2018, department leaders asked her and her team to help them add a module guiding police to be more respectful during traffic stops.

The researchers realized they could use the same natural language processing tools they’d used to analyze body-worn camera footage to gauge whether their training worked. That research, published in PNAS in 2024, found that officers who had received training were more likely to convey respect by providing a clear reason for stopping drivers, offering reassurance, and expressing concern for the drivers’ safety.

“A lot of trainings aren’t evaluated at all, or if they are, it’s based on people’s experience of the training or knowledge of the material,” Eberhardt said. “This was one of the first times we were able to look at whether a training could actually make a difference for real interactions with the public.”

Over more than a decade of working in Oakland, Eberhardt has demonstrated one of the key findings of her work: bias is pervasive, yet it is not intractable. LeRonne Armstrong, who served as Oakland’s police chief from 2021 to 2023, says the collaboration with Eberhardt’s team made a profound impact. “We changed the style of policing,” he says. “We actually used the data to get better. The culture is still built around the things we put in place.”

“The Oakland Police Department has moved from being one of the worst… in the San Francisco Bay Area to being one of the best police departments in comparable cities in the country,” attorneys representing plaintiffs in the Riders class-action suit wrote in a brief in 2021. The lawyers told Eberhardt’s team that they no longer received reports of illegal arrests and brutal beatings. And the nature of routine policing changed as well: “The number of stops of innocent Black residents has dropped dramatically,” Eberhardt and Markus noted in a 2024 article, “and when these residents are stopped, they are treated with more dignity and respect.”

Eberhardt and her colleagues continue to conduct research using body-worn camera footage from Oakland and other cities. (A recently released study used machine learning models to analyze street stops by New York City police officers to evaluate compliance with federal court orders.)

They hope their methodology will be adopted by more police departments, as well as in other fields rich in linguistic data, such as healthcare and the court system. This process will become cheaper and easier as speech recognition technology and artificial intelligence evolve. “To the extent that researchers in general can use footage as data,” Eberhardt says, “we can actually leverage the technology to improve society.”

For media inquiries, visit the Newsroom.

Explore More