How AI-powered police forces watch your every move
How AI-powered police forces watch your every move
Change in the criminal justice system is rarely linear. It comes in fits and starts, slowed by bureaucracy, politics, and just plain inertia. routinely get passed, then or tied up in court.
However, there is one corner of the system where change is occurring rapidly and almost entirely in one direction: From facial recognition to predictive analytics to the rise of increasingly convincing deepfakes and other synthetic video, new technologies are emerging faster than agencies, lawmakers, or watchdog groups can keep up, reports.
Take New Orleans, where, for the past two years, police officers have flagging the whereabouts of people on wanted lists, according to recent reporting by The Washington Post. Since 2023, the technology has been used in dozens of arrests, and it was deployed in two high-profile incidents this year that thrust the city into the national spotlight: the that killed 14 people and injured nearly 60, and last month.
In 2022, City Council members attempted to put guardrails on the use of facial recognition, to specific violent crimes, and mandated oversight by trained examiners at a state facility.
But those guidelines assume it's the police doing the searching. New Orleans police have hundreds of cameras, but the alerts in question came from a separate system: a network of 200 cameras equipped with facial recognition and installed by residents and businesses on private property, feeding video to a nonprofit called Project NOLA. Police officers who downloaded the group's app then received notifications when someone on a wanted list was detected on the camera network, along with a location.
That has and defense attorneys in Louisiana frustrated. 鈥淲hen you make this a private entity, all those guardrails that are supposed to be in place for law enforcement and prosecution are no longer there, and we don鈥檛 have the tools to do what we do, which is hold people accountable,鈥 Danny Engelberg, New Orleans鈥 chief public defender, told the Post. say it has contributed to a in the city.
The police department said it would shortly before the Post鈥檚 investigation was published.
New Orleans isn鈥檛 the only place where law enforcement has found a way around city-imposed limits for facial recognition. Police in San Francisco and Austin, Texas, have both agencies to run facial recognition searches on their behalf, according to reporting by the Post last year.
Meanwhile, at least one city is considering a new way to gain the use of facial recognition technology: by in exchange for free access. Last week, the Milwaukee Journal-Sentinel reported that the Milwaukee police department was considering such a swap, leveraging 2.5 million photos in return for $24,000 in search licenses. City officials say they would use the technology only in ongoing investigations, not to establish probable cause.
Another way departments can skirt facial recognition rules is to use AI analysis that doesn鈥檛 technically rely on faces. Last month, The Massachusetts Institute of Technology Review noted the rise of 鈥嬧媋 tool called 鈥淭rack,鈥 offered by the company Veritone. Notably, the algorithm can鈥檛 be used to track by skin color. Because the system is not based on biometric data, it evades most laws intended to restrain police use of identifying technology. Additionally, it would allow law enforcement to track people whose faces may be obscured by a mask or a bad camera angle.
In New York City, police are also exploring ways to use AI to 鈥淚f someone is acting out, irrational鈥 it could potentially trigger an alert that would trigger a response from either security and/or the police department,鈥 the Metropolitan Transportation Authority鈥檚 Chief Security Officer Michael Kemper said in April, according to The Verge.
Beyond people鈥檚 physical locations and movements, police are also using AI to change how they engage with suspects. In April, Wired Magazine and 404 Media reported on a new AI platform called Massive Blue, which police are Some applications of the technology include intelligence gathering from protesters and activists, and undercover operations intended to ensnare people seeking sex with minors.
Like most things that AI is being employed to do, this kind of operation is not novel. Years ago, I covered efforts by the Memphis Police Department named 鈥淏ob Smith.鈥 But like many facets of emerging AI, it鈥檚 not the intent that鈥檚 new 鈥 it鈥檚 that the digital tools for these kinds of efforts are more convincing, cheap and scalable.
But that sword cuts both ways. Police and the legal system more broadly are also contending with increasingly sophisticated AI-generated material in the context of investigations and evidence in trials. Lawyers are growing worried about the potential for deepfake AI-generated videos, which could be used to create fake alibis or falsely incriminate people. In turn, this technology that introduces doubt into even the clearest video evidence. Those concerns became even more urgent with the release of Google Gemini鈥檚 last month.
There are also questions about less duplicitous uses of AI in the courts. Last month, an Arizona court watched an impact statement of a murder victim, by the man鈥檚 family. The defense attorney for the man convicted in the case according to local news reports, questioning whether the emotional weight of the synthetic video influenced the judge鈥檚 sentencing decision.
was produced by , a nonpartisan, nonprofit news organization that seeks to create and sustain a sense of national urgency about the U.S. criminal justice system, and reviewed and distributed by 麻豆原创.