Following records requests from The Post, officials paused the first known, widespread live facial recognition program used by police in the United States.
A Project NOLA security camera keeps watch over the corner of Conti and Burgundy streets in New Orleans on May 8. (Edmund D. Fountain/For The Washington Post)
NEW ORLEANS — For two years, New Orleans police secretly relied on facial recognition technology to scan city streets in search of suspects, a surveillance method without a known precedent in any major American city that may violate municipal guardrails around use of the technology, an investigation by The Washington Post has found.
Police increasingly use facial recognition software to identify unknown culprits from still images, usually taken by surveillance cameras at or near the scene of a crime. New Orleans police took this technology a step further, utilizing a private network of more than 200 facial recognition cameras to watch over the streets, constantly monitoring for wanted suspects and automatically pinging officers’ mobile phones through an app to convey the names and current locations of possible matches.
This appears out of step with a 2022 city council ordinance, which limited police to using facial recognition only for searches of specific suspects in their investigations of violent crimes and never as a more generalized “surveillance tool” for tracking people in public places. Each time police want to scan a face, the ordinance requires them to send a still image to trained examiners at a state facility and later provide details about these scans in reports to the city council — guardrails meant to protect the public’s privacy and prevent software errors from leading to wrongful arrests.
Since early 2023, the network of facial recognition cameras hasplayed a role in dozens of arrests, including at least four people who were only charged with nonviolent crimes, according to police reports, court records and social media postsby Project NOLA, a crime preventionnonprofit company that buys and manages many of the cameras. Officers did not disclose their reliance on facial recognition matches in police reports for most of the arrests for which the policeprovideddetailed records, and none of the cases were included in the department’s mandatory reports to the city council on its use of the technology. Project NOLA has no formal contract with the city, but has been working directly with police officers.
“This is the facial recognition technology nightmare scenario that we have been worried about,” said Nathan Freed Wessler, a deputy director with the ACLU’s Speech, Privacy, and Technology Project, who has closely tracked the use of AI technologies by police. “This is the government giving itself the power to track anyone — for that matter, everyone — as we go about our lives walking around in public.”
New Orleans Police Superintendent Anne Kirkpatrick during an interview this month. (Edmund D. Fountain/For The Washington Post)
Anne Kirkpatrick, who heads the New Orleans Police Department, paused the program in early April, she said in an interview, after a captain identified the alerts as a potential problem during a review. In an April 8 email reviewed by The Post, Kirkpatrick told Project NOLA that the automated alerts must be turned off until she is “sure that the use of the app meets all the requirements of the law and policies.” The Post began requesting public records about the alerts in February.
The police department “does not own, rely on, manage, or condone the use by members of the department of any artificial intelligence systems associated with the vast network of Project Nola crime cameras,” Reese Harper, a spokesman for the agency, said in an emailed statement.
Police across the country rely on facial recognition software, which uses artificial intelligence to quickly map the physical features of a face in one image and compare it to the faces in huge databases of images — usually drawn from mug shots, driver’s licenses or photos on social media — looking for possible matches. New Orleans’s use of automated facial recognition has not been previously reported and is the first known widespread effort by police in a major U.S. city to use AI to identify people in live camera feeds for the purpose of making immediate arrests, Wessler said.
The Post has reported that some police agencies use AI-powered facial recognition software in violation of local laws, discarding traditional investigative standards and putting innocent people at risk. Police at times arrested suspects based on AI matches without independent evidence connecting them to the crime, raising the chances of a false arrest. Often, police failed to inform defendants about their use of facial recognition software, denying them the opportunity to contest the results of a technology that has been shown to be less reliable for people of color, women and older people.
A facial recognition system deployed by police in London’s Oxford Circus on May 13. London is one of the few places where live facial recognition is known to be in wide use. (Leon Neal/Getty Images)
One of the few places where live facial recognition is known to be in wide use is London, where police park vans outside of high-traffic areas and use facial recognition-equipped cameras to scan the faces of passersby, and confront people deemed a match to those on a watch list. While the city says the program has never led to a false arrest since launching in 2016, Big Brother Watch, a London-based civil liberties group, argues that the practice treats everyone as a potential suspect, putting the onus on the people who were falsely matched to prove their innocence.
Real-time alerts
The surveillance program in New Orleans relied on Project NOLA, a private group run by a former police officer who assembled a network of cameras outside of businesses in crime-heavy areasincluding the city’s French Quarter district.
Project NOLA configured the cameras to search for people on a list of wanted suspects. When the software determined it had found a match, it sent real-time alerts via an app some officers installed on their mobile phones. The officers would then quickly research the subject, go to the location and attempt to make arrests.
Police did not set up the program nor can they directly search for specific people, or add or remove people from the camera system’s watch list,according to Bryan Lagarde, Project NOLA’s founder.
Little about this arrangement resembles the process described in the city council ordinance from three years ago, which imagined detectives using facial recognition software only as part of methodical investigations with careful oversight. Each time police want to scan a face, the ordinance requires them to send a still image to a state-run “fusion center” in Baton Rouge, where various law enforcement agencies collaborate on investigations. There, examiners trained in identifying faces use AI software to compare the image with a database of photos and only return a “match” if at least two examiners agree.
Investigators have complained that process takes too long and often doesn’t result in any matches, according to a federally mandated audit of the department in 2023. It has only proved useful in a single case that led to an arrest since October 2022, according to records police provided to the city council.
A surveillance camera mounted to the underside of a balcony on Bourbon Street in New Orleans. (Edmund D. Fountain/For The Washington Post)
Freddie King, a New Orleans council member who represents the district that includes the French Quarter, voted in support of a 2022 ordinance that authorized police to use facial recognition as long as they adhered to certain guardrails. (Edmund D. Fountain/For The Washington Post)
By contrast, Project NOLA claims its facial recognition cameras played a role in at least 34 arrests since they were activated in early 2023, according to the group’s Facebook posts — a number that cannot be verified because the city does not track such data and the nonprofit does not publish a full accounting of its cases. Without a list of the cases, it’s impossible to know whether any of the people were misidentified or what additional steps the officers took to confirm their involvement in the crimes.
Kirkpatrick said her agency has launched a formal review into how many officers used the real-time alerts, how many people were arrested as a result, how often the matches appear to have been wrong and whether these uses violated the city ordinance.
“We’re going to do what the ordinance says and the policies say, and if we find that we’re outside of those things, we’re going to stop it, correct it and get within the boundaries of the ordinance,” she said.
There are no federal regulations around the use of AI by local law enforcement. Four states — Maryland, Montana, Vermont and Virginia — as well as at least 19 cities in nine other states explicitly bar their own police from using facial recognition for live, automated or real-time identification or tracking, according to the Security Industry Association, a trade group.
Lawmakers in these places cited concerns in public meetings that the technology could infringe on people’s constitutional rights or lead police to make mistakes when they rush to arrest a potential suspect before taking steps to confirm their connection to the crime, as many people look alike. At least eight Americans have been wrongfully arrested due to facial recognition, The Post and others have reported.
The unsanctioned surveillance program in New Orleans highlights the challenge of regulating a technology that is widely available, at a time when some police see AI as an invaluable crime fighting tool. Even in some places where officials have banned facial recognition, including Austin and San Francisco, officers skirted the bans by covertly asking officers from neighboring towns to run AI searches on their behalf, The Post reported last year.
Violent crime rates in New Orleans, like much of the country, are at historic lows, according to Jeff Asher, a consultant who tracks crime statistics in the region. But city officials have seized on recent instances of violent crime to argue that police need the most powerful tools at their disposal.
Last month, an independent report commissioned after the New Year’s Day attack that left 14 people dead on Bourbon Street found the New Orleans police to be understaffed and underprepared. The report, overseen by former New York City police commissioner William Bratton, advised New Orleans to explore adopting several new tools, including drones, threat prediction systems and upgrades to the city’s real-time crime center — but did not recommend adding any form of facial recognition.
Kirkpatrick, the city’s top police official, and Jason Williams, its top prosecutor, both said they are in discussions with the city council to revise the facial recognition ordinance. Kirkpatrick says she supports the idea of the city legally operating its own live facial recognition program, without the involvement of Project NOLA and with certain boundaries, such as prohibiting use of the technology to identify people at a protest.
“Can you have the technology without violating and surveilling?” she asked. “Yes, you can. And that’s what we’re advocating for.”
5,000 cameras
Few people have as much visibility into the everyday lives of New Orleans residents as Lagarde, a former patrol officer and investigator who started his own video surveillance business in the late 1990s before launching Project NOLA in 2009.
Funded by donations and reliant on businesses that agree to host the cameras on their buildings or connect existing surveillance cameras toits centralized network,Lagarde saidProject NOLA has access to 5,000 crime cameras across New Orleans, most of which are not equipped with facial recognition. The cameras all feed into a single control room in a leased office space on the University of New Orleans campus, Lagarde said in an interview at the facility. Some camera feeds are also monitored by federal, state and local law enforcement agencies, he said.
Bryan Lagarde, who founded Project NOLA in 2009, in a real-time video monitoring room at the University of New Orleans in 2017. (Max Becherer/The Advocate)
Project NOLA made $806,724 in revenue in 2023, tax filings show. Much of it came from “cloud fees” the group charges local governments outside of New Orleans — from Monticello, Florida, to Frederick, Colorado — which install Project NOLA cameras across their own towns and rely onLagarde’s assistance monitoring crime. He’s experimented with facial recognition in Mississippi, he said, but his “first instance of doing citywide facial recognition is New Orleans.” New Orleans does not pay Project NOLA.
For more than a decade,Lagarde used standard cameras outside businesses to monitor crime and offer surveillance clips for officers to use in their investigations. Lagarde’s cameras became so widespread that police began calling him when they spotted a Project NOLA camera hovering near a crime scene they were investigating, according to police incident reports, interviews with police and emails obtained through a public records request.
Lagarde began adding facial recognition cameras to his network in early 2023, after an $87,000 bequest from a local woman. Lagarde used the money to buy a batch of cameras capable of detecting people from about 700 feet away and automatically matching them to the facial features, physical characteristics and even the clothing of people in a database of names and faces he has compiled.
Lagarde says he built his database partly from mug shots from local law enforcement agencies.It includes more than 30,000 “local suspected and known criminals,” Project NOLA wrote on Facebook in 2023. Lagarde can quickly identify anyone in the database the moment they step in front of a Project NOLA camera, he said. He can also enter a name or image to pull up all the video clips of that person Project NOLA captured within the last 30 days, after which Lagarde says videos get automatically deleted“for privacy reasons.”
Project NOLA found enthusiastic partners in local business owners, some of who were fed up with what they saw as the city’s inability to curb crime in the French Quarter — the engine of its tourism economy that’s also a hub for drug dealers and thieves who prey on tourists, said Tim Blake, the owner of Three Legged Dog, a bar that was one of the first places to host one of Project NOLA’s facial recognition cameras.
“Project NOLA would not exist if the government had done its job,” Blake said.
Tim Blake’s bar, the Three Legged Dog, was one of the first places to host a Project NOLA camera. (Edmund D. Fountain/For The Washington Post)
While Lagarde sometimes appears alongside city officials at news conferences announcing prominent arrests, he is not a New Orleans government employee or contractor. Therefore, Lagarde and the organization are not required to share information about facial recognition matches that could be critical evidence in the courtroom, said Danny Engelberg, the chief public defender for New Orleans.
“When you make this a private entity, all those guardrails that are supposed to be in place for law enforcement and prosecution are no longer there, and we don’t have the tools to do what we do, which is hold people accountable,” he said.
Lagarde says he tries to be transparent by posting about some of his successful matches on Facebook, though he acknowledges that he only posts a small fraction of them and says it would be “irresponsible” to post information about open investigations. Project NOLA, he added,is accountable to the businesses and private individuals who host the cameras and voluntarily opt to share their feeds with the network.
“It’s a system that can be turned off as easily as it’s been turned on,” he said. “Were we to ever violate public trust, people can individually turn these cameras off.”
Banned devices
Lagarde declined to say who makes the equipment he uses, saying he doesn’t want to endorse any company.
Several Project NOLA cameras in the French Quarter look nearly identical to ones on the website of Dahua, a Chinese camera maker, and product codes stamped on the backs of these devices correspond to an identical camera sold by Plainview, New York-based equipment retailer ENS Security, which has acknowledged reselling Dahua cameras in the past. Project NOLA’s website also contains a link to download an app where police officers can view and manage footage. The app, called DSS, is made by Dahua.
Congress banned federal agencies from using products or services made by Dahua and a list of other Chinese companies in 2018, citing concerns that the equipment could be used by President Xi Jinping’s government to spy on Americans. Since 2020, the law has barred any agency or contractor that receives federal funds from using those funds on the banned products.
“This technology requires accountability,” said Stella Cziment, a lawyer who heads a watchdog agency overseeing the practices of the New Orleans Police Department. “I am never going to be satisfied with the accountability it receives if it’s in a private entity’s hands.” (Edmund D. Fountain/For The Washington Post)
A Project NOLA security camera mounted to the Hotel Monteleone. (Edmund D. Fountain/For The Washington Post)
A Dahua spokesperson declined to comment on the New Orleans cameras and said the company stopped selling equipment in the U.S. last year.
The New Orleans Police Department has received tens of millions of dollars from the federal government in recent years and confirmed that someofficers have installed this DSS app on mobile phones and police workstations. Kirkpatrick said she was not aware of who made the app or cameras but would look into it.
Lagarde said Project NOLA uses “American-made, brand-name servers to operate our camera program.”
Some city officials argue that police are not violating the city’s facial recognition ordinance because they do not own the cameras or contract with Lagarde; they are merely receiving tips from an outside group that is performing facial recognition scans on its own.
“If Bryan Lagarde calls an officer and says ‘I think a crime is occurring on the 1800 Block of Bienville,’ that’s no different than Miss Johnson looking out of her window and saying ‘I think a crime is occurring on 1850 Bienville,’” Williams, the Orleans Parish district attorney, said in an interview.
But in many cases, police have gone to Lagarde to request footage or help identifying and locating suspects, according to police reports, Project NOLA social media posts and internal police emails.
Tracking a suspect
In one case last year, a police detective investigating a snatched cellphone relied on Project NOLA to identify the perpetrator and track him down using facial recognition alerts, according to accounts of the investigation drawn partly from the police incident report and partly from Project NOLA’s Facebook post.
The detective contacted Lagarde “to assist locating the perpetrator on Project NOLA cameras,” according to the police report, providing still shots taken from the city’s surveillance camera footage. Lagarde used Project NOLA’s clothing recognition tool to find previous video footage of a suspect. With the new, better images of his face, Project NOLA used facial recognition to learn his possible identity and share that with the detective.
The detective took that name and found photos of a man on social media whose appearance and tattoos matched the phone-snatcher. Police got a warrant for his arrest. Lagarde added that name and face to Project NOLA’s watch list, and a few days later, cameras automatically identified him in the French Quarter and alerted police, who found and arrested him. The man was charged with robbery but pleaded guilty to the lesser offense of theft, court records show.
The police report mentioned that Lagarde helped identify the suspect, but did not mention that he used facial recognition to do so or used live facial recognition and automated alerts to monitor for and locate him.
New Orleans Police Sgt. David Barnes. (Edmund D. Fountain/For The Washington Post)
David Barnes, a New Orleans police sergeant overseeing legal research and planning, said officers are trained toalways find probable cause before making an arrest. He said Lagarde sometimes overstates in Facebook posts the role his technology played in some of the cases. He said the detective investigating the phone-snatching case was only asking Lagarde to find videos of the suspect, not the location of the suspect.
On a rainy May morning outside the Three Legged Dog, a Project NOLA camera swiveled about, blinking red and blue lights, and twitching side to side as it followed cars and people based on an automated program. The camera is no longer pinging the police on an app — at Kirkpatrick’s request.
“Like you and everybody else, I do not want to lose any cases of violent criminals based on policy violations or violations of our ordinances,” Kirkpatrick said in her email last month to Lagarde.
But the alerts still go to Project NOLA staff, who Lagarde said convey the location of wanted suspects to the police via phone calls, texts and emails.
Schaffer reported from Washington. Nate Jones and Jeremy Merrill contributed to this report.