IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Maryland General Assembly Warms to Facial Recognition Limits

A plan to limit police use of facial recognition technology is likely to pass in this year’s session of the General Assembly. The bill would allow police to use the tools to investigate violent crimes and serious offenses.

(TNS) — A plan to limit police use of facial recognition technology is likely on track to pass in this year’s session of the Maryland General Assembly, capping a multiyear effort with a compromise that’s drawn support from both law enforcement and public defenders.

“The world’s going to end,” Baltimore County State’s Attorney Scott Shellenberger joked during a hearing Tuesday in which he supported the legislation.

The bill would allow police to use facial recognition while they investigate violent crimes and other serious offenses. It’s set to put Maryland among roughly a dozen other states that have attempted to regulate the technology in law enforcement.

At a time when officials and the public grapple with the technology’s expanding ubiquity, though, some experts say Maryland’s bill doesn’t go far enough. They want protections against tendencies to less accurately identify Black and brown people, who have historically faced disproportionate targeting by law enforcement.

“Once you allow the technologies to spread for law enforcement use, you’re inevitably going to head down the path where it’s abused by law enforcement,” said Jeramie Scott, senior counsel and director of the Project on Surveillance Oversight at the Electronic Privacy Information Center, a nonprofit research and advocacy group. “Barring a ban, a moratorium, law enforcement’s use of facial recognition should be highly, highly regulated and highly restricted.”


Some Maryland law enforcement agencies have used facial recognition technology for more than a decade — sometimes in ways that have drawn criticism. In the aftermath of Freddie Gray’s death in 2015, Baltimore Police used it to monitor protests. In 2022, a man who was misidentified using the technology spent days in the Baltimore County jail.

Del. David Moon, a Montgomery County Democrat who sponsored the bill, said that when he began working on the issue a few years ago, he discovered “the technology was being widely deployed in Maryland, basically unchecked in a sort of wild, Wild West manner.”

Even in the year since Moon and Baltimore County Democratic Sen. Charles Sydnor last proposed the bill, the issues have continued, some say. Debbie Levi, the director of special litigation in the Maryland Office of the Public Defender, told lawmakers that within the last year in Baltimore, at least one rape case, one firearm offense and one armed robbery were dismissed because police use of facial recognition in the cases was not properly disclosed.

Without legal requirements to disclose that information during discovery, Levi said it takes close scrutiny or sometimes “just happenstance” to figure out when the technology’s been used.

“It’s sort of like coming out of the gate too quickly,” Levi said.


Under the bill moving through the legislature, disclosure of facial recognition use would be required during discovery, when prosecutors provide evidence to defense attorneys before a trial. The name of the facial recognition system, the databases searched, and results that led to further investigative action would need to be turned over to a defendant’s lawyer.

Results generated by the technology could generally not be used as evidence in court unless for the purpose of establishing probable cause or identifying someone in connection with the issuance of a warrant or at a preliminary hearing. The results could not serve as the sole basis to establish probable cause or as a positive identification of a person.

Other provisions in the bill would require law enforcement agencies to publicly disclose the systems they use and their policies around using them. They would also need to publish an annual report with specific data on how it was used.


Baltimore Police generated 622 facial recognition match reports in 2023 and 811 in 2022 using the state’s Cogent Mugshot System provided by the multinational company Thales, according to the department.

The system is the only facial recognition software the department uses, and it was “unknown” how many instances in which the system in use identified a suspect or person of interest using facial recognition, according to a records request for the 2022 data received by the public defender’s office and provided to The Baltimore Sun.

Police spokeswoman Lindsey Eldridge said in a statement that “facial recognition is just one tool that Baltimore Police Department uses as part of ongoing investigations and to generate investigative leads that contribute to the case closures.”

The technology does not provide probable cause by itself; it must be “coupled with other investigative leads and proven investigative systems and techniques,” she added. The department has established safeguards that include “mandatory training, quality control checks, reporting to and oversight by the City Council.”

Democratic Councilman Kristerfer Burnett has been working on additional guardrails for Baltimore City. A pair of bills he proposed last year would limit police use of the technology only to investigate the most serious crimes, increase transparency, and require businesses to disclose to patrons if they are using such software.


Under the state legislation, police could use facial recognition technology only to investigate certain crimes, including: crimes of violence, human trafficking, child abuse, child pornography, hate crimes, weapon crimes, animal cruelty, drug offenses, and stalking or other criminal acts that present an ongoing threat to public safety or national security.

Maryland State Police would be charged with creating a statewide policy for law enforcement use of facial recognition. The state Department of Public Safety and Correctional Services would develop and administer a training program that officers would need to complete annually.


Those provisions and more would make the law both more comprehensive and, in some ways, less restrictive than the roughly dozen other states that have passed similar restrictions, said Jake Laperruque, deputy director of the Security and Surveillance Project at the Center for Democracy & Technology, a nonprofit organization that advocates for civil liberties in technology policy.

Laperruque said the most important parts of Maryland’s potential law include limiting the list of uses to only serious crimes, requiring disclosure to defendants when police used it in their case, and prohibiting its use for “live or real-time identification,” such as scanning crowds of people.

Where it comes up short, he said, is in following the lead of states like Montana and Massachusetts that require a warrant or other court order for facial recognition scanning.

Scott, of the Electronic Privacy Information Center, said Maryland’s plan is lacking in a number of other areas. One focus, he said, should be on addressing facial recognition algorithm bias issues that, when combined with law enforcement’s history of disproportionately targeting minority communities, could exacerbate inequalities in the criminal justice system.

“The technology hasn’t been proven to be reliable as an investigative tool,” Scott said. “Knowing that facial recognition is a very powerful and pervasive surveillance tool, moving toward widespread law enforcement’s use inevitably puts us in a position where our democratic values and constitutional rights will be undermined by this technology.”

The bill could have stronger protections, for instance, by attempting to ensure that facial recognition algorithms do not have biases, making sure “bias from human review” is addressed with ongoing training, and requiring the specific disclosure of communities that are being targeted with the technology most often.


The list of crimes allowed for using facial recognition may also be too broad, allowing for a “potential loophole for uses that could be problematic,” Scott said.

Moon, the bill sponsor, said in an interview the list of crimes was the primary hang-up before the bill gained momentum last year. When the bill started in the 2023 session, just three bullet points identified the crimes allowed — crimes of violence, human trafficking and ongoing threats to public safety or national security.

After amendments, that list grew to 12, to include everything from weapon crimes to stalking.

That version of the bill, which was reintroduced this year, also notes that the law should “not be construed to restrict” facial recognition in a list of additional cases, like identifying a missing person or, more vaguely, “conducting otherwise legitimate activity unrelated to a criminal investigation.”

Jake Parker, a lobbyist for a trade organization that represents technology companies, said his organization decided to support the bill after the sponsors brought together different stakeholders to develop the compromise, including changing the list of crimes.

“It was a little too restrictive,” he said.

The legislation, House Bill 338 and Senate Bill 182, is expected to be voted on in committees in the coming weeks.

©2024 Baltimore Sun, Distributed by Tribune Content Agency, LLC.