IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Mass. Outlines Strict Guidelines for Facial Recognition

The Massachusetts Facial Recognition Commission released its recommendations to the Legislature for using the controversial technology, including strong limits on when local police may use facial recognition.

Facial recognition applied to a crowd of people
(TNS) — A commission exploring the use of facial recognition technology in Massachusetts issued recommendations to the state Legislature Tuesday in hope of ultimately balancing police use of the tool to identify suspects with public privacy concerns.

Facial technology can be employed to identify a person based on facial features spotted by video. While it has helped police investigations, use of the software also brought up significant concerns of its accuracy, the privacy it allowed the public and its impact on due process laws.

Among its most significant recommendations, the state’s Facial Recognition Commission said the commonwealth should ban the technology from being used for live surveillance and place strict limits on local police’s use of the tools.

The commission suggested that local police be prevented from employing facial recognition programs unless they were explicitly allowed to do so by law. Instead, it said the Massachusetts State Police should be in charge of managing facial recognition operations. Any use of algorithms that could recognize emotions also should be made illegal, the commission said.

“Facial recognition and other biometric technologies are new tools with serious privacy, accuracy, and due process concerns that we must address,” State Sen. Jamie Eldridge, a co-chair of the commission, wrote in a statement. “As a legislator, I find the recommendations critical and needed for our current criminal justice system, and as guidelines for the commonwealth’s law enforcement agencies.”

The recommendations delivered to the Legislature by Eldridge and fellow co-chair State Rep. Michael Day were the culmination of an effort by lawmakers to better define the technology’s role within law enforcement and its impact on personal privacy.

Born from a 2020 piece of legislation designed to reform policing in the state, the commission met regularly throughout the last year and reviewed testimony, reports, articles and individual community laws on facial recognition technology. The group also surveyed police across Massachusetts on their use of the software.

“This commission clearly felt that the commonwealth must clarify the role this technology should play in our criminal justice system and better address the due process and civil rights concerns of our residents,” Day said. “The report lays out a series of measures that will provide our law enforcement professionals with the tools they need to keep the public safe while implementing the oversight necessary to ensure that this technology is not misused to the detriment of the general public.”

The 2020 law banned most government agencies from using facial recognition, while also creating the commission to evaluate the use of the technology in the commonwealth.

Other members of the commission included representatives of Gov. Charlie Baker, the American Civil Liberties Union and the state police.

The report Tuesday recommended that police be required to show probable cause that a person in a surveillance video has committed a felony crime before a judge could sign off on identifying them through facial recognition software.

It also suggested that defendants identified using facial recognition be notified of the technology’s use.

“The ACLU of Massachusetts is proud to support these recommendations, which balance law enforcement interests and civil rights,” Kade Crockford, the ACLU’s Technology for Liberty program director in Massachusetts, said in a statement. “It is critical to get this balance right because face surveillance technology poses profound and unprecedented threats to our privacy and other basic freedoms—and is particularly dangerous for communities of color and other marginalized groups.”

Facial recognition technology has come under fire for its questionable accuracy when identifying racial minorities. Research has shown that the same programs that may nearly always correctly identify a white man but may get a Black woman’s identity wrong more than a third of the time.

Not every member of the 21-person commission agreed fully with the reports findings.

Barnstable District Attorney Michael O’Keefe, a commission member, said that while he approved of much of the report, he took issue with the recommended limits on police identifying people in the area of crimes. A facial recognition match by video would act akin to a “tip” in an investigation, he said, but police would still need to confirm whether the person identified was actually in the area of the crime.

”Much of the report I agree with. The use of this technology, like all new technology, should be regulated and appropriate guardrails should be established,” O’Keefe said. “But police must be allowed to do their job.”

In a statement, Day said that the report made “clear and deliberate recommendations that account for the complexities of emerging facial recognition technology and its implications for individual privacy rights on one hand, and the proper role it can play in our criminal justice system on the other.”

“If the Legislature adopts these recommendations,” he continued, “I believe it will strike the correct balance between those competing interests and will set appropriate guidelines for law enforcement’s use of this technology.”

©2022 Advance Local Media LLC. Distributed by Tribune Content Agency, LLC.