IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Facial Recognition Experts to Testify to House Committee

The House Homeland Security Committee will hear testimony from the National Institute of Standards and Technology regarding how the Department of Homeland Security uses the technology and its limitations.

Department of Homeland Security
Shutterstock/Keith Homan
(TNS)—A Congressional committee will hold a hearing Thursday on the Department of Homeland Security's use of facial recognition following a government report that found the technology often has difficulty accurately reading faces, based on race, gender and age -- which critics say could easily result in bias.

The House Homeland Security Committee will hear testimony from the National Institute of Standards and Technology, whose report recently found that, depending on the algorithm used, facial recognition software can lead to "false positives," particularly among Native Americans, Asians and African Americans, including women of color, meaning that the software wrongly considered photos of two different people to show the same person.

"The Department of Homeland Security's increasing use of facial recognition technology is quickly turning U.S. airports into surveillance hubs," said Sen. Edward Markey, D-Mass. "I'm working on legislation to force DHS to put a moratorium on its use of this technology until it enacts enforceable rules governing biometric data collection because the risks to our civil rights are just too high."

DHS referred requests for comment to U.S. Customs and Border Protection, which did not respond. But State Police spokesman David Procopio said that facial recognition can be an important tool for law enforcement.

"It is applied to images of a suspect obtained through prior investigative steps," Procopio said, "and any potential suspect identifications made with the technology are subsequently confirmed or rejected by other investigative steps."

But technology consultant Kate O'Neill, founder of KO Insights, said that society has "neither the regulations nor the infrastructure to require that people first opt in to facial recognition, let alone the public education necessary to allow people to make informed choices about the trade-offs of security versus privacy that would go into a wide-scale deployment."

"It's easy for law enforcement and government agencies to claim that tools like facial recognition make people safer," O'Neill said. "But that's not true if the technology misidentifies people...What's more, it is a clear expansion of a police surveillance state, which inherently limits civil liberties."

©2020 the Boston Herald. Distributed by Tribune Content Agency, LLC.