(TNS) — U.S. Sen. Edward J. Markey is calling for federal oversight over facial recognition technology after a watchdog’s test of Amazon’s commercially available software mistakenly identified Markey and other congressmen as suspected criminals in an arrest database.
The failure of the futuristic technology, Amazon Rekognition, could lead to innocent people being accused of crimes or winding up in a hostile encounter with law enforcement, according to the American Civil Liberties Union, which conducted the test.
The ACLU said it ran photos of every single member of Congress through a database of mugshots and got more than two dozen false positives — on Markey and 27 other lawmakers.
“Congress must take these threats seriously, hit the brakes, and enact a moratorium on law enforcement use of face recognition,” the ACLU said. “This technology shouldn’t be used until the harms are fully considered and all necessary steps are taken to prevent them from harming vulnerable communities.”
Markey said in a statement, “Congress needs to enact comprehensive privacy legislation that enshrines the right that consumers own their personal data, and this law should also apply to facial recognition technology. We need rules that ensure consumers have knowledge about how their personal information may be used and the ability to say ‘no’ to its collection and retention.”
In a letter to Amazon CEO Jeff Bezos, Markey and U.S. Reps. Luis Gutierrez (D-Ill.) and Mark DeSaulnier (D-Calif.), who also were misidentified, posed a number of questions about the technology, including how the company ensures accuracy and privacy. They also requested information about which law enforcement agencies Amazon has spoken to about using its technology.
“While facial recognition services might provide a valuable law enforcement tool, the efficacy and impact of the technology are not yet fully understood,” the letter states. “In particular, serious concerns have been raised about the dangers facial recognition can pose to privacy and civil rights, especially when it is used as a tool of government surveillance, as well as the accuracy of the technology and its disproportionate impact on communities of color.”
The ACLU said the false positives were disproportionately people of color, and said a false positive could cause a police officer to approach a person with inaccurate assumptions that could have significant ramifications.
“An identification — whether accurate or not — could cost people their freedom or even their lives,” the ACLU said.
Amazon has taken heat in recent months for its facial recognition products from employees, investors and the Congressional Black Caucus.
In a statement, Amazon disputed the results, claiming the ACLU had not used proper settings on their searches, and that facial recognition has been used successfully in the past.
“We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families and building educational apps for children).” A spokesperson for Amazon Web Services said in a statement. “With regard to this recent test of Amazon Rekognition by the ACLU, we think that the results could probably be improved by following best practices around setting the confidence thresholds.”
The ACLU said it used Amazon’s default settings.
©2018 the Boston Herald
Visit the Boston Herald at www.bostonherald.com
Distributed by Tribune Content Agency, LLC.