This week, the ACLU released the results of a test conducted with Rekognition, Amazon’s facial recognition software technology, in which it asked the software to identify members of Congress. According to Digital Trends, images of members of Congress were compared to a publicly accessible database of 25,000 mug shots, and the software incorrectly believed 28 of them to be police suspects.
The ACLU also pointed out that 39 percent of the false matches were people of color. When coupled with the fact that only 20 percent of all members Congress are people of color, this draws attention to the concern that facial recognition software contains racial bias.
A number of police departments have been considering and even testing the use of Rekognition software in their departments, which has raised concern among many civil liberties groups. While an Amazon spokesperson pointed out that the ACLU did not follow Amazon’s recommendations for using the technology (the rate of similarity between faces was set at 80 percent rather than the recommended 95), the ACLU’s test was successful in shedding light on the technology’s potential for problems in the public sector.