The ACLU also pointed out that 39 percent of the false matches were people of color. When coupled with the fact that only 20 percent of all members Congress are people of color, this draws attention to the concern that facial recognition software contains racial bias.
A number of police departments have been considering and even testing the use of Rekognition software in their departments, which has raised concern among many civil liberties groups. While an Amazon spokesperson pointed out that the ACLU did not follow Amazon’s recommendations for using the technology (the rate of similarity between faces was set at 80 percent rather than the recommended 95), the ACLU’s test was successful in shedding light on the technology’s potential for problems in the public sector.