Privacy and civil rights concerns have escalated, as the technology becomes easier to deploy and gains traction as a law enforcement tool.
Earlier this month, Microsoft’s President and Chief Legal Officer Brad Smith added fuel to the fire on the facial recognition controversy, taking a surprising stand on the issue in a blog post.
According to a 2015 National Conference of State Legislatures report, the U.S. Government Accountability Office found 41 states and the District of Columbia used facial recognition among their biometric techniques to identify driver’s license fraud by comparing license photos with other photos within that state’s Department of Motor Vehicles system.
When it comes to law enforcement, 21 states allow some form of facial recognition use to be applied to their driver license databases and mug shot photo databases, by either the FBI, its own police agencies or both, according to a 2016 Georgetown Law Center on Privacy and Technology report.
Privacy and civil rights abuses are frequently cited as concerns tied with facial recognition technology when used by law enforcement.
“The technology has vastly improved in its power and the number of cameras out there capturing images of people’s faces and intruding on their privacy,” said Adam Schwartz, senior staff attorney for the Electronic Frontier Foundation. “This improved technology has the capability to track people as they go from place to place, who they are talking to, and what they are doing as they go.”
Protesters, for example, may be leery to participate in demonstrations where cameras and facial recognition are used, potentially chilling their First Amendment rights to speak freely, according to Schwartz. Additionally, he cited concerns that law enforcement may heavily rely on facial recognition matches as their main tool in a case and deem an individual a suspect before they undergo due process.
Facial recognition accuracy flaws have also been flagged, with experts noting the technology seems to be on the mark when it comes to white males but is less accurate when it comes to matching identifications of African-Americans and other minority groups, and women.
Currently, law enforcement will take images of suspects and compare them against photos in driver license databases, mug shot databases and other images they may have, as well as capture real-time images of crowds of people at airports, parks, streets and other locations and compare them against the same databases.
But one new facial recognition technology twist is emerging, say civil rights activists.
“We have particular concerns with Amazon, given the documents obtained by the ACLU of Northern California that showed that the company was aggressively marketing this technology to law enforcement and proactively suggesting uses that would adversely impact individuals’ constitutional rights,” said Neema Singh Guliani, a senior legislative counsel with the American Civil Liberties Union (ACLU). “For example, one of the documents obtained showed Amazon suggesting that the technology could be integrated into body-worn cameras, which are supposed to be tools for police accountability, not surveillance. For this reason, we are actively urging Amazon to stop selling this technology to government.”
The ACLU, along with Amazon’s shareholders and its employees, have also asked the e-commerce titan to halt its sales to law enforcement agencies of its facial recognition technology known as Rekognition. The shareholder’s letter stated investors were concerned the technology would unfairly and disproportionately target minorities, immigrants and civil rights activists, and also face the potential for use among foreign governments.
In a blog post last month, Amazon’s Matt Wood, general manager of artificial intelligence at AWS, stated, “Amazon Web Services (AWS) is not the only provider of services like these, and we remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement.”
According to Wood, Rekognition, which launched in 2016, has been used by Amazon's customers to analyze video to prevent human trafficking, inhibit child exploitation, reunite missing children with their families, and build educational apps for children and organizations by enhancing security through multi-factor authentication, finding images more easily, or preventing theft of packages from front door porches.
The Major Cities Chiefs Association (MCCA), an organization representing police chiefs and sheriffs from the largest cities, also favors the use of the technology.
“While I can envision some cost savings for agencies [that use facial recognition technology], it is much more likely to represent service improvement than cost savings,” said Rick Myers, MCCA executive director. “Imagine an officer being able to scan a crowd at a large gathering … or identifying someone who is a habitual domestic violator through a photo sent by a gun dealer in response to an application for a gun purchase, which could ultimately save a life. While these things may or may not result in cost savings, it is more about increasing effectiveness.”
Myers says his organization is keeping a close eye on any potential legislation that may limit the use of facial recognition technology by law enforcement agencies. While he is not aware of any type of legislation that is pending, he questioned efforts to limit the use of facial recognition for safety in public spaces.
“It remains remarkable to me that limitations on public safety’s ability to surveil and analyze video and photos taken in public locations where there is no constitutionally defined expectation of privacy results in much more hoopla than the private sector’s widespread use of similar technologies unfettered by legislation or even public concern, even when in use in private areas under the control of the private organization,” said Myers.
Amazon’s Wood echoed those thoughts. He noted Amazon opposes a ban on such technology and that citizens and politicians should not limit its use for fear it may be applied for nefarious purposes. “The same can be said of thousands of technologies upon which we all rely each day. Through responsible use, the benefits have far outweighed the risks,” he said.
Microsoft’s Smith explained it makes more sense to have government pass legislation to regulate facial recognition technology than to ask the companies to regulate the ways that government agencies, such as law enforcement, use the technology that they buy from these companies.
Organizations like the Center for Democracy and Technology (CDT) want government to only use facial recognition technologies when a crime has been committed, rather than also using it as a surveillance tool on the general public.
“There should be no dragnet of crowds. And even if you want to use it for criminal investigation purposes, you should get a warrant for that,” said Chris Calabrese, CDT’s vice president of policy.
On the commercial front, Calabrese says legislation is needed to ensure companies have an opt-in policy if people are willing to have facial recognition technology used, rather than an opt-out policy or no policy.
The Electronic Frontier Foundation takes a more strident view. “This technology is so menacing that police should not use it at all,” said Schwartz. “Often, police will use it without regulations. We want police to get a warrant before they can use it. Right now, it’s the wild west.”
Illinois, Texas and the state of Washington are currently the only states that have laws specifically addressing facial recognition technology and have limits on its use, according to Schwartz. Illinois passed its legislation, the Illinois Biometric Information Privacy Act, a decade ago and was one of the first to prohibit companies from collecting biometric data, including facial features and data, without obtaining prior consent before scanning their image. Since then, Texas and Washington state have passed similar laws.
The ACLU, meanwhile, has called for a temporary halt to any use of facial recognition technology. “Congress should issue a federal moratorium on the use of this technology, until it can be fully debated,” said Guliani. “Communities can address whether they want the technology used at all by the government given the concerns and protections can be adopted to ensure that its use does not have a disproportionate impact on communities of color, activists, and immigrants.”