IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Congress Asks Experts for Guidance on Facial Recognition

Law enforcement uses facial recognition systems with little oversight and, at times, disastrous impact. During a congressional hearing this week, members and experts talked through how new laws could head off greater harm.

When a police detective called his cellphone in January last year to advise Robert Williams turn himself in, he thought it was a prank. His wife’s call to the local police department confirmed they had no warrant for him. But when the 43-year-old, Farmington Hills, Mich., resident pulled into his driveway, Detroit officers were waiting, Williams recalled during a July 13 Congressional hearing.

They handcuffed him in front of his family and held him in a detention center without food or water, where he spent 30 hours confined with other suspects, Williams said. Eventually, he found out the detectives wanted him on a charge of felony larceny — from a store he hadn’t visited in years.

How had the police come to pinpoint him? The officers’ facial recognition algorithm had mistakenly matched a 6-year-old driver’s ID photo of Williams with that of a suspect captured in a surveillance camera picture.

Williams recalled his conversation with the detective: “I held up the paper [with the suspect’s image] to my face and said, ‘I hope you don’t think all Black men look alike.”

His young children remain shaken by the seemingly random arrest, Williams said. He is aware his situation could have been even worse.

“What if the crime was capital murder?” Williams said. “The courts system is so backed up that they probably wouldn’t even get to me yet and I’d still be locked up.”

SHINING LIGHT ON FACIAL RECOGNITION


Law enforcement use of facial recognition technology (FRT) is poorly charted territory in the U.S.

Government reports find little oversight and even awareness among agencies about what systems they are using and how, said Gretta Goodwin, director of the Government Accountability Office’s Homeland Security and Justice team, testifying at the hearing.

Goodwin’s team found that 13 of the 14 federal agencies that used FRT for criminal investigations did not have up-to-date knowledge of what non-federal systems they used, such as those owned by third parties or other levels of government. Some agency leadership were even originally unaware that their employees used the technology at all.

Such oversight gaps leave little opportunity to prevent mistaken arrests or ensure that sensitive facial data about private individuals is kept secure from hacks. After all, agencies cannot monitor and safeguard what they cannot see.

FRT is believed to be actively used by federal, state and local law enforcement. Congress members are now asking whether and how FRT can be safely used by these bodies and what rules must be created to stave off harmful impacts.

“Facial recognition systems are quietly being incorporated into American policing,” said Rep. Sheila Jackson Lee, D-Texas, chair of the House Subcommittee on Crime, Terrorism and Homeland Security, who convened the hearing. “Is the technology sufficiently accurate to justify it’s use by police?”

ACCURACY LIMITS


Police departments may adopt advanced tools with the goals of better protecting residents, and it has been used to help to identify insurrectionists from the Jan. 6 Capitol attack.

But officers unaware of the limitations may over-assume the usefulness and reliability of the tool.

For example, the technology has been shown to be less able to accurately identify children, the elderly, women and people of color, Jackson Lee noted. Officers running image searches to find suspects who are not young adult or middle-aged white men may therefore be especially likely to receive false positives and negatives, in which the system inaccurately flags unrelated individuals or overlooks images that genuinely contain the suspect.

Rep. Karen Bass, D-Calif., confirmed during questioning that the technology is less likely to misidentify white men and appeared to probe the implications, asking, “What about that in terms of the accuracy and also its use as an investigative tool?” No speaker explicitly recommended demographic-restricted use of the technology, and Bass said the the George Floyd Justice in Policing Act she sponsored would, if turned into law, block federal resources from funding facial recognition technology.

IMPLEMENTATION ISSUES


Reliability concerns are exacerbated by the tendency of officers to use the tool in untested ways.

The National Institute of Standards and Technology (NIST) assesses the reliability of various facial recognition algorithms, but in vastly different contexts than that of law enforcement, said witness Barry Friedman, New York University School of law professor and faculty director of its Policing Project.

NIST tests involve higher quality images than surveillance camera clips and smaller image databases than those used by law enforcement today. It is also unknown if police and NIST are using the same specific algorithms.

SLIPPERY SLOPE: PRIVACY AND SURVEILLANCE


Even if facial recognition technology was completely accurate, witnesses said its use by law enforcement can lead to privacy and rights issues.

Individuals do not give their informed consent before their images are pulled into facial recognition photo databases, with governments likely to tap department of motor vehicles and passport photos while private companies scrape images from social media platforms to create expansive collections, said Brett Tolman, executive director of Right on Crime, a conservative criminal justice reform advocacy organization.

Kara Frederick, research fellow at the Center for Technology Policy at the Heritage Foundation, said use of third-party systems introduces greater risks of public harm. That’s because private firms are unbound by the Constitution, likely to be motivated by profit to produce products quickly rather than securely and — in the case of some major firms — may have poor track records with privacy.

Privacy concerns also arise over the potential for facial recognition data to be combined with other sources of personal information about individuals to create comprehensive resident profiles.

Facial recognition technology run on footage from the various store, traffic and other cameras present throughout cities could lead to close mapping of residents’ personal connections and movements, Tolman warned.

Bertram Lee Jr., media and tech policy counsel at the advocacy group the Leadership Conference on Civil and Human Rights, said that low barriers to applying the technology might also see officers identify all individuals who attend political demonstrations or places of worship, potentially scaring off many from exercising rights to free speech and association.

VALID USE CASES?


Tolman said that facial recognition technology could potentially bring value in very limited circumstances, such as helping police find a mass-murder suspect presenting immediate public threat. He said officers should be required to first get permission from elected officials before using the tool.

Cedric Alexander, a law enforcement specialist who previously served as deputy commissioner of the New York State Division of Criminal Justice Services and as a member of President Obama’s Task Force on 21st Century Policing, meanwhile, suggested limiting the tool to use in helping officers generate ideas in certain rare situations.

He said the technology could potentially aid in investigations in which there are no other leads and where its findings are only used to inspire new avenues for research and prompt deeper exploration of the backgrounds of the identified individuals, rather than used to trigger quick arrests, Alexander said.

Officers would also need to be careful to prevent their implicit biases from causing them to too readily accept the generated findings.

REGULATORY WISHLISTS


Bertram Lee Jr. called for banning — or at least pausing use of — facial recognition technology until measures can be taken to prevent the technology from causing undue harm. Some states and cities have indeed passed prohibitions.

Preventing abuses of FRT requires new federal laws and regulations as well, speakers said, with Alexander recommending state and federal government establish training and certification programs to ensure that employees using this technology do so appropriately.

Better mechanisms are also needed to handle cases that reach trial, said Jennifer Laurin, professor at the University of Texas at Austin School of Law. FRT results cannot be admitted as evidence in court but are still used to influence bail hearings, sentencing decisions and persuade defendants into guilty pleas before cases go to court. People who are innocent but fear harsh sentences should they not win their cases may opt to plead guilty to lesser charges to reduce the amount of punishment they experience for acts they did not commit.

Laurin recommended requiring prosecution to inform defendants early on if FRT was used in the investigations against them, reveal the system’s level of confidence in the match results and, ideally, disclose the code of the algorithms used to reach these conclusions. However, that last goal is likely to be blocked by private companies that claim their algorithms are proprietary, she said.

Friedman said that NIST also needs to test systems as they relate to the way police actually use them, and proposed Congress use its ability to regulate interstate commerce to compel vendors to apply safeguards. That could include implementing restrictions around probe image eligibility, databases size limits and the level of confidence the system must have that two images match before returning results.

Unless these or other measures are enacted, however, facial recognition technology will remain largely unconstrained.

Rep. J. Luis Correa, D-Calif., emphasized the problem during hearing questioning:

“I’m left here with a very unsettling feeling that I essentially have no remedy for somebody using my facial recognition information for whatever they want to do with it,” Correa said.

“You could create one,” Friedman answered.
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.