IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

A Florida Drug Case Could Set Precedent for Facial Recognition in Policing

The sale of $50 of crack is at the center of a court case that could outline new evidentiary rules and standards for the use of the technology in law enforcement.

(TNS) — A humble drug bust in Jacksonville, Fla., could result in a landmark ruling that challenges the rising popularity of the face recognition technology used to help put a man in jail for eight years. The appeal now underway in Florida’s First District Court of Appeals could also yield further evidence of what many have asserted: that facial recognition tech is itself inherently biased against people of color.

The potentially historic case revolves around a typical crime: the sale of $50 of crack to an undercover officer. Police say Willie Allen Lynch sold the drugs on Sept. 12, 2015, to cops who weren’t able to capture him on the spot due to concerns the arrest would blow their cover.

The cops did manage to snap photos of their suspect, however, and three weeks later ran the images through a statewide police database using an algorithm designed by the company MorphoTrust that searches 22 million Florida driver’s licenses and more than 11 million law enforcement photos. An analyst looked at a list of candidates ranked by stars and picked Lynch, leading to his arrest and eventual conviction and 8-year sentence.

Lynch, a repeat offender who acted as his lawyer for part of his trial, handwrote legal papers from jail arguing why the jury needed to see more of the results from the police database. The judge denied that effort, a decision that is central to Lynch’s appeal.

The appeal could be the first time a court directly addresses due process and evidentiary standards for face recognition, which its critics say have not kept up with advancements in or use of facial recognition across the country. “One of the things he hit upon was the inherent unfairness of the identification procedure that took place,” said Clare Garvie, an expert on police surveillance tech at Georgetown University’s Center on Privacy and Technology.

“We’re talking about biometric analysis here. Something roughly equivalent to fingerprinting or DNA analysis,” she says.

Such evidence is often highly scrutinized during trials, but in this case, the results were accepted at (literally) face value. Lynch “was denied access to evidence in his case,” Garvie said. “This is the case that’s presented the issue to a court most clearly thus far.”

Facial recognition begins with a database comprised of mugshots and other government photos, like ones held by motor vehicle agencies or the State Department. (The Center on Privacy and Technology estimates that 1 out of 2 adults in America is in a police database.) Each face is analyzed and turned into a numerical figure representing its measurements. A user inputs a photo to be searched into the system, generating a new formula that is compared to millions of others. The system then creates a “candidate list” of possible matches.

Human intervention is critical in any final analysis because the software itself is far from perfect. Requests for proposals from law enforcement agencies around the country typically require false positive rates of 2 percent or less for fingerprint systems but will accept 20 percent false-positive rates for facial recognition, according to court papers filed in New York by the Center for Privacy and Technology. The problem is that humans are hardly perfect, either. A 2016 study by the University of York found that people are not nearly as good at identifying people by their faces as they think they are. Training would likely improve their ability to choose the right person from the search results, but no one knows what it means to be an expert. Says Garvie, “There are no standards for how face recognition systems work.”

The FBI, which has access to 411 million photos through various state and federal databases, says facial recognition is critical to its mission. Its database, which also uses a MorphoTrust algorithm, was accurate 85 percent of the time, Deputy Assistant Director Kimberly Del Greco told Congress last year. The Government Accountability Office issued a report in March 2017 reprimanding the FBI for failing to conduct audits of the technology or to submit required reports disclosing the program’s growth regularly.

Similar secrecy surrounds the NYPD’s use of face recognition. The country’s largest police department won’t say how many people are in its face database, but disclosures through an ongoing lawsuit brought by the Center on Privacy and Technology have hinted that anyone arrested since 2011 is in a database. The center has submitted requests for information on law enforcement’s use of facial recognition across the country.

“We are only now discovering how much this technology is being used by agencies around the state,” says Andy Thomas, a public defender in Florida whose office is representing Lynch in his appeal.

Those error-prone results get even dicier when the suspect is black, as Willie Lynch is. An often-cited 2012 study that used mugshots from the Pinellas County database found one face-recognition algorithm failed nearly twice as often when analyzing photos of black people compared to white people. Black people make up a disproportionate number of mugshots in police databases, driving up the possibility for error and wrongful imprisonment.

In Lynch’s case, the database returned four other matches, in addition to his photo. Court records show that a Yelp-like star system ranked each one. “It’s like we’re rating the likelihood this person is a crack seller,” says Thomas.

Jacksonville Sheriff’s Office analyst Celbrica Tenah picked Lynch as the suspect. She testified that she didn’t know the maximum number of stars a possible match could have, according to court papers. “I can’t speak to the algorithms about how it puts one, two, three, four, five but it does from my understanding arrange the photos based on what’s most (alike) to the photo that you uploaded,” she said.

That still leaves plenty of room for fudging results. One striking example of the technology’s flexible standards came in an interview with Roger Rodriguez, a retired NYPD detective. He told Forensic Magazine last year that, under the current system, an investigator can take an image of a suspect with his eyes closed and then paste the eyes of someone else in the hopes of obtaining a match. “When you (edit an image), an analyst is giving that photo a second opportunity to return a match,” Rodriguez said. “That is where the NYPD paved the way in their thinking. We focused on enhancing tools and giving photos a second opportunity, so that’s where the art comes in.”

Rep. Ted Lieu, D-Calif., said that needs to change. “There are questions about whether current facial recognition technology can accurately identify people of color, which opens up a whole host of concerns around bias and profiling,” Lieu said in a statement. “It’s a legal and ethical issue with major privacy implications that we’re contending with here in Congress.”

©2018 New York Daily News Distributed by Tribune Content Agency, LLC.