The use of facial recognition technology is being debated in communities across the country, including in Massachusetts, where some local officials have taken regulating the technology upon themselves.
(TNS) — When Northampton Police Chief Jody Kasper joined the force as a patrol officer in 1998, the department was still using ink fingerprinting for criminals. It was a messy, “smudgy” and archaic process.
Since then, technology has evolved. Police now use digital fingerprinting, a quicker, less burdensome technology. A similar evolution happened with police photography, as officers went from using polaroid to digital cameras.
Another technology, considered far more controversial, is causing many communities to consider its ramifications: facial recognition, a piece of unregulated software that can scan faces captured by cameras and match them with people. Some in law enforcement, as well as public officials, civil liberties advocates and even those working in the business of artificial intelligence, fear the use of the software can lead to misidentifications. Facial recognition software continues to improve, but risks remain.
“It’s a whole new world,” said Kasper, who expressed concerns with the accuracy of the software. “We’ve seen a lot of technology in our field develop over time. I’m sure that’s what will happen with this too.”
The use of facial recognition software is being debated in communities across the country, including in Massachusetts, where some local officials have taken regulating the technology upon themselves.
San Francisco banned the technology in May 2019, becoming the first city in the country to do so. Somerville followed suit in June, and similar ordinances have been passed in Brookline, Northampton and, most recently, Cambridge. The Springfield City Council is set to vote on a facial recognition moratorium in February.
Critics of the software have argued it violates individuals’ civil liberties. And, they point out, there is a high risk of misidentification, especially in identifying people of color, elderly individuals and women, according to research from the Massachusetts Institute of Technology and federal studies.
Municipal officials passing facial recognition bans also argue local legislation is needed as federal and state legislation is lacking and government agencies are able to use the software with no oversight.
“There’s literally nothing stopping law enforcement from tracking someone who goes to a protest, who goes to a mosque or a church,” said Somerville City Councilor Ben Ewen-Campen, who introduced his city’s ban. “It’s just a black box.”
A bill being considered by the Massachusetts Senate and backed by the American Civil Liberties Union of Massachusetts would place a moratorium on the technology and other forms of biometric surveillance, including the analysis of a person’s gait and eyes, until officials are able to appropriately regulate the software.
One of the main reasons advocates support the moratorium is because federal or state legislation restricting the use of biometric surveillance does not exist, according to Democratic State Sen. Cynthia Creem, who represents Massachusetts’ First Middlesex and Norfolk and introduced the bill to the state Senate.
“Right now, there’s no legislation,” Creem told MassLive. “There’s no regulating what pictures are being used, how they’re used.”
Dozens of people, including officials from the office of Suffolk District Attorney Rachael Rollins, the Boston Teachers Union and Massachusetts Institute of Technology, testified in support of a statewide moratorium on the government’s use of the technology at a legislative hearing in October.
Among those who spoke at the hearing was Brennan White, CEO of Cortex, a Boston-based company that creates artificial intelligence platforms for social media marketing and that uses facial recognition technology regularly.
While the software has the potential to be used positively, according to White, it is also incredibly powerful and can be easily misused. Unbridled use of the technology can potentially set “up an infrastructure that could be abused by future dictators,” he said.
“In the wrong hands, this could lead to a dystopian-type society," the CEO said.
Debate on biometric surveillance is happening at the federal level as well.
U.S. congressional members of the House Oversight and Reform Committee, Democrats and Republican alike, criticized the software in May of last year, The Washington Post reported. Lawmakers brought up similar concerns about how biometric surveillance remains unregulated, how the technology can be faulty and how it potentially violates Americans’ personal privacy.
“The era of this use and capability is moving quicker than people have caught onto those issues,” Creem said. “There’s a lot more talk about it.”
To make the public more aware of the alleged dangers of facial recognition, the ACLU of Massachusetts kicked off a campaign in June called “Press Pause on Face Surveillance.”
Officials at the organization say state government agencies, including the Registry of Motor Vehicles and Massachusetts State Police, have used the technology in secret for more than 10 years “without any oversight, accountability, legislative authorization from any independent actor,” according to Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts.
The group sued the Massachusetts Department of Transportation after the agency did not respond to two public records requests in early 2019 regarding the department’s use of facial surveillance technology, according to the ACLU. MassDOT has allegedly used its database of state-issued ID photographs for facial recognition since 2006, according to the lawsuit.
In response to a request for comment about the department’s use of facial recognition and the ACLU’s lawsuit, a MassDOT spokesperson said in a statement to MassLive that the RMV captures images as part of the process of issuing driving licenses. The images are then run against other images in the agency’s system to prevent identity fraud.
The accessing of RMV images by law enforcement is permitted under the Driver's Privacy Protection Act of 1994, the spokesperson added.
A Washington Post report also revealed that investigators with the Federal Bureau of Investigation and Immigration and Customs Enforcement used states’ Department of Motor Vehicle databases for facial recognition and scanned through hundreds of millions of people’s photographs without their knowledge or consent. The majority of the images were of state residents, most of whom have never been charged with a crime.
Crockford said the unregulated use of biometric surveillance is part of a trend, where the technology is moving “much, much faster than the law.”
“We believe that the appropriate response to protect the public and to protect civil rights in Massachusetts is to press pause,” she said.
The application of the technology by governments, as well as the subsequent bans of the software, is occurring not only in Massachusetts but nationwide, Crockford said.
Communities in California, beyond San Francisco, are considering their own bans. State officials in Michigan are also looking to restrict law enforcement’s use of the technology, as the ACLU of Michigan investigates the statewide scope of facial recognition activities, according to a press release. The New York ACLU branch is trying to prevent face surveillance in public schools as well, Crockford said.
“We here in Massachusetts are fighting this bi-coastal battle that the technology doesn’t get ahead of our rights,” Crockford said.
Northampton’s police chief, as well as others in law enforcement, are also wary of the software and want to “press pause” on its use.
Kasper recently visited China, where she noticed several surveillance cameras set up in public locations. According to the ACLU, China’s government uses the technology “to control and oppress religious minorities and political dissidents.”
The New York Times reported Beijing has invested billions of dollars in the software and in other methods of surveillance. With its thriving technology industry and millions of cameras monitoring the public, the country aims to create a strong national surveillance system, according to the Times’s report.
The police chief called China’s use of facial recognition “creepy.”
“That may be the concern there: an authoritarian government," she said.
Kasper, along with several advocates for facial recognition bans, still see utility in the technology once it becomes more evolved. There are a number of different applications for the software people have not considered, she said, whether that be putting a name to an unidentified dead body or catching a robber whose face was captured on camera.
However, the odds of Kasper’s department using the technology in the next several years are low, the chief added.
“We’ll kind of see where we are at in three years,” she said.
Springfield City Councilor Orlando Ramos compared restricting the use of facial recognition software to trying to legislate flying cars: Officials can’t regulate the technology until they know how it works.
The councilor, who introduced his city’s temporary ban on facial recognition, said he can see how the software can be used as an effective tool for law enforcement in their investigations only if facial recognition evolves to become more accurate. However, software of that caliber doesn’t exist yet, according to him.
“It’s not worth potentially putting people at risk with technology that’s not effective,” he said.
For the towns and cities in Massachusetts that have passed or introduced their own moratoriums and bans on facial recognition technology, the concerns have remained largely the same.
Local officials point to the software’s biases and inaccuracies; they think its use by government agencies without the public’s permission is an abuse of personal privacies and civil liberties, and municipal legislators believe the software’s application in municipalities opens those communities up to lawsuits from people targeted by the technology.
Ramos said introducing Springfield’s ban boiled down to three reasons, the first being the potential for litigation.
“We want to protect the taxpayer,” the councilor said. The other two motivations included protecting the public from unnecessary government surveillance and preventing the mistreatment of marginalized communities.
“It’s more than an inconvenience,” Ramos said. “It’s a danger for people of color.”
The city councilor said widespread public support is another reason banning the technology is worthwhile. The ACLU of Massachusetts polled more than 500 adults in Massachusetts and found nearly eight in 10 support a moratorium on government use of facial recognition technology, according to the June poll.
Where public officials differ is on how long they want their bans to last.
While Northampton’s moratorium on the municipal use of facial recognition will last three years and Springfield’s proposed ban would remain in effect for five years, Brookline, Cambridge and Somerville have all banned the technology indefinitely.
One of the purposes of Ramos’s ordinance is to allow police to come back to the table in the future with more accurate facial recognition technology that they want to use as an investigative tool under limited circumstances. The city councilor said he wants to be supportive of police but smart about the technology and its applications.
“It’s not our intention to hinder police in any way,” he said.
Ben Ewen-Campen, who spearheaded Somerville’s legislation, said the decision to do so was straightforward and that his ordinance, a two-page document, passed unanimously.
“There was a general sense that there was really, really widespread support of this,” he said. “I think everyone understood in the absence of regulation, in the absence of transparency, this goes way too far.”
Ewen-Campen does not think there is any situation where facial recognition technology’s use at the municipal level would be acceptable, even if the software were to work perfectly without misidentifications, he said.
“I personally can’t imagine a situation where a local police department should have that sort of power,” the city councilor said.
Lawmakers would be in a different position if government agencies came to the public years ago to ask for permission to use facial recognition under specified circumstances, instead of allegedly using it behind closed doors, according to Ewen-Campen.
“It shouldn’t be a surveil-first, ask-questions-later policy,” the city councilor said. “I think the burden of proof should be on the people who want to use the really powerful surveillance technology.”
From catching bad guys to identifying victims; how facial recognition can help
Brookline Police Sgt. Casey Hatchett, a member of the town’s Town Meeting, spoke against the town’s ban of facial recognition technology in late 2019. She argued enacting the legislation was too hasty and wanted limited exemptions for law enforcement, noting, though, that Brookline police had not purchased or deployed the technology and was not looking to do so.
“There was not enough time before Town Meeting to fully consider the effects of this ban -- not enough time to fully grasp the potential uses of this technology today -- and in the future, not enough time to inform ourselves and the public about the differences between facial surveillance and facial recognition, and not enough time to fully consider some of the acceptable uses of this technology,” she said at the meeting, according to prepared comments.
Those acceptable uses, Hatchett told MassLive, could potentially include identifying a person with Alzheimer’s who may be lost and not able to remember their own name, locating missing children, preventing terrorist threats and even exonerating an individual wrongfully convicted of a crime.
She pointed to a case in November, shortly before she spoke against the ban, in which a man was out running in Brookline, suffered a heart attack, went unconscious and was taken to the hospital, where he remained unidentified for multiple days.
He eventually recovered, but there were days where his family and friends were concerned about his whereabouts, according to Hatchett.
“We had no way to contact their family,” Brookline Police Lt. Paul Campbell said. “This ban prevents us from running the person’s face in a database, contacting their family and identifying the man.”
Brookline’s Town Meeting voted two years ago to form a committee to look at the challenges and benefits of surveillance technology and military-type equipment as they related to “the particular realities of Brookline,” according to Hatchett’s prepared comments.
“A lot of our work was to review this technology,” Hatchett said, adding that a member of the committee circumvented the process of studying the software by introducing the outright ban of the technology. “It was premature.”
The sergeant said she is aware of the facial recognition software’s inaccuracies, especially with women and minorities, and that is why she was in favor of a temporary moratorium that would enable officials to implement checks and balances on the technology so government agencies could use it in appropriate circumstances.
“We’re police, but we’re also members of the community. I don’t want a false positive. I don’t want a wrongful arrest,” Campbell added. “I don’t want something that’s going to be inaccurate, and I think most police officers would say the same thing.”
Hatchet argued before Brookline’s Town Meeting in favor of a “limited exception" that would allow law enforcement to use facial recognition with oversight from the town’s Select Board and require reporting of any use of the software. Law enforcement has applied the software solely to assist in cases, she said, but Brookline police would never use it conduct widespread facial surveillance, as China does.
“There is an enormous difference between facial surveillance and the use of facial recognition technology – one is China, and one is using appropriate tools to conduct case specific investigations and community caretaking activities,” the sergeant said, according to her prepared remarks. “We are not China.”
Brookline police have outsourced images of potential criminals in limited cases to attempt to match them to other agency’s databases, including the Department of State Police’s Commonwealth Fusion Center’s, a criminal intelligence center that shares information between federal agencies and state and local governments.
There was a series of residential burglaries multiple years ago, and video captured an image of the suspect. Brookline police sent that image to the fusion center. Hatchett said she believes the suspect was caught in the process of another break-in, though.
Hatchett said one misconception about the software is that it is used by law enforcement in a “vacuum,” where a match is made after an image of a face is run in a database and someone is then directly arrested.
“Investigations aren’t like that. They’re multi-layered,” Hatchett said. “It wouldn’t be enough to suffice in court.”
She and Northampton’s police chief both said facial recognition is used solely as a tool that helps in investigations, not as a piece of evidence to provide probable cause for an arrest.
Campbell added that facial recognition gives law enforcement a list of possible suspects once an image of a person is run in a database. It also shows by how many percentage points each of those suspects matches the image.
“It gives you information. It gives you a direction to go on,” the lieutenant said. “I think a lot of people who are opposed to facial recognition, they think that it just gives you one picture and that’s your guy.”
Others, though, see problems with facial recognition being used in the shadows.
Crockford at the ACLU of Massachusetts said that because facial recognition technology has been used in secret, the only reason the public knows anything about its use is because the ACLU has filed hundreds of public records requests across the state to learn how, if at all, the software is applied at the municipal and state level.
The group discovered local police departments were largely not using the technology but that state agencies, like the RMV and state police, had been using it without the public’s knowledge for 13 years.
“Nobody’s really minding the store,” she said.
MassDOT initially used the software to determine whether people were applying for state IDs and drivers’ licenses under fake names, according to Crockford. But the ACLU also found that the RMV sent a memo to law enforcement in October 2006 alerting state police that the agency had the technology and could run images of criminal suspects for the department to try and find matches.
The department confirmed it uses facial recognition through a software that analyzes existing photographs provided to state police. Personnel are trained on how to incorporate facial recognition information into their investigation in accordance with state police policy.
“The RMV, with the help of Massachusetts State Police, has implemented an amazing Facial Recognition System that allows a digital image -- such as a license photo -- to be compared against the 9.5 million images in the RMV database to identify potential ‘matches,’ ” the memo said, according to the ACLU’s lawsuit. “State Police and RMV staff have been using this tool since May 2006 and have successfully identified many individuals who have fraudulently applied for multiple licenses or IDs.”
State police also use facial recognition technology in criminal investigations to help identify unknown suspects, according to David Procopio, spokesman for the department. Any potential identifications made with the software are then confirmed or rejected through other investigative methods.
“As with any investigative tool, we only use facial recognition software for legitimate, clearly-defined law enforcement purposes,” he said in a statement.
The ACLU also found through its public record requests to dozens of police departments in Massachusetts that a Cambridge-based surveillance company had been trying to get the Plymouth Police Department to deploy the company’s technologies in the town.
Hundreds of pages of emails showed that CEO Jacob Sniff of Suspect Technologies, a start-up backed by billionaire Mark Cuban, had worked to get Plymouth Police Chief Michael Botieri to install the company’s software on nearly 20 surveillance cameras in the town and then send recorded videos of people in the community to the company so it could better its technology’s algorithms, Crockford said.
Sniff also proposed the department give the company images of people wanted by Plymouth police so that any time one of those individuals were to walk by the camera, the department would get an alert, according to Crockford.
“More or less, the facial application will integrate and use existing cameras to scan everyone’s face who enters a lobby, and compare to a local list of people that have open warrants (usually 5-10% of any town),” Sniff said in one of his emails, according to the ACLU. “It will have different categories of warrants and the officers will take different actions, based on the different categories. We will send any visual alerts back to the dispatchers/officers in the lobby to review and decide whether to take action on.”
Suspect Technologies also acknowledged in its emails that the company’s software only works around 30% of the time, according to the ACLU.
The police department did not go through with the proposals, the ACLU said.
Sniff did not respond to multiple requests for an interview.
Crockford said cases like Plymouth’s are “disturbing” and underscore the vulnerabilities of municipalities, who do not know much about biometric surveillance and have legitimate fears over serious crimes, to marketing efforts by companies that offer facial recognition technologies.
“Absent a statewide moratorium, we see a real potential for local government officials being exploited, frankly,” Crockford said.
Facial recognition technology is a type of image recognition that falls under the larger category of machine learning, or artificial intelligence, where computers analyze and try to classify large data sets.
A computer will be presented with millions of images and ideally be able to identify what is in each picture without the help of a human. Facial recognition is one step more specific: The software aims to look at human faces and assign those faces each an identity, according to Brennan White, head of Cortex, a marketing company that works every day with facial recognition technology.
Facial recognition is based on a “confidence system,” where the software runs an image of a person in a database with a large number of photographs of people and then says by how many percentage points the initial image matches the others.
“Up until recently, computers could replicate photos, could replicate videos, but they couldn’t understand them," White said. “The nature of machine learning is not that some human has to teach a computer what a dog is.”
White, who testified in favor of the statewide moratorium on the software at the October legislative hearing, considers himself bullish about the technology in the long term but wants to press pause on its state and municipal use until the public becomes more educated on the topic, he said.
“It’s great that we want to rush into the next best thing, but we want to make sure we’re not harming anything,” White said. “The average person needs to level up their understanding of this before we knee jerk one way or the other.”
A recent study by the National Institute of Standards and Technology found when using facial recognition technology there are higher rates of false positives for Asian and African Americans than for Caucasians, ranging from factors of 10 to 100. NIST reported higher rates of false positives for black women more specifically.
MIT researcher Joy Buolamwini, the founder of the Algorithmic Justice League at the university, found similar results in her work. Buolamwini, who also testified in favor of the moratorium, ran more than 1,200 faces through recognition programs offered by Face++, IBM and Microsoft and found the technologies frequently misidentified women of color.
“In the worst case, the failure rate on darker female faces is over one in three, for a task with a 50 percent chance of being correct,” she said in her study, the Gender Shades project.
The ACLU also tested “Amazon Rekognition,” a facial recognition technology unveiled by the company in 2016, to try and identify 188 New England athletes. The test misidentified 28 of them and matched the players to mugshots in the arrest photo database.
Amazon claimed the ACLU misused the technology to make headlines and said the software can have positive applications.
“As we’ve said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking,” an Amazon Web Services said in a statement. "We continue to advocate for federal legislation of facial recognition technology to ensure responsible use, and we’ve shared our specific suggestions for this both privately with policymakers and on our blog.”
Problems with misidentifications can sometimes be attributed to a small data set or poor image quality, according to White. Worries over inaccuracies in the facial recognition technology are well-founded, though, he added.
The CEO compared the software to DNA analysis. When technology that analyzed DNA first started to develop, officials were initially wowed by it, but they later came to find the software was not always accurate.
“Turns out people were wrongfully convicted by DNA before,” White said. “The worry is the same for facial recognition.”
Cortex has worked with companies as high-profile as Toyota, using facial recognition technology to discover what company-made images and videos audience members prefer the most. Compared to law enforcement’s use of the software, White said, there are no major drawbacks. The only negative consequence of the technology not working correctly is financial. “The stakes are dramatically lower," according to him.
“If we fail and we’re wrong,” he said, “nobody dies, nobody goes to jail, there’s no high-intensity gunfight.”
White said he could still see benefits to government agencies using the technology once its accuracy improves, though.
“It sounds scary,” he said. “I think this is one of the things that could easily be polarized. There’s the obvious side, ‘We need to make ourselves more secure here,’ and then there’s the side, ‘We need to protect our privacy.’ ”
©2020 MassLive.com, Springfield, Mass. Distributed by Tribune Content Agency, LLC.