IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Facial Recognition Furor Hits University of Colorado

The Colorado Springs campus is in the midst of a media firestorm over a professor’s $3.3 million government research project that is helping the American military identify and thwart terrorists.

(TNS) — A recent media firestorm that a University of Colorado at Colorado Springs professor says was overblown and inaccurate drew attention to his $3.3 million government research project that he said is helping American military better identify and thwart terrorists.

Professor Terrance Boult, El Pomar Chair of Innovation and Security at UCCS, said he was surprised that people thought his military-funded study of long-range facial recognition was public domain, as reported by some media.

“The data is only available under a restricted-research license,” he said, meaning only signed license holders may obtain the information. “Part of the response is, ‘But it’s on the internet.’ My response is, ‘But it’s not.’”

In what’s known as the UnConstrained College Students Dataset, Boult secretly photographed thousands of campus students, employees and visitors on public sidewalks at the university in 2012 and 2013, using a hidden camera set up in a building nearly 500 feet away.

The research sought to improve software to allow facial recognition technology to perform more accurately at greater distances.

“We want to make sure the men and women in the fighting forces have the best ability to protect themselves,” Boult said. “We want to be able to recognize car bombers or vest bombers as they approach from far enough away to stop them if they’re on a watch list.”

Critics contend that the technology can threaten the First Amendment by invading privacy. Boult counters that the research and the data accumulated are not in the public realm.

“Some of the people making those comments are posting next to their Facebook photos,” he said, “so it’s sort of like they’re worried about it, but they have all these photos on Facebook.”

While he said he understands concerns about data privacy, his research did not include collecting anyone’s name or identification.

“The data was never given to the government, nor do we think they ever accessed it,” Boult said. “They paid us to do the research, and much like cancer research, they did not get all of the material.”

Boult runs the Vision and Security Technology Lab at UCCS, where he and students work on security projects that include machine learning, computer vision, surveillance, biometrics and computer vision.

The facial recognition project, initially funded by the Office of Naval Research’s Multidisciplinary University Research Initiatives Program and later by other government entities, began in 2007. The work was part of a consortium of other campuses that included the University of Maryland, University of New York at Buffalo, Columbia University, Rutgers University, Carnegie Mellon University and the University of Texas at Dallas.

To test whether the algorithms the UCCS team developed were working, Boult set up surveillance cameras inside a university building as one of 10 data sets to evaluate the software.

Over several months in 2012 and 2013, the cameras captured about 16,000 images of 2,400 people on the sidewalk.

Of those, more than 1,700 were considered “matched identities” — meaning the same person was snapped more than once — and of good enough quality to be used in the study. The project also included a competition for researchers to see if they could correctly match the faces using the developing software.

Boult said researchers did not know the subjects’ names or identities or try to determine them. And the photos were released to the research consortium five years after they were taken, he said, when most students would have graduated.

The UCCS Institutional Review Board, which ensures “protection of the rights and welfare of human subjects in research,” analyzed the protocol, campus spokesman Jared Verner said in a statement.

“The photographs were collected in public areas,” he said. “No personal information was collected or distributed in this specific study.”

The university is “committed to the principle of academic freedom and the ability for faculty to study and research a variety of topics,” the UCCS statement continued. “The university takes the privacy of its students seriously as outlined in the Federal Educational Rights and Privacy Act.”

The purpose of the surveillance was to capture people in a more natural state, rather than knowing they were being photographed, Boult said. He said that he’s aware of the balance needed to achieve the research goal but not make mistakes so that “somebody gets interrogated by the police because we’re not doing it right.”

Nothing like that happened with the project, he said.

“If someone thought we were collecting this data with students’ names to put in an FBI database, for example, that would be concerning,” Boult said. “That’s not what we’re doing.”

The Financial Times first wrote about the research in April.

The First Amendment protects photography in public places when it’s communicative, meaning there’s a message to be communicated, and when audiences receive that message, said Lata Nott, executive director of the First Amendment Center at the nonpartisan Freedom Forum Institute in Washington, D.C.

Boult’s research is interesting, Nott said, because while photography is protected if it’s an expressive act, “Is developing an algorithm an expressive act?”

The project also has potential Fourth Amendment implications, she said, which states the government cannot unlawfully search a place where the public has a reasonable expectation of privacy, which is not guaranteed on a public sidewalk. She noted that debates on new technology are appearing in court decisions, including that search warrants are needed for authorities to retrieve personal cellphone photo data.

Facial recognition technology, in general, can threaten First Amendment rights, which protect the freedom to associate with other people anonymously, by “undermining privacy,” she said.

Facial recognition enables individuals to be tracked and identified for applications that include commercial and consumer, law enforcement and foreign policy, said Klon Kitchen, senior fellow for technology and national security for The Heritage Foundation, a conservative think tank in Washington, D.C.

Cameras are everywhere in public places, he said, leading to the key question: What type of privacy expectations do we have in public?

It’s an issue that has not been litigated clearly, Kitchen said.

The city of San Francisco last month became the first American city to ban facial recognition by police and other agencies, citing the potential for misuse.

Boult said his photographic method of data collection would not have been allowed in Europe, where strict laws govern photographing others in public.

American military use facial recognition to hunt a target and provide “greater reliability than human verification,” Kitchen said. “It minimizes the likelihood of false positives.”

Image recognition software on drones can warn convoys of roadside bombs, and in noncombat situations can identify those needing assistance in disaster relief situations, Kitchen said.

©2019 The Gazette (Colorado Springs, Colo.). Distributed by Tribune Content Agency, LLC.