IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Facial Recognition Software on the Rise in U.S. Schools

As school officials across the country worry about how to stop the next mass shooting, biometric technologies and expanded surveillance systems have become attractive alternatives to traditional security procedures.

The specter of mass shootings has pushed school administrators across the country to consider investment in an array of new and emergent security technologies that have been sold as potential solutions to head off these tragic incidents. 

Chief among the new technologies is facial recognition — a technology that has recently exploded to prominence in many other sectors of society. 

Large cities like Chicago and Detroit, frequently courted by companies, have seen a recent push towards widespread adoption, while school districts in cities in states as diverse as FloridaTexas, Missouri, and Colorado, among others, are also seeing investment. 

As the technology becomes more ubiquitous, some schools have embraced it wholeheartedly in the hopes of improved security, while others are taking a more cautious approach, slowed by concerns for privacy and accuracy. 

Improving Security  

One place where the technology has been welcomed with open arms is Putnam City School District in Oklahoma. 

Covering a significant swath of Oklahoma City, as well as several smaller, neighboring cities, Putnam already has an extensive security system: over 800 cameras are equipped at 30 school buildings spread out over some 43 square miles, said Mark Stout, the district's chief of police.

Still, improvements are always being sought, he added.  

The district began looking into the facial recognition market in early 2018. After selecting Israeli vendor AnyVision, equipment was installed during the late months of that year; officials then ran the cameras through a period of testing that lasted four to five months — with a heavy emphasis on rooting out any potential for gendered or racial bias, Stout explained.  

While still relatively new, district administrators feel the technology gives an added layer of sophistication to security processes already in place. When coupled, for instance, with a system of strategically placed metal detectors and Genetec-powered access control devices — which allow officials to remotely lock down certain parts of the school — the new cameras hopefully have the capability to help quickly identify and isolate threats. 

Also important is the product's "watchlist" feature, which helps security officials archive and identify certain students who have been suspended, do not belong on school grounds, or who may pose some sort of threat. While some schools have seen backlash over this feature, Stout said that the public has been receptive to it as a key security function. 

At the same time, the software is also moving closer to accurate object recognition, which would help security personnel identify "someone with a rifle, or a long gun, or a handgun," Stout said. This future capability would greatly advance the ability to minimize threats, he added.

On the whole, then, the system has been viewed as a success by administrators and the community, Stout said. 

"We feel very confident with the system," he said. "We haven't really run into any kinks with the software, or with the system. And it's been fairly well received by the public." 

Putting the Breaks On

The technology's deployment has not gone so smoothly everywhere. 

This year the Lockport City School District in upstate New York had planned to deploy a face recording system at its schools, with the explicit hope of keeping certain unwanted adults — particularly sex offenders — off of school grounds. The district's $1.4 million purchase of an AEGIS system was funded through a large grant from the Smart Schools Bond Act, which allocated $2 billion in state funds to improve security at New York schools. 

However, public concerns about the district's additional plans to archive suspended students with the software caused public outcry. As a result, a combination of negative press coverage, lobbying by the state's ACLU chapter, and public concern eventually called into question the district's plans. A New York assemblymember, Monica Wallace, also introduced legislation that would have temporarily banned the technology from being used at the state's schools.    

In May, the state's Department of Education (NYSED) asked Lockport to delay its testing of the AEGIS face-recording function; the function that identifies objects, like weapons, was allowed to continue.  

Wallace, speaking with Government Technology, said that while she doesn't believe in banning the technology outright, a large-scale adoption of biometric surveillance without public scrutiny and an adequate regulatory framework is problematic.   

"All these school districts keep wanting to purchase more and more of this software. They're being visited by salespeople who are suggesting that this is such great technology and they should be spending millions of dollars on it, all without any kind of discussion as to what they should be looking for in terms of accuracy, cost-benefits and what risks are involved," Wallace said. 

In addition to concerns about potential racial or gender bias built into the technology, data security and privacy were also primary concerns, Wallace noted. 

Wallace's bill failed to pass during this year's legislative session, a fact she attributes more to an overcrowded legislature than lack of political will, she said. But, she said, the introduction of the bill helped push a broader conversation on the subject. 

Wallace said she is "hopeful" that the bill will be passed next session, though she noted that other legal changes by the state — that is, changes ordered by NYSED, like Lockport's temporary ban — could supercede the need for such legislation.  

Costs and Benefits  

Security benefits from facial recognition systems seem to come in a variety of shapes and sizes, but so do its related ethical concerns. 

In the case of Lockport, activists condemned the capacity for overreach that such a system would have the ability to archive the identities of the people it records, then go back to track and analyze their movements throughout the school over a 60-day period. Other schools have seen controversial uses of the tech as well, like UC San Diego's use to predict student engagement, or the case of the Colorado University professor who covertly used it on his students in a $3.3 million military experiment. 

In addition to the issue of invasiveness, however, the question as to whether there are unintended social or psychological side effects from this kind of surveillance also remains to be answered. 

Jon Penney, a research affiliate at Harvard University, studied online search habits before and after the mass surveillance revelations released by former NSA contractor Edward Snowden, finding a consistent downturn in certain types of online behavior after the covert operations were revealed. 

Those results, he posits, were somewhat consistent with a "chilling effect" on a range of online behaviors, including willingness to speak, engage, and share, he said. Carefulness, a habit of being less likely to take risks or be adventurous, is also a frequent result of surveillance, he said.

How this might translate within a school setting, Penney said he couldn't be sure, though he said he felt his work's findings could potentially be translated to a host of different surveillance situations.  

One thing's for sure, Penney said, surveillance tends to become normalized after a while. "People are upset, but then they get used to the idea of being watched. I think the challenge is that often surveillance effects are a lot more subtle and a lot more unconscious than people realize."  

Lucas Ropek is a former staff writer for Government Technology.