IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

FETC26: Approach School Surveillance Tech With Skepticism, ACLU Says

A policy advocate from the American Civil Liberties Union warned FETC attendees last week that fear-based marketing and limited empirical evidence are driving district adoption of student surveillance tools.

A security camera monitoring a modern classroom with students in the background.
Adobe Stock
ORLANDO — From social media monitoring to artificial intelligence-powered weapons detection systems, many K-12 districts have invested in technology promising to promote student safety. But some critics say these types of surveillance tools are being marketed through fear-driven claims rather than proof of efficacy, putting student privacy, learning and overall well-being at risk.

Leading a panel at the Future of Education Technology Conference (FETC) in Orlando last week, Chad Marlow, senior advocacy and policy counsel at the American Civil Liberties Union (ACLU), urged education leaders to critically assess how ed-tech surveillance tools are marketed, implemented and evaluated, outlining how this specific type of tech may introduce unintended consequences for students and schools.


WHAT IS STUDENT SURVEILLANCE TECH?


Student surveillance tech, according to Marlow, is presented to districts as a way to detect risk early and prevent harm. However, he said these tools vary widely in how they operate and what they monitor.

Most systems operate by identifying early signs of abuse or risky behavior, he said. For example, tech used to monitor students’ online activity “scans [and] scrapes targeted social media accounts ... looking for certain words and phrases that have been entered as potentially being of concern.”

When a monitoring system flags suspicious terms or phrases, it sends alerts to school officials or the vendor, prompting further review or intervention, he said.

Weapons detection software, on the other hand, is designed to analyze surveillance camera feeds and alert officials if a camera picks up something that the software recognizes as a potential weapon. Marlow noted that these tools often rely on AI to assess images in real time.

He expressed skepticism and concern regarding the reliability of these systems, though, citing real-world misidentification and failures. For example, he said, in October 2025, an AI-powered gun detection system used by a Maryland high school wrongly identified a student’s bag of chips as a firearm, prompting armed local law enforcement to arrive at the scene.

Marlow also described a broader ecosystem of surveillance tools used in schools, including video monitoring and proctoring that use webcams to track students’ actions and behaviors; web filtering tools that scan what students explore online; and behavioral detection technology that analyzes a student’s facial expressions, body language and other behaviors to ascertain their emotional state and predict future actions.

SURVEILLANCE ED-TECH MARKETING TACTICS


While Marlow said efforts by districts to improve student safety are well-intentioned, he critiqued the efficacy of these tools, arguing that they often lack verifiable research that demonstrates they work as vendors claim they do.

In a 2023 report, the ACLU illustrated how tech companies often sell student surveillance tech by listing major safety concerns for school districts, then making claims about the technological precision and effectiveness of their respective tools. Marlow said vendors promote two interrelated narratives: first, that schools face urgent and escalating safety risks, and second, that surveillance technologies are the most effective — or only — way to address those risks.

According to Marlow, vendors frequently emphasize worst-case scenarios, like mass school shootings, to reinforce a sense of urgency. He said the industry’s emotional messaging affects him, too.

“I will tell you, as any parent, even though I am steeped in information about this, these narratives hit me,” he said. “They make me afraid. I know they make school officials afraid.”

Once fear has been established, Marlow said, companies shift to promoting their products as proven interventions. The problem, according to the ACLU, is that perceived threats to student safety are extremely rare, or at times nonexistent.

“The ed-tech surveillance industry is trying to get us to focus purely on what we feel and not what we know,” Marlow said.

Part of that strategy, he continued, involves exaggerating the accuracy of the tools themselves, as a dearth of hard evidence has led companies to market their tools with success metrics that lack substantiation. He cited an example in which a vendor claimed their school security software helped save “1,562 students” from suicide during a single academic year.

“That number is a scam,” Marlow said.

He also pointed to the use of isolated success stories, or anecdotes, as evidence of impact.

“If these companies are monitoring tens of millions of students a day, then of course, when you’re doing that, you’re going to be able to generate a handful of success stories,” Marlow said. But without context — including failures and harms — those stories can be misleading.

Finally, Marlow said industry trade groups have worked to reduce financial barriers to adoption by lobbying for public funding tied to school safety. As a result, some products are offered at low cost or for free.

“If these things are such low cost, why not take the risk?” Marlow said, summarizing the argument districts are often presented with. But, he added, “these pools of money only lower the financial cost of their products. The other costs of their products remain significant.”

RECOMMENDATIONS FOR DISTRICTS


Beyond questions of efficacy, Marlow stated that surveillance can alter school culture and affect student behavior. For example, he noted that tools sold as responses to serious problems, such as violence and self-harm, are often used to enforce minor policy violations like vaping, raising concerns about scope and proportionality.

“Student surveillance not only undermines these rights, it teaches students to fear that others will attempt to unveil their most private thoughts and actions and punish those they object to,” he said.

With this in mind, Marlow encouraged districts to approach purchasing decisions with caution and skepticism, relying on diverse information sources and resisting fear-driven narratives. He urged leaders to consider the broader opportunity costs of spending on surveillance when other interventions or supports may be more effective, and drew a distinction between perceived and actual safety.

“Remember that just because something is free doesn’t mean it has no costs,” he said. “Feeling safer is very different than actually being safer.”
Julia Gilban-Cohen is a staff writer for the Center for Digital Education. Prior to joining the e.Republic team, she spent six years teaching special education in New York City public schools. Julia also continues to freelance as a reporter and social video producer. She is currently based in Los Angeles, California.