IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Privacy Groups Uneasy With Monitoring Students for Self-Harm

A new report says schools are making more use of programs that monitor student devices for clues of suicidal ideation and self-harm, despite concerns about student privacy and the efficacy of such programs.

Shutterstock kids at computers
The state of student mental health in schools has reached crisis levels over the pandemic, according to a statement this week by the American Academy of Pediatrics, American Academy of Child and Adolescent Psychiatry and Children’s Hospital Association, declaring a "national state of emergency in children’s mental health." Noticing the problem, some K-12 schools have turned to a plethora of new software programs designed to monitor school-issued devices for clues of suicidal ideation among students.

Despite a growing need for K-12 mental health resources, a report from Student Privacy Compass, a website from the nonprofit think tank Future of Privacy Forum, suggests that the increased use of monitoring programs that track browsers for keywords relating to self-harm could do more harm than good.

According to the report, more than 15,000 schools use the monitoring program Securly and 10,000 schools use GoGuardian, the latter of which saw a 60 percent increase in users during the pandemic. In another June 2021 survey from the Center for Democracy and Technology, 81 percent of K-12 teachers reported using monitoring tools in their schools.

Anisha Reddy, policy counsel with the Future of Privacy Forum and one of the report’s authors, said she was unable to find “any independent research” into the efficacy of K-12 monitoring programs, which raises the question of whether schools themselves are the testing grounds for these programs.

The report noted these programs sometimes flag keywords related to sexual orientation or gender identity, such as “gay” and “lesbian,” as possible signs of bullying, but that has the potential to expose LGBT students in ways that could be "harmful, discriminatory or disparate," or otherwise considered invasive. Programs have also flagged discussions among students about media or music deemed suspicious, including the novel To Kill A Mockingbird, or keywords used in the context of research assignments.

Reddy said investments in these programs may be funneling money away from resources proven to help students, such as additional counselors trained in children’s mental health, while cultivating an atmosphere of distrust that could prove counterproductive.

“Introducing this kind of technology without that kind of support can only serve to potentially harm students,” she said. “Technology might not be the answer if you don’t have support for the students you identify.”

According to the report, most monitoring services employ algorithms that detect keywords based on simple natural language processing, as well as other modes of artificial intelligence that examine context.

Jeff Patterson, CEO and founder of the student monitoring program Gaggle, said the company’s monitoring technology has saved "more than 1,400" student lives. He said the software only flags “very specific indicators that a child is at risk of suicide, self-harm or violence.”

“If our technology flags something concerning, we have a team of human reviewers who then analyze the content to determine context and decide whether or not the content merits outreach to the school district’s emergency contact,” he said in an email to Government Technology. “At the end of the day, we believe the value of saving a child’s life should outweigh any concerns about a child’s schoolwork being monitored for threats of violence or self-harm.”

The report said companies like Gaggle tend to market themselves as ways to prevent school violence, and Reddy argued that talk of safety and violence in the context of mental health issues can lead to unwanted police involvement — a key concern among mental health and disability rights communities.

Erica Darragh, a campaigner with the digital privacy organization Fight for the Future, shared these concerns and said schools should ban the use of monitoring programs, whether for self-harm or academic proctoring.

Drawing from a 2016 report by the American Civil Liberties Union indicating that 14 million students attended schools with police but no counselor, nurse, psychologist or social workers, Darragh said such programs could serve to criminalize students struggling with mental health issues and feed them into the “school-to-prison pipeline.”

“The potential for exacerbating harm, not only on the individual level but also the institutional level, is much greater than perceived positive outcomes,” she noted.

Linnette Attai, director of privacy initiatives at the Consortium for School Networking (CoSN), suggested that schools and policymakers should clarify intervention policies and how these tools can be used, with these concerns in mind.

"Before engaging with any technology company, establish the goal. What problem are you trying to solve? How will you solve for it while protecting your students from harm or other unintended consequences?” she said. “There's a line between threat monitoring and profiling or targeting. Document the policies and procedures that will keep you on the right side of that before moving forward.”
Brandon Paykamian is a staff writer for Government Technology. He has a bachelor's degree in journalism from East Tennessee State University and years of experience as a multimedia reporter, mainly focusing on public education and higher ed.