IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

CoSN 2026: When Legislators Talk ‘Safety,’ Schools Hear ‘Restriction’

As federal and state lawmakers push nearly 20 bills to protect children online, data privacy expert Linnette Attai warns of unintended consequences for student access and school operations.

Two images overlaid, with the one on the left showing a person signing a document and the one on the right showing a government building.
Shutterstock
CHICAGO — With nearly 20 bills circulating at the federal level aimed at children’s online safety, panelists at the Consortium for School Networking (CoSN) conference last week frequently discussed how to protect students in an increasingly digital world.

As lawmakers move to address the issue, some school districts find themselves at a critical crossroads between well-intended policy and complex classroom reality. Data consultant Linnette Attai, who also serves as project director for CoSN’s Student Data Privacy and Trusted Learning Environment initiatives, warned that while the recent surge in legislative activity stems from a genuine concern for children, the resulting laws often suffer from a fundamental misunderstanding of the K-12 environment.

“The simplest way to understand what’s changing is to understand that legislators are concerned primarily with individual safety online as well as privacy. And sometimes those things are getting conflated into legislation,” she said.


MUTUALLY EXCLUSIVE: SAFETY AND PRIVACY


According to Attai, when safety and privacy are treated as interchangeable, the resulting regulations can inadvertently restrict the tools schools use to function, creating downstream impacts on everything from lesson plans to vendor competition.

One of the most significant risks of the current legislative trend is the potential for “safety-first” policies to act as barriers to learning, Attai added. She pointed to the Kids Online Safety Act (KOSA) — legislation that primarily protects minors online by requiring platforms to implement digital safeguards — as a prime example of how broad legal language can lead to over-restriction.

“The way the bill [KOSA] is written could actually result in students, young people in general, being restricted from accessing certain content that is otherwise lawful for them to access,” she said. “So in the guise of safety, we are in a situation where potentially we could be inadvertently unduly restricting the content that kids can access.”

Attai said updates to the Children’s Online Privacy Protection Act (COPPA) propose stronger parental controls and stricter limits on how companies collect, use and share children’s data. Historically, COPPA has focused on parental oversight, but in school settings, districts typically serve as intermediaries, providing consent on behalf of parents. However, as new laws seek to empower teenagers with individual consent rights, Attai noted they could significantly complicate questions around permissions and access to ed tech.

“We really want the school district to be the gatekeeper. We really want the school district to be deciding what technology is used, provided that it’s designed in accordance with applicable laws,” she said. “So, if the teenager has rights to step in and consent, how might that upset the balance in the class?”


UNINTENDED CONSEQUENCES


Beyond the classroom, Attai said these regulations have potential to carry heavy logistical and economic costs. She specifically noted that to satisfy safety requirements, many laws suggest age verification — a process that requires schools and vendors to collect even more sensitive data on minors.

While tech giants have the legal teams to navigate a patchwork of state and federal laws, Attai said smaller ed-tech innovators may be squeezed out.

“The small and midsized companies are gonna be much more challenged, and that will have economic impact on their viability,” she said.

To navigate this landscape, Attai suggested a holistic framework built on three equal supports: privacy, security and safety, which she said “go hand in hand.”

“If they’re not all working full speed, then none of them are working,” she said. “And if all three are not possible, then it’s probably the wrong tech and the wrong implementation.”

Attai noted that while lawmakers and tech companies are vocal, administrators who actually run the schools are often missing from this conversation.

“School district leaders really need to make sure their voice is heard here, because everyone is talking about student data privacy except school districts,” she said. “What we’re seeing is a move to leverage conversations about screen time and overuse of screen time into proposed legislation that would limit screen time in schools to maybe an hour or two, which is a fundamental shift in how schools operate these days.”

Julia Gilban-Cohen is a staff writer for the Center for Digital Education. Prior to joining the e.Republic team, she spent six years teaching special education in New York City public schools. Julia also continues to freelance as a reporter and social video producer. She is currently based in Los Angeles, California.