IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

Opinion: A School Psychologist’s Guide to Digital Resilience

To support students facing mental health stressors in the digital age, school leaders must explain features like “data mining” and “engagement algorithms,” and give kids chances to develop social skills offline.

Mental health apps for Gen Z in the USA, youth using wellness technology, structured digital setup
Adobe Stock
I remember the first wave. For me, it began nearly 15 years ago with the introduction of social media into young people’s daily lives. At the time, little attention was given to safeguards or developmentally appropriate protections. As educators, parents and mental health providers scrambled to respond to “fires,” our focus was on triage. In doing so, we missed critical opportunities to advocate for proactive, prioritized care.

At the high school where I worked, we saw the emergence of “FOMO,” the early days of cyber bullying, and the shift in online identity curation. Today, as a school psychologist and mental health specialist, I see us entering a second wave that is swifter and more complex. This time, however, we have the benefit of hindsight. We know the risks that come when rapid changes outpace the rights and protections for our students.

Young learners today are navigating a landscape that is no longer about “turning off the TV.” It’s an environment of unregulated algorithms and human-simulated bots. There is no forgive-and-forget in an always-on digital world. Moments that time once softened are now permanently captured, searchable and retrievable. AI doesn’t create this reality, it simply hyperscales it. And while there’s certainly room for pessimism, my school hallways reveal a story of resilience and an urgent need for a united front among educators, families and state leaders in protecting our students.

THE NEW DIGITAL TREND: FROM GAMES TO MALADAPTIVE FEATURES


While “being online” once meant educational games and carefree curiosity, the landscape has shifted with the rise of maladaptive features. We’re no longer talking about “mean comments,” but algorithms meticulously designed to keep children scrolling by feeding them increasingly extreme content. I have seen students search for “study skills” only to find their feeds populated, within two weeks, with videos of self-harm linked to academic pressure.

Scarier still is the rise of human-simulated connections. Nearly two-thirds of teens (64 percent) report using chatbots, and about 30 percent do so daily, according to Pew Research in December 2025, raising urgent questions about how often students are practicing conversation, reflection and decision-making with machines rather than people. This creates a connection of “nothingness” — a student investing emotional energy in something that doesn’t exist. We are serving a uniquely vulnerable generation, shaped by the reckless experiment of unregulated social media and the lasting effects of COVID-19. If we fail to prioritize human connection, AI will easily fill that void.

THE FIRST CONVERSATION: PROCESSING TRAUMA IN A DIGITAL AGE


When a student comes to me distressed by something they saw online — whether real-world violence or AI-generated trauma — the approach remains rooted in trauma-informed care. The first conversation isn’t about the technology, it’s about the person. My priority is creating a safe, private space where the student can describe what happened in their own words. I use a framework that prioritizes what I call “The Big Three”:

  • Feelings: What did you feel the moment you saw it? (Normalize fear, confusion or anger.)
  • Thoughts: What went through your mind?
  • Behaviors: Have you noticed changes in your sleep or daily life? Are you avoiding certain things or acting out of character?

If a student experiences trauma as a result of an AI-generated video, that trauma is no less real than if it were caused by an actual event. As providers, our skill set hasn’t changed, but our vocabulary must. We need to name features like “data mining” and “engagement algorithms” to help students see the man behind the curtain and regain their sense of agency. Students need language to identify digital dependency, misplaced trust in AI outputs and the influence of automation bias. Naming these dynamics is a critical step toward healing.

BUILDING A SYSTEM OF STABILITY


Schools cannot constantly be in crisis mode. To support students, we must build a foundation of universal supports that promote hope and belonging. When a school operates with predictable routines and strong relationships, it becomes a safe place that offsets the volatility of the Internet.

Schools should prioritize tiered supports by using universal screeners to identify students experiencing significant stress or anxiety. We can also bridge the gap through hosting joint sessions, or "crossover events," where legal authorities, mental health providers and families use the same language regarding cyber safety. Leveraging anonymous reporting is another vital tool for creating a culture where students feel comfortable flagging concerns about their peers’ well-being. When schools have a clear process to investigate these reports, they can address issues that occur outside the classroom but still affect the community.

RECLAIMING CONFLICT AS A SKILL


One unspoken trend that worries me is how digital spaces can phase out the healthy conflict necessary for growth. Algorithms are designed to show us what we like, and AI bots are programmed to be infinitely agreeable. But in the real world, relationship-building requires navigating disagreements.

When students spend too much time in these frictionless environments, they lose the muscle memory for face-to-face resolution. We must remind our learners that peer disagreement isn't a “system error,” but a vital part of becoming a resilient human being. Digital wellness means having the social stamina to handle the messy, non-algorithmic parts of being human.

THE SCHOOL AS A HUB


There are many unsung heroes in this movement. While school psychologists handle the emotional fallout, educators and library media specialists are on the front lines of academic integrity, helping students distinguish between using AI as a brainstorming partner versus a replacement for thought.

When schools act as a hub, bringing together media experts, mental health providers and families, this second wave becomes less daunting. For example, AI can be a powerful tool for students who struggle with writing or social nuances. By promoting positive AI use for improving employment opportunities, like crafting cover letters, we remove barriers while highlighting the skills students already possess.

The goal isn’t to banish the digital world, but to cultivate wellness, safety and balance. I’ve sat in very difficult meetings with parents regarding search histories, and they are uncomfortable conversations. But when educators and families face the conflict head-on, we find solutions. We can’t filter the whole world for our students, but we can help them develop the internal filters they need to navigate it with confidence.

Kay Kelly is a mental health clinical services specialist at the teletherapy company eLuma and an experienced school psychologist whose expertise spans diagnostic assessment, counseling, intervention and advocacy.