IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Policies Target Kids’ Access to Social Media, but Will They Work?

A rising number of state and federal lawmakers are crafting legislation that would restrict young kids' access to social media. But some policy experts worry that the bills will be difficult to enforce and may have unintended consequences.

A closeup image of a child using a smartphone
Shutterstock
(TNS) — A rising number of state and federal lawmakers are crafting legislation that would restrict young kids' access to social media and institute other protections for young social media users—all in the name of improving mental health.

But some policy experts worry that the bills—which are generating bipartisan support—will be difficult to enforce and may have unintended consequences.

"This is all new territory for Congress: how do you protect the First Amendment? How do you keep kids' autonomy online?" said Allison Ivie, the government relations representative for the Eating Disorders Coalition for Research, Policy and Action, which has been tracking this issue closely. She was referring to a bill recently filed in the U.S. Senate. "There is a level of frustration in this country when we see these levels of mental health problems skyrocketing, and people want a quick fix."

Many lawmakers, who are parents and grandparents, are seeing this problem play out in their homes, said Ivie. And she suspects there was an expectation from a lot of adults that kids' mental health issues would dissipate once they were back to learning full time in-person again.

"That didn't happen for a lot of kids," she said. "The damage had been done. So, now it's like, 'they're not the same kid—they're still glued to their phone, and I don't know what to about this.'"

Taken together, bills filed in at least nine states and at the federal level generally have three primary goals: compel social media companies to verify users' ages; bar social media companies from using algorithms to recommend content to young users; and restrict minors from using social media either through age requirements, parental permission requirements, or curfews and time limits.

Bills recently signed into law in Utah and Arkansas require social media companies to verify all users' ages and get parental consent for minors before they can set up accounts. The Utah law also requires social media companies to block minors from accessing their platforms from 10:30 p.m. to 6:30 a.m., grant parental access to minors' accounts, and limit the data that can be collected on minors.

Now, the U.S. Congress is wading in. A bipartisan group of senators has introduced a bill that would bar all children younger than 13 from having social media accounts, which is already a policy some social media companies have. It would also require teens 14 to 17 years of age to get their parents' permission before opening an account. And it would prohibit social media companies from using algorithms to recommend content to minors.

Social media is defined very broadly in that bill, introduced by Sens. Brian Schatz, D- Hawaii, and Tom Cotton R- Ark. It's not clear which social media platforms would be most affected.

More federal legislation is likely ahead. Ivie said that she expects a bill originally introduced last year by Sens. Richard Blumenthal, D- Conn., and Marsha Blackburn, R- Tenn., called the Kids Online Safety Act, will be reintroduced.

And in a parallel effort, a growing number of school districts are suing social media companies over the harm they say these platforms are doing to kids' mental well-being.

SOCIAL MEDIA COMPANIES UNDER PRESSURE TO CREATE MORE PROTECTIONS


Social media companies have been under growing scrutiny recently for the way their products impact young users and collect their data. Meta, the parent company of Facebook and Instagram, in particular has faced strong backlash since a whistleblower in 2021 alleged that the company was sitting on extensive research into how its platforms—namely Instagram—hurt kids' mental health and did not take action to address those problems.

Meta, TikTok, and Snap, the company that owns Snapchat, all told Education Week in statements that their companies take youth safety and well-being seriously and are continuously developing tools to promote safe and healthy use of their products among their youngest users. Those protections include more parental control options, screen time management tools, and age-verification features.

"TikTok is committed to providing a safe and secure platform that supports the well-being of teens, and empowers parents with the tools and controls to safely navigate the digital experience," a TikTok spokesperson wrote to Education Week. "We strive to accomplish this through robust safety policies, parental controls, and age-appropriate account settings, which include automatically setting a 60-minute daily screen time limit for users under 18 years old, and disabling direct messaging for those under 16."

A Snap spokesperson said in a statement that to reduce the spread and discovery of harmful content, their company uses a combination of content from known creators and publishers and human moderation to review content.

"We also work closely with leading mental health organizations to provide in-app tools for Snapchatters and resources to help support themselves and their friends," said a Snap spokesperson.

"We refer to research, feedback from parents, teens, experts, and academics to inform our approach, and we'll continue evaluating proposed legislation and working with policymakers on these important issues," Meta told Education Week in a statement.

Many social media companies now provide tools for parents and teens, said Taylor Barkley, the director of technology and innovation at The Center for Growth and Opportunity at Utah State University, but he's not sure it's enough.

"I will say that companies were much too slow on the draw, that's my critique of them, he said. "I talk with parents and they have no idea that [social media companies have] resources."

Regulating social media companies in the name of youth mental health is one of the rare issues that both Democrats and Republicans can agree on, he said. But having built-in bipartisan support doesn't mean that it will be smooth sailing for these legislative initiatives.

Requiring all users to verify their ages, which will soon be the case in Utah and Arkansas, will likely be very unpopular and difficult to do, said Barkley. And he's skeptical the new measures in those states will do much to improve the mental health of teens and children.

"I just really haven't seen it from proponents of these bills, what effect they anticipate they will have on teen mental health," said Barkley. "Let's take Utah for example and Arkansas, what effects do we expect these bills to have on suicidal ideation, on depression, or anxiety rates amongst teens in the states. There is very little follow through on how will we know that these bills will be successful?"

The impact on schools may be limited, added Jeffrey Carpenter, a professor of education at Elon University who studies social media in education. He pointed out that social media is only one factor contributing to the mental health problems of today's youth, so there is a limit to what these restrictions can achieve.

ANOTHER PROBLEM IS THAT KIDS ARE GOOD AT FINDING LOOPHOLES IN NEW TECHNOLOGIES.


"Resourceful young people who want to use social media typically have little trouble getting past already existing age limits and parental consent requirements," said Carpenter. "Many parents may not fully understand how social media platforms work, and what exactly it is they are consenting to."

The final wrinkle is that social media isn't always bad, as it can provide a safe space for some kids, said Kelly Vaillancourt Strobach, the director of policy and advocacy for the National Association of School Psychologists. Kids such as LGBTQ+ youth and those who do not have an established religious or ethnic community they identify with where they live often find support on social media.

"That has been one of the benefits of social media, you can find communities of like-minded people on social media to supplement what you don't have in your actual community," she said. "I'm speaking as Kelly the parent, I want to make sure that I have some control and knowledge of what my kids are doing online, but at the same time are we inadvertently fueling more harm for certain populations that we know are already at greater risk for mental health issues?"

SHORING UP DATA PRIVACY PROTECTIONS IS A TOP PRIORITY IN SOME STATES


Additional legislation in a couple of states takes different approaches to social media restrictions for kids. North Carolina lawmakers are considering a bill that would shore up data privacy protections for children who use social media, and lawmakers in Massachusetts have proposed taxing social media companies to help pay for mental health services and programs for children. In Florida, a bill that directs the state's education department to develop an online safety curriculum is awaiting the governor's signature.

Carpenter said he would like to see future legislation focus on providing schools with more funding and resources to teach digital literacy skills and hire more mental health support staff.

But schools don't have to sit idly by waiting to see whether all this new legislation and lawsuits succeed. Schools can teach students digital literacy skills that will help them learn how to use social media responsibly and protect themselves when they are online. And schools should begin teaching those skills in elementary school, some experts argue.

"We need to be in kindergarten starting to have conversations about how technology can be great and social media can be great, but here's how to use it responsibly," said Vaillancourt Strobach.

©2023 Education Week, Distributed by Tribune Content Agency, LLC.