IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Can Tech Companies Be Liable for Teen Social Media Issues?

California legislators will renew discussion over a bill to penalize Facebook, Snapchat and other large companies for the algorithms and other features they use to keep minors on their platforms for as long as possible.

Social Media
(TNS) — Jaimie Nguyen’s use of Instagram started harmlessly enough in the seventh grade. There were group chats to schedule meetings with her volleyball team. She had fun searching for silly, sports-related memes to share with friends.

But pretty quickly, Nguyen, now 16, began spending a large portion of her weekday evenings scrolling through Instagram, TikTok or YouTube. She sought validation from people liking her posts and became caught up in viewing the endless loop of photos and videos that popped into her feeds, based on her search history. Disturbingly, some posts made her think she could look better if she followed their advice on how to “get thinner” or develop rock-hard abs in two weeks.

“I was eventually on Instagram and TikTok so many hours of the day that it got super addicting,” said the junior at San Jose’s Evergreen Valley High. Over time, she found it hard to focus on homework and became increasingly irritable around her parents.

Experiences like this — a teenager spending increasing blocks of time online with potentially harmful consequences — are at the center of a national debate over whether government should require social media companies to protect children and teens’ mental health.

As soon as Aug. 1, California legislators will renew discussion over AB2408, a closely watched bill that would penalize Facebook, Snapchat and other large companies for the algorithms and other features they use to keep minors like Jaimie on their platforms for as long as possible. The bill passed the Assembly in May, and an amended version unanimously passed through the Senate Judiciary Committee on June 28.

Experts and industry whistleblowers say these companies knowingly design their platforms to be addictive, especially to young users, and contribute to a growing crisis in youth depression, anxiety, eating disorders, sleep deprivation, self-harm and suicidal thinking. The bill would allow the state attorney general and county district attorneys to sue major social media companies for up to $250,000, if their products cause addiction.

The tech industry opposes AB2408 for a number of reasons. The bill offers an “oversimplified solution” to a very complex public health issue, said Dylan Hoffman, an executive director for California and the Southwest for TechNet, a group of technology CEOs and senior executives. Many other factors, he said, affect teen mental health.

But Leslie Kornblum, formerly of Saratoga, doesn’t buy the idea that there was no connection between her 23-year-old daughter’s teen bouts with anorexia and her immersion in “thinfluencer” culture on Instagram and Pinterest. Her daughter, who is now in recovery, was inundated with extreme dieting tips on how to fill up on water or subsist on egg whites, Kornblum said.

Meta, the parent company of Facebook and Instagram, faces a growing number of lawsuits from parents who blame the social media sites for their children’s mental health struggles. In a lawsuit filed in U.S. District Court in Northern California against Meta and Snapchat, the parents of a Connecticut girl, Selena Rodriguez, said her obsessive use of Instagram and Snapchat led to multiple inpatient psychiatric admissions before she died by suicide in July 2021. Her parents said the platforms didn’t provide adequate controls for them to monitor her social media use, and their daughter ran away when they confiscated her phone.

The debate over AB2408, known as the Social Media Platform Duty to Children Act, reflects longstanding tensions between tech companies’ ability to grow and profit and the safety of individual users.

A U.S. Surgeon General advisory issued in December called on social media companies to take more responsibility for creating safe digital environments, noting that 81 percent of 14- to 22-year-olds in 2020 said that they used social media either “daily” or “almost constantly.” Between 2009 and 2019 — a period that coincides with the public’s widespread adoption of social media — the proportion of high school students reporting sadness or hopelessness increased by 40 percent and those contemplating suicide increased by 36 percent, the advisory noted.

AB2408 is similar to bills recently proposed in Congress as well as in other states. Assembly member Jordan Cunningham (R-San Luis Obispo) said he co-sponsored the bill with Buffy Wicks (D-Oakland) because he was “horrified” by growing evidence, notably from Facebook whistleblower Frances Haugen, that social media platforms push products they know are harmful.

“We’ve learned that (social media companies) are employing some of the smartest software engineers in the world — people that two generations ago would have been putting people on the moon, but who are now designing better and better widgets to embed within their platforms to get kids hooked and drive user engagement,” said Cunningham, a father of three teenagers and a 7-year-old.

But TechNet’s Hoffman said AB2408’s threat of civil penalties could force some companies to ban minors from their platforms altogether. In doing so, young people, especially from marginalized communities, could lose access to online networks they rely on for social connection and support.

Moreover, Hoffman argued that AB2408 is unconstitutional because it violates the First Amendment rights of publishers to select the kinds of content they share and promote to their audience.

Cunningham’s rebuttal: AB2408 has nothing to do with regulating content; the bill targets “the widgets and gizmos manipulating kids’ brains,” he said.

Jaimie Nguyen was able to pull back from social media, thanks in part to her parents expressing concern. But she could only do so by removing Instagram and TikTok from her phone. Now, it’s up to legislators to decide whether the government should step in.

Says Cunningham, “There’s nothing in the 50 states or federal code that says you can’t design a product feature that knowingly addicts kids. I think we need to change that.”

© 2022 Silicon Valley, San Jose, Calif. Distributed by Tribune Content Agency, LLC.

Tags:

Social Media