IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

California’s New Child Privacy Law Could Become National Standard

The first-in-the-nation legislation imposes sweeping restrictions on Internet companies that serve minors, requiring that they design their platforms with “well-being” in mind and barring eight common data-collection practices.

A child looks at the TikTok app on an iPad.
Shutterstock
A new California privacy law might fundamentally change how kids and teens use the Internet — not only in California but also across the country.

The first-in-the-nation legislation, which goes into effect in 2024, imposes sweeping restrictions on Internet companies that serve minors, requiring that they design their platforms with children’s “well-being” in mind and barring eight common data-collection practices.

Supporters of the bipartisan measure — including a range of privacy, consumer and children’s advocates — have compared it to longstanding consumer safety protections, such as seatbelts and nutrition labels. New York, Washington and West Virginia also have weighed child privacy bills, and Congress considered four such bills last year. While the Washington and West Virginia bills died in committee, the New York, Pennsylvania and federal bills remain under consideration.

Whatever happens in Congress, the California law may yield new protections nationwide. Because it can be difficult for technology companies to apply different rules to users in different places, some might adopt new privacy protections across their entire footprint, some privacy law experts told Stateline — and not just for children, but for adults as well. The state has long been a leader in online privacy protections, pioneering laws that other states adopt later.

Advocates say a similar law in the United Kingdom, used as a model for California’s legislation, yielded global changes on major platforms including YouTube and Facebook. Shortly before that law went into effect, for instance, YouTube made teenagers’ uploads private by default and disabled an autoplay feature that nudged them to keep watching longer.

“Overall, it’s imposing a duty on platforms to keep kids’ safety in mind,” said Irene Ly, the policy counsel at Common Sense Media, a nonprofit that provides media recommendations to families and that backed the California law.

But tech industry groups strongly opposed the California legislation and could still sue to block it. Even advocates have acknowledged that portions of the act are overly vague, leaving major questions about how companies will comply when it goes into effect. A group of experts mandated by the law is developing guidelines on implementation.

One requirement — that sites and apps estimate the age of child users with a “reasonable level of certainty” — has alarmed some Internet freedom and consumer groups, which say it might force companies to collect more personal data. Eric Goldman, a professor of law at Santa Clara University School of Law and a prominent commentator on tech policy, has called the act a “Trojan horse” and a “poison pill.”

“We understand that many parents and policymakers are seeking to protect young people,” said Jennifer Huddleston, the policy counsel at NetChoice, which represents technology companies including Amazon, Google and Meta, which owns Facebook and Instagram. “[But] we find these kinds of top-down regulations often create barriers, not only for the future of innovation, but for parents … trying to figure out the best solution to address their own family’s values.”

‘ESCALATED’ DANGERS


Federal law already protects children’s privacy, but with some significant limitations.

Passed in 1998, with only a handful of updates since, the Children’s Online Privacy Protection Act, known as COPPA, requires that sites or services aimed at children aged 12 and younger request parental consent before collecting a child’s personal information.

The California law goes much further, expanding the definition of “child” to include all minors up to 18 years old and encompassing any service that is “likely to be accessed” by children, whether the service targets them or not. News sites, navigation apps and online retailers all could have to comply with the law, in addition to mobile games, social media networks and education technology platforms, such as those often used in schools.

“It takes a substantially different approach than the leading federal framework,” said Chloe Altieri, a policy counsel with the think tank Future of Privacy Forum. “Under COPPA, it’s more obvious what services and features are used by children. But a 16- to 18-year-old uses anything on the Internet, at this point.”

The California law also prohibits online services from selling children’s personal data or tracking their location, with some exceptions. Privacy settings must be strict by default, and privacy policies explained in child-friendly language.


Digital companies also must audit their products and features on a regular basis and record any potential harms to children, such as notification settings designed to hook them on a product or algorithmic systems that surface dangerous content. Companies must then work to “mitigate or eliminate'' those risks, the law says.

Policymakers intensified their scrutiny of these types of features last year, said Common Sense’s Ly, after former Facebook product manager Frances Haugen leaked internal documents that revealed new details about the platform’s negative impact on kids. The company’s own research showed, for instance, that its Instagram app exacerbated suicidal ideation, eating disorders and body image issues for teen girls.

“Platforms rolled out some changes to things like parental controls after that,” Ly said. “We’ve always felt these are useful tools for parents, but the companies themselves can take much bigger action.”

Violations of the California act will be punishable by fines of up to $7,500 per affected child.

But the law’s effects also could extend outside California, privacy experts said. In 2021, shortly before a similar British law went into effect, major companies including Google, Meta and TikTok introduced new global protections for young people.

YouTube, TikTok and Instagram, for instance, all changed privacy settings for some teenage users to make their accounts or uploads private by default. Google also turned on SafeSearch for all users under 18, while TikTok, Instagram and Snapchat disabled direct messages between children and unknown adults. Children can still change many of these defaults, if they choose, and adults posing as children can still communicate with them.

Many Internet services also strengthened their privacy policies for all users after the European Union adopted new data regulations in 2016, said Washington state Sen. Joe Nguyen. Nguyen, a Democrat who co-sponsored a child privacy bill in Washington this year, formerly worked as a program manager at Microsoft overseeing European privacy compliance.

“Some companies will totally, 100%, apply [the act] outside of California,” he said. “That’s just how technology works — it’s a pain to do different things for different jurisdictions.”

The Washington state bill failed when lawmakers couldn’t agree whether their law should include a private right to action, or the ability for individuals to sue web services, Nguyen said.

Lawmakers in other states aren’t waiting to see how tech companies respond to the requirements in the California law. New York state Sen. Andrew Gounardes, a Democrat, introduced a bill in September similar to the California law, with stiffer penalties of up to $20,000 per violation.

“Some of the dangers of unregulated social media have obviously been around for a while,” Gounardes said. “But I think the dangers only escalated in the light of the pandemic and the fact that young people are spending more time on their phones and other devices.”

In September, Pennsylvania lawmakers proposed a bill that would impose new privacy protections on third-party tech vendors working with K-12 schools, such as banning them from targeting ads to students, selling student data or compiling personal data profiles for non-education purposes. West Virginia lawmakers also considered a privacy bill last session.

President Joe Biden called on Congress in his 2022 State of the Union address to pass a federal Internet privacy law, saying “it’s time to … demand tech companies stop collecting personal data on our children.”

But while the California legislation enjoyed strong bipartisan support — a Republican and Democrat co-sponsored the bill, and it passed both chambers on unanimous votes — federal child privacy bills are stalled, said Müge Fazlioglu, the principal researcher at the International Association of Privacy Professionals. Federal lawmakers have instead focused on comprehensive privacy laws, which tend to be more controversial.

“There’s always more sensitivity about children’s data,” Fazlioglu said. “But it is fair to say that Congress has been paying attention.”

THE PARADOX OF KIDS' PRIVACY LAWS


Even the law’s staunchest supporters acknowledge, however, that it has flaws.

Among other issues, lawmakers declined to define some specific terms in the legislation, leaving tech companies and their legal teams to puzzle out concepts such as “material” harm to well-being or “likely to be accessed by children.” In comments to the California Assembly, a coalition of industry trade groups said they would need far more guidance from the state attorney general to interpret the law’s more subjective requirements.


Some consumer and civil rights advocates also have objected to a provision that requires websites and services to differentiate between children and adults “with a reasonable level of certainty,” or else provide children’s protections to everyone. The act’s backers argue that many companies could fulfill that requirement with data they already collect, such as the types of accounts or content a user engages with.

But there are concerns that companies could resort to more invasive methods of age verification — such as asking the birthdate of every user, linking individual profiles to data broker records or requiring users submit photos to screening software that estimates how old they are.

Instagram adopted that type of more invasive approach in June: Now, if a user changes her age from under 18 to over 18, she must first submit a government-issued ID, a “video selfie” for facial analysis, or the corroboration of three adult followers.

“The tricky part is that [age verification] puts a perverse incentive on companies to collect a lot more data on people — that’s a paradox we run into a lot on kids’ privacy legislation,” said Justin Brookman, the director of technology policy at the nonprofit Consumer Reports, which neither supported nor opposed the law. “It’s well-intentioned but counterproductive.”

Instead, Brookman advocates for privacy laws that grant stronger protections to both kids and adults — one way around the age estimation conundrum.

Tech companies will get some clarity on these and other issues by January 2024, when a working group of experts mandated by the law releases further guidelines on implementation. That is, if the legislation is not challenged first: NetChoice, which has sued to block social media regulations in other states, declined to comment on possible litigation.

Technology companies would prefer a policy approach that emphasizes consumer education, Huddleston said. Families have different ideas of what “well-being” entails — and tech companies have introduced several new privacy features in the past few years.

NetChoice plans to fight New York’s child privacy bill, as well.

“Industry has already stepped up to provide parental control tools and privacy options,” she said. “We want to empower consumers to make their own choices … as opposed to a regulatory approach, which can be very static.”

This article was originally published by Stateline, an initiative of The Pew Charitable Trusts.