IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Can Section 230 Reform Advocates Learn from Past Mistakes?

Congress is considering a flurry of proposed revisions to Section 230 of the Communications Decency Act, but some experts say reforms must be nuanced and carefully researched to avoid unintended consequences.

Concept,Showing,Of,Cda,Section,230,Or,Communications,Decency,Act
Shutterstock/WESTOCK PRODUCTIONS
A Brookings Institution panel warned yesterday that Section 230 reform efforts being considered by Congress could have unintended and harmful side effects — unless lawmakers learn from past mistakes.

Revising Section 230 of the Communications Decency Act is an idea that often surfaces in conversations over trying to make the Internet safer from hate speech, disinformation, criminal activities or other harms. That policy essentially protects online platforms from most civil liabilities over their decisions to leave up or take down user-posted content. Complainants would need to instead sue the user.

Federal lawmakers introduced more than 20 Section 230 revision proposals between Jan. 2021 and Mar. 2022. But Brookings fellow and Lawfare Senior Editor Quinta Jurecic, said that some of these proposals seem to be repeating strategic missteps seen in earlier efforts to alter Section 230.

Section 230 reforms enacted in 2018 aimed to combat online sex trafficking ultimately produced few to no tangible wins. The policies — the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) and Stop Enabling Sex Traffickers Act (SESTA) — accidentally made investigating such crimes more difficult (because activity shifted to overseas sites that were harder for U.S. law enforcement to pursue) and exposed consensual of-age sex workers to more violence.

These are lessons for today’s legislators to consider as they look to wield Section 230 reform to tackle important issues like COVID-19 misinformation and child sexual abuse.

“FOSTA is a cautionary tale for what could happen if you fiddle with 230 without really thinking too carefully about what the effects might be,” Jurecic said.

She and other panelists recommend lawmakers avoid vague regulations and instead carefully consider and frame the problems they want to solve, consult with residents who would be impacted and then craft clear, nuanced policy.

UNEXPECTED SIDE EFFECTS


FOSTA created exceptions to Section 230, making platforms liable if they help sex traffickers advertise or if they “unlawfully promote or facilitate prostitution." A similar policy, SESTA, lifted liability protections for “knowingly” enabling or supporting material promoting involuntary sex trafficking or sex trafficking of children.

The well-meaning effort had the side effect of prompting many websites and forums that were related to willing sex work — but not to forced sexual exploitation — to shut down, including platforms that sex workers used to screen out dangerous customers and share safety tips with each other.

Panelists said FOSTA was too vague, prompting platforms that the legislation didn’t intend to target to nonetheless shutter to avoid risk of liability.

Part of the issue is that policymakers appear not to have consulted with all communities likely to be impacted by the policy — in this case, by overlooking talking with sex workers.

Jurecic wrote in a recent report that several current proposals run a similar risk. The 2021 Health Misinformation Act similarly creates an exception to Section 230, in this case by holding platforms liable if their algorithms amplify misinformation about public health emergencies. The Department of Health and Human Services would define what counts as misinformation.

The COVID-19 pandemic has demonstrated that officially released public health information and guidelines can change quickly, however. Platforms concerned that they won’t be able to keep up with sorting the latest facts from misinformation may decide to simply ban most health-related content, rather than risk penalization, Jurecic wrote.

Unless the definition of misinformation is clear and precisely targeted, cautious platforms might shut down a wide array of conversations, including social media groups where people with disabilities turn to discuss medical advice with each other, she added.

RESEARCH FIRST


Better research is important, and two federal bills could help establish the base of knowledge needed to inform thoughtful Section 230 revisions, by providing insights into how online platforms work.

The Platform Accountability and Consumer Transparency Act (PACT) would require platforms to explain their content moderation approaches and regularly report about content they’ve removed or “deprioritized.” The Platform Accountability and Transparency Act (PATA) obligates social media companies to let independent researchers view more data, including about user targeting.

GETTING SPECIFIC: WHAT DO WE WANT?


Lawmakers seem quick to regard Section 230 reform as a panacea for solving any kind of Internet ills. But effective changes require really spelling out first what online spaces would, ideally, look like, who that benefits, what stands in the way of that vision and how a policy change would overcome this, Jurecic said.

“There's just a sort of a general sense that, ‘the Internet is bad and I don't like it, and I want to fix it,” she said.” Very rarely do we define carefully what ‘bad’ means or what ‘fixing’ means or what a fixed-up place would look like.”

Social media platforms are used as community spaces, and so their governance ideally should be informed by research and philosophy around community building and restorative justice, not just by legal concerns, said Kate D’Adamo, a partner at Reframe Health and Justice and a sex workers’ rights advocate.

The diversity of online spaces and communities means there’s no single set of rules that will be appropriate and effective across the board, and so there also need to be mechanisms for keeping platforms responsive to their particular user bases.

Danielle Citron, law professor and director of University of Virginia’s LawTech Center, similarly said that legislation require firms to take “reasonable steps” to stop dangerous illegal activity. What counts as “reasonable” would need to be defined and should depend on the size and focus of the platforms, with small startups and global platforms held to different expectations.

She called for imposing a “duty of care” on social media platforms, that would prevent them from solely pursuing profit and instead obligate them to take into account potential harms on users and the public — or else be held negligent.
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.