IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Could Regulating Social Media Companies Fix the Internet?

A congressional hearing last month took up the sticky issue of when and how to hold companies like Facebook, Google and Twitter accountable for misinformation. Lawmakers are now faced with a regulatory maze.

The U.S. Capitol Building.
The U.S. Capitol Building
Shutterstock/Colin Dewar
On March 25, 2021, the House Subcommittee on Consumer Protection and Commerce and the Subcommittee on Communications and Technology held a joint hearing titled “Misinformation Nation: Social Media’s Role in Promoting Extremism and Misinformation.” It addressed a wide range of issues relating to social media and misinformation and looked at how regulation could potentially help.

The major focus of the hearing was whether to strengthen Section 230 of the Communications Decency Act to create new oversight of social media companies. Representatives on both sides of the aisle stated support for a revision, though for different reasons. Democrats believe Section 230 limits platforms’ ability to moderate content while Republicans believe it gives the platforms too much power to moderate content.

Other experts in the digital space have concerns about whether making changes might have unintended negative consequences.

India McKinney, director of federal affairs for the Electronic Frontier Foundation (EFF), raised concerns in a conversation with Government Technology following the hearing. McKinney said, EFF does not believe that Section 230 needs to be changed and could ultimately benefit Facebook because its size and ability to litigate. This could end up having an adverse effect on smaller competitors.

She also fears that too many restrictions could also lead to corporate censorship, especially if the laws are vague. McKinney stated her belief that the law is balanced as it currently exists.

“It actually does a pretty good job of allowing platforms to moderate their own content without fear of nuisance lawsuits, and also still gives the individual users the right to hold other users accountable for the speech that they say online,” McKinney said.

McKinney proposed an alternative solution that would not have First Amendment implications: privacy legislation.

Much of what was discussed int the hearing centered on privacy and the management of personal data. She argues that a private right of action could hold companies accountable by giving power back to the users — she cites Illinois' Biometric Information Privacy Act as one strong example.

Still, misinformation is a major concern with big tech platforms. New Jersey Rep. Frank Pallone, chairman of the House Committee of Energy and Commerce, addressed the danger of this misinformation and the need to hold social media companies accountable in his opening statement.  He cited a Pew survey that claimed 30 percent of Americans are still hesitant to take the COVID-19 vaccine and quoted Homeland Security Secretary Alejandro Mayorkas who identified domestic violent extremism as the “greatest threat” to the United States.

“The time for self-regulation is over,” Pallone said.

McKinney, however, emphasized the importance of considering free speech with any legislation that might regulate misinformation. She also noted the ambiguity of a term like misinformation, emphasizing the importance of defining it in a legal context to keep it separate from hyperbole, satire, parody, political commentary and opinion.

Illinois Rep. Janice Schakowsky, chairwoman of the Consumer Protection and Commerce Subcommittee, also addressed the misinformation occurring on these platforms, emphasizing its impact to democracy. She acknowledged that platforms like Twitter, Facebook and Google have “fundamentally and permanently transformed our very culture.” Despite her belief that much of this transformation has been positive, she said that American democracy has also been harmed by the divisiveness of misinformation and extremism.

“The regulation we seek should not attempt to limit constitutionally protected free speech, but it must hold platforms accountable when they are used to incite violence and hatred — or as in the case of the COVID pandemic — spread misinformation that costs thousands of lives,” said Schakowsky.

Part of the issue, as addressed in the hearing, is the fact that these platforms were created for private users, making increased engagement and profit driving factors.

“Rather than limit the spread of disinformation, Facebook, Google and Twitter have created business models that exploit the human brain’s preference for divisive content to get Americans hooked on their platform, at the expense of public interest,” stated Pallone.

McKinney echoed this sentiment in her case for the private right of action, stating that it is something that large platforms do not want because it could hurt their bottom line. While she understands the congressional concerns, it is EFF's position that legislation needs to happen in different areas.

McKinney believes there is a lot to be done in terms of honest competition on antitrust and privacy and said that stricter regulations on privacy and more transparency requirements would give outside groups greater ability to audit, among other things.

“That really changes the landscape of what [platforms] can get away with doing and that would really shift the dynamic of a lot of the decisions that they make, and I think that could be really interesting,” McKinney said.

 

Tags:

Social Media
Julia Edinger is a staff writer for Government Technology. She has a bachelor's degree in English from the University of Toledo and has since worked in publishing and media. She's currently located in Southern California.