Facebook Oversight Board Upholds Decision to “Deplatform” Trump

In an online webinar hosted by the Aspen Institute, Facebook Oversight Board members gave insight into their decision to uphold Facebook's Trump ban. The ban must be made permanent or temporary within the next six months.

Facebook Oversight Board
In a landmark decision yesterday, the Facebook Oversight Board backed its decision to ban former President Donald Trump from its platform after concern that several of his posts fanned the flames of the Jan. 6 insurrection at the U.S. Capitol. However, the board’s decision comes with a slight catch: Facebook must decide whether to permanently ban or restore Trump’s account within the next six months.

To further examine the case, the Aspen Institute hosted an online webinar earlier today with four oversight board members and two policy and law experts to discuss the decision and what comes next.

The webinar began with moderator Vivian Schiller, executive director of Aspen Digital, clarifying the board’s role in relation to Facebook and what led to the ban.

The board, Schiller said, is funded by an independent trust and supported by an independent company that is separate from Facebook. With this separation, board members can impartially review the company’s decisions, including removing posts or individuals from its platform, Jamal Greene, one of the board's co-chairs, said.

As for the post that led to the ban, the former president posted a video to Facebook on Jan. 6 during the Capitol riot where he told rioters to go home while continuing to perpetuate a false narrative that the 2020 election was “stolen.” Not long after the post was published, Facebook deleted the video and banned Trump the following day.

Now, per the board’s request, Facebook will take the next six months to create rules for influential users such as politicians and other highly followed individuals and decide whether to ban or restore Trump’s account permanently.

“Facebook has a responsibility to protect public safety,” oversight board member Julie Owono said.

But several concerns were raised during the webinar, including Facebook’s decision not to answer some of the board’s questions about how its technology and algorithms are used to disseminate information to its users.

“This is one of my personal concerns,” board member Ronaldo Lemos said. “If you look into a certain piece of content and realize that it’s part of a massive disinformation campaign that is exploiting specific responses, that is not your typical citizen exercising free speech.”

As such, he said, not showing how that piece of content interacts with an algorithm might imply deception.

So why not require Facebook to answer these questions?

The board, Lemos said, can't definitively make decisions on Facebook’s behalf.

“A lot of people that look to the Oversight Board have criticisms regarding its structure and demand that the board has more power to enable rules and basically interfere with Facebook’s activities,” Lemos said. “That is a typical type of criticism we hear all the time, but the limited powers that the Oversight Board does have over Facebook is quite a lot.”

Those powers, he said, include reviewing whether posts should be left up or taken down, along with deciding whether Facebook’s decisions in removing posts or individuals should be enforced.

As for what’s next for Facebook, Henry Olsen, a Washington Post columnist and a senior fellow at the Ethics and Public Policy Center, said the company needs to decide whether it will create policies that allow free speech or take the company down a different path.

“If you really believe in the American First Amendment, you need to have policies that reflect that,” Olsen said. “Political speech carries risks. I don’t think limiting people you don’t like who have influence is the appropriate way to govern a platform that seeks to be devoted to free speech.”

“If you start saying this person is too hateful or this person is too influential, that ultimately becomes something in the eye of the beholder, and then we no longer have a platform that is dedicated to free speech,” he added. “You will have a platform that will either be bandied about by political purposes depending on the pressures of the day to suppress the opponents of the people with political power, or you will have a partisan entity that seeks to suppress speech that it disagrees with.”

The risk with the latter option, he said, is that it could result in Facebook becoming a publisher rather than a platform that supports free speech.

“If Mark Zuckerberg wants Facebook to become a publisher, then he should act as any private publisher and understand Facebook in those terms,” Olsen said. “Otherwise, free speech needs to be at the forefront of their decision.”
Katya Maruri is a staff writer for Government Technology. She has a bachelor’s degree in journalism and a master’s degree in global strategic communications from Florida International University, and more than five years of experience in the print and digital news industry.
Special Projects
Sponsored Articles
  • Sponsored
    How state and local government transportation and transit agencies can enable digital transformation in six key areas to improve traveler experience.
  • Sponsored
    The latest 2020 State CIO Survey by NASCIO reveals that CIOs are doubling down on digital government services, cloud, budget control and fiscal management, and data management and analytics among their top priorities.
  • Sponsored
    Plagiarism can cause challenges in all sectors of society, including government organizations. To combat plagiarism in government documents such as grants, reports, reviews and legal documents, government organizations will find iThenticate to be an effective yet easy-to-use tool in their arsenal.
  • Sponsored
    The US commercial sector, which includes public street illumination, used 141 billion kilowatt-hours of electricity for lighting in 2019. At the national average cost of 11.07 cents per kilowatt-hour, this usage equates to a national street energy cost of $15.6 billion a year.