How is Twitter cracking down on verified users who post negative content?

Answer: By revoking their verified badges and imposing new guidelines for users seeking verification.

  • Facebook
  • LinkedIn
  • Twitter
  • linkText
  • Email
On Wednesday, Nov. 15, Twitter announced via tweet that it had updated its qualification guidelines for verification on its platform, and that it will be reviewing currently verified users and revoking their badges if they do not meet these new guidelines. The social media company originally announced it had paused general verifications, pending new rules.

Verification was meant to authenticate identity & voice but it is interpreted as an endorsement or an indicator of importance. We recognize that we have created this confusion and need to resolve it. We have paused all general verifications while we work and will report back soon — Twitter Support (@TwitterSupport) November 9, 2017
The social media giant’s blue verification badges were not originally designed to be endorsements, but simply an indicator that an account owner has proved that they are in fact who they say they are. However, the badges have come to carry that unintended meaning, which posed an issue for Twitter last week when it received significant backlash for verifying the account of an organizer behind the Unite the Right rally in Charlottesville, Va., in August that left one dead and many injured.

Per the new rules, verified users will no longer be so if their account displays or promotes hate, violence or violent images, harassment, threats to others or themselves, or in any way violates pre-existing Twitter Rules. The announcement tweet suggests that these new rules are an official acknowledgment by the company that many people view its verification badges as an endorsement.

  • Facebook
  • LinkedIn
  • Twitter
  • linkText
  • Email
Kate is a senior copy editor in Northern California. She holds a bachelor's degree in English with a minor in professional writing from the University of California, Davis.