IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

As Midterms Loom, Congress Fears Domestic Disinformation

Witnesses testifying in a recent congressional hearing said domestic sources are playing a strong role in driving online falsehoods that undermine faith in elections and inspire real-world violent attacks.

Detailed image of the Facebook app problem reporting page.
Federal lawmakers are looking to learn more about combating mis- and disinformation as midterm elections approach. Domestic sources have emerged as the greatest perpetrators of falsehoods, said several witnesses during a July 27 House hearing.

“ISD research suggests domestic disinformation targets Americans at a higher volume and frequency than foreign campaigns,” testified Jiore Craig, head of digital integrity at the Institute for Strategic Dialogue (ISD), a think tank that analyzes extremism.

Domestic actors can be particularly convincing. For example, some social media ads tout election falsehoods while featuring trustworthy-sounding organization names and, without permission, displaying images of trusted public figures, Craig said.

“Much domestic disinformation is well-resourced, references real-world people and events, and deliberately uses social media product features like targeted advertising, recommendation systems and ‘explore’ feeds that are opt-in by default to seed disinformation,” Craig said in written testimony.

U.S.-based perpetrators can also lay the groundwork for foreign actors, by creating a climate ripe for manipulation.

“Research also suggests that the domestic sources of disinformation enable the success of many foreign efforts as both rely on exploiting cultural tensions and partisan rhetoric,” Craig said.

Most of mis- and disinformation around the 2020 election results was generated or spread by U.S. residents, rather than foreign nations, said Renée DiResta, research manager at the Stanford Internet Observatory, where she studies the “spread of malign narratives across social networks.”

Inaccurate narratives advanced in several ways during 2020. In some cases, high-profile individuals pushed falsehoods online. In others, regular people posted on social media looking for answers and then saw their posts amplified and woven into unsupported narratives about fraud, DiResta said.

“One of the challenges to taking action is that most of what we saw in the 2020 election was not what is commonly known as ‘coordinated inauthentic behavior.’ It was not foreign interference, and it was not large networks of fake accounts,” DiResta said.
Jiore Craig
Jiore Craig, head of Digital Integrity at the Institute for Strategic Dialogue, testifies about disinformation.
Rising popularity of Instagram, TikTok and other social media platforms characterized by influencer culture also raises more concerns over micro influencers who may seed falsehoods, Craig said. Micro influencers are foreign or domestically controlled social media accounts with small but dedicated followings. The accounts may primarily post on unrelated topics, such as food or music, and become trusted parts of a community, while occasionally slipping in false claims about elections to unsuspecting audiences.


As midterms approach, election officials and law enforcement are preparing for possible threats, said John Cohen, adjunct professor in Georgetown University’s Security Studies Program, in his written testimony. These include potential violence at polling places and other public election-related locations, which may be spurred on by false narratives; threats against election officials and their families; voter intimidation; cyber attacks; disinformation and efforts to destroy or tamper with ballot drop boxes and other voting equipment.

The elections are occurring in a national climate marked by mass shootings that online conspiracy theories help propel, Cohen testified. Perpetrators tend to be angry, disconnected individuals and are often drawn to extremist online communities where they can find social contact, a cause toward which they can turn their anger and justify their violence, and help planning the attacks.

“Fueling these acts of violence is an online and media ecosystem that is saturated with conspiracy theories and other content purposely placed there by foreign and domestic threat actors including foreign intelligence services, international terrorist groups, domestic violent extremists, and criminal organizations,” Cohen said. He noted that such attacks have been targeting highly trafficked public areas, law enforcement and other groups based on their politics, faith, gender, race or ethnicity.

“Today, the United States is confronting a threat environment that is the most volatile, complex and dynamic of any that I have experienced in my career,” which included “close to 40 years of experience working on law enforcement, homeland security, intelligence and counter-intelligence issues,” Cohen said.

Making a difference will take a mix of efforts, including improving the public’s media literacy skills so residents can assess the reliability of information they encounter, supporting community-based violence prevention programs and reconsidering how law enforcement can get informed about and track nascent violent threats, while still respecting free speech, Cohen said.


Craig also urged steps to curb disinformation in the long term. It’s easy to get caught up in disputes over whether a particular piece of content is false or not, she said, but stakeholders should prioritize efforts to change the ecosystem that has been enabling disinformation to spread and thrive. That takes getting insight into social media platforms’ workings, including how they amplify content and the funding behind the advertisements they show.

“The way social media companies make business decisions about what to show Americans around elections has a significant impact on how successful disinformation campaigns are in reaching Americans,” Craig said.

But outsiders can rarely view details about “who is paying for the content, who is making money from a person engaging with content, or whether the accounts sharing content are neighbors, foreign citizens, or fake accounts,” Craig said.

DiResta similarly urged Congress to mandate that social media and other platforms increase transparency and give civil society and researchers more access. She also proposed the White House take steps like creating “clear standards for consistent disclosures of mis- and disinformation from foreign and domestic sources as a core function of facilitating free and fair elections.”

Other efforts to help reduce the damage of dis- and misinformation, DiResta said, can include more public awareness campaigns to raise understanding of the threat, government and civic society efforts to push out reliable information and collaborations between technology and research sectors on identifying emerging falsehoods and uncovering “disinformation networks.”
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.