During the COVID-19 pandemic, state-supported groups and cybercriminals may target networks, websites and social media streams to disrupt information flow, deceive the public and interfere with legitimate government functions.
For many small nonprofit organizations and government entities, being the target of state-sponsored disinformation may seem like the plot to an action movie. Still, in today's dynamic global political and cyber environment, it is a genuine possibility. Public administrators must be aware that state-sponsored and state-supported groups and cybercriminals may target their networks, websites and social media streams to disrupt information flow, deceive the public and interfere with legitimate government functions.
National Public Radio’s Geoff Nunberg named “disinformation” the 2019 Word of the Year because of the concept’s rising impact on governance and society. Disinformation can be defined as, “all forms of false, inaccurate, or misleading information designed, presented and promoted to intentionally cause public harm or for profit.” This is a timely concept as the COVID-19 pandemic spreads around the world.
In just the first two weeks of March, the European Union's Commission on Strategic Communication and Information Analysis reported 110 discrete cases of coronavirus-related disinformation campaigns. These campaigns were aimed at disrupting national and local governments, promoting pro-Russian, pro-Chinese, pro-Iranian, and pro-Islamic State agendas. In the month since, policy analysts from around the globe have identified daily examples of disinformation and nearly a third of people surveyed in Argentina, Germany, South Korea, Spain, the U.K. and the U.S. reported having personally seen misleading or disinformation on social media.
Not only is disinformation a national security problem, it poses challenges to public-sector organizations, from small nonprofits to schools to local, state and federal governments. In the digital age, the primary means for delivery and promulgation of disinformation is through data systems and the Internet.
In and of itself, disinformation is not a new concept. From the Trojan Horse in ancient Greece to the Allied information campaign to conceal the actual location of the D-Day invasion from the Nazis during World War II, disinformation has been a tool employed by governments to gain political, diplomatic and military advantages for centuries. In today’s disinformation environment, campaigns by state actors and criminals are enabled by artificial intelligence and cyberattacks, with demographic data being particularly valuable to both parties. Controlling and manipulating data is at the heart of disinformation operations, as it directly translates into political and criminal power.
There are three types of disinformation that can impact governments and public-sector entities: deception, disruption, and interference. Each of these concepts has specific objectives and each method lends itself to particular propagation techniques.
Deception refers to the intent to influence public opinion and its formation by deliberately introducing false narratives. Deception can be achieved through doctoring online content, mimicking legitimate media or government communications, and hiding the true identities of the perpetrators. Disruption involves directly interfering with the inherent functions of government or society while interference consists of attempts to undermine the sovereignty of a nation. Interference is most often perpetrated by manipulating websites and social media content and replacing legitimate information with false information. The most dangerous form of interference is mixing legitimate speech and media reports with deliberately misleading information or statements and images taken out of context.
The Department of Homeland Security recommends several actions that can be taken at the local level to combat disinformation. First, public-sector organizations should create a prioritized list of themes that may be vulnerable to disinformation. Under the COVID-19 crisis, that will probably include topics like the virus’s impact on the demographics of the local community or the tension between maintaining a shelter-in-place mentality versus reopening businesses to stimulate the economy. Administrators need to be vigilant for social media and messaging narratives in their community that run counter to official government statements, especially now as state and local governments are planning to emerge from stay-at-home orders.
The response is a three-step process: hit the actor, hit the technology, and build resiliency. This methodology is based on the premise of promoting both information transparency and consumer literacy. For municipalities and local public entities, hitting the actor involves identifying social media posts that are directly impacting their agencies and refuting them with official statements and facts.
Once identified, administrators also need to report those posts to social media platforms for action. This allows the industry to hit the technology being used to propagate disinformation. Social media companies are using a two-pronged system to fight disinformation: block content and provide vetted, alternative information alongside the fake information to allow users to see how data is being manipulated.
Building resiliency requires partnering with educational institutions and advocacy groups that can reinforce fact-based messaging. On the national level, the Department of Homeland Security is appealing to the patriotic duty of American citizens to discern disinformation in their media feeds. On the local level, partnerships with trusted civic organizations like rotary clubs and the Red Cross can help reach constituents and disseminate correct information.
Mayors, city managers, nonprofit managers, and other public administrators must be aware that disinformation is not only a national issue, it is a local problem as well. Protecting citizens and constituents from deception, disruption, and interference requires a deliberate campaign to monitor networks, websites, and social media outlets and counter false narratives with transparency and build societal resiliency. The relentless onslaught of disinformation will not end with the coronavirus. Local officials must start now to understand and counter false narratives in their jurisdictions.
Colonel Danielle Willis is an Air Force officer and a doctoral student in Public Administration at Valdosta State University. She holds a bachelor’s degree from the University of Michigan and master’s degrees from Oklahoma State University and Air University’s Air War College. The views expressed in this article are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. government.
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.