IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

RSA Conference Ends on Call to Action on Information Disorder

Former CISA Director Chris Krebs, Color of Change President Rashad Robinson and journalist Katie Couric discussed the societal threat of dis-, mis- and malinformation as the RSA Conference concluded last week.

Chris Krebs speaks at the RSA Conference 2022
Chris Krebs speaks at the RSA Conference 2022
Jule Pattison-Gordon
The RSA Conference packed a wide range of cybersecurity lessons and insights into its four days of panels — but one issue held particular prominence as the focus of the final keynote capping off the event: information disorder.

Former director of the Cybersecurity and Infrastructure Security Agency (CISA) Chris Krebs, Color of Change President Rashad Robinson and journalist Katie Couric came together for a panel that dug into their work as co-chairs of the Aspen Institute Commission on Information Disorder and what’s come since. The team released a report in November 2021 that highlighted how disagreement over facts prevents the nation from coming together to tackle common, pressing problems and leaves the U.S. open to manipulation by foreign and domestic malicious actors. Speakers raised a call for alarm and action.

“What we’re dealing with right now with information disorder, which impacts all other issues, is … to make sure that the truth actually has a fighting chance,” Robinson said.

The topic was front and center that day, with the Jan. 6 congressional committee slated to hold its first public hearing just hours after the panel, to discuss the disinformation-fueled assault on the U.S. Capitol.

Information disorder refers to disinformation, misinformation and malinformation and their pernicious effects on societies. Disinformation is falsehoods spread deliberately, while misinformation refers to those spread inadvertently by someone who thought the content was true. CISA defines malinformation as factual information “used out of context to mislead, harm, or manipulate,” something Krebs said can include hack-and-leak campaigns.


STEPS FORWARD IN TRANSPARENCY


The Commission on Information Disorder published 15 recommendations for government, civic society and private companies and there has been progress on some of them in the seven months since the report’s launch.

For one, proposed legislation would require social media firms to share data with researchers, creating more transparency into the platforms’ workings. Krebs also praised the federal government’s quick moves to declassify intelligence that would help counter potential Russian disinformation efforts during the ongoing invasion of Ukraine. Declassification is taking just a few hours — a significant leap over the monthslong process when Krebs joined CISA in 2018, or the 27-hour timeline later reached under the Trump administration.


ROLE OF SCHOOLS


State governments and schools have a role to play, too, and their policies and practices can help chip away at the problem or exacerbate it.

Robinson said malicious actors trying to disrupt society through disinformation campaigns often work to identify and then amplify tensions among communities. These efforts may aim to “sow division” as well as undermine important processes and programs like elections and public health efforts, the report said.

The report notes that some disinformation campaigns aim to instill division by targeting one racial or religious group with false claims about another. Other campaigns take aim at misinforming and manipulating a particular population.

For example, Lawyers’ Committee for Civil Rights Under Law President and Executive Director Damon Hewitt noted in a May hearing that disinformation efforts intended to discourage Black individuals from voting by mail played on fears stemming from police misconduct and memories of the Tuskegee experiment.

Failing to acknowledge and address long-running issues like systemic racism and the history of racism in the U.S. leaves these tensions among communities unaddressed and makes the country ill-prepared to combat disinformation efforts that would exploit them, Robinson said.

Robinson’s comments come at a time when many states have sought to limit teachers’ abilities to discuss race, gender and systemic inequality. Between January 2021 and May 2021, 42 states introduced bills or took “other steps” to this effect, per Education Week. The Washington Post notes those laws that have passed often create confusion and concern around what is permissible, producing a chilling effect on fully teaching history and current events.

“As we think about even some of the big debates that we are facing in this country — whether or not we even teach Black history, American history, in our schools — we have to think about these questions as security questions,” Robinson said.

Along with teaching a more complete accounting of history, schools can also help against information disorder by preparing students to detect unreliable content. Couric suggested introducing media literacy as early as elementary school to teach students how to assess the trustworthiness of a piece of content.
Panelists Katie Couric, Rashad Robinson and Chris Krebs speak at the RSA conference 2022.
(Left to right): Moderator Hugh Thompson speaks with panelists Katie Couric, Rashad Robinson and Chris Krebs at the RSA Conference 2022.

SOCIAL MEDIA REGULATION


Government will also need to take a firmer stand against social media firms, which have often sought to avoid responsibility for the spread of false and misleading information through and by their platforms, Robinson said. Instead, platforms often suggest users are to blame for creating demand by engaging with the untrustworthy posts they encounter.

Platforms’ profit models incentivize promoting attention-generating content — regardless of its accuracy — leaving little motivation for changing behaviors, unless new government regulations force it, Robinson said. And “conflict content” tends to get many hits.

The Commission on Information Disorder report took aim at Section 230, which protects platforms for being sued in civil court over the user-created content that they host. The report called for removing this protection in cases where the content is a paid advertisement and for ensuring platforms are not shielded from liability for actions they take to promote and amplify user content, such as with their recommendation algorithms.

Panelists emphasized that these proposals do not impact the First Amendment.

“We’re certainly not advocating for any regulation of the content itself necessarily,” Krebs said. Focus is instead on “the mechanism. It’s about optimizations, the incentive structures.”

And making sure such efforts happen means governments need to construct consequences for bad behavior that are significant enough to get social media firms’ attention, Robinson said.

DISINFORMATION BOARD AND LOCAL ENGAGEMENT


Some government efforts to counter information disorder have faced headwinds, and a federal effort to fight disinformation recently stalled in the face of disinformation and misinformation.

The Department of Homeland Security (DHS) launched a Disinformation Governance Board working group this spring. The Board was intended to help coordinate DHS agencies’ efforts to counter any disinformation that impacts homeland security, while also ensuring these initiatives protect civil liberties and rights, freedom of speech and privacy, according to DHS and former Board leader Nina Jankowicz.

But the Board’s launch was met with some public confusion over its purpose and purview. That soon gave way to fierce pushback from right-wing voices that mischaracterized the group, incorrectly claiming that it would conduct censorship. Jankowicz faced violent threats and the Board’s future became uncertain. Within three weeks of the Board’s launch, DHS put it on pause and Jankowicz resigned.

Krebs pointed to this event as an indicator that government has a role to play but cannot combat information disorder on its own.

"My takeaway is that the government is not going to save us. It is not an issue that — like you saw a couple of months or so ago — with the DHS Disinformation Governance Board. These are the sorts of things you can deal with that can get weaponized, and it ultimately comes down to us in our communities to engage,” he said.

Efforts to support local news media and build community are also essential, he said. The pandemic has increased social isolation, leaving more people immersed in online echo chambers where fringe theories can flourish, and disconnected from the local communities that traditionally act as reality checks.

“If you go to a kids’ soccer game, and if you were to say something like — oh, I don’t know, ‘JFK Jr. is still alive,’ the other parents would look at you like you’re crazy. But the echo chambers and the filter bubbles allow for that sort of information to propagate,” Krebs said.
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.