IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Misinformation Plagues Government. Can Technology Help?

Tech companies are now creating tools to help government find and fight misinformation online. One startup, Logically, explains how its new platform Logically Intelligence can root out dangerous content.

A screenshot of Logically Intelligence, a new platform for finding and responding to misinformation and disinformation.
It’s hard to wrap up the profound impacts misinformation has had on the world in the past several years. Foreign countries have used it in an attempt to influence U.S. elections. It’s led people to reject masks and vaccines, two of the most powerful tools the world has to fight the COVID-19 pandemic. It played a big part in the Jan. 6 storming of the U.S. Capitol.

And it’s not going away. In fact, it’s gotten to the point that tech companies are seeing an opportunity to sell the government software it can use to identify and fight misinformation and disinformation online.

Logically, a United Kingdom-based startup, just launched a new platform called Logically Intelligence in an attempt to fight misinformation. It’s a combination of situational awareness for elected decision-makers and tools for lower-level public officials to quickly respond to harmful rumors and lies swirling around the Internet.

An electoral battleground state in the 2020 election — Logically won’t say which one — used the platform to identify about 40,000 pieces of information it considered to be harmful to election integrity or public health.

Logically Intelligence ingests millions of pieces of content from a wide array of websites, including Facebook, Twitter, blogs and fringe social networks such as Parler. Algorithms help sift through them to identify possible misinformation or disinformation — which is done by comparing content to a baseline truth, as determined by the company — and sometimes a team of human analysts.

“In a geopolitical context and a COVID context, around 75 percent of those claims can be fully automatically fact-checked, and 25 percent of them require human intervention,” said Lyric Jain, CEO of Logically.

The platform also seeks to speed up the response to such content. It has tools to request that social media platforms take down certain posts, as well as other actions such as placing a warning or context label on posts.

That response needs to be as fast as possible, Jain said. And while traditional channels can take hours or days, the company hopes it can use its status as a trusted intermediary to achieve that speed.

“If the response isn’t immediate, if it seems like, at least optics-wise, there’s any uncertainty in response, or if there’s a vacuum where a narrative is allowed to go unchallenged, that’s the space where the most harm occurs,” he said. “And it’s really hard to convince people once they’ve been convinced of that narrative.”

The company is also playing around with the idea of using networks of credible, influential people as another means of combatting misinformation.

“We thought this kind of influence through reliable, authentic influencers with reliable, factual communications could be a good conduit where people who trust each other already are able to get those messages across, as opposed to an institution, be it a government or a fact-checking organization which might not be perceived in the best light … it might be that that individual has anti-government tendencies or they might feel that every fact-checking organization is a liberal conspiracy,” he said.

In the meantime, the company also gives communications tools to government. That includes a library of pre-existing fact-check resources and investigations, so that public officials have factual information at hand if they want to put out a statement.

There are also tools for visualizing who is responsible for spreading information and the channels through which it’s amplified.

Logically is not the only one looking to put these tools into the hands of government. A U.S. company, AlphaVu, recently started offering tools specifically targeted to misinformation about COVID-19, which has been used in the Virginia Department of Public Health.

Ben Miller is the associate editor of data and business for Government Technology. His reporting experience includes breaking news, business, community features and technical subjects. He holds a Bachelor’s degree in journalism from the Reynolds School of Journalism at the University of Nevada, Reno, and lives in Sacramento, Calif.
Special Projects
Sponsored Articles