An exciting project by the National Conference on Citizenship is making the internet a safer place for truth.
April 26, 2020
Anyone who’s spent time on social media lately has probably seen old high school classmates hawking miracle cures for COVID-19 or distant relatives boosting conspiracy theories about the origins of the coronavirus. With a public health and economic crisis that’s changing by the minute, sifting good information from bad is like navigating a monsoon. And because there’s so much experts don’t know about the virus, false and unfounded information is filling the void.
The World Health Organization (WHO) recently described this situation as an “infodemic,” and even a reasonably media-literate person may find the line between fact and fiction to be murky. Some online platforms are making an effort to squash misinformation, like Facebook, which is notifying users when they “like” misinformation and directing them to a WHO page that debunks common coronavirus myths.
But according to long-time journalist Cameron Hickey, we need a broader approach. Hickey is project director of the Algorithmic Transparency Institute at the National Conference on Citizenship (NCoC) and creator of Junkipedia, a giant repository of online misinformation that’s available to journalists, researchers, and all kinds of organizations seeking to improve the quality of information on the internet. Anyone can submit misinformation to Junkipedia when they see it by using an online form, email, or even a text message. Hickey spoke with us about NCoC’s work and what everyone should know about navigating information online.
How can Junkipedia help?
Hickey says that Junkipedia can tilt the internet toward accuracy in a variety of ways. His team is currently partnering with civil rights organizations to monitor disinformation related to the U.S. census, for example, so they can better understand what kind of disinformation is spreading, which groups it’s targeting, and who’s creating and sharing it. Those organizations can then reach out to their constituencies with accurate information. Journalists can use it to find myths and conspiracy theories that need debunking. Academic researchers can dig into the repository to study how misinformation takes hold and spreads on the internet. State, local, and national governments can adjust their public health and safety messaging to account for what citizens may be reading online.
Can it improve the quality of journalism?
Journalists have an immense responsibility to set the record straight, especially at a time when Americans are overwhelmed by information. To do this, Hickey says it helps for them to understand how misinformation spreads online and to write in a way that actively pushes back.
For example, some news outlets reported that the U.S. government was investigating the possibility that the novel coronavirus had escaped from a laboratory in Wuhan. Hickey says that this may be true in a literal sense (yes, federal officials confirmed that they are investigating), but framing the story this way inadvertently amplifies and legitimizes a conspiracy theory (just because the U.S. government is investigating doesn’t mean there is any evidence to suggest it’s true).
In this article, for example, a reader would need to get past the headline and first paragraph to see that it doesn’t support online conspiracy theories. The story is simply that intelligence officials considered it possible that a lab employee in China became infected with the virus and accidentally spread it, though the officials do not consider it “the most likely possibility.”
“The people behind coordinated misinformation campaigns know how to manipulate news organizations into covering allegations, however spurious, and giving them print space or airtime,” Hickey warns. “News outlets must be constantly vigilant that they aren’t unwittingly falling into that trap.” Using Junkipedia, a journalist can figure out how to responsibly frame a story, knowing that some readers will have already encountered bad information.
Shouldn’t tech companies be responsible for addressing this problem?
On more open platforms, like Facebook or Twitter, information is visible to the public to at least some degree. But plenty of online discussion happens in closed or encrypted networks like Discord, WhatsApp, WeChat, or private Facebook groups. And while tech companies may employ a network of factcheckers to remove problematic content, Hickey notes that they do so “according to their own editorial whims.”
Furthermore, these platforms are in competition for users and do not necessarily share information with one another. Facebook may flag a problematic post that will then pop up on Reddit and go on to circulate from there.
Who is at risk of sharing misinformation online?
Conservative, liberal, young, old, high school students and graduate students – everyone is at risk of sharing misinformation, Hickey cautions. “What’s important to note is that the prevalence of misinformation does seem to be skewed along partisan lines to an extent, but that does not mean that people on the left are less likely to share problematic content than people on the right. It’s important for everyone to be vigilant. Complacency is a real risk that I think afflicts people in particular who consider themselves well educated.”
Simply put, people on both sides of the aisle are inclined to believe information that aligns with their views. They don’t have to be wearing tin foil hats to share something inaccurate or harmful.
How can I report misinformation?
If you encounter misinformation or conspiracy theories online, send links or screenshots to Junkipedia:
Web: submit a tip
Don’t assume someone else has already flagged it. Your social media feed is unique and you may be seeing different posts than other users.
Myths Busted: The World Health Organization (WHO) corrects some of the most common fake coronavirus news circulating online. If you’re itching to post on social media, consider downloading these clear informational graphics that set the records straight – via World Health Organization
Infodemic: How can social media companies combat misinformation on their platforms? One expert says that the government’s emergency alert system, which pushes critical information out over mobile phones, television, and radio, should expand to include social media platforms, where many Americans get their news – via MIT Technology Review
Global Action: Americans aren’t the only ones grappling with an avalanche of misinformation. Here’s a look at how governments around the world are tackling the problem, from lawsuits to regulation to arrests – via Poynter
Bipartisan Risk: Both conservatives and liberals get hoodwinked by fake news. Experts say that while people from both sides of the aisle are more likely to believe news that aligns with their ideologies and to reject news that doesn’t, their psychology differs – via Scientific American
The Future of Misinformation: Pew Research Center and Elon University partnered to survey experts on whether the online misinformation will become a bigger or smaller problem over the next decade. They were split. Some say technology can evolve to make things better, while others say human nature will open the spigot of fake news even wider – via Pew Research Center
# # #
This article originally appeared in the April 26, 2020 issue of Wide Angle, our regular newsletter designed, we hope, to inform rather than inflame. Each edition brings you original articles by Common Ground Solutions, a quiz, and published articles — from across the political spectrum — that we think are worth reading. We make a special effort to cover good work being done to bridge political divides, and to offer constructive information on ways our readers can engage in the political process and make a difference on issues that matter to them.
Sign up below to receive future issues.