The icon indicates free access to the linked research on JSTOR.

The more we learn about the impact of fake news and disinformation on democratic politics, the scarier it seems. In political contexts ranging from Russia’s seizure of the Crimea to Britain’s referendum on exiting the E.U. (and including, of course, the 2016 U.S. election), fake news has swayed public opinion and affected voter turnout. Or has it? One of the scariest things about fake news is that because it spreads via closed, proprietary networks like Facebook and Twitter, neither governments, media, nor citizens can get access to the kind of data that would actually allow us to quantify its effects.

JSTOR Daily Membership AdJSTOR Daily Membership Ad

Nonetheless, we know enough to worry—and to plan. Smart researchers, journalists, technologists, and policymakers have been hard at work trying to figure out ways of fighting back against the deliberate dissemination of false and misleading information.

Strengthening Digital Literacy

For the most part, pessimism about the likely effectiveness of regulation or counteroffensives—not to mention their feasibility—leads to one consistent, repeated recommendation: Let’s make voters more resistant to disinformation by strengthening digital literacy. In the report “Russian Hybrid Warfare: A Study of Disinformation,” Flemming Splidsboel Hansen calls for a “cognitive firewall,” which

should be installed at both the collective and at the individual level. At the society-wide level the ability to recognise and subsequently reject disinformation and not give it the attention which it demands should be improved…And at the individual level a new media literacy should be developed…which gives members of the target audience the tools with which to distinguish facts from fiction and the information from the disinformation.

Sounds great! I’m all for investing in education and media literacy programs that teach college, high school, and even elementary students how to differentiate between credible and questionable data sources. Yes, let’s support libraries and community programs that foster critical thinking and strong media skills among adults. And by all means, if we can pressure or persuade the big social platforms into creating their own feel-good digital literacy programs and tools, let’s do it.

But building strong critical thinking skills and smarter citizens is a long-term project—one that is hardly likely to enjoy continued support from governments who are elected on the basis of public ignorance. Democracy is on fire now. Fake news is a problem we can’t wait a generation to fix.

Fighting Fake News Where it Spreads

That’s why we need to stop thinking about improving the disinfo-resistance of individual citizens, and instead focus on improving the disinfo-resistance of entire communities. We actually have something going for us: fake news largely spreads through communities. What’s more, it spreads through the communities we have built for ourselves, in the form of social networks.

Research into fake news suggests that how we build our online communities has a big impact on our individual and collective vulnerability to fake news. It turns out that people who belong to relatively homogenous communities, made up of people who share their values and world view, are a lot more vulnerable to manipulation via fake news. The best thing you can do to combat fake news is to broaden your own online social circle, and build online communities that bring different kinds of people into contract and conversation.

Broadening your online social circles helps combat fake news in three ways. First, it serves as a kind of personal inoculation: The narrower and more like-minded your online circle, the more easy it is to be taken in by fake news. In “Social Media and Fake News in the 2016 Election,” Hunt Allcott and Matthew Gentzkow found that “those with segregated social networks are significantly more likely to believe ideologically aligned articles, perhaps because they are less likely to receive disconfirmatory information from their friends.” Based on a 1,200-person online survey they conducted after the 2016 election, Allcott and Gentzkow show that the more someone’s Facebook friends shared their presidential preference, the more likely that person was to believe headlines that matched their ideology over headlines that challenged their ideology. In other words, spend too much time in an echo chamber, and you learn to prefer the sound of an echo.

Second, expanding your social circle shifts your relationship to social media in a way that undercuts the core logic of modern information warfare. As Michael Jensen writes in “Russian Trolls and Fake News: Information or Identity Logics?,” Russia uses identity as the psychological access point for building trust in fake news sources.

In a world where trust in institutions and the media is declining, people tend to trust those who they identify as being like themselves more than professional and governmental sources. In a social-media space, Russia conducts influence operations using traditional spy craft methods adapted for a digital space: assets—in this case citizens—are targeted online at first through innocent conversation to build a report before trying to move their beliefs about political entities and events.

It’s worth digging into what Jensen is uncovering here, because it speaks to the relevance and importance of our social networks in facilitating Russian-style disinformation. “While the term fake news suggests informational or epistemic deficits,”Jensen writes, “To the extent that these communications are not just factually deficient but also seek to manipulate political identities, the consequences for democratic politics may be far more damaging.”

Clumping online with like-minded souls sets the stage for just this sort of manipulation. Social psychology tells us that much of our individual identity derives from the herd: the kinds of Facebook groups you join, the kinds of Twitter users you follow, and the kinds of Instagram hashtags you use all become ways not just to connect with other people, but to claim a certain kind of valued identity.

While it’s natural to be drawn to the groups and conversations that reinforce your identity, the more we rely on our social networks as a source of identity reinforcement, the more vulnerable we become to the kind of manipulation Jensen describes. Russia can tap into identity as a way of connecting with targets precisely because so many of us have grown accustomed to experiencing social media as a never-ending series of identity affirmations. Wean yourself off that experience of identity affirmation (and offer less of it to others), and you’re removing Russia’s favorite psychological access point.

Third, expanding your social circle helps to undo one of the most pernicious injuries caused by fake news and information warfare—what Gregory Asmolov (in an article of the same title) calls “The Disconnective Power of Disinformation Campaigns.” Digging into social networking behaviors in a variety of polarized contexts, Asmolov notes that:

A research project on unfriending practices in the context of the Israeli-Palestinian conflict concluded that “unfriending was more prevalent among more ideologically extreme and more politically active Facebook users.” In addition, “weak ties were most likely to be broken.” In the case of the Russia-Ukraine conflict…evidence suggests that the conflict has had a robust impact on previously strong apolitical ties among people, including ties between relatives, classmates, and friends.

Asmolov makes a compelling case that this undoing of loose social ties represents one of the biggest dangers of fake news: “The constant flow of state-sponsored disinformation triggers and sustains the phenomenon of ‘unfriending’ as an outcome of conflict-dominated social categorization.” He suggests a variety of measures to combat this, of which my favorite is the idea of a “yellow card” system “issued by one user to another to alert them when an online discussion is entering a phase where it may threaten the social connection between participants and to raise the question of whether continuing the debate is worth paying the price of ‘disconnection.'”

Why to Friend Your Conservative Aunties

By all means, bring on the yellow cards—if it allows us to sustain a wider range of social connections rather than unfollowing and unfriending in the face of disinformation-fuelled polarization. But even without yellow cards, we can make the same kind of choice by broadening our social circles, and deliberately staying engaged with a wide range of people. When you stay connected to even those Facebook friends whose political posts or social commentary tend to appall you, you’re fighting back against disinformation’s disconnective impact.

I don’t mean to suggest that we can conquer trolls, bots, and information warfare by friending random people on Facebook. Fake news, disinformation, and weaponized information are here to stay. In his vivid and compelling contribution to a report on “The MADCOM Future” (as in MAchine Driven COMmunication), Matt Chesen maps out three different scenarios for how we’ll come to terms with this reality, and none of them are good:

Scenario 1: A World Gone MADCOM: Global Information Warfare Nations weaponize narratives, using MADCOMs to exacerbate social discord, undermine faith in government, and eliminate the reliability of traditional journalism.

Scenario 2: Muddling Through: Measures and Countermeasures Western democracies manage to bring some tools of national power to bear on adversaries…Nations struggle to agree on peacetime norms for information security, but adoption is uneven and impossible to enforce, due to weak attribution tools and methodologies. MADCOMs undermine democracy, but they are a bonanza for capitalism. Corporations continue to harvest customer data, and utilize that data for subtly manipulative marketing.

Scenario 3: Lockdown: The Cognitive Security State In response to threats posed by MADCOMs, computational propaganda, weaponized narratives, and other rampant disinformation, over the next decade many nations impose stringent regulations on online communications and information…Political change slows as incumbents use their cognitive security powers to solidify their political positions…Reality becomes more resilient, but is also determined by unelected bureaucrats in capitals who subtly use these tools to maintain domestic and global order.

It’s precisely because we face such bleak possible futures that we must make whatever effort we can, now, to inoculate both ourselves and our communities. In the face of disinformation and polarization, it’s very tempting to pull up the drawbridge and retreat into cozy conversations with the friends and colleagues who share our values. Indeed, that’s a practice I advocate as a form of occasional respite in a world of ever-growing conflict.

But once you’ve retreated and replenished, it’s time to get out there again. It’s time to un-hide your vitriolic uncle on Facebook, re-follow your annoying colleague on Twitter, and maybe even tune into hashtags that extoll the merits of your least-favorite politicians, commentators, and sports teams. If you’re serious about fighting fake news, it’s time to brave the kind of chaotic and eclectic community that you experience only when you actually engage with people unlike yourself; the kind of community that is truly our best hope for transcending the disinformation threat.


JSTOR is a digital library for scholars, researchers, and students. JSTOR Daily readers can access the original research behind our articles for free on JSTOR.

RUSSIAN HYBRID WARFARE: A study of disinformation, pp. 33-37
Danish Institute for International Studies
The Journal of Economic Perspectives, Vol. 31, No. 2 (Spring 2017), pp. 211-235
American Economic Association
Journal of International Affairs, Vol. 71, No. 1.5, Special Issue: CONTENTIOUS NARRATIVES: DIGITAL TECHNOLOGY AND THE ATTACK ON LIBERAL DEMOCRATIC NORMS (2018), pp. 115-124
Journal of International Affairs Editorial Board
Journal of International Affairs, Vol. 71, No. 1.5, Special Issue: CONTENTIOUS NARRATIVES: DIGITAL TECHNOLOGY AND THE ATTACK ON LIBERAL DEMOCRATIC NORMS (2018), pp. 69-76
Journal of International Affairs Editorial Board
Atlantic Council