Domestic v. foreign disinformation: A question of timing (US2020 Disinformation news, ed. 4)
There are two possible reasons why we are not talking as much about foreign interference. Both could be true. Only one is good news.
Welcome to edition 4 of my newsletter on disinformation in the US2020 elections, assembled from notes made as I Hubbed another 20 resources over the past fortnight as part of my enhanced personal content strategy.
A prominent theme from previous editions was domestic disinformation, particularly Republican efforts to delegitimise the election results. This edition’s resources offer two possible reasons why we are not talking as much about foreign interference. Yet.
Keeping their powder dry
The most worrying idea is Just Security’s contention that anyone trying to mess with America probably realises they can’t get Trump re-elected, and so are better off focusing on the post-election period, which offers a “better chance to push more Americans to extremes than ever before”.
Foreign players, in other words, will probably get involved post-election: if they do have material on Biden, it makes more sense to damage him once he’s in office, during what promises to be “the most profitable moment for chaos”.
Looking at US in the mirror
The reason for that post-election chaos, of course, remains domestic. The introspection bandwagon, documented in edition 2, continued in the NYTimes, where a member of the US National Security Council from 2017–2019 argues that while the impact of Russian disinformation in 2016 was probably exaggerated, its greatest impact was to focus Americans on an outside enemy when “we should have focused on fixing our own faults”.
I’m wondering if Europeans will fall into the same trap, or a related one: looking in such horrified fascination at the US that they fail to notice their own vulnerabilities.
The rest of this edition is structured around the threats identified by Just Security.
Media: not adapting, not convincing
One of the 6 threats identified by Just Security is media’s inability to handle Trump’s delegitimisation campaign.
Harvard’s Berkman Klein Center for Internet and Society analysed “55,000 news stories, five million tweets, and 75,000 public Facebook posts”, showing that it’s “Trump, the Republican National Committee, and Fox News — not Facebook spammers and Russian trolls — who are the primary drivers of misinformation around mail-in voting fraud”.
It’s a commonplace that democracy relies on journalism, while journalism relies on politicians respecting democratic norms. The point made here is that when politicians don’t, some traditional journalistic practices actually propagate harm. Or, as Just Security put it, “political journalists seem to have trouble realizing how bad things are”.
Detailed recommendations for not falling into that trap are provided by a network of scholars in politics and media:
It makes uncomfortable reading, but then these are uncomfortable times.
only ~30% of Americans are persuadable one way or another
Harvard’s analysis, finally, also looked at how public opinion is shaped by mainstream and social media, finding that only ~30% of Americans — “people who rely on local TV for news, who aren’t die-hard Fox News viewers or New York Times subscribers” — are persuadable one way or another.
Vote fraud and voter suppression
Two threats identified by Just Security concern US militias.
Apart from their general potential to run amok, they have been recruited since March to suppress voting through polling booth intimidation. While this has been Republican strategy for many years, what’s new is the training they’ve received to collect voter fraud evidence, destined to fuel delegitimisation conspiracies for years.
Which is ironic, given that the main source of outright voter fraud actually appears to be the Republican party itself.
Key phrase: “church officials don’t have a key to the box … GOP officials picks up the ballots”. Meanwhile, none of the ballot collection locations listed on the local Republican Party’s website are official — instead they list their HQ, gun shops and other local businesses. The GOP wins either way: coverage of these tactics undermines faith in the election process, while any ballots that are deposited in these boxes can be filtered for Republican votes.
Meanwhile, Channel 4 broke fascinating new insights into the Trump campaign’s use of Cambridge Analytica data to suppress Black American voters in 2016.
Social media & censorship
Just Security’s 2nd threat is that social platforms fail to tackle post-election disinformation.
In early September, when Facebook set out its election disinformation policies, they stressed stability:
to ensure clear and consistent rules, we are not planning further changes to our election-related policies between now and the official declaration of the result
- Mark Zuckerberg, 3 September
My last edition was just late enough to cover them changing their mind and those policies, three weeks later. Since then, changes have come thick and fast, including:
- Twitter Will Turn Off Some Features to Fight Election Misinformation, (NYTimes) - includes enforcing my 2014 suggestion to Read before you share!
- Facebook Has Finally Banned Holocaust Denial (Time)
- YouTube Toughens its Rules Around Dangerous Conspiracy Theories, with a Focus on QAnon Content (Social Media Today).
- Facebook completely bans QAnon and labels it a ‘militarized social movement’
Of course, having a policy is one thing, enforcing it is something else, but this time it appears that they actually did. Maybe they feel a change coming.
The predicable backlash was not far behind, with Donald Trump tweeting that Twitter had shut itself down to stop the spread of bad news about Joe Biden, and linking to a satirical news story as evidence.
Honestly, I can’t make this stuff up. Who’d want to?
But just as predictable was evidence of the potential downside of policing political debate, with Garry Kasparov’s Facebook page being shut down because he mentioned the conflict in Nagorno-Karabakh, and @Politico’s Jake Sherman suspended from Twitter for Tweeting the NYPost’s email story to Joe Biden’s campaign to see if they had an answer.
And while I’m generally no fan of Spiked, they have a point when they observe that Ofcom’s definition of [COVID] misinformation is so broad it includes advice that was once given out by health chiefs.
Meanwhile:
So expect more resources tagged #censorship in the coming months.
At the very best, getting this right will take time — at worst, we’ll still be arguing about this in 2024. But in the final analysis, fixing social media won’t in itself fix out societies. Critiques of social media platforms for allowing disinfo and hate speech to flourish are accurate, but sometimes — as Wired points out — are also just wishful thinking. After all, the first Trump-Biden debate “was probably seen by more than 80 million people”, who saw their President egg on right-wing thugs. In such a context, “there’s a limit to what platform moderation can accomplish.”
Conspiracy watch
Just Security’s 6th threat was that Qanon would rebuild after their Facebook ban.
Of the resources added tagged #qanon recently, this short video most succinctly summarises why: banning a conspiracy theory will not get rid of the people who believe in it.
According to Vox, it may be too late to stop Qanon, potentially “one of the largest networks of extremism in the United States” (note that this article was published on the day Facebook banned it):
But what is it about Qanon which makes it so pernicious? I spend a lot of time thinking about how knowledge management, knowledge brokerage, online communities and participation processes can support evidence-informed policymaking (here), so I found CJR’s deep dive into how the Qanon knowledge community works absolutely fascinating.
“QAnon doesn’t simply offer readers insider insight into current events, but also provides them the ability to take part in shaping those events through an intricate research process.”
The result is a body of “validated” knowledge providing a coherent, “proven” framework for followers to understand “what is really happening”.
It is not science but it feels like it, which for many people is enough. It is in fact a particular variety of community-driven fan fiction: what CJR calls a “dark participatory culture” for what its members believe to be very serious political participation.
That’s it for this edition. As always, all resources are tagged #us2020 and #disinformation on my Hub, where you can also get the RSS of individual resources, browse and subscribe to my newsletter, and explore everything else I Like, Think and Do. This newsletter is underpinned by a Zettelkasten Overview as part of my enhanced Personal Content Strategy. I am @mathewlowry on Twitter.