I read 100-and-counting articles on Fake News so you don’t have to

A library, organised by topic, with a title I wish I could go back in time and change. Constantly under development.

Introductions

(2018) Back in December 2016, I read 50 articles on Fake News so you don’t have to for my newsletter, hosted on my TumblrHub. I organised those 50 resources, curated over the preceding years, by topic, so when I discovered that each Medium post’s heading has its own hyperlink I realised I could create an online library here on Medium by importing the 2016 newsletter and updating it with a selection of 130+ resources I’ve curated on fake news since then, as another extension of my personal content strategy. This library doesn’t include all of them— just the essentials.

In other words, you’re probably here because I shared one of these links:

Apart from adding links, I’ve barely touched the original post, so it still has its 2016 Introduction (below) and these “First reactions”:

(2016) I probably don’t need to tell you that fake news is the topic of the moment — it’s as if this is a just-discovered Brand New Problem.

Image for post
Image for post
Not a new problem

Just try telling that to a Brit (left). So, not a new problem, then. And if you think it’s limited to the Anglo-Saxon world, I have some very bad news for you.

I first came across fake news in 2007 as I explored the weird and frightening world of Eurosceptic echo chambers. Back then they weren’t algorithmically driven, so I figured that although they were a problem, they wouldn’t be a mainstream problem.

Which is probably why Eli Pariser’s 2011 book on filter bubbles — echo chambers reinforced by social media algorithms like Facebook’s Newsfeed — made such an impact on me. Because whenever someone discovers how to make a fortune out of human psychology, problems generally result. They did.

whenever someone discovers how to make a fortune out of human psychology, problems generally result

(Eli Pariser created a Slack Channel for discussing solutions — see you there, but start here)

Until the 2016 US election, however, this remained a niche topic: while I’ve been reading & tagged resources filter bubble and fake for several years, the vast majority of the latter were published only recently.

So, what did I learn?

Sorry, wrong question — try: “what am I reading?”. This enewsletter edition is me rereading and reflecting on those resources, as part of my recently tweaked personal content strategy, where I write these newsletters every few weeks to help me absorb and digest the stuff I read every day, leading to an occasional (hopefully) original post of my own.

All I know as I start is there will be no easy solutions — this is me trying to understand the key questions. Although it’s more of a shared, annotated reading list than anything else, I do hope you get as much out of reading it as I did writing it — every article I quote below is well worth your time, and I do include some first thoughts.

Definitions: You know it when you see it, right?

not much drives traffic as effectively as stories that vindicate and/or inflame the biases of their readers…
-
What was fake on the Internet this week: Why this is the final column (December 2015)

So fake news is stuff that’s made to be shared on the internet? But isn’t everyone trying to get their stuff shared on the internet?

analysis of six hyperpartisan Facebook pages found that posts with mostly false content or no facts fared better than their truthful counterparts
-
Facebook’s fake news problem won’t fix itself

We presented approximately 800 participants across four studies with statements ranging from the mundane to the meaningful. We included some bullshit too…
- Most of the information we spread online is quantifiably “bullshit”

mainstream media and polling systems underestimated the power of alt-right news sources and smaller conservative sites that largely rely on Facebook … Before social media, the filter was provided by media companies…
-
Facebook’s failure: did fake news and polarized politics get Trump elected?

So it’s stuff that’s doing well on Facebook / the internet that’s not true? But who determines truth here?

… government propaganda designed to look like independent journalism… any old made-up bullshit … a hoax meant to make a larger point… [or only] when it shows up on a platform like Facebook as legitimate news? What about conspiracy theorists … satire intended to entertain… a real news organization that gets it wrong?
- The Cynical Gambit to Make ‘Fake News’ Meaningless

There’s bad information out there that’s not necessarily fake. It’s never as clear-cut as you think… Facebook’s algorithm may not understand the various shades of falsehood.
- Facebook’s Fake News Crackdown: It’s Complicated

OK, so one person’s fake news is another person’s Truth and an entire Macedonian village’s main revenue stream?

And what about Bye-Bye Belgium (hat-tip: Giorgio Clarotti)? Is this 2006 ‘docu-fuction’ journalism? Fake news? Fiction? Satire? Or all of the above?

A good historical perspective plus some reframing from CJR:

an independent, powerful, widely respected news media establishment is an historical anomaly… Fake news is but one symptom of that shift back to historical norms… provides a … scapegoat for journalists grappling with their diminished institutional power
-
The real history of fake news

And whoah! Is fake news even human?

a large portion of online chatter about the 2016 elections was generated by bots
- Misinformation on social media: Can technology save us?

Can technology save us? Nope. Technology = Algorithms, which can — and will — always be gamed.

The sentiment behind the Atlantic’s December 2016 “Cynical Gambit” piece, above, slowly gathered pace through 2017 — for example:

it is more important than ever to define what “fake news” actually is, and what it is not… BBC Media Editor Amol Rajan has identified three sorts … Huffington Post blogger … five sorts… First Draft and … Shorenstein Center… seven sorts…
- Fake News: Defining and Defeating

the conversations … were actually pretty shallow … because everybody meant different things… we can only really start talking about interventions if we understand what we’re talking about
- Stop Calling It Fake News, Harvard Kennedy School (podcast)

The latter quote is from Claire Wardle (Director, First Draft), who co-authored (with Hossein Derakhshan, one of my favourite bloggers) a report for the Council of Europe (pdf) which is actually worth reading, and persuasively sets out three ‘Information Disorders’ (disinformation, malinformation and misinformation) to replace the crap term ‘Fake News’.

Obviously the EU Commission didn’t read it, because a month later they created … the High Level Expert Group on Fake News. Thankfully, from Page One, these experts dismiss the term ‘Fake News’ as unhelpful, redefining their mission to focus on disinformation. So we’ll get there in the end.

Politics

is where it starts getting genuinely frightening. In the radical right corner:

“If you want to make big improvements in communications… hire physicists”. Cambridge Analytica… claims to have built psychological profiles … on 220 million American voters… We are inside a machine and we simply have no way of seeing the controls … the most powerful mind-control machine ever invented…
- Google, democracy and the truth about internet search (Guardian, Dec 2016)

And over there in the radical left(?) corner:

The Five Star Movement controls … sprawling network of websites and social media accounts that are spreading fake news, conspiracy theories, and pro-Kremlin stories to millions… The leaders of the party are making money with a fake news aggregator.
- Italy’s Most Popular Political Party Is Leading Europe In Fake News And Kremlin Propaganda

So does that make Russia the referee? Uh-oh:

From a nondescript office building in St. Petersburg, Russia, an army of well-paid “trolls” has tried to wreak havoc all around the Internet — and in real-life American communities.
-
The Agency (June 2015)

The net result is an American information environment where citizens and even subject-matter experts are hard-pressed to distinguish fact from fiction
- TROLLING FOR TRUMP: HOW RUSSIA IS TRYING TO DESTROY OUR DEMOCRACY (Nov 2016)

And for balance (remember that?):

In their evidence-lite handwringing over Russian mastery of Western electorates, Remainers and Hillary backers are rehabilitating the panic and conspiracy theorising of the McCarthyites of old.
- ‘Putinites on the web’ are the new ‘Reds under the bed’

that Russia took advantage of the social web’s desire to just share things without reading them… may be true, but so does every other media outlet
-
No, Russian Agents Are Not Behind Every Piece of Fake News You See

from the Guardian (above) and others didn’t explode across the web as soon as they came out. I curated a few resources in 2017 (they’re all here), but it wasn’t until March 2018 that the Guardian blew the lid with The Cambridge Analytica Files — an explosive exposé of how a billionaire-funded alt-right media combined forces with “MI6 for hire” digital mercenaries to weaponise Facebook, who didn’t seem to care (until now).

“The company has created psychological profiles of 230 million Americans. And now they want to work with the Pentagon? It’s like Nixon on steroids.”
- ‘I made Steve Bannon’s psychological warfare tool’: meet the data war whistleblower

Other things I liked since my 2016 post included:

Fake news has been around as long as real news… a much more important problem … is the delegitimization of real news by American conservatives… This is not “fake news.” It is a blatantly ideological distortion of real news.
- Fake News Is Not the Real Media Threat We’re Facing

seek to produce a divided electorate and a president with no clear mandate … an American information environment where citizens and even subject-matter experts are hard-pressed to distinguish fact from fiction… unsure who to trust… more willing to believe anything that supports their personal biases
- TROLLING FOR TRUMP: HOW RUSSIA IS TRYING TO DESTROY OUR DEMOCRACY

The vote marked the culmination of a targeted, 11-day information operation that was amplified by computational propaganda techniques and aimed to change both public perceptions and the behavior of American lawmakers…#releasethememo [evolved] … from discussion on Nunes’ memo through … an expanding conspiracy theory about missing FBI text messages and imaginary secret societies … internal coups … an organizational framework for this comprehensive conspiracy theory… to minimize and muddle concerns about Russian interference in American politics…
- How Twitter Bots and Trump Fans Made #ReleaseTheMemo Go Viral

While not adding much particularly new, finally, Wired is excellent with:

Humans are a social species… particularly susceptible to glimmers of novelty, messages of affirmation and belonging… of outrage toward perceived enemies. These kinds of messages are to human community what salt, sugar, and fat are to the human appetite. And Facebook gorges us on them… Today’s phantom public sphere has been fragmented and submerged into billions of individual capillaries… all of this invalidates much of what we think about free speech
- It’s the (Democracy-Poisoning) Golden Age of Free Speech (Wired)

… and forms a nice bridge between Politics and Psychology, below.

Psychology

Somewhere between your ears (and millennia ago in our evolutionary past)

Unless we understand the psychology of online news consumption, we won’t be able to find a cure …
-
Why do we fall for fake news?

simply repeating false information makes it seem more true
-
Unbelievable news? Read it again and you might think it’s true

As Facebook attempted to capture the fast-moving energy of the news cycle from Twitter… it built a petri dish for confirmation bias … Fake news and sensationalist news would be relatively ineffective without the existing worldview they confirm. But with … distrust of “the system” held by millions of Americans, Facebook provided the accelerant
- How The 2016 Election Blew Up In Facebook’s Face

polarization does not happen … because one side is thinking more analytically, while the other wallows in unreasoned ignorance … subjects who tested highest on measures like “cognitive reflection” and scientific literacy were also most likely to display … “ideologically motivated cognition.”
-
How Your Brain Decides Without You

More: every single resource I’ve tagged psychology is fascinating.

Update (2016): since finishing the first version of this post, I think the psychological aspects of fake news, and what they mean for the future of journalism, will be the focus of any eventual post.

Blogplug: I followed through on the above idea with two enewsletters exploring psychology (What happens when AI, Psychology & Big Data drive politics? and Psychology: One topic, three angles), which lead to:

Also worth the read:

confirmation bias… must have some adaptive function… related to our “hypersociability.” … Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing… no sharp boundary between one person’s ideas and knowledge and those of other members of the group… If your position… is baseless and I rely on it, then my opinion is also baseless. Tom … agrees with me, his opinion is also baseless, but now the three of us concur we feel that much more smug about our views…
- Why Facts Don’t Change Our Minds

What sets apart the content I’ve curated and tagged both ‘fake’ and ‘psychology’ is that they include potential solutions. For example:

people are more willing to accept politically polarizing information if it’s discussed in a way that doesn’t challenge how they view the world… ignoring the effects of information avoidance and discussing only ignorance and stubbornness does us all a disservice by framing the problem in partisan terms…
- Why each side of the partisan divide thinks the other is living in an alternate reality

Clinicians should aim to understand parents’ values and engage in genuine, respectful conversations… can lead one-third to one-half of initially hesitant parents to vaccinate… by identifying a shared goal… anecdotes that help parents understand the importance of vaccination may also be remembered more than data or statistics…
- The best shot at overcoming vaccination standoffs? Having doctors listen to — not shun — reluctant parents

Empirical data carry little weight against an argument from authority. And the reverse is true too…“fact checking” has little impact on those whose facts are determined by authority… to undermine the argument from authority we cannot do it through science — we have to do it by undermining the authority itself.
- Seeking truth among ‘alternative facts’

Education

82% of middle-schoolers couldn’t distinguish between an ad labeled “sponsored content” and a real news story…
- Most Students Don’t Know When News Is Fake, Stanford Study Finds

Aside: that one was also tagged native advertising, which reminds me to ask: when a corporation is paying for journalistic coverage, is the result journalism, advertising or fake news?

What we need today is metaliteracy — an ability to make sense of the vast amounts of information in the connected world of social media… educators and policymakers must “demonstrate the link between digital literacy and citizenship.”
-
How can we learn to reject fake news in the digital world?

See also: In the war on fake news, school librarians have a huge role to play

as one of the ‘magic bullets’ everyone called for through 2017 and since. My 2017, however, began with danah boyd:

We’ve been telling young people that they are the smartest snowflakes… and that they should trust their gut to make wise decisions… All they have to do is “do the research” for themselves

Did Media Literacy Backfire?

She followed this up with “What Hath We Wrought?” at SXSW a year later.

So Who is doing What about it?

over 30 news organizations … and tech companies such as Facebook, Twitter, and YouTube to share best practices on how to verify true news stories and stop the spread of fake ones
- Facebook, Twitter, and 30 other orgs join First Draft’s partner network to help stop the spread of fake news

So glad that’s working!

Following the controversial firing of the editorial team who managed the Trending Topics… technology that will help prevent fake news stories from showing up in the Trending section
- Facebook to roll out tech for combating fake stories in its Trending topics

Facebook’s application for Patent … describes a sophisticated system for … improve the detection of pornography, hate speech, and bullying… much easier to identify than false news stories.
-
Facebook is patenting a tool that could help automate removal of fake news

Free advice for Facebook: when you’re in a hole, stop digging. So firing your editors because they’re biased is only a good idea if you’re going to replace them with something better. And if someone does develop a fake news detector, don’t block it, even temporarily.

a red flag will appear if the site or news is deemed fake, yellow if the source is unreliable or green if it’s ok
- How Le Monde is taking on fake news

OK as defined by…?

If you can detect trolls, you can protect the people they’re trolling by muting or putting a warning over the trolls’ posts
- Machine learning can fix Twitter, Facebook, and maybe even America

See also: Hoaxy: A Platform for Tracking Online Misinformation

What about good, old-fashioned, high-quality journalism? Oh.

“…all the fact-checking of Trump’s lies, all the investigative journalism about his failures, even the tapes — none of it meant anything”… we ended up with a filter bubble election.
-
The Dissolution of News: Selective Exposure, Filter Bubbles, and the Boundaries of Journalism

More: do I really have to send you to my collection of resources tagged filterbubble again? How about factchecking, then?

And let’s not forget the legislators:

Tech companies may face new legislation after struggling to comply with voluntary code of conduct…
-
Facebook, Twitter, and Google are still failing to curb hate speech, EU says

(re: Facebook’s Dec 16, 2016 announcement) You don’t need me to enumerate the tactics Facebook is introducing — analyses are everywhere, or you could go straight to the announcement or Zuckerberg’s followup. The most interesting analyses I saw included (my emphases):

people who want to coordinate to mess with the system will be able to do so fairly easily
-
Clamping down on viral fake news, Facebook partners with sites like Snopes and adds new user reporting

If Facebook is going to pay for video, it might want to consider paying for truth, which is also good for business… Lies, propaganda, fake news, hate, and incivility won’t be “fixed” with any product or algorithm or staffing tweaks
-
Facebook Steps Up

Relying on third-party fact checkers could lessen scrutiny on Facebook’s verification process and give the company another scapegoat… “labeling stories that have been flagged as false”… [will] fly in the face of the company’s business interests… explicitly ban fake news sites from placing ads … is a nice-sounding gesture that won’t actually do much… Any solution must center on the way News Feed ranks stories… “Fake news” isn’t a glitch in the system, but rather the Like economy working at peak efficiency.
- A Closer Look at Facebook’s Fake-News Fixes

(And because there’s always more Firehose …) Here’s some more stuff tagged ‘fake’ on my ReadMe queue that I haven’t gotten to yet:

but for every post describing a way forward I saw another saying that it wouldn’t work. For example:

a major new Yale University study finds that fact-checking and then tagging inaccurate news stories on social media doesn’t work… “disputed” tags made participants just 3.7 % more likely to correctly judge headlines as false…
- Tagging fake news on Facebook doesn’t work, study says

crowdsourced trustworthiness ratings are actually much less effective if they exclude the ratings from people who are unfamiliar with a given site. Which is what Facebook plans to do…
- Crowdsourcing trusted news sources can work — but not the way Facebook says it’ll do it

I also really liked, in the wake of (yet) another US school shooting:

the response from the platforms is the equivalent of politicians’ “hopes and prayers” … irrefutable proof that these actors are leveraging social media …
Just like guns, social media platforms can be automated — they all have APIs that allow accounts to post content in an automated fashion. And just like guns, when accounts are automated, they can do significant damage…
- The Automatic Weapons of Social Media

By late 2017 the Institutions had started weighing in, with an excellent report for the Council of Europe from Wardle and Derahkshan, and another from an expert group convened by the European Commission. At last, people started talking realistically about regulation:

government should require social media platforms… to make it possible for third parties to build software to monitor and report on the effects of social media algorithms…
- How to Monitor Fake News

Inevitably, a blogplug is required at this point, after I was pulled into some interesting conversations surrounding these reports:

First reactions

(2016) I think I’m now more confused than I was before I started. What stays with me, however, is best summarised by Jeff Jarvis:

Do we really want to set up Facebook or Google as censors … to decide what is real and fake, true and false?
-
Fake News: Be Careful What You Wish For

No, I really, really don’t.

I understand the desire to get technologists to “fix” this.

Asked who should tackle the problem, respondents gave about equal weight to government, tech companies such as Facebook and Google, and the public…
-
Fake news is sickening. But don’t make the cure worse than the disease

But despite all the efforts into natural language processing, semantic analysis and AI, there will be no simple technical fixes.

Why? Any technical fix will be vulnerable to the same problem all algorithms face: they can be gamed. Today’s Macedonian teenagers are simply gaming the Facebook algorithm to make cash. Previous generations (anyone here remember content farmers?) gamed Google. They still are.

if we end up trusting algorithms… we are allowing the truth to be gamed

So if we end up trusting algorithms to distinguish fake news from real, we are allowing the truth to be gamed. Which is, of course, exactly what’s happening. Why should more technology make this any better?

We’re the problem. Whatever the algorithm, someone will game it for personal gain. And they won’t be exploiting technoloqy — they’ll be exploiting human psychological weaknesses like the backfire effect, confirmation bias, homophily, narcissim and assimilation bias.

(2016) Technology only ever augments our own human tendencies, for good or ill. Tech fails if it doesn’t somehow work the way our brains work.

Everyone was so naive about the Internet in the 1990s — we all thought it would bring the world together, as if it could somehow surgically remove from human nature our desire to flock together with the likeminded. Of course, the early internet geeks were likeminded.

Which makes this a wicked problem: not only is human nature hard to change, there are also billions of us, and there’s money being made.

So a better understanding of human psychology looks crucial. But where to apply that knowledge? Education from the youngest age is clearly critical, but it will take decades to create a media-literate and ‘metaliterate’ population via our schools. And by that time, other problems will certainly be upon us.

(2016) The best way of increasing understanding of an issue amongst an adult population used to be via high-quality journalism, but we wouldn’t have this problem if the newsindustry had not been devastated.

As the first posts I curated above show, one person’s fake news is someone else’s authoritative journalism and a third person’s business model. Technology can’t tell the difference — but more people already trust algorithms with this than they trust editors and journalists.

It’s difficult to see this being reversed in an industry in freefall, increasingly reliant on native advertising.

Looking ahead

that by the time I’d gotten here — at the end of the post, after having ploughed back through all those resources and (trying to) organise my thoughts — that an idea, a solution, something would have presented itself. Anything I could develop into a ‘proper’ blog post, rather than these half-cooked Notes To Self.

That hasn’t happened. Sorry. At the back of my mind is a tickle tagged open web, but it’s not very strong, probably because the Open Web is no better shape than professional journalism.

What I did find are indications that this is not going to get any easier:

If you think this election is insane, wait until 2020… technologies like AI, machine learning, sensors and networks will accelerate. Political campaigns get … so personalized that they are scary in their accuracy and timeliness.
-
5 Big Tech Trends That Will Make This Election Look Tame

More: resources tagged AI.

But I also see good things, and a lot more stuff to read. Maybe something will come.

(Update, 2016: perhaps due to Facebook’s announcement, perhaps simply because I’ve had a couple of days to mull things over since first publishing these notes, I think my next post will dive deeper into how psychological flaws have rendered journalism — particularly factchecking, data- and explanatory-journalism — useless.)

“We are so screwed it’s beyond what most of us can imagine… a slew of slick, easy-to-use, and eventually seamless technological tools for manipulating perception and falsifying reality… You don’t need to create the fake video… just point to the fact that the tech exists and you can impugn the integrity of the stuff that’s real…
- He Predicted The 2016 Fake News Crisis. Now He’s Worried About An Information Apocalypse.

Annexes

In the meantime, I’m grabbing a copy of the 2011 film Detachment for Christmas. Here’s why (hat-tip: @marcoRecorder via Energia viva)

“How are you to imagine anything, if the images are always provided for you?… We must learn to read, to stimulate our own imaginations, to cultivate our own consciousness, our own belief systems. We all need these skills to defend, to preserve, our own minds”

Written by

Piloting innovative online communications since 1995. Editor: medium.com/Knowledge4Policy. Founder: MyHub.ai. Personal Hub: https://myhub.ai/@mathewlowry/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store