top of page
Writer's pictureMike Flood

How misinformation threatens democracy


By Mike Flood


Mike is Chair of Milton Keynes Humanists and the UK-based network Humanists for the Common Good. In this article he writes that false information and half-truths can promote narratives that undermine democratic norms and that humanist organisations are beginning to wake up to the threat.


In March 2024, deepfake images of Donald Trump fraternising with a group of smiling black people were circulating on social media: they had been generated by his fans using artificial intelligence. The intention was clear: to solicit support for their hero from within an important demographic in the run up to the presidential election in November. Mark Kaye, a popular Florida talk show host, was one of the influencers. When questioned about this he conceded that an image that he had shared was doctored, but he said he didn’t believe that he was doing anything wrong. “If anybody’s voting one way or another because of one photo they see on a Facebook page”, he protested, “that’s a problem with that person, not with the post”. I’m just “a storyteller...”.


Such deception may present a problem for “that person” (albeit trivial), but it presents a much bigger problem for society, and especially where such trickery is targeted at voters in vulnerable swing states. We can’t tell what impact manipulated images will have on Trump’s election chances, but in early March, this particular image was reported to have racked up over 1.3 million views.


That’s America. But you don’t need to look far to find examples of consequential fakery closer to home:

  • The infamous pledge plastered on the side of a bus in the run-up to the 2016 UK referendum on membership of the EU, which misled voters about the costs and benefits of EU membership — “let's fund our NHS instead”, it implored, rather than sending “£350 million a week to the EU”; or

  • The fraudulent claims about the safety and effectiveness of vaccines, which undermined public health messaging about Covid, prolonged the lockdown, and endangered public health; or

  • The attempts to demonise political leaders — the deepfake audio clip of Keir Starmer appearing to swear at his staff, that circulated on social media just before the 2023 Labour Party Conference; and the fake clip of London Major Sadiq Khan posted just before Remembrance Day commemorations calling for pro-Palestinian marches to take precedence.


Some perpetrators spread “fake news”/half-truths without bad intent. They believe it’s right – and claims do often contain an element of truth (vaccines can kill, although the risk is small). But with other activists, the intention is to deceive, i.e. “disinformation” rather than “misinformation”, perhaps because they think that a little manipulation of the “facts” is justified if it serves a higher purpose or “truth”. And hostile foreign powers have become specialists in disinformation. These days they deploy armies of bots programmed to manipulate narratives and sway public opinion – Putin’s Russia is a master at it.


Repeating statements often enough doesn’t make them true, but it does make them more likely to be believed (the so-called “illusory truth effect”), whilst “confirmation bias” (our natural tendency to ignore contrary information) can lead to people becoming effectively isolated in their own ideological “filter bubble”, harbouring a customised and distorted view of the world. And one wonders what will happen to people’s internet searches as Large Language Models (like ChatGPT) start flooding the web with wild “hallucinations” (made-up “facts”). Speaking of which, we also have to cope with conspiracy theories that proliferate in environments where objective truth is questioned or dismissed. Followers may passionately believe that some incident or other either did not happen in the way reported, or that some covert, powerful and malevolent (“deep state”?) organisation is responsible. Simple/simplistic explanations can be compelling. And then there are the troublemakers who steal and make private information public – Hilary Clinton’s presidential campaign was seriously derailed by mal-information when thousands of compromising emails from her campaign manager were leaked. As William Blake once wryly observed: “A truth that’s told with bad intent beats all the lies you can invent.”


The impact on democracy / people

False information and half-truths can promote narratives that undermine democratic norms and values (or justify authoritarian measures). And when the messaging is convincing, and it’s amplified by trusted sources, it makes it difficult for individuals to discern fact from fiction. Moreover, it is not possible to quantify the social cost of this, in terms of infringement of rights, polarised opinion, loss of trust in authority, and bad decision-making, but it is clearly many orders of magnitude greater than what it cost the perpetrators to generate and circulate their mischief – and let’s not forget the reputations blighted and the lives lost (note 1). What’s more, as Jonathan Swift observed: “Falsehood flies, and the Truth comes limping after it” (note 2). One wonders what Swift might have said today in the era of 24-hour news cycles, smartphones, social media and AI-powered bots, when malicious gossip, fabricated images and deliberate falsehoods can be circulated to a global audience within seconds, and it’s possible to tailor and micro-target advertising at specific ethnic, religious or other groups – and even take account of individuals’ personal circumstances and vulnerabilities.


The World Economic Forum’s latest Global Risk Report ranks “false information” as “a new leader of the top 10 rankings”, and it notes that the “disruptive capabilities of manipulated information are rapidly accelerating, as open access to increasingly sophisticated technologies proliferates and trust in information and institutions deteriorates”. In the next two years, it concludes, “a wide set of actors will capitalise on the boom in synthetic content, amplifying societal divisions, ideological violence and political repression – ramifications that will persist far beyond the short term”.



Tyrants, sociopaths, extremists and criminals have always played upon the vulnerabilities inherent in liberal democracies, but it wasn’t until the advent of social media and smartphones that bad actors have been able routinely to exert significant influence on public opinion. And they can do this today at very little cost or political risk. Indeed, judging from the number of reports that are now appearing, “foreign information manipulation and interference” (FIMI) is a major worry in official circles. It is likely to have been a contributory factor in the recent demise of democratic government around the world – the graph is from the Economist Intelligence Unit’s latest Democracy Index. And this trend may well have been hastened by the growing perception in the South that the West has double standards... Western governments are fighting back (including through the Five Eyes Alliance), but this is asymmetric warfare, with some leading perpetrators operating from behind robust “firewalls” (and controlling the news and what their people can see or say).


What’s to be done?

In addition to cybersecurity work, addressing the threat requires:


  • Supporting independent media organisations that adhere to rigorous journalistic standards and independent fact-checking initiatives;

  • Regulating and holding content creators and social media platforms to account (i.e. imposing fines that really hurt) if they produce or inadvertently disseminate false information;

  • Pressing for content moderation practices, and enhancing transparency, especially around elections and associated advertising/political content; and

  • Promoting critical thinking, media literacy and civil engagement to foster a culture of informed citizenship and democratic participation – as Finland has been doing for years (the country scores highly on media literacy and a range of other desirable social characteristics).


Society also needs to address underlying socioeconomic factors that contribute to people’s anxiety and confusion, such as systemic inequality, polarisation and marginalisation, and it needs to accept that tackling false information requires sustained international cooperation and collaboration.

Some in the West see any kind of regulation as a threat to free speech, but no one has a right to spread untruths – unless it’s in the form of political satire, which, as one lobbyist has pointed out, is “protected speech”, even if you don’t get the joke! Neither are people free from responsibility if they circulate misleading information that causes harm. The law will have its say, just as it will if material is deemed to be stolen, libellous or illegal. Moreover, achieving “reach” over social media is not a right, it’s a privilege, and there is increasing pressure to hold big tech bosses to account if they tolerate noxious, harmful or misleading material on their platforms (note 3).


What you can do

And what can you and I do? Clearly, we can make it our business to seek out sources of news and information that we trust. Some of those that I regularly consult for my work on Fighting Fake are shown here – none are perfect, but these agencies do at least publish apologies or corrections when they get it wrong. And you can always check media credentials with respect to “factual reporting” or “political bias” on websites like Media Bias/Fact Check. And if you have doubts about a story or fact, you should seek validation from other sources, and if still in doubt, use a reputable fact-checker. For suspect pictures, you can carry out a “reverse image search” (using a tool like TinEye), or check for “watermarking” (for AI-generated imagery, such as the deepfakes of Trump fraternising with Black voters).


Humanist organisations are beginning to wake up to the threat. During the Covid lockdown, Humanists International ran a campaign to counter misinformation and Humanists UK has raised concerns about people spreading false information about Canada’s experience with assisted dying, and also misinformation about climate change, but given the scale of the problem and the importance humanists place on reason and evidence, isn’t there a case for everyone being more proactive, as the General Assembly of Humanists International called for last summer in Copenhagen? The truth, surely, is worth it.


Notes

  1. In the States “SorryAntivaxxer.com” has been keeping a compendium of medical horror stories and the avoidable deaths of people who rejected vaccines and protective masks and went on to succumb to Covid.

  2. And yes, according to Quote Investigator, it was Swift and not Mark Twain, Thomas Jefferson or Winston Churchill!

  3. Freedom from prosecution (in the US) dates back to passage of Section 230 of the 1996 Communications Decency Act, which provides immunity from liability for providers and users of an “interactive computer service” who publish information provided by third-party users. Some argue that “We should keep Section 230... but condition it on reasonable content moderation practices that address illegality causing harm”.

49 views0 comments

Recent Posts

See All

Comments


bottom of page