News/Analysis/7 positive steps

A light blue letter icon with 2 in the top right against an orange signal wave and purple background.

Welcome to The Spectrum of Misinformation newsletter bringing you the latest misinformation news, in-depth analysis, and practical advice for communicators.

In News digest catch-up on US threats to misinformation research, responses to the dropped Australian disinformation bill, fact-checking for influencers, CDC nomination, and misinformation links in UK politics. There's an analysis of the threat to universities in Research: under threat, while you'll find advice on developing counter-misinformation interventions in 7 positive steps including the in-depth explainer How weak? The art of weakening attacks.

News digest

FT covers concerns the incoming Trump administration could cut funding for US misinformation research and support legal action against universities undertaking it (see below: Research: under threat).

ABC News quotes its chair saying that Australia is being "flooded" with misinformation and disinformation and that the broadcaster is best placed to help stem the tide. It follows the Australian government withdrawing its bill to require social media companies to combat misinformation. BBC reports a bill to ban under-16s from social media has passed but critics question how it will work.

CNN covers a survey of 500 social media influencers with 62% saying they do not verify the accuracy of information before sharing. UNESCO are calling for influencers to learn fact-checking skills. Read the report.

POLITICO highlights Dave Weldon, nominated to lead the CDC, has questioned the safety of vaccines such as MMR and suggested a mercury-based preservative once commonly used in vaccines is linked to a rise in autism.

The Guardian reports Nigel Farage's campaign group against WHO has links to consultants working for vaping and novel nicotine firms. Meanwhile, Tortoise alleges a peer who has joined a cross-party group to fight disinformation has previously used Kremlin talking points.

Research: under threat

The latest news from the US is concerns are being raised that misinformation research and researchers will be targeted by the Trump administration (see FT).

The move is being framed as a pro free speech crack down on a leftwing "censorship cartel" and alleged "election interference" linked to investigations that found a Russian troll farm tried to interfere in the 2016 US election.

Trump has said he would seek to curb funds to any US universities found to have engaged in censorship activities “such as flagging social media content for removal or blacklisting” for at least five years. Trump supporters have also suggested criminal charges could be brought against universities.

US researchers are understandably worried about potential cuts to funding and personal attacks. Berin Szóka, president of TechFreedom, told FT: “All they have to do is threaten criminal litigation to get people to stop doing research."

It follows a pattern of the incoming administration threatening universities, for example as The Guardian reports, over accusations of antisemitism.

I think it shows how uniquely vulnerable misinformation research is to allegations of political bias when, while leftwing misinformation attacks do occur, the vast majority of attacks in the US and UK can be described as, at the very least, aligned with a rightwing agenda.

Topics such as public health and climate, where there used to be some kind of political consensus have, following the COVID-19 infodemic and fossil fuel industry lobbying against net zero, fractured along party political lines.

So what's the outlook for UK misinformation research?

While I expect elements of the Conservatives/Reform to make similar allegations against UK universities, the Labour government has taken a different tack.

Following the role misinformation played in the July 2024 riots, education secretary Bridget Phillipson said she would launch a review of the schools curriculum in England to embed critical thinking across multiple subjects and arm children against "putrid conspiracy theories" (see Sky News). So far there are few details of the review but it'll be worth watching to see if these strong words are matched by strong evidence-based action.

7 positive steps

Planning to fight misinformation can seem daunting, so where should you start? Below are my top 7 tips for a more effective counter-misinformation approach:

  1. Set your threshold

You can't fight every misinformation attack so decide in advance which key topics or preexisting myths should trigger a response.

Do these topics/myths align with your organisation's strategic priorities? Are they linked to a risk of serious real-world harms? Prioritise those that are potentially most damaging to your organisation/the public. Consider engaging if the false claims are both spreading widely and being shared by influential figures.

  1. Target your audience

I've previously discussed thinking of everyone involved in/exposed to an attack as sitting somewhere along 'The Spectrum of Misinformation'. On this spectrum they can be described as sitting within particular groups with different motivations and vulnerabilities (see Decoding the spectrum).

Deciding which group on this spectrum you are looking to influence and what you hope to achieve can help to shape any counter-misinformation intervention.

For example, if you are tackling a preexisting myth such as false claims linking vaccines to autism you might try and contain the onward spread. That might mean targeting those sharing misinformation out of genuine concerns (a sub-set of the group I call Believers), providing reassuring evidence of vaccine safety as well as evidence of the harms vaccines prevent, motivating them not to share.

Alternatively, if dealing with a new misinformation attack, such as the false claim the WHO will enforce national lockdowns, you might want to target those who are already sceptical (described as Sceptics and Rejectors), giving them convincing reasons why it is worth engaging and strong counter-arguments that can help them to rebut the misinformation.

  1. Focus on issues not personalities

It's tempting to tackle high-profile figures, such as politicians or celebrities, who are creating or sharing misinformation head-on and call them out. But remember, polarisation is often exactly what misinformers want.

Personal attacks are only likely to fuel extreme views on both sides, creating uncertainty that leaves the usually large proportion of "don't knows" (Neutrals) on an issue unsure who to trust and more vulnerable to misinformation.

Moving swiftly from the inevitable references to what a powerful and influential misinformer said to address the genuine concerns people reading/listening might have ("this is really about", "the evidence on X shows us") is much more likely to reach beyond those who are already sceptical.

  1. Engage with empathy

Misinformers are experts at incorporating emotive language, framing, and examples into their attacks. Just think how many attacks focus on risks to children or a perceived threat from 'outsiders'.

The attacks are designed to generate anxiety, fear, anger that, research tells us, makes people more ready to believe fake news.

So it's worth thinking about how, rather than attacking someone's core beliefs, any intervention could be reassuring, neutral, dialling down the emotive rhetoric. How could it demonstrate understanding and empathy for people's fears while guiding them to compelling reasons not to be angry or afraid?

  1. Weaken the attack

One of the key challenges of any intervention is how to warn of a misinformation attack without risking unintentionally spreading the fake news itself. I take an in-depth look at this problem in How weak? The art of weakening attacks.

  1. Show your working

Earlier this year I attended a panel discussion on misinformation with journalists. Maintaining public trust was identified by journalists as their biggest challenge in this new age of fake news.

How did they try to build trust? One answer was to show their working at every stage when rebutting misinformation or conspiracies so that any reader could follow the evidence and their reasoning (and their calculations) without just taking their word for it.

I think this is an important lesson for anyone tackling misinformation: being honest and transparent about the evidence (and where it's lacking) and how you arrived at your conclusion that claims are false is one of the best ways to persuade the ambivalent "don't knows" on an issue to listen.

  1. Highlight consensus

While claims of expert knowledge aren't likely to win people over on their own there's plenty of research suggesting that providing evidence of scientific consensus on an issue can help people to resist attacks.

Perceived scientific agreement/consensus on an issue is thought to act as 'gateway' or 'bridge' to other beliefs, for example being persuaded by the consensus that climate change is happening can lead to people accepting evidence it is manmade.

Highlighting consensus is a way of counteracting the common misinformer tactic of trying to seed uncertainty about information on an issue (often called "keeping the controversy alive") which keeps people vulnerable to further attacks.

Missed newsletter #1?

Start from the beginning as Adventures in misinformation reveals what inspired this newsletter, swot up on technical terms with Learning the language, discover what the spectrum is all about in Decoding the spectrum, and explore what Trump's victory could mean for misinformation in US: get ready for the surge.