Event highlights + News/Analysis

A pink mail icon with the number 13 in a box against a spotlit blue background.

The Spectrum of Misinformation returns with misinformation news, analysis & practical advice for communicators.

In News digest, experts reject Trump's paracetamol claims, Charlie Kirk shooting conspiracies, a call to tackle UK health misinformation, YouTube to reinstate misinformers, the fake news mogul funding the far-right, 'Big Meat' plans to attack climate and health research. Plus: can D&D play a role in fighting misinformation? And did Trump just post a deepfake of <checks notes> himself? See how UK ministers are entering the fray in Politicians, assemble! Then catch-up on an eventful September in Event highlights and relive past glories in Rewind: all about August.

News digest

The Express, The Guardian & BBC were among many outlets to feature experts and health authorities rejecting Trump's unsubstantiated claim that taking Tylenol (paracetamol) in pregnancy causes autism in children. Health Secretary Wes Streeting said: "I trust doctors over President Trump, frankly, on this." (see Politicians, assemble!)

On LBC Nigel Farage refused to dispute Trump's claim saying he had "no idea" but would not side with experts: “I don’t side with anybody because science is never settled, and we should remember that.”

The Guardian reports GMC are investigating a cardiologist who linked COVID-19 vaccines and cancer in a Reform conference speech. Medical bodies say his 'pseudo-science' remarks undermine trust in doctors.

AP, CNN, Reuters and other outlets detail false claims and conspiracy theories following the shooting of US influencer Charlie Kirk including fake photos, misidentification of the attacker, and bogus AI answers.

The Independent covers LSHTM experts raising concerns false and misleading health advice from social media and AI tools is putting lives at risk and will lead to a sicker UK. LSHTM has launched a call for a network to fight dangerous health misinformation in the UK.

Forbes reports YouTube is planning to reinstate users previously banned for spreading misinformation on the platform.

The Observer reveals a US billionaire who backs fake news sites spreading anti-Muslim conspiracy theories funded Tommy Robinson.

According to The Guardian, Meta whistleblower Sarah Wynn-Williams faces heavy fines due to a gagging order relating to her book about her time at Facebook Careless People (read my review).

Science reports a Burkina Faso project testing if GM mosquitoes can combat malaria has been shut down by the government. It follows false claims the mosquitoes are used to spread disease and sterilise people.

BBC highlights how Russian disinformation and a fake news network may have disrupted elections in Moldova.

The Grocer uncovers a plan by the global meat industry to launch misinformation attacks on the Eat-Lancet 2 study which examines the impact of meat and dairy on health and climate.

Dungeons & Dragons and escape rooms could be used to teach people how to resist misinformation Professor Jin Ha Lee tells KUOW.

Gavi explores how, in Burundi, Muslim leaders are playing a vital role in educating communities about the benefits of vaccination.

Daily Beast covers Trump posting then deleting an apparently AI-generated video of himself promoting the QAnon 'medbeds' conspiracy theory of futuristic pods that can cure disease and regrow limbs.

Politicians, assemble!

Have UK politicians finally woken up to the threat of misinformation?

The last few weeks suggest that perhaps they have, with the Secretaries of State for Culture, Health, and Education all speaking out.

Lisa Nandy (Culture) was first out of the blocks, telling the Royal Television Society Cambridge Convention:

We support OFCOM’s recommendation that Public Service Media content should be prominent on major video sharing platforms and on fair commercial terms. If we need to regulate we will... we will act to empower audiences... [so] they can distinguish between news and polemic, and misleading or false content.

Dawn Alford of the Society of Editors welcomed the commitment: “Amid a plethora of fake news, conspiracy theories and disinformation available online, the vital necessity of accurate and factual news and information across all media cannot be under-estimated."

Next up was Wes Streeting (Health) who said people should ignore Trump's unsubstantiated claims linking women taking paracetamol in pregnancy with autism in children. He told ITV's Lorraine:

there is no evidence to link the use of paracetamol by pregnant women to autism in their children... a major study was done back in 2024 in Sweden, involving 2.4 million children, and it did not uphold those claims... don’t take even take my word for it, as a politician – listen to British doctors, British scientists, the NHS.

Bridget Phillipson (Education) criticised Farage for failing to repudiate Trump's claims, saying she'd taken paracetamol during pregnancy and "to scaremonger in that way, and to scare women and put lives at risk, I think is really dangerous".

She also took aim at Reform platforming an antivax cardiologist at their recent party conference:

Vaccinations have saved the lives of millions of people. They’re [Reform] prepared to go to those fringes and we’ve got to be firmer as a party in taking them on.

Is the UK Government a convert to fighting misinformation?

The comments are certainly encouraging but it remains to be seen whether they will be backed up with action:

Will the government fund media and digital literacy initiatives to teach people how to resist misinformation?

Are they prepared to take on tech billionaires whose algorithms amplify harmful misinformation on social media and whose AI chatbots hallucinate clinical details and give dangerous advice?

It's worth remembering that a year ago, in the wake of extremist violence around the Southport Stabbings, Phillipson promised to embed critical thinking in the English school curriculum and help arm children against “putrid conspiracy theories”. We're still waiting...

Event highlights

On 15 September 2025 over 1,000 people attended LSHTM's Health Misinformation UNPACKED event in-person or online.

In a series of short talks and panels academic experts, journalists, and communications professional discussed the threat of harmful health misinformation and potential solutions.

If you missed it you can watch a recording of the full event on YouTube while below I reflect on some of the key themes:

Communicating uncertainty

The problem of communicating uncertainty crops up regularly in discussions of how to talk about scientific evidence.

It's particularly relevant to misinformation as uncertainty is something misinformers play on, sowing doubt with tactics like 'keeping the controversy alive', undermining the idea of scientific consensus on an issue, or producing fake or fringe experts to give an alternative view.

Experts providing false certainty on a topic can often backfire, as we've seen in the fallout from the COVID-19 pandemic. As Chris Whitty and other experts touched on, we should always leave space for genuine questions and doubts and that engaging with these honestly and empathetically is key.

I'd add to this that recognising and communicating not just uncertainty but unknowns is vital to building trust.

One of the problems is that scientists are much more comfortable with uncertainty and unknowns than the rest of us. In science realising you don't know something, or that the evidence has changed so so should your model, can lead to new ideas and discoveries. But when you are talking about trusting that the injection in your arm or the pill you swallow is safe uncertainty seems like a bad thing.

So what to do? Ditching false certainty is a must. But communicating uncertainty - like discussing a misinformation attack - comes with risks.

I think embedding information on uncertainty and unknowns within what we do know based on the evidence, qualifying and corralling that uncertainty with informative reassuring context, is the best way to manage these risks.

For example, in any research story I would always look to mention the most important limitations of the research, what it didn't study or couldn't show or shouldn't be taken as evidence of.

Similarly, alongside describing the benefits of any new treatment, talking about downsides or side effects in a measured and honest way early on can protect against misinformation later.

Think global, national, regional

Something I was glad to see Rabiu Alhassan highlight is how, even though misinformation is global, interventions must always be adapted to the national (or regional) context.

Rabiu discussed his work at GhanaFact and FactSpace West Africa. He explained how, because of political, religious, cultural differences across Africa, any 'one size fits all' approach to misinformation won't work and interventions must be adapted to each country.

I think about this problem a lot, particularly how it relates to the UK: with health being devolved across England, Wales, Scotland and Northern Ireland each with a very different political and cultural context. There are also important regional differences within England - for instance very different demographic and party political leanings between the Southwest and the Midlands.

Trust & common sense

Trusted sources of information and how people trust was a theme touched on throughout the talks and panels.

You can have brilliant public health messages but if these are delivered by individuals or organisations audiences you are trying to reach don't trust then it's unlikely to do anything to counteract misinformation.

Chris Whitty shared his view that while the misinformers spreading false claims for profit or political gain should be directly called out, the UK 'general public are very sensible'.*

While I think this is (generally) true I worry this risks complacency or plays into a trope of British exceptionalism.

Were the US public similarly 'sensible' before the US media and information environment was poisoned by misinformers?

Can we rely on Britons who (generally) reacted in a sensible way to COVID-19 measures to react in the same way to the next pandemic?

It made think back to the controversy around animal research in the early 2000s, where false narratives became established in public narratives in the UK and once ingrained took an enormous, civil society-wide effort to dislodge. And British audiences who may be sensible about public health may not be about other issues such as migration, where the UK political narrative has become so mired in misinformation it may be all but impossible to move on.

Closed algorithms, closed spaces

Another theme that came up in Rabiu's talk and comments by Heidi Larson and Simon Piatek is the invisible side of misinformation.

Rabiu talked about using AI tools to respond to misinformation on both WhatsApp and using text messages, examples of closed spaces where misinformation can spread undetected.

Later, Heidi and Simon commented on how the move away from internet searches or conversations on public platforms like X, towards using AI chatbots sees these conversations disappear into a closed private space. This is particularly concerning as young people turn to chatbots for advice, sometimes with tragic consequences.

It compounds many of the long-standing problems with invisible algorithms, eg on platforms such as TikTok and YouTube, bombarding people with (often AI-driven) more extreme content.

Whatever we may think of AI, understanding how it can share or generate misinformation is critical. As the panellists commented it's essential that we think about what guardrails and regulation are needed now before it's too late.

Get creative & fun!

On social media, where misinformers spread their claims in memes and entertaining videos, worthy but dull content is unlikely to cut it.

But as Dina Banerjee and Sander van der Linden explained it's possible to use humour and satire to fight back.

In her talk, Diya highlighted how dialogue, empathy, and engagement on socials, as well as working with influencers and fandoms, can help organisations reach audiences unlikely to interact with conventional public health content.

Sander mentioned his recent collaboration with the WHO for their Plot Twist series. In this short video highlighting misinformer's manipulation tactics he sends himself up, joking about brainwashing and trolling himself in a way that makes it much harder for anyone to troll back.

Another approach is to turn resisting misinformation into a fun game.

This is exactly what a team including Sander has done with BadVaxx, where you get to play as both an antivax misinformer and someone attempting to counter a range of tactics used to spread misinformation.

As an organiser of the event I'm undoubtedly biased, but there's so much more to unpack. I found lots to agree with but also points I'd disagree on across the fascinating talks and panel sessions featuring a range of experts: watch on YouTube and see for yourself!

*I noted Chris Whitty's slide on vaccination rates in England stated MMR coverage at 5 years was 91.9% [2025 data]. While this sounds good it's important to note this stat is for one dose when two doses of MMR are needed to protect children from measles and the same data show only 83.7% of under-5s in England have had their two doses of MMR, well below the WHO 95% target. See Measles in the UK.

Rewind: all about August

Last month I discussed why misinformers need scapegoats plus recommended reading and all the latest news + analysis.