News/Analysis/The trust gap
Welcome to The Spectrum of Misinformation newsletter: bringing you misinformation news, analysis & practical advice for communicators.
In News digest, read-up about misinformation and grooming gangs, a medic's TikTok worries, AI fake victims, failures on conspiracy theories and extremism, WhatsApp fake-busting tools, Finns' circle of trust. There's an analysis of Meta's move into a Fact-check free zone, a deep dive into The media and misinformation, and advice on how to become a trusted source in Bridging the trust gap.
News digest
BBC reports ACT on IICSA warned against politicising sexual violence and "the troubling trend of misinformation that undermines the true scale of the crisis and the pressing need for reform" as Conservatives and Reform call for a new UK inquiry into grooming gangs.
AP quotes Keir Starmer condemning "lies and misinformation" after Elon Musk attacked his handling of child grooming scandals.
BBC and many other outlets report Meta will get rid of independent fact-checkers on Instagram and Facebook in the US leading to fears of a flood of disinformation and harmful content.
The Guardian covers a UK doctor's concerns TikTok influencers and podcasters are not just advocating alternative medicines but discouraging patients from taking conventional ones.
Newsweek highlights how an AI image was used to create a false identity for a woman set on fire on the New York subway with the hoax shared on X, Reddit, and other websites.
The Observer carries a warning from a former UK counter-extremism czar that not enough is being done to fight disinformation and conspiracies that fuelled 2024's summer riots.
Mashable reports WhatsApp has teamed-up with Google on a reverse image search to help people identify if an image has been edited, manipulated or used out of context.
RTE investigates how Finland's national media literacy policy involves teaching children and adults to spot disinformation, with its success attributed to Finns' trust in authorities and the media.
Fact-check free zone
The news that Meta is abandoning independent fact-checkers on Instagram and Facebook (see The Guardian) is hardly surprising.
It follows Elon Musk making savage cuts to moderation and safety teams as soon as he bought X and Trump criticising moderation and fact-checking as biased and an attack on free speech.
But what the move distracts from is that free speech, in the sense of having a democratic right not just to speak but to be heard, has always been largely an illusion on social media platforms.
As this BBC piece from last year explores, those running these platforms are constantly changing the algorithms that invisibly control who we see and who sees us.
Far from setting information free, Meta's move demonstrates the power platform owners can have over user experience, flooding our feeds with misinformation and harmful content or radically altering who engages with our posts overnight.
But is this what users (and advertisers) want?
The ongoing X-odus suggests not: X is estimated to have lost nearly one fifth of its daily active user base with its revenue dropping by 40%.
Is letting the misinformers run riot really good for business? Especially when rivals like Bluesky offer a less fake/hate-filled experience?
It's notable that these changes are only in the US with Meta saying it has "no immediate plans" to get rid of its third-party fact checkers in the UK or the EU. The UK has its own protections under the new Online Safety Act (although how much protection this offers remains to be seen), as does the EU, with Zuckerberg railing against Europe as a place with “an ever-increasing number of laws institutionalising censorship and making it difficult to build anything innovative”.
In fact many countries have different preferences for social media platforms than the US and different rules to regulate them. What's clear is the US is set on running an experiment to find out what happens when you let misinformation virus spread unchecked.
The media & misinformation
Friend or foe? A new feature-length article surveys the challenges and opportunities of working with the media to fight fake news.
Find out why you need to pick your battles, do your homework, and how the real truth being out there is a seductive story for any journalist in When to engage: the media and misinformation.
Bridging the trust gap
Digging into misinformation problems unearths a lot of hard questions.
I think one of the most important is: who trusts you and why?
As a comms professional, I've always wanted to believe my work has enhanced the trustworthy image of the institutions I've worked for. But I realise now I never thought enough about the audiences I was reaching and if, especially in areas such as science, health and the environment, I was just "preaching to the converted": our own echo chamber of supporters and sceptics (see Decoding the spectrum).
Surveys in the UK and US suggest public trust in experts such as scientists and doctors has remained consistently high for decades. But I worry this overall picture papers over deep cracks in the monolithic notion of trust, opened up in no small part by the increasing politicisation of science and medicine.
In a 2024 YouGov UK poll 64% of Reform voters said they trusted scientists to tell the truth compared to 92% of Labour voters. For family doctors the trust gap was smaller, but still significant (Reform = 77% Labour = 91%). Conservative voters were between these poles.
A 2024 Pew Center survey found a similar pattern in the US with 66% of Republicans trusting scientists to act in the public interest as opposed to 88% of Democrats (perhaps no surprise given so many Republican politicians are misinformers).
Remember polarisation (not persuasion) is often the aim of misinformation attacks. Fostering distrust is a key element of this, with research suggesting those who don't know what to believe (neutrals) are more likely to fall for misinformation (or at least not resist it) giving misinformers the power they crave.
While the media is not always an ally in the fight against fake news, I've heard from journalists they see maintaining public trust in this new age of misinformation as their biggest challenge.
I'd argue this is also the big challenge for institutions, to be seen as a trusted source across boundaries of politics, class, and ethnicity. Without this, large sections of society will ignore factual interventions and continue to believe damaging misinformation.
So how can you become a trusted source to those who don't naturally trust you? Who see you as politically biased or "not on their side"?
I think, if it's to be effective, building trust needs to be an underpinning goal of any institutional counter-misinformation strategy.
This goal should inform the framing and messaging of all interventions, with a good place to start the 7 positive steps I've already suggested.
Instead of seeing a response to each specific misinformation attack as a win/lose moment, it's about playing a longer game. Rather than each intervention being a knee-jerk response it should be created as part of a framework that demonstrates the consistency, honesty, and independence of messages from both experts and institutions.
This approach requires emotional intelligence, not just scientific rigour.
It may involve reaching out to influencers within political groups or communities that aren't naturally inclined to trust you (or authorities in general) to see if they are willing to endorse or amplify your messages.
This doesn't mean pandering to extreme or dangerous views, it's about being mindful of the genuine concerns or traumas that may lie behind misinformation belief, for example, more neutral framing of interventions ("discover the facts about", "find out what most scientists think") that can draw people in.
Misinformers don't care about people's concerns, will switch claims or topics if they think it can get them the clout/money/influence they want. Over time, by demonstrating to those vulnerable to misinformation (believers, neutrals) that your experts and you institution cares, I think it's possible to build the trust that's essential for effective counter-misinformation work.
What did you miss in 2024?
Find out in my review of the year with a 12-month trawl of the misinformation news headlines and reflections on how last year's stories can help us figure out what to expect in 2025.