1/5: Generator

Who generates fake news? Why do they do it? And do their motives matter?
In the first of five articles I'll be investigating Generators, the people who unleash a fake news storm, then sit back and watch the chaos unfold...
Recap: decoding the spectrum
When I started researching misinformation I was intrigued to find those affected by misinformation were a diverse group, and that their attitudes to misinformation could change both depending on the issue and over time.
By applying the communications approach of segmenting your audience/choosing a target audience for potential interventions I came up with 'The Spectrum of Misinformation', the idea that you could think about people as being somewhere on the spectrum below:

Generators are those who create fake news for fun, profit or politics, Believers share misinformation either out of genuine belief or because it fits their agenda, Neutrals are "don't knows" who may be persuaded to share misinformation on an issue, Sceptics check claims against the facts and may be encouraged to challenge, while Rejectors don't understand why people believe.
State sponsored misinformers
Of all the groups on the spectrum they might seem the most straightforward. If misinformation is 'bad' then so are they, there are no 'good' Generators, right?
In February 2022 during the Russian invasion of Ukraine stories circulated about a mysterious Ukrainian fighter pilot who shot down up to 40 Russian planes. Dubbed 'The Ghost of Kyiv', it later turned out this pilot was a morale-boosting fiction created to inspire Ukrainians.
Wartime fake news like this is nothing new, as with the myth carrots helped RAF pilots see in the dark in The Blitz (it was really radar). If it benefits your side, this kind of propaganda (misinformation for political ends) can be seen as justifiable and 'for the greater good'.
But propaganda is rarely so benign.
For example, in 2024 journalist Lin Hsien-yuan's polls showing a Beijing-friendly candidate leading Taiwan's presidential election caused uproar before being exposed as entirely fake. More on the case here.
Whether he was motivated by politics, money (or both) the intent seems clear, not to unite but to destabilise and disrupt. The polls coincided with a Facebook campaign featuring false claims of poisoned pork, an egg shortage, and US-backed bioweapons, leading Taiwan's Premier to state: "China has been actively waging cognitive warfare against Taiwan through disinformation."
The war in Ukraine has seen Russian disinformation campaigns, previously deployed against Finland and the Baltics, reach a new level of intensity.
In one case in 2023 the BBC identified over 800 fake accounts on TikTok spreading false claims about Ukrainian Defence Minister Oleksiy Reznikov. These and other accounts appearing to have links to Russia spread corruption allegations about leading Ukrainian officials in an apparent attempt to undermine the Ukrainian public's trust in its leaders and discourage support from the West.
In a piece for RUSI, Dominic Presl argued that Russia is winning the global disinformation war, seeding its misinformation through well-established networks everywhere from Europe to francophone Sahel and Central Africa, Latin America and the Middle East.
Unlike Lin Hsien-yuan, the creators of most state-sponsored misinformation remain elusive figures. Generators with the resources and backing of states behind them usually have little to fear and no motivation to stop.
Will misinform for money
Monetising their misinformation is a key goal for most Generators. Whether they appear to care about health, politics, conflict, or individual freedoms the profit motive is always there.
In 2023 an Oxford Internet Institute analysis found 85% of anti-vax websites had a monetisation strategy. The most popular were donations (58%), sales of information products (41%), merchandise (31%), and advertising (22%).
The Oxford study suggests anti-vaxers use three models of monetisation: online celebrity, radical social movement, and junk news. Each enables them to capitalise on one or more of the strategies above while a misinformation ecosystem of linked websites means they can explore all avenues of exploitation.
Despite coming low down the Oxford study's list there's plenty of evidence that misinformation websites benefit from online advertising. For example, a 2020 study by Judit Szakács estimated ad revenue from Czech and Slovak misinformation sites totalled up to €1.27m.
Meanwhile, investigations by The Global Disinformation Index found Google and other ad technologies meant big brands were unwittingly funnelling money into misinformation sites across Europe, for example in 2021, providing an estimated $12m to 56 Spanish-language COVID-19 disinformation sites.
Musk's changes to X mean it also now rewards creators of attention-grabbing misinformation, with some influencers able to earn up to $3000 a month from paid subscribers for spreading fake news about conflict in the Middle East, topped up with extra money from X's ad revenue sharing scheme.
Even if misinforming starts as pure mischief, if successful, the opportunities for exploitation become too tempting to resist.
Money remains a powerful incentive for Generators. It's a major reason why interventions trying to persuade them to give up their fake news ways are likely to fail, they have too much to gain from continuing to peddle misinformation.
Misinformation = power
One of the curious things is that, while misinformation is everywhere, there are very few Generators, with most people elsewhere on the spectrum.
For example the CCDH identified the Disinformation Dozen as responsible for 65% of anti-vaccine content on Facebook and Twitter, according to an analysis of 812,000 posts (1 February - 16 March 2021).
It helps if you have your own social media platform: another CCDH report found at least 87 of Elon Musk's X posts in 2024 carried false or misleading claims about the UK election and achieved over 2 billion views.
Politicians and celebrities (including some journalists) seem particularly vulnerable to the allure of becoming a Generator, maybe because they have a ready-made audience, or are afraid of losing the audience they have.
For instance, CNN analysed 20 false claims Trump made just in his inauguration address, while The Guardian documented how Nigel Farage has consistently used 'cover-up' conspiracy theory rhetoric in relation to the Southport stabbings.
The Guardian highlights RFK Jr as a politician who has harnessed the ideas of diagonalism, uniting groups across the political spectrum around suspicion of all power being involved in conspiracy, to manoeuvre to the centre of power.
The piece quotes Naomi Klein: "Despite claims of post-partisanship, it is right-wing, often far-right, political parties around the world that have managed to absorb the unruly passions and energy of diagonalism... folding its Covid-era grievances into preexisting projects opposing ‘wokeness’ and drumming up fears of migrant ‘invasions’."
It highlights one of the key tactics of political misinformers, to 'flood the zone' with false and misleading claims and see what gains traction with their audience. While claiming to be driven by ideology, the approach is in fact deeply cynical and opportunistic, with real issues and victims abandoned if they don't sufficiently mobilise the Generator's power base or antagonise opponents.
Polarisation, rather than persuasion, is often the aim.
But if you think only rightwing politicians are Generators you'd be wrong, Labour's Rachel Reeves has been widely criticised for repeating misinformation about a £100m bat tunnel as an example off nature rules blocking housebuilding, leaving out key context about a decade of HS2 planning failures.
Remember, misinformation = misleading as well as false information.
It's also worth noting that with the rise of clickbait and outrage columns mainstream media outlets now regularly promote misinformation.
There have always been celebrity misinformers, of course, but social media and the decline of the gatekeeping role of traditional media has made it even easier for the powerful and famous to use misinformation to attract a cult following.
Ultimately, Generators want to generate Believers to share their claims and attract a large enough audience to give them the clout and money they crave.
What can we do about Generators?
Unfortunately, as mentioned, the incentives of money and power are so strong that communications interventions that might work for other groups are very unlikely to persuade Generators to stop.
But analysing their motivations is still useful as, by knowing what they want, it's possible to devise strategies that limit the damage they can cause.
For instance, structural interventions, such as technological and legal measures adopted by governments, can kerb their ability to reach and grow their audience. Crucially, they can also clamp down on their ability to monetise their misinformation through donations, advertising, and sales of snake oil cures, depriving them of the revenue streams that fund their networks.
Another tactic is to expose their manipulation tactics through improved media literacy/education and so motivate people to resist.
It is also worth looking at who Generators routinely attack: fact-checkers, the media, universities, in short all those who subject their claims to independent external scrutiny. Beyond direct interventions a commitment to strengthening civil society is one of our best defences against misinformation.
Overall, a policy of containment is likely to be the only effective approach to Generators intent on distracting, dividing, and dominating us.