Learning the language
What do we mean when we call something misinformation? Is that the same as calling it disinformation? And what's prebunking?
Misinformation research comes with its own language, a barrage of technical terms whose familiar-sounding words are subtle and slippery.
Take misinformation itself. This is defined as incorrect or misleading information. It can be 'fake news' shared for fun or profit (disinformation) or political ends (propaganda) or it can be shared out of genuine belief.
Or misleading is important. Misinformation doesn't have to be a clear lie, it can rather be something that distorts the facts, misinterprets or misrepresents them.
Disinformation and propaganda are different, they make a judgement about the motives of the person who creates them (for fun or profit, for political ends). But are we always sure of someone's motives?
This is why, as tempting as it can be to call something out as disinformation, I prefer to use the umbrella term misinformation, a catch-all that doesn't make as many judgements. But it still begs the question: who decides what and what is not misinformation? Isn't one person's misinformation another's information?
Filter bubble and echo chamber are another pair of technical terms that take a bit of unpacking. A filter bubble is an online environment in which people are only exposed to opinions and information that conform to their existing beliefs, the result of things like personalised search algorithms.
An echo chamber is an online or offline environment that promotes similar views and where diffusion is biased towards the likeminded. But it doesn't have to be the result of a filter bubble. So, an echo chamber could be a 5-a-side football team where likemindedness is created, not by an algorithm, but by a shared interest and dressing room bantz.
Algorithmic hijacking sounds like it belongs in The Matrix but is in fact people create entertaining videos talking about the same topics as conspiracy theorists so their videos get recommended to those watching extremist content.
A rabbit hole is a particular type of filter bubble in which algorithmic viewing recommends more extreme content over time.
Misinformation virus is an analogy that misinformation infects and is spread by individuals, potentially causing an infodemic. Interestingly, recent research suggests that how misinformation spreads can be described using mathematical models designed to simulate the spread of viruses.
Inoculation describes a response that involves not just sharing factual information but forewarning of future misinformation attacks by exposing people to an attack in a weakened form and refuting it with strong counter-arguments. Check out some resources and fun games about inoculation science.
Prebunking refers to the part of any approach that exposes people to weakened doses of the persuasive attack. Rather like with a vaccine containing live virus, it's important you weaken the misinformation attack before you expose people to it otherwise you risk spreading the very misinformation you are trying to contain.
Truth sandwich is a method of forewarning about a misinformation attack while reducing the risk of spreading it. The idea is to break down the intervention into a beginning, middle, and end. At the beginning you have facts and expert sources, in the middle you share the attack in a weakened form, then at the end of the sandwich you finish with facts and credible alternative explanations.
Misinformer, which I've taken a shine to myself, is someone who provides misleading or false information. But in this specific context it's a more concise way of referring to a generator of misinformation.
There are many more technical terms to learn, but when I started out reading misinformation research these stood out to me as being either particularly useful or needing some unpacking before they made sense.