Pembroke College Cambridge

Pembroke People: Inoculating Against Misinformation, Melisa Basol

Melisa Basol (2017) is a PhD student at Pembroke and a Cambridge Gates Scholar.

Melisa is currently in the first year of her PhD in psychology, focusing on social psychology and specifically on persuasion, and attitudinal resistance against it.  She is part of Dr Sander van der Linden’s Social Decision-Making research lab. The lab, in collaboration with DROG, developed the Bad News game to provide a psychological ‘vaccine’ against fake news by showing players how fake news is created and spread.  In today’s blog Melisa discusses why it’s so important to ‘pre-bunk’ misinformation, the challenges of contemporary social technologies, and why psychology is just as important as technological solutions.

What brought you to Cambridge?

I came here for my MPhil, and I applied to work with my supervisor Dr Sander van der Linden, an exceptional mentor and researcher in Social Psychology. Also his approach, and the University of Cambridge’s generally, that research should be used to overcome societal challenges, really resonated with what I wanted to do with psychology.

What makes studying misinformation so important?

Anyone who looks back in our history will realise that deceptive strategies and propaganda are nothing new, so it’s not a new challenge but it is a new environment that we experience information in, due to the advancement of the internet and technology in general. We are exposed through algorithms and filter bubbles and these tailored domains to what we want to hear. We are often exposed to like-minded people and our side of the truth, leading to people having polarised and skewed ideas of reality. Technology just reinforces and accelerates the psychological mechanisms that allow this type of information-processing to begin with.  There was a report that showed that misinformation travels further and faster than any other form of misinformation. So even if we fact check and retract false claims and debunk things, it will never keep up with the speed of how quickly misinformation spreads and manifests itself.

Scientists have also found that misinformation is normally spread by humans rather than bots, while we tend to think it’s some sort of automated process. It’s really the people. And several surveys within the UK and also across the world show there’s a huge discrepancy between how well people think they’re able to differentiate between true and false news and how accurately they really can. That combination led us to say, if we follow this principle of stealing the thunder, if we can pre-emptively expose people to misinformation, can we play offence rather than defence? Can we tackle misinformation before it spreads itself, before people click the share button and base important (societal) decisions off them?

Hence the term inoculation. How do you inoculate against misinformation?

There are different initiatives and different tools across the world, all conducting really exciting research. The work that we do is based on the theoretical foundations of inoculation theory. It was started by McGuire, a researcher at Yale in the 60s, who was trying to understand propaganda during the Cold War. And it suggests, very much like the biological analogy of immunisation, just as you inject a weakened virus to trigger antibody production, injecting a weakened dose of the persuasive, deceptive argument, could lead to attitudinal resistance to future attacks.

Together with DROG in the Netherlands, the social decision-making lab developed this game called Bad News. It’s freely accessible, anyone can play it. It’s a choice-based game that implements the core foundations of inoculation theory. In the game we’ve implemented the six most commonly used strategies in the spread of misinformation (e.g. polarisation and impersonation). The player is encouraged to drop all ethical pretences and walk a mile in the shoes of a fake news tycoon. By being challenged to keep their followers and credibility high, players learn these deceptive strategies in a politically neutral space.   You can take it in either direction; it’s a playful environment where you’re shown strategies that differing political ideologies might use.

We found that after playing the game, players could more reliably detect misinformation. This held up across ideologies, gender and educational background. DROG and the lab are continuing to extend those theoretical boundaries by conducting research and implementing findings around the role of social consensus, decay, and so on. The game itself is a very flexible research tool that we can easily adjust, localise, and contextualise to whatever problem we’re dealing with. There’s a Brexit version in the British design Museum at the moment for example that people can play. It’s been translated to 14 languages. It’s out there doing stuff, and our findings so far are very promising.

So it’s the booster shots and herd immunity now?

Exactly! That is the overall goal. Can we outpace the spread of misinformation? Can we build enough resistance that even if a family member shares something with you, you won’t share it? If so, what is the longevity of that resistance and what kind of boosters are needed to stabilise resistance?

We tend to see fake news as a technological problem, but at its heart it’s about what people believe.

I think we do need a multi-layered defence mechanism against the spread of misinformation. I don’t think the debunking approach of moderating content, flagging it, fact-checking and retracting things is enough by itself, though ethically it’s an imperative step. What we find psychologically is that the continuous influence affect blurs the line between myth and truth. Continuously being exposed to an idea (whether true or false) will make you more likely to believe it in the future.

Once misinformation has manifested itself it’s very difficult to dismantle, which is why we’re trying to push the idea that no media literacy programme in schools is going to solve the problem in its entirety. Especially because a lot of people sharing misinformation are not in school and won’t be any time soon. I think tackling it at an individual level is important. How can we disrupt the automatic and habitual processing of (mis)information? Can we incentivise “inoculated” individuals to spread the resistance? Can we outpace the spread of misinformation and build herd immunity? I think pre-bunking is an important step that more and more bodies and institutions recognise.

And governments can be limited in what they can do.

It’s a fine line between freedom of speech and censoring. We’re doing work with WhatsApp, who are facing a few difficulties in India with the spread of misinformation on their platform. The new challenge with apps like these is that content moderation isn’t necessarily possible or an option. Only the sender and receiver can read the encrypted message. It is very difficult to tease out who is spreading misinformation. In a lot of developing countries like India WhatsApp is the primary social network, meaning that you can go on any website and find a group aligned with your interest, whether that’s puppies or a political candidate. You join a group of hundreds of people you’ve never met, and you’re exposed to a very intense and isolated group of like-minded people. So it is important to understand the psychological motives behind spreading misinformation.  WhatsApp has tried to deal with this problem by limiting the number of forwarding you can do or the group sizes you can be part of, but they found that doesn’t stop the problem. People can still be part of five groups rather than one.  We’re working with them to develop a WhatsApp version of the game that fits the Indian context.

It’s really important we push for pre-bunking. If we follow this inoculation analogy, the virus might continue to develop; new forms of misinformation might emerge. We hope that the on-going work at the social decision-making lab will help keep up with this “virus”. 

And has being a Gates Scholar been helpful for you?

Incredibly so. Although Cambridge is already filled with these diverse, intellectual, terrifyingly accomplished people, the Gates cohort is distinctively driven by this urge to use their research for social good. I think that perspective and common purpose is a great atmosphere to be around. Most of my closest friends are Gates scholars; the side projects I get involved with like Laugh 4 Change are through other Gates Scholars. It’s an incredible honour and privilege, I don’t think I’ll ever feel worthy of it, but it’s a great and inspirational atmosphere to be around.

Latest tweets

Pembroke College Cambridge