Research by The Alan Turing Institute reveals who is most likely to believe online misinformation – and what can be done to tackle the problem. This research was funded by the Health Foundation. Dr Bertie Vidgen, a Research Fellow from The Alan Turing Institute, discusses the findings in this blog.

False information about COVID-19 has been described as an ‘infodemic’ by the WHO Director-General, and there are concerns that it can lead people to ignore official guidance, avoid getting vaccinated or even to use harmful ‘miracle’ cures. Numerous projects have been established over the past year to find, flag and monitor misleading online health-related content, but far less research has investigated who believes the content in the first place, and why.

In our project at The Alan Turing Institute, funded by the Health Foundation, we used a mix of surveys and assessments to address this gap. Identifying who is most vulnerable to misinformation is crucial for building a deeper understanding of the problem, and for developing more targeted, effective interventions to tackle its root causes. Otherwise, we risk deploying overly draconian, broad and restrictive policies to address misinformation, such as banning content from certain websites. Through our research, we hope to contribute to ongoing debates among policymakers and regulators about how best to tackle the problem.

In the study, we asked a panel of 1,700 people, representative of the UK in terms of age and gender, to complete a detailed survey about their personal background, outlook and experiences, and tested their personality, cognitive skills and different literacies. We then presented them with various headline-style health-related claims about COVID-19, measuring their vulnerability to misinformation based on how accurately they assessed the claims. Some of these claims were true (eg ‘COVID-19 can spread through the air’) and some false (eg ‘the COVID-19 virus can be treated by drinking lemonade’).

Our results show that individuals with lower digital literacy, numerical literacy, health literacy and cognitive skills are worse at assessing the veracity of health-related statements. Unexpectedly, most sociodemographic, socioeconomic and political factors made little or no difference. These are important results as they mean that developing people’s cognitive skills and literacies could make a big difference to their ability to identify misinformation. This would also have positive benefits beyond just tackling misinformation, especially digital literacy (the ability to use digital technologies to find, evaluate and communicate information), which research shows is crucial to navigating modern life.

The results of our study led us to three recommendations, as follows.

1. Digital literacy should be explored as a powerful tool for combatting misinformation

In our study, we demonstrated the power of misinformation by asking people to assess the same claims, but preceded by different types of related content. As expected, those who had been shown true content before they assessed the claim fared better than those who had been shown false content. However, surprisingly, giving participants warnings about misinformation before they made their assessments had only a very small impact on their performance.

2. New strategies are urgently needed to communicate the severity of misinformation to the public

We found that individuals differ greatly in their knowledge of health, but that almost everyone has ‘room to improve’. Government policies should aim to enable people to better recognise health-related misinformation, and encourage them to scrutinise content that they are unsure about. While public discourse is often focused on reducing the supply of misinformation (an admittedly important way to tackle the problem), our research draws attention to reducing people’s vulnerability.

3. We must address the factors that make people susceptible to misinformation

In our project, we have started to identify the factors that make people susceptible, but there needs to be far more research if we are to stem the flow of misinformation online. The health, social and economic consequences of COVID-19 are already devastating – we need to minimise the potential of misinformation to make them even worse, and plan for how to limit the effects of misinformation in future public health crises.

Read our full research report here, and if you have any questions, please contact me.

Dr Bertie Vidgen is a Research Fellow in Online Harms at The Alan Turing Institute.

This blog was originally published on the website of The Alan Turing Institute.

Further reading

Newsletter blog

Data analytics for better health – realising the potential for all

31 January 2020
Newsletter blog

We are broadening the scope of our work to help create a future where everyone’s health and care...

You might also like...

News

New Artificial Intelligence projects funded to tackle health inequalities

News

NHSX’ NHS AI Lab and the Health Foundation have today awarded £1.4 million to four projects to...

Press release

Research reveals devastating and lasting impact of the pandemic on those asked to shield

Press release

Health Foundation statement on Networked Data Lab research surrounding clinically extremely...

Briefing

Assessing the impact of COVID-19 on the clinically extremely vulnerable population

Briefing

We present analysis from the Networked Data Lab on the impact the pandemic has had on the clinically...

Kjell-bubble-diagramArtboard 101 copy

Get social

WEBINAR TODAY: #NHSrecovery – how do we 'build back better'? Hear from @JKDhesi, James Mackey, @rcsloggett,… https://t.co/FU8r5ifdaM

Follow us on Twitter
Kjell-bubble-diagramArtboard 101

Work with us

We look for talented and passionate individuals as everyone at the Health Foundation has an important role to play.

View current vacancies
Artboard 101 copy 2

The Q community

Q is an initiative connecting people with improvement expertise across the UK.

Find out more