The Health Foundation is working with the FrameWorks Institute to design more effective ways of talking about the wider determinants of health. The aim of this work is to expand the way people think about health and build support for policies that create healthier lives.
In 2019, FrameWorks completed the first phase of this project, exploring patterns in how the UK public thinks about health. The research found people typically define health in terms of the absence of illness, primarily relate health to medicine and health care, and believe individual willpower and choice are key determinants. Now, during the unprecedented challenge of COVID-19, illness and access to health care are more likely than ever to be in the forefront of people’s minds.
The social, economic, and environmental conditions in which people live have powerful effects in shaping health over the longer term. These factors (known as the ‘wider determinants of health’) are much less prominent in public thinking. As a result, public health professionals often find it hard to communicate and garner support for the prevention and health-creation policies that could have the most significant impact on improving population health.
Experts from FrameWorks recommended several communications principles to help public health advocates deliver more effective and memorable messages. These included using step-by-step explanations, with examples, to describe how the wider determinants affect health, and avoiding ‘myth-busting’ that risks worsening misperceptions.
In this article we explore the research literature underpinning these communications principles. We describe how people assess the truthfulness of information they encounter, explain why communicators should avoid repeating myths or unhelpful ideas, and offer strategies that can help messages stick or counteract misinformation. This research literature could be helpful when considering how to communicate important public health messages now, in the current context of COVID-19, and in the future.
How do people assess truth?
People are constantly exposed to information that might affect their beliefs, attitudes and worldview. There is a tendency for people to accept information as true unless it is obviously implausible or they distrust the source. This may be because, during conversations, people usually exchange information that is relevant and interesting to the individuals involved, or because questioning and rejecting information requires more attention and effort. When assessing the validity of new information – whether consciously or not – people typically ask themselves at least one of four questions.1,2
1. Is it compatible with my beliefs?
People are more inclined to believe things that are consistent with their existing beliefs and ideology. This may be because people process such information easily and comfortably. When people are exposed to information that is inconsistent with their beliefs, it creates negative feelings, making them resistant to the new information and possibly more firmly fixed to their original view: a ‘backfire’ effect.1,2
For example, a US study explored the effect of people’s political orientation on their reaction to public health messaging. When participants were given information about how type 2 diabetes is linked to social circumstances, support for policies to address these conditions increased among Democrats but fell among Republicans compared to a control group who did not see the information.3 This demonstrates the risk of public health campaigns backfiring and actively damaging support for the messages they intend to promote, if they are framed in a way that challenges the audience’s worldview.
Once someone has accepted a piece of information that fits with their knowledge and beliefs, it is highly likely to stick, and they are less likely to accept counteracting information in the future.
2. Do other people believe it?
People are more likely to believe information they repeatedly encounter, which may be because repeated exposure to a message creates a sense of social consensus: shorthand for its validity. The power of repetition is so strong, repeating information influences people regardless of the number of sources it comes from, or the sources’ credibility. As people become familiar with a message, they forget when and from whom they heard it.1, 2 Even when people know they have been exposed to misinformation, such as in political campaigns, they often cannot differentiate between true and false claims later.4 Familiarity can override people’s memory of fact versus fiction.
3. Do I believe it comes from a credible source?
People are more likely to believe information from a source they believe to be credible. However, people may perceive untrustworthy sources to be credible for a variety of reasons. People are more likely to trust a source if they already believe or identify with its message.2 Familiar sources, such as household names, are more likely to be seen as honest and sincere than unfamiliar ones.5 And superficial characteristics, such as a spokesperson’s accent, can influence people’s perceptions. For example, a US study found participants were more likely to believe statements spoken in an American accent compared to English spoken by non-native speakers (including Polish and Italian people).6
On the other hand, people often don’t notice, or forget, details of the source yet remember the message later. If it’s a believable message, they may even incorrectly attribute it to a different, more credible source in their mind. This effect is more likely if the information is repeated, coherent and compatible with the person’s existing beliefs.2
4. Do I see sufficient evidence for it?
People are more likely to believe a claim when they can recall examples or arguments that support it. If it feels difficult to come up with examples, people may disregard the information. Counterintuitively, shorter lists of examples are more effective than longer lists. To put it another way, people are more likely to accept information when they can easily bring to mind a small number of arguments that support it.1
For example, a study asked participants to think of examples of their own assertive behaviour. Participants who were asked to think of a few examples subsequently rated their assertiveness higher than the participants who were instructed to come up with many examples.7
Counteracting incorrect or unhelpful ideas: don’t repeat them
Communicators often feel they should address misinformation head-on to correct it. For example, myth-busting – which involves stating the myth before rebutting it – is a common approach to providing people with facts and correcting flawed knowledge. Repeating inaccurate information, however, often inadvertently creates or strengthens people’s belief in it.1, 2 Even when people understand, believe and remember misinformation being retracted, they often continue to recall the incorrect information as true.2
There are several reasons why repeating misinformation while attempting to correct it can be counter-productive. Repeating myths makes them more fluent and familiar. People quickly forget which information is fact and which is fiction, and simply recall familiar ideas, especially those that align with their prior knowledge and beliefs.1, 2
For example, a study investigating a US Centers for Disease Control and Prevention (CDC) ‘myth-buster’ about the flu vaccine, found recipients correctly distinguished facts from fiction immediately after reading the hand-out, but after 30 minutes remembered more myths as facts compared to people in the control group who had not read the information.8 Worse still, recipients may have subsequently recalled that the erroneous information (which they thereafter believed) came from the CDC itself, further reinforcing their acceptance of it.9
Repeating unhelpful information can spread it to new audiences. For example, in 2000 the CDC set up a public hotline to dispel a hoax story about bananas being infected with necrotising bacteria. While the hoax was initially seen by relatively few people, the story got picked up by the mainstream media when the CDC got involved and disseminated widely. The myth persisted in the US for years after the incident.1
Myth-busting creates a sense of controversy, making people assume there is strong evidence on both sides. Without detailed explanation of why myths are inaccurate and why new information is more valid, attempts to correct misinformation can create coherence gaps in people’s understanding of the issue, leading them to defer judgement or even reject corrections.1 Vested-interest groups, such as those against action on climate change and the tobacco lobby, are known to exploit the powerful effect of stirring up controversy through using misinformation campaigns to foster uncertainty and undermine the case for public health action.10
What communication strategies can help messages stick?
Certain communication strategies can increase the likelihood of people believing and assimilating new information. The primary goal for communicators should be to increase the familiarity and fluency of valid information. Several communication techniques can help.1, 2
Repeating information makes it more familiar to people and makes them more likely to believe it, even if it repeatedly comes from the same source. Public health communicators should therefore seek to repeat the same, core messages and, where appropriate, give a small number of clear, accessible explanations to back the information up.
Communicators should avoid repeating unhelpful or incorrect messages as this is more likely to reinforce these beliefs than to correct them.
Make messages accessible and coherent
To help people understand and make sense of new information, communicators should use clear, step-by-step explanations that leave minimal coherence gaps, especially on matters where people may readily revert to a more familiar and coherent, but less valid, narrative.
Communicators should also try to optimise the accessibility of information through its design and delivery, including using plain English, incorporating pictures and graphics, and considering other visual features such as text size and use of colour.
Design messages that reduce the risk of a backfire effect
Tailoring messages to different audiences’ worldviews can increase the chance that people with varying perspectives and priorities will accept the information. Framing messages in a worldview-affirming way reduces the impact of ‘social reactance’11, where people react negatively against being told how to think or behave. An alternative approach to messaging is to find the commonly held value or idea, that speaks to and is accepted by a broad audience.
Sometimes tailoring messages can be achieved through simple language adaptations. For example, one study in the US found Republicans are more supportive of ‘carbon offsetting’ than ‘taxes’ to tackle climate change, despite these being the same charge, while the different terminology had little effect on Democrats and Independents.12
‘Inoculate’ people against misinformation
In some situations, warning people they may be about to encounter misleading information can help to undermine it and make people more sceptical. This includes describing or highlighting the ways in which it is inaccurate, and explaining why the source may be motivated to spread myths.
For example, a US study exploring people’s perception of the scientific consensus on climate change, tested two inoculation messages – one more detailed than the other – explaining that some politically-motivated organisations try to convince the public that there is a lack of consensus among scientists. Both messages protected participants from being influenced by a statement denying the scientific consensus, with the more detailed message having a greater protective effect.13
The way we present messages about the wider determinants of health can seem unfamiliar, complicated and challenging to some people’s existing beliefs and attitudes. Greater awareness of common communication pitfalls, as well as the strategies identified for delivering powerful messages, can help up-stream approaches to improving health to be more effective.
The strategies summarised in this article offer some basic principles for health communication. Our current work with FrameWorks will build on these, and create specific frames and tools for communicating more effectively about the wider determinants. To sign up for updates, email Rachel.Cresswell@health.org.uk.
1. Schwartz N, Newman E, Leach W. Making the truth stick and myths fade: lessons from cognitive psychology. Behavioural Science and Policy. 2016; 2(1): 85-95.
2. Lewandowsky S, Ecker UKH, Seifert CM, Schwartz N, Cook J. Misinformation and its correction: continued influence and successful debiasing. Psychological Science in the Public Interest. 2012; 13(3):106-131.
3. Gollust SE, Lantz PM, Ubel PA. The polarizing effect of news media messages about the social determinants of health. American Journal of Public Health. 2009; 99:2160-2167.
4. Ramsay C, Kull S, Lewis E, Subias S. Misinformation and the 2010 election: a study of the US electorate. 2010
5. Brown AS, Brown LA, Zoccoli SL. Repetition-based credibility enhancement of unfamiliar faces. The American Journal of Psychology. 2001;111:199-209.
6. Lev-Ari S, Keysar B. Why don’t we believe non-native speakers? The influence of accent on credibility. Journal of Experimental Social Psychology. 2010;46:1093-1096.
7. Schwartz N, Bless H, Strack F, Klumpp G, Rittenauer-Schatka H, Simmons A. Ease of retrieval as information: another look at the availability heuristic. Journal of Personality and Social Psychology. 1991;61:195-202.
8. Skurnik I, Yoon C, & Schwarz N. Education about flu can reduce intentions to get a vaccination. Unpublished. 2007. In Schwartz N, Sanna LJ, Skurnik I, Yoon C. Metacognitive experiences and the intricacies of setting people straight: implications for debiasing and public information campaigns. Advances in Experimental Social Psychology. 2007;39:127-161.
9. Schwartz N, Sanna LJ, Skurnik I, Yoon C. Metacognitive experiences and the intricacies of setting people straight: implications for debiasing and public information campaigns. Advances in Experimental Social Psychology. 2007;39:127-161.
10. Oreskes N, Conway EM. Merchants of Doubt. Bloomsbury Press. 2010.
11. Brehm SS, Brehm JW. Psychological reactance: a theory of freedom and control. New York, NY: Academic Press. 1981.
12. Hardisty DJ, Johnson EJ, Weber EU. A dirty word or a dirty world? Attribute framing, political affiliation and query theory. Psychological Science. 2010;21:86-92.
13. Van der Linden S, Leiserowitz A, Rosenthal S, Maibach E. Inoculating the public against misinformation about climate change. Global Challenges. 2017;1:1600008.