top of page

Climate Change Misinformation Interventions

By Jeremy Wright

 

Misinformation has shown to be a pervasive issue in Western society. These problems arise because people are expected to rely on accurate information as a way to make well-reasoned and informed decisions about many aspects of their daily lives. For example, in order to make healthy choices about what to eat or whether to buy an electric vehicle, then people will need to understand the differences between the options available to them. And to make the choice to eat a well-balanced dinner or make sustainable purchasing decisions, then people need to understand the nutritional value of food and the potential benefits of buying an EV. And for people to understand why a well-balanced meal is healthier than a bag of chips or an EV is more ecologically sustainable than a fossil fuel powered vehicle, they need to have access to accurate explanations of the evidence produced by nutritional and environmental scientists. If the evidence produced in research is misconstrued, then people will be more likely to make unhealthy and unsustainable choices that could have long-term negative impacts. Therefore, academics, businesses, and governments have been trying their best to understand how to limit the negative impacts of the misinformation that motivates people to behave in ways that are detrimental to themselves and others.

  Misinformation in general has been described as a threat to many things, but climate change misinformation has been singled out by many experts as the largest threat that modern humans and wildlife have ever experienced1. As covered in more detail in the article The Origins of Climate Change Misinformation, the magnitude of the threat caused by climate change misinformation is partly because it has only been influencing people since at least the 1950s and has already caused irreversible damage to ecosystems and human settlements around the world2,3. Although 75 years seems like a long time, it is a small blip compared to the geological age of the Earth and length of climate cycles. 

So, what is being done to manage this threat!? In academic discourse, management strategies are referred to as misinformation interventions because the goal of the tested techniques is to find ways to reduce the spread of misinformation and mitigate its impact on people and the planet. However, before an intervention strategy can be constructed, then it is important to understand how misinformation travels. This step is pivotal because misinformation can travel between individuals online 600% faster by reaching more people than factual and accurate information about the same topic4. 

One of the ways the spread of misinformation is mapped is to use an epidemiological approach similar to predicting the likelihood that an illness will become a pandemic5,6. The first step in this process is to identify the source of misinformation and label it as ‘patient zero.’ Then, the individuals who engage with the misinformation are considered ‘hosts’ who share it and ‘infect’ others, who then become hosts that continue the spread. Using this model, the rate that misinformation moves is represented by the figure ‘R0’. If R0 equals 0, then there is no possibility of widespread infection, but if it reaches a value of 1 or greater, then the pathogen is likely to infect enough people to develop into a full-blown pandemic. By identifying where misinformation starts, who it is infecting, and how fast it travels, then it is more likely that experts are able to intervene and effectively inhibit the spread. 

Types of Interventions

 

The next step in designing a misinformation intervention is to understand what types can be used. Sundelson and colleagues (2023)7 reviewed literature about the types of misinformation interventions and put together the “4i Framework for Advancing Communication and Trust (4i FACT).” The 4i FACT model is an attempt to categorize interventions so that future researchers are able to design effective strategies that can be implemented in the real world. As the name suggests, there are 4 main types of interventions currently being used: Institutional, informational, individual, and interpersonal. 

An institutional approach is one where a business, government, or educational institution uses their resources and field of influence to plan and implement a strategy meant to reduce the impact of misinformation (e.g., policy/legislation, journalistic standards). An informational approach is one that attempts to inform people about a topic that misinformation might influence, like climate change (e.g., fact-checking, info-sessions). Individual approaches are meant to increase people’s understanding of a topic so that they are less vulnerable to believing inaccurate information (e.g., enhance climate change literacy). And interpersonal approaches are used to motivate people to avoid infecting others with the falsities they come across while supporting the transfer of accurate information between people (e.g., community engagement). Importantly, all 4 types of interventions have reciprocal and interconnected relationships. Basically, this means that informational approaches affect individuals, their interpersonal interactions, and the institutions they are a part of, and that these exchanges can happen in any order. For example, institutional approaches use policy to dictate the information used to change the attitudes of individuals, which will affect their interpersonal interactions.

Debunking and Prebunking

 

So, how are these interventions applied? There are two main ways to implement a misinformation intervention, and that’s either before or after exposure8. When an intervention is applied after exposure, it is referred to as ‘debunking’ as some piece of misinformation is, for example, fact-checked and corrected after the fact. Prebunking is the opposite, where an intervention is applied prior to exposure to misinformation to reduce the likelihood that someone will engage with or believe it. 

Debunking has been one of the most common ways to intervene in the spread of misinformation about climate change. Debunking can be an effective way to help people form more accurate beliefs, but there are some major issues with it. For example, Lewandowsky (2021)9 discusses what has become known as the Continued Influence Effect of Misinformation (CIEM). The CIEM is a cognitive process where a piece of incorrect information will continue to influence what people believe, even after it has been corrected. So, as soon as you have believed it, then it’ll be incredibly difficult to completely update that belief. Basically, once a piece of information makes it into your head, it’s really difficult to get it out again! 

Something that makes debunking misinformation even more difficult is when it doesn’t go away. This is referred to as the Illusory Truth Effect and happens when someone is exposed to inaccurate information over and over again6,10. When misinformation is repeated, then we start to believe it has factual merits even if we know that it is very clearly false.  

The combination of the CIEM and the Illusory Truth Effect makes debunking misinformation about climate change incredibly difficult, which has motivated researchers to come up with pre-emptive strategies to reduce the likelihood that people will engage with or believe misinformation in the first place. Prebunking is a relatively new strategy when compared to debunking and is being actively researched as an effective way to reduce the impact that misinformation about climate change can have on people’s beliefs and behaviours. Even though it is fairly new in the academic literature, prebunking has shown to be more effective than debunking11,12.

Prebunking is based on a principle of beliefs written about by the 17th-century philosopher Spinoza. This principle states that the human default is to believe any new information we come across and to verify its truthfulness afterwards. This principle also suggests that the best way to stop the infection of misinformation about climate change is to prepare people to be skeptical and reject it from the beginning. Looping back around to the epidemiological metaphor, prebunking is also referred to as ‘inoculation.’ This is because, like vaccinating against an illness, providing people with the information and skills prior to exposure to misinformation prepares their systems to recognize it as false and refuse to engage with it8,9,13. And as more people refuse to engage with misinformation, the spread of the disease is stopped in its tracks. 

Conclusion

 

Let’s recap! Misinformation has been around for a long time, and it is so disruptive to the West because people need accurate information to make good decisions. Climate change misinformation has been labelled as the biggest threat to life on Earth because climate change has been around for a very short time and has already caused irreversible and widespread damages that will only continue to get worse. To try and mitigate the threat, businesses, governments, and academics have been working on debunking and prebunking techniques to help people form more accurate beliefs about climate change so we can all make more ecologically sustainable choices. 

  So, let’s all do our part to be sure that we hold accurate beliefs about climate science by listening to experts and following the facts! 


 

References

  1. Lazer, D., Baum, M., Benkler, Y., Berinsky, A., Greenhill, K., Menczer, F., Metzger, M., Nyhan, B., Pennycook, G., Rothschild, D., Schudson. M., Sloman, S., Sunstein, C., Thorson, E., Watts, D., & Zittrain, J. (2018). The science of fake news: Addressing fake news requires a multidisciplinary effort. Science, 359(6380). https://doi.org/10.1126/science.aao2998

  2. Supran, G., Rahmstorf, S., & Oreskes, N. (2023). Assessing ExxonMobil’s global warming projections. Science, 379(6628), 1-9. DOI: https://doi.org/10.1126/science.abk0063

  3. Supran, G., & Oreskes, N. (2020). Assessing ExxonMobil’s climate change communications (1977-2014). Environmental Research Letters, 15, 1-18. https://doi.org/10.1088/1748-9326/ab89d5 

  4. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. https://doi.org/10.1126/science.aap9559

  5. Cinelli, M., Quattrociocchi W, Galeazzi A., Valensise, C., Brugnoli, E., Schmidt, A., Zola, P., Zollo, F., & Scala, A. (2020). The COVID-19 social media infodemic. Scientific Reports, 10. https://doi.org/10.1038/s41598-020-73510-5

  6. Van der Linden, S., Roozenbeek, J., Maertens, R., Basol, m., Kacha, O., Rathje, S., & Traberg, C. (2021). How can psychological science help counter the spread of fake news? The Spanish Journal of Psychology, 24(25), 1-9. DOI:10.1017/SJP.2021.23

  7. Sundelson, A. E., Jamison, A. M., Huhn, N., Pasquino, S. L., & Sell, T. K. (2023). Fighting the infodemic: the 4i framework for advancing communication and trust. BMC Public Health, 23(1), 1662. 

  8. Lewandowsky, S., Cook, J., Ecker, U. K. H., Albarracín, D., Amazeen, M. A., Kendeou, P., Lombardi, D., Newman, E. J., Pennycook, G., & Rapp, D. N. (2020). The Debunking handbook 2020. George Mason University Center for Climate Change Communication. 

  9. Lewandowsky, S. (2021). Climate change disinformation and how to combat it. Annual Public Healthy Review, 42, 1-21. https://doi.org/10.1146/annurev-publhealth-090419-102409

  10. Dechêne, A., Stahl, C., Hansen, J., & Wänke, M. (2010). The truth about the truth: A meta-analytic review of the truth effect. Personality and Social Psychology Review, 14(2), 238-257. https://doi.org/10.1177/1088868309352251

  11. Cook, J., Lewandowsky, S., & Ecker, U. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLOS ONE, 12, e0175799. http://dx.doi.org/10.1371/journal.pone.0175799 

  12. Tay, L. Q., Hurlstone, M. J., Kurz, T., & Ecker, U. K. H. (2022). A comparison of prebunking and debunking interventions for implied versus explicit misinformation. British Journal of Psychology, 113(3), 591–607. https://doi.org/10.1111/bjop.12551 

  13. Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131. https://doi.org/10.1177/1529100612451018 

bottom of page