Your Q&A format may be doing you more harm than good

Debunking misinformation is one of the toughest tasks a science communicator can face (National Academy of Sciences, 2017; Scheufele and Krause, 2019). This is especially true because in their attempts to refute false beliefs, many communicators may be doing more harm than good. It is particularly important for science communicators to understand how corrective information can backfire when communicating about vaccines, as small differences in compliance can significantly weaken the herd immunity effect (Scherer and McLean, 2002).

Recently published research by Pluviano, Watt, Ragazzini, &  Della Sala (2019), finds that using a commonly used “myths and facts” format to encourage people to ignore any misinformation they may have heard about vaccinations can result in people being more likely to believe the myths than before. This is in line with the familiarity backfire effect thesis, not the more controversial worldview backfire effect (see Nyhan, B., Porter, E., Reifler, J., & Wood, T. J. (2019)) .

This research is timely as increasing numbers of people are contracting measles and other vaccine-preventable diseases, which the World Health Organisation attributes to gaps in vaccine coverage (World Health Organisation, 2018). These gaps include locations where vaccines can be difficult to obtain, due to factors like poverty and conflict, but it also includes people becoming hesitant to vaccinating their families due to misinformation about the medical consequences of vaccinating children, (Goldstein, MacDonald, & Guirguis, 2015).

To combat this hesitancy, many organisations try to encourage people to vaccinate by using this “myths and facts” or “frequently asked questions” format, including the Australian Government and the Australian Academy of Science (Australian Academy of Science, 2016; Department of Health, 2018; Paynter et al., 2019).

The core argument presented by Pluviano, et al. (2019) is that using “myths and facts” formats counterintuitively encourages audiences to more strongly believe the misinformation they are trying to debunk. Their research is part of a broad area of psychological research that tries to find methods to debunk misinformation (Chan, Jones, Jamieson, & Albarracín, 2017). More specifically, this research looks at whether mentioning the misinformation, which is a necessary part of a “myths and facts” format, reminds audiences of the myths through repetition, leading audiences to remember the myth and not the fact. This builds on Schwarz, Sanna, Skurnik, &  Yoon (2007), which finds that “attempts to inform people that a given claim is false may increase acceptance of the misleading claim,” (Schwarz, et al., 2007, p. 151).


The authors conducted quantitative research with sixty Italian parents recruited from a paediatrician’s offices in Italy. These participants were divided into a control group who read some general tips about healthcare, and an experimental group who instead read a booklet containing ten myths and ten corresponding corrective facts. Both groups then completed questionnaires at two points of time to determine whether they believed in any myths about vaccines, such as the myth that they cause autism in children. They used mixed-model ANOVAs to find differences in the groups’ the level of belief in the myths immediately after the stimulus and again seven days later.


The authors found a statistically significant result where participants who had read the “myths and facts” material became more likely to believe the misinformation after seven weeks than immediately after reading the booklet. The authors suggest this is because repeating a myth that participants may have already heard makes this myth even more familiar, however introducing a corrective fact for the first time fails to be retained in the same way, and over time it becomes harder to remember the fact compared to the myth.

This mechanism seems plausible, and most authors agree that there are better ways to counter misinformation (Schwarz, et al., 2007), but Cameron et al. (2013), found “no evidence that presenting both facts and myths is counterproductive to recall accuracy,” when attempting to debunk vaccine misinformation with a sample twice as large as Pluviano, et al. (2019)’s (Cameron, et al., 2013, p. 387). Therefore, it appears that more research should be conducted about this topic, given the contrary evidence.


As a professional communicator I sometimes have used the “myths and facts” format because it helps to ensure I address any myths directly. However, it seems there are better ways to address misinformation. For example, Cook and Lewandowsky (2012) emphasise that if you must address a myth you should present the fact first, and provide an full alternative explanation so audiences can update their mental models, helping to ensure the correct information is retained. Recent Australian research has shown that using public figures as the message source can also be an effective way to encourage people to vaccinate (Zhang, Chughtai, Heywood, & MacIntyre, 2019).

Broadly speaking Pluviano, et al. (2019)’s research is yet more evidence against the deficit model of science communication (Bucchi and Trench, 2008; National Academy of Sciences, 2018; Sturgis and Allum, 2004). Usefully, it warns science communicators that using the wrong format to debunk myths about topics like vaccines can not only be ineffective, but it can also be dangerous. Science communicators should consider debunking myths by using other approaches that don’t risk making the myths more familiar and the facts relatively less likely to be remembered.


  • Australian Academy of Science. (2016). The science of immunisation questions and answers.
  • Bucchi, M., & Trench, B. (2008). Handbook of public communication of science and technology: Routledge.
  • Cameron, K. A., Roloff, M. E., Friesema, E. M., Brown, T., Jovanovic, B. D., Hauber, S., & Baker, D. W. (2013). Patient knowledge and recall of health information following exposure to “facts and myths” message format variations. Patient Educ Couns, 92(3), pp. 381-387. doi:10.1016/j.pec.2013.06.017
  • Chan, M.-p. S., Jones, C. R., Jamieson, K. H., & Albarracín, D. (2017). Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation. Psychological Science, 28(11), pp. 1531-1546. doi:10.1177/0956797617714579 Retrieved from
  • Cook, J., & Lewandowsky, S. (2012). The Debunking Handbook. St Lucia, Australia: U. o. Queensland.
  • Department of Health. (2018). Questions about vaccination. Canberra, Australia: C. o. Australia.
  • Goldstein, S., MacDonald, N. E., & Guirguis, S. (2015). Health communication and vaccine hesitancy. Vaccine, 33(34), pp. 4212-4214. doi:10.1016/j.vaccine.2015.04.042
  • Kaplan, D. (2004). The SAGE Handbook of Quantitative Methodology for the Social Sciences: SAGE Publications.
  • National Academy of Sciences. (2017). Communicating science effectively: A research agenda 9780309451024). Washington (DC): T. N. A. Press.
  • National Academy of Sciences. (2018). The Science of Science Communication III: Inspiring Novel Collaborations and Building Capacity: Proceedings of a Colloquium Washington, DC: The National Academies Press.
  • Nyhan, B., Porter, E., Reifler, J., & Wood, T. J. (2019). Taking Fact-checks Literally But Not Seriously? The Effects of Journalistic Fact-checking on Factual Beliefs and Candidate Favorability. Political Behavior, 1-22.
  • Overall, J. E., & Doyle, S. R. (1994). Implications of chance baseline differences in repeated measurement designs. J Biopharm Stat, 4(2), pp. 199-216. doi:10.1080/10543409408835083
  • Paynter, J., Luskin-Saxby, S., Keen, D., Fordyce, K., Frost, G., Imms, C., . . . Ecker, U. (2019). Evaluation of a template for countering misinformation—Real-world Autism treatment myth debunking. PLOS ONE, 14(1), p e0210746. doi:10.1371/journal.pone.0210746 Retrieved from
  • Pluviano, S., Watt, C., Ragazzini, G., & Della Sala, S. (2019). Parents’ beliefs in misinformation about vaccines are strengthened by pro-vaccine campaigns. Cognitive processing, pp. 1-7.
  • Scherer, A., & McLean, A. (2002). Mathematical models of vaccination. British Medical Bulletin, 62(1), pp. 187-199. doi:10.1093/bmb/62.1.187 Retrieved from
  • Scheufele, D. A., & Krause, N. M. (2019). Science audiences, misinformation, and fake news. Proceedings of the National Academy of Sciences, 116(16), pp. 7662-7669. doi:10.1073/pnas.1805871115 Retrieved from
  • Schwarz, N., Sanna, L. J., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns. Advances in experimental social psychology, 39, pp. 127-161.
  • Sturgis, P., & Allum, N. (2004). Science in society: Re-evaluating the deficit model of public attitudes. Public Understanding of Science, 13(1), pp. 55-74. doi:10.1177/0963662504042690 Retrieved from
  • World Health Organisation. (2018). Measles cases spike globally due to gaps in vaccination coverage. Retrieved Date Accessed, 2019  from
  • Zhang, E. J., Chughtai, A. A., Heywood, A., & MacIntyre, C. R. (2019). Influence of political and medical leaders on parental perception of vaccination: a cross-sectional survey in Australia. BMJ Open, 9(3), p e025866. doi:10.1136/bmjopen-2018-025866 Retrieved from

Leave a Reply

Your email address will not be published. Required fields are marked *