The threats to public health posed by fake news and conspiracy theories are being taken seriously by health bodies, politicians, and social media companies. Gill Partington takes a critical look at the term ‘infodemic’, and unpicks the relationship between information and disease.
In February 2019, the World Health Organization announced the existence of a new and urgent threat to health: not a pandemic (that announcement would not be made until the following month) but an ‘infodemic.’ The WHO’s Coronavirus Situation Report number 13 defined this threat as ‘an over-abundance of information – some accurate and some not – that makes it hard for people to find trustworthy sources and reliable guidance when they need it’. In response, the organisation declared that its technical risk communication and social media teams would be working to track misinformation, with the intention of refuting it with ‘evidence-based information’. A new platform was launched, called WHO Information Network for Epidemics (EPI-WIN), for precisely this purpose. One of its interventions – the ‘myth busters’ initiative – is a series of shareable infographics, each of which debunks a commonly repeated falsehood about coronavirus, asserting the facts in its place.
Misinformation, fake news and conspiracy theories have long flourished on social media, of course. Anxieties about their effects on civic life and democracy have been rising for some years now, but the context of coronavirus has given them a new aspect. They are now a new kind of threat, not only because the circulation of misinformation has intensified, but because of its intersection with health matters. A recognition of this new situation was signalled by a dramatic shift in the attitudes and policies of social media companies. Previously sanguine about the circulation of misinformation and reluctant to enact measures to restrict it, social media platforms have now become uncharacteristically proactive. Facebook has partnered with third-party fact-checkers and pledged to tackle any misinformation that poses a health risk. Popular search terms associated with coronavirus misinformation are either blocked, have their results tagged as “false information”, or are redirected to a WHO myth busters page. Twitter has also taken action, stating that it has removed thousands of tweets that ‘go directly against guidance from authoritative sources of global and local public health information’. Health priorities now trump those of free speech, it seems.
The infodemic is being taken seriously by health bodies, politicians and media. But what exactly is it, and what is the precise nature of the threat it poses? These are not straightforward questions. The term itself entered the OED very recently, introduced along with ‘self-isolate’ and ‘social distancing’ as a response to the pandemic. It is defined as a‘proliferation of diverse, often unsubstantiated information relating to a crisis, controversy, or event, which disseminates rapidly and uncontrollably through news, online, and social media, and is regarded as intensifying public speculation or anxiety’. But the word is not a new invention. Its origin is traced back to 2003, when it was used to describe the explosion of information (and misinformation) associated with the SARS epidemic. It’s a twenty-first century word to describe a twenty-first century sense of informational overwhelm, which puns on existing medical vocabulary. ‘Epidemic’ derives from Greek terms epi (upon/above) and demos (people) denoting a disease that threatens a whole population, while the prefix ‘pan’ signifies a global reach. Infodemic is a compound that mangles this etymology. It’s a piece of improvised wordplay, effectively conveying the idea that a deluge of information can be unmanageable, that it can spread like a disease.
So is it simply an analogy? Sylvie Briand, architect of the WHO’s information risk strategy, told the Lancet that social media amplifies the problem of misinformation: ‘ it goes faster and further, like the viruses that travel with people and go faster and further’. Here, the connection certainly seems to be figurative and metaphorical, presenting one thing (information) in the light of another, separate thing (a virus). And yet in other statements, we see a logic of causation not analogy: certain rumours ‘can potentially harm the public’s health, such as false prevention measures or cures’. Some coronavirus misinformation may be benign – promoting the benefits of natural remedies, like ginger or lemon juice – but other crank cures are potentially harmful, such as the hydrogen peroxide inhalers marketed by new age healers. And other falsehoods pose a direct risk to people’s health: The much touted but unproven benefits of chloroquine (as promoted by President Trump) led to one fatality as a result of ingesting a toxic form of it, marketed to clean fish tanks. Reports from Iran suggest many have died from drinking methanol, in the mistaken belief that alcohol prevents the virus.
Misinformation can also threaten collective health in more indirect ways – by promoting conspiracy theories. 3 in 10 Americans believe that coronavirus that was not an accidental occurrence but grown in a lab, according to a survey from the Pew Research Centre. There is also widespread denial that the virus even exists. In a video that was watched by many thousands before being removed from Twitter and YouTube, the influential conspiracy theorist David Icke claims COVID-19 is actually a cover for the harmful effects of 5G radiation. Here, misinformation intersects and gives new impetus to an already existing corpus of conspiracy theories about EMF technology as a plot manipulate or even massacre whole populations. This has resulted in real world actions, as telecommunications masts have been damaged and burned in the UK and the Netherlands. Such violence is dangerous in itself, of course, but as these conspiracies gain a foothold they pose a greater threat, eroding the trust in medical authority necessary to maintain lockdown measures. In the USA a new outbreak of protests, fuelled by this discourse, involves mass gatherings that deliberately flout social distancing guidelines.
Not only is misinformation analogous to a contagious disease in its exponential spread, therefore, it can cause or exacerbate ill-health. But there is yet another kind of correlation. Manlio De Domenico, statistical physicist at Italy’s Bruno Kessler Foundation, states that ‘infodemics present characteristics very close to those of epidemics’. WHO director general, Tedros Adhanom Ghebreyesus, has told the Munich Security Conference that the war against the COVID-19 must be fought on two fronts, since the infodemic is ‘just as dangerous’ as the virus itself. Meanwhile, the WHO’s website describes the infodemic as ‘a second “disease”’. This suggests a relationship of proximity and even synergy. Information and disease are conceived of as twin aspects of the same crisis, so closely entwined they are almost conflated. According to this logic, information itself is a de facto pathology. The infodemic doesn’t just resemble or exacerbate a disease, it effectively IS one.
How to fight such a multi-faceted menace? An infodemic demands a field of infodemiology to study and counter it, and this is what the WHO has called its strategy. It doesn’t gloss the term, but infodemiology was first identified as ‘a new research discipline and methodology’ in 2002 by Gunther Eysenbach, who defines it as ‘the study of the determinants and distribution of health information and misinformation’. It monitors and regulates information which specifically pertains to health, and which may therefore affect or be detrimental to it. But in the WHO’s myth busters initiative we can perhaps detect the influence of another field of study: inoculation theory. This has an earlier origin, emerging in the 1960s in social psychology and revolving around the notion of ‘vaccinating’ subjects against certain ideas. In early experiments, test subjects who were given prior, controlled exposure to certain ideas subsequently proved to be much more resilient to those same ideas than ‘non-vaccinated’ control subjects, and less open to persuasion. Where infodemiology views health through the lens of information, inoculation theory sees information in terms of disease.
It has found a new impetus in the age of social media. The Dutch research initiative ‘Bad News’, for instance, bases itself on the idea that through the controlled environment of a game simulation, players can gain an understanding of the way fake news is spread online, and are thus less susceptible to it when encountering it in ‘real life’ situations. The WHO’s myth busters infographics seem to show the influence of this logic: prominent falsehoods are first raised and anticipated before being refuted. The strategy is designed to disseminate correct facts and advice, of course, but before doing so it repeats the misinformation in the form of a question: ‘Can an ultraviolet disinfection lamp kill the new coronavirus?’; ‘Are hand dryers effective in killing the new coronavirus?’ Sharing these infographics online as memes exposes social media users to misinformation, but in a way that provides resistance to it. Like the virus itself, the infodemic seems to call for vaccination.
Disentangling these threads is complicated by the fact that the medicine and information are already knotted together through a shared vocabulary. ‘Transmission’ and ‘communication’ can apply to both diseases and ideas. The former was adopted by medicine in the seventeenth century only after it had already been used by physics and mechanics. But these terms and concepts have shuttled back and forth so often and for so long that it impossible to say where they ‘really’ belong. Language is constantly migrating, and there’s no more powerful or current example of this than the term ‘viral’. The computer virus may have begun as a medical metaphor but it is now so entrenched and familiar that the term is amphibious, applying equally to health as to computing. In everyday usage, it’s as likely to relate to a disease as to a meme (a another term which has travelled from biology to digital information). These relays between medicine and technologies, between bodies and information are so reciprocal, and recursive that it’s difficult to unravel one from the other. Infodemic is an idea caught in these tangled webs.
Gill Partington is research fellow on the project Indexofevidence.org at Exeter University’s Wellcome Centre for Cultures and Environments of Health. She has lectured at Birkbeck, University of London and Warwick University, and held research fellowships a the Bodleian Library, Yale Center for British Art, the Beinecke Library and Cambridge University Library. Her work centres on strange books and unorthodox reading practices, and she has published work on book destruction, artists books, contemporary literature and media theory.
 ‘Infodemic, n. : Oxford English Dictionary’ <https://www.oed.com/view/Entry/88407009> [accessed 23 April 2020].
 John Zarocostas, ‘How to Fight an Infodemic’, The Lancet, 395.10225 (2020), 676 <https://doi.org/10.1016/S0140-6736(20)30461-X>.
 ‘20200202-Sitrep-13-Ncov-v3.Pdf’ <https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200202-sitrep-13-ncov-v3.pdf?sfvrsn=195f4010_6> [accessed 23 April 2020].
 ‘Fish Tank Cleaner Chloroquine Phosphate Killed a Man Who Ingested It as Coronavirus Cure – The Washington Post’ <https://www.washingtonpost.com/nation/2020/03/24/coronavirus-chloroquine-poisoning-death/> [accessed 23 April 2020].
 ‘Hundreds Killed in Iran from Drinking Toxic Coronavirus “Cure”’, 7NEWS.Com.Au, 2020 <https://7news.com.au/news/health/false-virus-cure-kills-hundreds-in-iran-c-768920> [accessed 23 April 2020].
 1615 L. St NW, Suite 800Washington, and DC 20036USA202-419-4300 | Main202-857-8562 | Fax202-419-4372 | Media Inquiries, ‘Nearly Three-in-Ten Americans Believe COVID-19 Was Made in a Lab’, Pew Research Center<https://www.pewresearch.org/fact-tank/2020/04/08/nearly-three-in-ten-americans-believe-covid-19-was-made-in-a-lab/> [accessed 23 April 2020].
 ‘Fake News in the Time of C-19’ <https://members.tortoisemedia.com/2020/03/23/the-infodemic-fake-news-coronavirus/content.html> [accessed 23 April 2020].
 ‘Munich Security Conference’ <https://www.who.int/dg/speeches/detail/munich-security-conference> [accessed 23 April 2020].
 ‘EPI-WIN’ <https://www.who.int/teams/risk-communication/infodemic-management> [accessed 23 April 2020].
 Gunther Eysenbach, ‘Infodemiology: The Epidemiology of (Mis)Information’, The American Journal of Medicine, 113.9 (2002), 763–65 <https://doi.org/10.1016/S0002-9343(02)01473-0>.
 John A. Banas and Stephen A. Rains, ‘A Meta-Analysis of Research on Inoculation Theory’, Communication Monographs, 77.3 (2010), 281–311 <https://doi.org/10.1080/03637751003758193>.
 gusmanson.nl, ‘Can You Beat My Score? Play the Fake News Game!’, Bad News <https://www.getbadnews.com/> [accessed 23 April 2020].