The COVID-19 pandemic has been accompanied by an infodemic: an overabundance of information – some accurate and some not – that makes it hard for people to find trustworthy sources and reliable guidance when they need it. In this context, misinformation and disinformation can spread alarmingly quickly, which can in turn influence public opinion, undermine or support public health responses, and impact the length and intensity of outbreaks. Infodemic management, which includes attempts to understand tactics employed by malicious actors to spread information and 'social listening' the regular and systematic aggregation, filtering, and monitoring of conversations and public discourse, has therefore become crucial. However, it raises a number of important ethical considerations and questions. In this seminar, speakers sought to explore the nature and role of ethics as it relates to the infodemic, misinformation, and infodemic management.
Chair:
Professor Patricia Kingori, Professor of Global Health Ethics, Ethox Centre, University of Oxford, Oxford, UK
Panel:
Professor Timothy Caulfield LLM, FRSC, FCAS, Canada Research Chair in Health Law & Policy, Professor, Faculty of Law and School of Public Health, Research Director, Health Law Institute, University of Alberta
Ms Noran Adly, Community Engagement officer, UNICEF, Jordan country office, Jordan.
Dr Sam Martin, Senior Research Fellow and Consultant in Digital Sociology and Big Qualitative Data Analytics at: Ethox Centre, University of Oxford, UK; and RREAL (Rapid Research Evaluation and Appraisal Lab), University College London, UK
Dr Carolina Batista MD MPH, Member, International Board of Doctors Without Borders and Head of Global Health Affairs, Baraka Impact Finance
__________________________________________________________________________________
Professor Tim Caulfield began his presentation by calling the infodemic one of the great policy and communication challenges of our time. This has not only been recognised by governmental and intergovernmental agencies, but citizens too. In fact, a recent United States survey indicates 95% of people see the spread of misinformation as a serious problem; however, in contrast 78% of people believe in at least one prominent COVID-19 conspiracy theory. Professor Caulfield argues that it wouldn’t be an exaggeration to say misinformation is killing people: this is because it is shaping health policy, and contributing to stigma, and is therefore immensely harmful. The rise in misinformation is attributable to social media. While this isn’t the only causal factor, a growing body of evidence suggests it’s the main one. According to a recent study, 85% of misinformation has its origins in social media. While studying misinformation in this context is often tricky, the topic has now been explored through multiple methodologies and there’s enough good quality data to establish a link. Professor Caulfield argues that misinformation is currently more dangerous because of ideology. While ideology has always played a role in misinformation, it has most recently become tangled up with identity; a recent poll indicates individuals' are responding differently to the war in Ukraine based on vaccination status. For example 82% of vaccinated individuals believe in tougher sanctions on Russia in response to their invasion of Ukraine, 75% of unvaccinated individuals don’t. How can this be addressed? Professor Caulfield argues that evidence suggests debunking, such as in the form of fact-checking misinformation works. However who does this remains open to question; currently opinion is split between those who believe it is the role of governments to regulate social media, those who believe it is responsibility of platforms, and those who think neither should regulate it. This shows the extent of the policy-challenges that lay ahead. Ending on a more positive note, Professor Caulfield argued we can all make a difference in countering misinformation. He invited attendees to join his #scienceupfirst initiative which is using social media and creative communications to convey accurate public health messaging to the widest possible audience.
Dr Sam Martin followed-up on Prof Caulfield’s comments. Drawing on her experience as a digital sociologist and data analytics consultant, her work studying the role of social media misinformation in vaccine hesitancy among healthcare workers across four countries (the UK, South Africa, Kenya and Brazil) reveals that misinformation spread by social media does impact vaccine uptake. Therefore, the key question should be what role do governments and public health authorities have in removing sources of misinformation? Dr Martin and her team put forward a range of solutions: Firstly, specialist taskforces; including fact-checkers, psychologists, and healthcare workers, who could work collaboratively to debunk misinformation. Secondly, governments could work directly with platforms leveraging algorithms to correct misinformation. Thirdly de-platforming is a potential approach for serious and egregious offenders who persistently spread misinformation on social media. After all, misinformation spreads insidiously, and seemingly by innocuous means. Dr Martin highlighted a recent example of the hashtag ‘informed consent’ which has grown exponentially over the last two years. What was once a neutral concept, has become wedded to the nexus of anti-vaccination messages via social media. Furthermore, this hashtag has influenced the viewpoints of healthcare workers particularly in the context of mandates as a provision for continued employment. While vaccine hesitancy operates on a continuum, extreme positions are reinforced by misinformation; therefore social media monitoring is important. However, if monitoring social media is a prerequisite for tackling misinformation, there needs to be ethical requirements in terms of data collection. Dr Martin argued that researchers must consider relevant guidelines, particularly in relation to informed consent, and mitigate the risk of harm by anonymising data sets. However safeguards aren’t fool-proof, particularly on social media.
Ms Noran Adley, a community engagement officer for UNICEF in Jordan, argued that social listening is the best tool to monitor and analyse conversations within community settings; including the spread of misinformation. Social listening can be used to inform strategic activities both on and offline, and is also a mechanism to gather feedback on the implementation of public health interventions. Finally, it is a tracking tool which allows for the mapping of rumours and misinformation, through the clear identification of community concerns and challenges. Social listening is important because it helps develop insight into community practises, while also countering the effects of misinformation and has been particularly valuable in relation to the COVID-19 pandemic. It also enables community engagement officers to identify and verify conversations picked up from volunteers who work directly with communities. In Jordan, social listening amplifies the voice of the community, and facilitates a two-way feedback mechanism with communities who feel they are being ‘heard.’ Furthermore it has reduced the gap between the community and UNICEF’s Risk Communication and Community Engagement (RCCE) programmes ensuring the taskforce design interventions that are responsive to community needs. Ms Adley also noted how social listening fed into community engagement activities; creating evidence-based initiatives and trainings which reached their target audience. There are however ethical dilemmas within the context of social listening. Firstly, researchers must be able to keep individual data sets confidential. They also have to be aware of who is being excluded. Social media in general is difficult to monitor so it’s important to know the different channels through which individuals communicate. It’s also difficult to segregate data in terms of gender, geography and age group. Finally there can also be a wider inconsistency in that different communities rely on different public health messages which may be out of sync with the more uniform centralised public health messaging system. Ms Adley noted that Jordan now has a vaccination rate of nearly 50%; some of this can be attributed to social listening within the context of community engagement.
As a clinician, Dr Carolina Batista’s presentation examined the role of communities in fighting misinformation during the COVID-19 pandemic. COVID-19 has disproportionately affected vulnerable individuals across the globe. The reasons for this are multifactorial; however, one reason is the historic mistrust of the biomedical community, which is a consequence of long-standing exclusion by healthcare systems, racism, lack of access to care, and neglect. Another reason is the volume and speed in which information travels. Recently it has been difficult for people to assess the accuracy of information, which when compounded by the spread of misinformation, poses a considerable threat to public health and pandemic management. Finally public health officials have inconsistently and ineffectively engaged populations who have low levels of trust in government. As a result, there is an urgent need to develop and implement culturally appropriate interventions to mitigate the impact of misinformation on vulnerable communities. Furthermore, building and sustaining trust remains vital to any public health initiative and can only be done by trusted allies including community leaders, faith-based groups or healthcare workers. Finally, messaging needs to be evidence-based but also co-produced with the communities it is targeting. Explaining this last point, Dr Batista noted that there is no one-size-fits-all approach; instead governments and institutions must use social listening on the channels communities are engaging with, and communities should not be stigmatised on the basis of their stance towards particular public health measures. The future research agenda should address misinformation by way of three questions: Firstly, what factors make people susceptible to misinformation? Secondly, how does misinformation spread in social networks; and can we inoculate or immunize people against misinformation? And thirdly, which interventions can help boost psychological immunity to misinformation? Community leaders, scientific experts and public health authorities need to come together and collaborate in order to develop public health messages that build and enhance community trust and utilise channels and messages accordingly.
Key questions
How do individuals and communities evaluate information in the context of uncertainty?
Professor Caulfield argued that scientific uncertainty was not handled particularly well in the pandemic. Science communication should be clear and accessible, and lead to specific actionable items. However because COVID-19 was an unknown disease, this didn’t happen. The public wants to hear about scientific uncertainty, and if scientists are not honest about this, there is a potential to lose public trust. Those who peddle misinformation, have weaponized scientific uncertainty, but uncertainty is a normal part of scientific discovery. Dr Batista followed up on this, arguing that the partisan atmosphere in Brazil meant despite the activism of scientists, there were multiple attempts to disenfranchise and delegitimise scientists, scientific work and science in general. This is something we have to address, firstly by giving a voice to the science, and secondly by mobilising public health officials and policy-makers in their efforts to aggregate evidence to counteract misinformation. Ms Adley added that within the Jordanian context, if there is a sense of ownership within the community, there is also a sense of responsibility. Furthermore, you can reinforce community influence through community leaders, thereby facilitating solutions. The community response to uncertainty can in fact be positive, particularly if it aligns to capacity-building objectives.
How can we define misinformation when uncertainty exists even in relation to the definition?
Dr Sam Martin argued that during the COVID-19 pandemic, the definition of what is legitimate information changed considerably. Something that is labelled misinformation at one point, may in fact become legitimate information later on; and vice versa. However, the way in which information was shared also changed. During the pandemic, the media often shared scientific literature which had not been peer-reviewed which also contributed to heightened uncertainty. Dr Martin argued that information should be reviewed in reference to the milestones and touchpoints which preceded it; that is to say information needs to be seen in the wider context and in light of the critical evidence which emerges over time. Uncertainty is always present, so misinformation must be fact-checked carefully and on an ongoing basis before being dismissed.
Is misinformation more of a symptom or a cause of vaccine hesitancy?
Professor Caulfield argued it’s both. There’s no doubt that misinformation causes vaccine hesitancy. However, at the same time, misinformation also taps into ideology and identity which suggests it is also a symptom too. In terms of defining misinformation; Professor Caulfield calls it information that is ‘stuck.’ That is to say, it is not information which is still being debated, but rather information which is categorically untrue, and part of a wider culture of doubt and information chaos.
References
https://harris.uchicago.edu/news-events/news/pearson-instituteap-norc-poll-95-americans-say-spread-misinformation-problem
https://www.webmd.com/lung/news/20211110/americans-duped-by-covid-misinformation
https://journals.sagepub.com/doi/full/10.1177/03400352211041135
https://www.thestar.com/news/investigations/2022/03/19/how-vaccination-status-might-predict-views-on-the-russian-invasion-of-ukraine.html (this is a pop pressarticle
https://www.ekospolitics.com/index.php/2022/03/public-attitudes-to-ukraine-conflict-by-vaccine-acceptance/
https://www.scienceupfirst.com/
https://www.corecommitments.unicef.org/rcce
Please Sign in (or Register) to view further.