Collective Service Documentation

(Mis)communication? Social Listening and the Exclusion of Marginalised Voices


In a crisis situation such as the COVID-19 pandemic, social media users go to virtual networks to request and share information, locate loved ones, and find a sense of community. Increasingly, social science researchers use social media analytics (SMA) to understand community perceptions and guide public health and risk communication interventions. This paper suggests that presenting social media data as an accurate depiction of community-wide insights has the potential to misrepresent community perceptions and to silence and marginalise vulnerable groups.

Designed to track brand insights and contribute to commercial marketing strategies, SMA tools were redeployed during the pandemic to understand sentiments related to COVID-19, vaccines, and trust in authority figures. These automated systems work by using artificial intelligence (AI) to collect and categorise publicly available social media data in vast quantities. The speed at which these tools can turn large data sets into simple visualisations has made them particularly attractive to busy humanitarian agencies. However, as argued here, “the rush to adopt this methodology may result in agencies not fully understanding, mitigating or being able to communicate the limitations of the data.” Some of the limitations include:

  • In every country, there are people who either choose not to use – or do not have access to – social media. Thus, if social media data are used to inform the design of humanitarian responses, we risk designing responses based on the needs of the privileged, while further marginalising vulnerable groups.
  • In most countries, the discourse on social media is held in the dominant language; English has become the default lingua franca for social media. A SMA system that cannot understand or does not recognise minority languages or scripts risks excluding certain voices.
  • The very public nature of social media platforms could make it intimidating for some users to openly engage. Readily accessible data from a potentially narrow portion of the community can present a skewed picture of a society and its perceptions.
  • A challenge impacting all analysis of social media data, either via AI or through manual collection, is the difficulty in determining the authenticity of the users who post.
  • Social media may not be the fora people choose to share their opinions about challenging issues that are of interest to researchers, such as political discourse, perceptions, or behavioural insights.
  • The selection of search terms introduces a natural bias where research is guided by the researchers, rather than the community’s priorities.

As outlined here, social listening in a humanitarian context is the process of monitoring and analysing community conversations in online spaces (such as social media) to understand needs and inform humanitarian responses. Not every organisation has the resources to perform social listening. So, during COVID-19, the World Health Organization (WHO), the United Nations Children’s Fund (UNICEF), and the International Federation of Red Cross and Red Crescent Societies (IFRC) joined together to coordinate a network of risk communication and community engagement (RCCE) working groups and taskforces at the local, regional, and global levels. These coordination mechanisms allow insights, including those gleaned from social listening, to be shared among member agencies. The paper analyses 14 instances of two such types of reports, which draw on social media data collected using SMA tools:

  • COVID-19 Infodemic Trends in the African Region is a weekly social listening report created by the Africa Infodemic Response Alliance (AIRA), a regional network hosted by the WHO that brings together fact-checking, media, and non-governmental organisations. The assessment here finds that, while the AIRA team takes steps to explain the limitations in their methodology in regard to data sources, “a more nuanced approach to demographic data and in particular, language, is required to make this report a more practical tool to inform risk communication responses. While this element may be lacking, a positive and practical feature of these reports is the inclusion of guidance for practitioners near the end of the report.”
  • Social Listening report on COVID-19 Vaccination in Morocco is a weekly social listening report created by the UNICEF Communication for Development staff in the UNICEF Maroc office. The assessment here finds that, while the report authors discuss the limitations present in data collection and analysis, these mentions are very brief. Also, compared to the AIRA reports, the UNICEF reports provide far less risk communication guidance, requiring a leap from data (what people are talking about) to “something actionable that genuinely influences risk communication responses and policy decisions”.

Based on this analysis, the article concludes that “it is our responsibility as scholars to ensure the limitations of any data set are understood and clearly communicated to the audience….As the aim of these social listening reports is to influence humanitarian policy and risk communication approaches, this deficit risks decisions being inadvertently made on imperfect or misrepresented data….Further research is needed to assess the actions taken as a result of these reports and how practitioners understood and accounted for limitations and what impact the analysis had on policy and programming.”

Additional languages

No items found