The world is becoming more and more polarized, and the Covid-19 pandemic seems to have worsened the situation. On this blog, my colleagues and I have addressed the challenges with AI, algorithms and data manipulation, misinformation and data responsibility, privacy and ethics, and highlighted that the Covid-19 pandemic is not only a health crisis, but also a disinformation crisis. That social media also provides opportunities for development may or will not be a surprise, as listening to the online voice and opinions of people worldwide has become extremely easy and this publicly available information can be crucial in responding to extremist narratives.
Social media are the places where people are talking and exchanging opinions about content and companies, often publicly. As “social life all over the globe becomes an “open” resource for extraction that is somehow “just there” for capital” (Couldry & Meijias, 2019, 337), organizations are largely tapping into that wealth of information by monitoring and listening to the social media as this can improve their understanding of the public attitude and awareness towards their organization. They do this by Social Listening, which is defined as “…an active process of attending to, observing, interpreting, and responding to a variety of stimuli through mediated, electronic, and social channels.” (Stewart & Arnold, 2018, 86). This private sector tool by origin, is being employed by many companies in their digital marketing, but also in social research and development organizations this tool is becoming increasingly popular, and even leads some researchers to state that Social Listening, in certain contexts, is a tool ‘to assess user needs and evaluate programs as an alternative to focus groups or surveys in a way that was more informal, unbiased, and unsolicited’ (Pomputius, 2019, 184) and that ‘a social listening process could provide social marketers (and other market researchers) with an effective and efficient mechanism of measuring a campaign’s success in terms of engagement’ (Shaw, 2021, 456). I have shared a variety of examples on how to employ Social listening, which you can access in my previous posts.
The online space is a haven for extremists of all kinds. Lee, (2020, 66.)
As the social media are an open place to share opinions and content, there are alas also those people that exploit the social media to spread misinformation for harmful purposes like hate speech, conspiracy theories (e.g. QAnon), fake news, sexism, racism, violence and terrorism. Extremist narratives are prevalent and of concern in all countries worldwide, and of growing concern as ‘in 2013 about 17% of Internet users aged 15 to 30 reported being exposed to extremist messages, in 2015, it was more than 60%’ (Schmitt et al, 2018). By 2021, and in times of the COVID-19 lockdowns, it would be safe to assume and expect this number to have risen. In the Netherlands, a recent news report emphasized strong concerns over the online radicalization of youths with far-right, white nationalist beliefs, even linking it to Accelerationism. In the Dutch NOS News report, Jelle van Buuren from Leiden University, says (translated): “
There are hundreds of young people active in various places on the internet, and not everything they share is prohibited, So the services can’t keep an eye on all that. Yet you have to do something with it. To protect society, but also to protect these often young boys from themselves.”
Jelle van Buuren here touches upon a few concerns that need to be considered. Obviously, this does not only apply to white nationalist ideologies only but can be taken broader to extremist discourses concerning for example the anti-migrant discourse, climate change, elections, terrorism, and religious extremism. People on social media that are engaging with extremist content will, based on their clicking behavior, automatically receive bit by bit more and more of similar content. One of the issues that need to be addressed is the Social Media algorithms that contribute in this, which led the Washington post to write about Facebook that ‘today’s algorithm can turn their feeds into echo chambers of divisive content and news, of varying reputability, that support their outlook’. The algorithms are part and parcel of the problem and assist in this but cannot be held fully responsible, because “people seek out only information that reinforces their prior beliefs, offering ever more opportunities for the spread of hate, misinformation, and prejudice” (Aday et al. 2010) and thus other approaches to tackle these issues are necessary.
Control of the social media narrative is, for many, equated to control of wider societal narratives and is therefore an end in itself. Lee, (2020, 84)
Social media are not just places to share content or disseminate information, but places to meet, share, interact and recruit. It involves spreading the extremist views and recruitment, so people get drawn into these extremist networks by other people and are then encouraged to take action. Social media platforms have set solid terms of conditions and they ‘are in charge of enforcing these guidelines and regularly remove content and block users that are in violation of guidelines that they have set on hate speech, inappropriate content, support or celebration of terrorism, or spam’ (Ghanesh & Bright, 2020, 10-11). Despite their mission to moderate content and block users that violate their rules, moderation and bans also backfire in the way that blocked users go to a different platform, and that those banned by certain platforms gain status in their community or groups (Ghanesh & Bright, 2020). Simply blocking users is not solving the problem either, therefore Ghanesh & Bright suggest that along with other policies and measures, it is important to develop ‘programs to counter the narratives on which extremists thrive while being conscious of rights to free expression and the appropriateness of restrictions on speech (Ghanesh & Bright, 2020, 7). If the bans do not work, the content moderation does not suffice and the algorithm is not adjusted, there is a need to bring the counter narratives to those online places where the extremism is developing and leading the discourse. Or as Schmitt et al put it, ‘automated algorithms may define putative relation-ships between videos based on mutual topics, CM can appear directly linked to extremist content’ (Schmitt et al 2018). This would make sense, as these counter narratives ‘aim to disrupt the communicative activities of terrorist and extremist political and religious groups, either by undermining extremist messengers or messages, or by convincing audiences with alternative ideological messages (Lee, 2020, 68).
It seems that Social Listening appears to be an effective tool to locate the extremist discourses and to take the counter narratives to them. As previously written, an organization that works on a variety of issues is the Institute for Strategic Dialogue. Together with local NGOs, their Against Violent Extremism network has initiated a number of projects that draw on the social listening approach and based on the gained insights is developing counter narratives that are battling extremism. The ISD is utilizing the publicly available information on social media to take counter narratives to the places where extremism is taking place. The question remains how effective these programs are, is the message actually coming across, and is it effective in diminishing extremist narratives? While studying online initiatives in 7 different countries, Davey et al (2019) from the Institute for Strategic Dialogue state that it is effective, as their online interventions are cheap and easy to deliver, but also conclude that measuring success remains one of the major challenges of these counter narratives and they quote one of their interviewees:
“Reach does not tell you much. Interactions matter, how often contents are shared and with which sentiment. We do look at that very closely. Are the reactions negative or does it foster discussion and further communication? We think it is good when people interact with and comment on our contents.” (Davey et al. 2019, 24).
As it is the most likely to receive the most engagement, or in contemporary vernacular ‘go viral’, posts that contain humor, ridicule and satire have been coined as best practices in countering extremist narratives (Goodall Jr et al 2012, Beutel et al 2016, Ördén, H. 2018). This would be in line with Denskus, who argues that “using (social) media to create provocative, contentious or divisive content can be more lucrative than working toward traditional values of dialogue and consensus often associated with the initiatives supported through peacebuilding efforts (2019, 2)”. A particularly good and popular example that Ördén cites from the RAN network, is that of ‘Exit Deutschland’. ‘This German campaign ‘distributed 250 white power t-shirts at a neo-Nazi music festival that when washed just once changed their logo to “if your t-shirt can do it, so can you” and included the Exit brand’. Receiving 30 million likes on Facebook, the humorous counter-narrative was appreciated by civil rights activists and neo-Nazis alike’ (RAN@2012, 2 in Ördén, H. 2018).
What societies do and say in response to extremist views matters greatly’ (Lee, 2020, 85).
Social Listening could be an important asset to stay abreast of the latest developments on the social media and to develop communications and counter narratives or messages that can quickly adapt to the changing extremist discourses. However, Schmitt et al also points out that counter messages and narratives could lead to shift the focus on or even promote the extremist narrative instead, as ‘attempts at using social media to spread CM might run the risk of guiding people to extremist, or at least problematic material, due to a certain thematic congruence’ (Schmitt et al, 2018). Yet, if these counter narratives are not successful, then Lee proposes to view at it as ‘even if it fails to tempt adherents away from violent extremist ideologies, counter messaging is likely to maintain societal rejection of violent extremist narratives’ (Lee, 2020, 69). It may not be ideal, nor the magical tool to combat extremism, but it remains an effort to showcase the attempt of society to combat these views. In the end, ‘what societies do and say in response to extremist views matters greatly’ (Lee, 2020, 85).
Side note
This was the theme with which I had the least theoretical and practical experience (I am a member of, but not very active on Facebook or LinkedIn) so for me personally, the past weeks have been challenging and intense, most probably the most intense in the Com4Dev master period so far. I reckon this to be a particularly large learning curve; learning from each other as we have also referred to one another, it was really interesting to meet with our peers online and come to decisions every time. Everyone in our team has been able to share in these discussions, and with each team members’ expertise, this was very helpful. I think in the end we have developed our ideas in quite a structured way and have been able to utilize this jointly chosen theme into a blog with interesting ideas and insights. I was interested in the positive functions or the good use of big data from social media and highlighting successful projects and research articles in that field, so that some of these projects were bundled into a few blog posts dedicated to it. I think it is important to understand the power of social media, not only in terms of algorithms, which is a hot topic, but also that it provides us with new ways of doing research and to aim for mitigating social injustice through social media. This may be contrasting to what the others have written, but it is all related. I hope that our readers can and will conclude the same.
Bibliography:
Aday, S., Farrell, H., Lynch, M., Sides, J., Kelly, J., & Zuckerman, E. (2010). Blogs and bullets. New media in contentious politics. Peaceworks No. 65. Washington DC: United States Institute of Peace.
Beutel, A., Weine, S.M., Saeed, A., Mihajlovic, A.S., Stone, A., Beahrs, J.O., Shanfield, S.B., 2016. Guiding Principles for Countering and Displacing Extremist Narratives. Journal of Terrorism Research 7, 35.. doi:10.15664/jtr.1220
Davey, J. Tuck, H. Amarasingam, A. (2019). An Imprecise Science: Assessing interventions for the prevention, disengagement and de-radicalization of left and right-wing extremists. The Instititute for Strategic Dialogue.
Denskus, T. (2019): Social media and peacebuilding. In: S. Romaniuk, M. Thapa & P. Marton (Eds.), The Palgrave Encyclopedia of Global Security Studies. London: Palgrave Macmillan
Ganesh, Bharath & Bright, Jonathan. (2020). Countering Extremists on Social Media: Challenges for Strategic Communication and Content Moderation. Policy & Internet. 12. 6-19. 10.1002/poi3.236.
Goodall Jr H. L., Cheong, P.H. Fleischer K. and Corman S.R. (2012) Rhetorical Charms: The Promise and Pitfalls of Humor and Ridicule as Strategies to Counter Extremist Narratives. Perspectives on Terrorism, March 2012, Vol. 6, No. 1,
Lee, B. (2020). Countering Violent Extremism Online: The Experiences of Informal Counter Messaging Actors. Policy & Internet. 12. 66-87. 10.1002/poi3.236.
Ördén, H., 2018. Instilling judgement: counter-narratives of humour, fact and logic. Critical Studies on Security 6, 15–32.. doi:10.1080/21624887.2017.1377593
Parekh, D. & Amarasingam, Amarnath & Dawson, Lorne & Ruths, D. (2018). Studying jihadists on social media: A critique of data collection methodologies. Perspectives on Terrorism. 12. 3-21.
Pomputius, A. (2019) Can You Hear Me Now? Social Listening as a Strategy for Understanding User Needs, Medical Reference Services Quarterly, 38:2, 181-186, DOI: 10.1080/02763869.2019.1588042
Schmitt, J.B., Rieger, D., Rutkowski, O., Ernst, J., 2018. Counter-messages as Prevention or Promotion of Extremism?! The Potential Role of YouTube. Journal of Communication 68, 780–808.. doi:10.1093/joc/jqy029
Shaw, A. (2021) Promoting Social Change – Assessing How Twitter Was Used to Reduce Drunk Driving Behaviours Over New Year’s Eve, Journal of Promotion Management, 27:3, 441-463, DOI: 10.1080/10496491.2020.1838025
Silverman, T. Stewart, C.J. Amanullah, Z. & Birdwell, J. 2016. The Impact of Counter Narratives. The Instititute for Strategic Dialogue.
Stewart, M., & Arnold, C. (2018). Defining Social Listening: Recognizing an Emerging Dimension of Listening. International Journal of Listening, 32(2), 85–100. https://doi.org/10.1080/10904018.2017.1330656