Forecasting pitfalls – Facebook and the development of our digitized world

Forecasting pitfalls – Facebook and the development of our digitized world

While I am writing down this blog post, there probably is a tense atmosphere in the offices of Facebook in the United States. Facebook is struggling. Struggling to not negatively affect the US presidential elections, struggling to prevent misinformation, voters’ suppression, and foreign manipulations. After harsh critiques on the role of the social network in the 2016 elections, the company according to its own blog (Rosen, 2020), prepared well this time. The moment I publish this post, the ballot count may be completed but the studies on Facebook’s efforts to support democracy instead of harming it will not. Nevertheless, one thing is clear already: Facebook’s influence and power has not diminished since the last elections.

In my series of four blog posts, I presented four case studies on the role of Facebook in conflict, crisis, and development. I focused on topics that are not as present worldwide as the US elections. And I only selected a few amongst many cases. In this final post, I want to have a broader view on Facebook, its influence, its possibilities, and its responsibilities.

The academic discourse – especially within development studies and the communication for development field – mainly covers questions regarding the social media as a tool in development. Poell and van Dijck (2018) give insights from different fields of research about social media in protest movements. They carve out how social media platforms transform protests, and how they influence them. In her book Twitter and Tear Gas, Tufekci (2017) also discusses how digital connectivity supports or changes protest movements. She uses the image of sherpas to explain the role of internet in protests. Sherpas are the men in Nepal that help tourists on their mountaineering expeditions. They offer experience, company, guiding, and even carry the mountaineers’ backpacks. A person aiming at climbing Mount Everest in this analogy is a protest movement aiming at political change. The Sherpa which represents the internet, is a big help and makes it more likely to reach that goal. But he cannot take over completely. The mountaineer at the end still has to manage himself or herself to climb the world’s highest mountain. The same happens with the internet in protests: the internet can strengthen movements or accelerate social and political change. But it can not replace protests in the real world. Like the mountaineer needs muscles, physical and mental training to be able to reach the summit, a successful political movement still needs leadership, structure, and active members despite the help of the internet and social media platforms (Tufekci 2017; Poell & van Dijck).

But there is another angle which is not that popular in the academic research: the social media can not only be seen as tool, but also as an actor. To stay in the image, the Sherpa can have an own will and own ideas how to support or to hinder the mountaineer in his or her project. Let us have a look at the case studies I presented in my first four blog posts. In all four cases, Facebook interfered in national or world affairs:

  • In Myanmar, Facebook blocked pages and profiles after recognizing that the platform was a dangerous catalyst in the genocide against the Rohingya minority.
  • In the Covid-19 crisis Facebook marks and blocks fake news, intervened when sanitizer and face masks were offered at horrendous prices, and helps to spread trustworthy information on health issues.
  • In Thailand, Facebook is indifferent between obtaining the harsh Thai lèse-majesté law by blocking respective content and supporting the current pro-democracy movement.
  • Within its own initiative internet.org, Facebook acts as a player in development by providing online access to the underprivileged.

While Tufekci (2017) names the advantages and disadvantages of the absence of gatekeepers on the internet, Facebook is more and more becoming one of those gatekeepers. The Digital Forensic Research Lab at the Atlantic Council conducts research on Facebook and other social media companies playing their role as gatekeepers. As a part of the Atlantic Council’s Election program, Brooking et al. (2020) published an article about the expected online disinformation regarding the US elections. They state that “platforms must be held accountable for amplifying false information with serious offline implications, like lies about how to vote”. But is it that easy? Are social media platforms accountable for what their users publish, comment, like, and share? Or is it rather important to keep the gatekeepers away and to maintain absolute freedom of opinion and expression in the social media?

Those questions cannot easily be answered. It differs from case to case how the operators of a platform should be held accountable. In cases like Myanmar, the US elections or disinformation regarding COVID-19, there is a quite broad consensus that Facebook had to intervene. But in cases of conflict, the question about who should be supported is not always as easy to answer.

In the fields of sociology and communication, the public sphere is a widely discussed topic. Brought up by Habermas (1989) and others, the concept of an inclusive public space with participatory discourse everyone can join, has been object of discussion and investigation for many scholars. Gerhards and Schäfer (2010) conducted a study on online and offline debates about the human genome research to find out if the internet provides a more discursive public sphere than the traditional mass media do. The result: there is no significant difference regarding discussion and opinion leaders. According to Gerhards and Schäfer, search engines automatically favor well-known institutions and pages generating a lot of traffic.

“In this way, search engines might actually silence societal debate by giving more space to established actors and institutions, to experts and to expert evaluations and views, thereby replicating pre-existing power structures online. This manner of actor and content selection might be even inferior compared to the old (and already often criticized) mass media, because the latter at least employ journalistic norms like balanced reporting and neutrality when selecting actors and statements, and thereby present a possibly better communication than the internet.” (p.14)

The time this study was conducted, social media was not yet as important as today. But the findings regarding search engines can be transferred to algorithms as well. Tufekci (2017) explains how internet and social media widened the public sphere and created new public spaces and possibilities. Nevertheless, she warns from idealizing the past like Habermas tends to do and from judging “the networked public sphere merely by comparing an in-depth investigative story in the New York Times or ProPublica with a viral fake story in the ‘Denver Guardian’” (p. 267).

The social media and the internet are not a completely free and open public space. Of course, the access is almost unlimited – at least for everyone who lives in an area with internet connection – and with its internet.org initiative Facebook is working on giving access to everyone. But there is a difference in producing or publishing information and distributing or receiving it. Donner and Locke (2018) state that the “decades-old and optimistic frame of ICTs for development (ICT4D) is increasingly naïve in accounting for the impact of these platforms—while smartphones and apps may have massively democratized the means of production, they have correspondingly intensely focused ownership of the means of distribution” (p. 39-40).

At the moment, the gatekeepers are mainly technological actors: search engines, algorithms, Artificial Intelligence. For Facebook, the algorithm mainly considers content that seems to be able to provoke clicks, likes and shares. As during the last years this mechanism led to the spread of disinformation, for example in the cases of the US elections, the Rohingya crisis or the COVID-19 pandemic, Facebook has been improving its control mechanisms and enlarged its human control staff. Furthermore, Mark Zuckerberg announced in 2018 to implement an Oversight Board in his company. In May 2020, the board members were introduced (Clegg, 2020).

For an initiative of the non profit The Citizens, this isn’t enough. The activists calling themselves The Real Facebook Oversight Board criticize that the oversight board so far is not operational and that it has too limited power. According to them, the Facebook oversight board is only capable to react after disinformation or hate speech have happened and cannot prevent them. As a volunteering board, the activists want to provide their knowledge and skills to complement the official board and to prevent another US election “going wrong” (The Citizens, 2020).

The success of the official and the unofficial Facebook oversight board will probably be observed and examined quite comprehensively. Maybe one of them will be able to offer meaningful solutions how Facebook can improve the way it is fulfilling its role as a platform, a catalyst for the good, and a worldwide public sphere.

Personal reflection

This is my last post of that little series covering Facebook. Overall, this blogging exercise was a good experience for me. I enjoyed discovering Facebook’s role in different case studies, seeing the differences and similarities and recognizing how different a social network can be used and misused depending on the local situation and digital literacy of a society. Analyzing the pitfalls Facebook is facing, made me understand that development efforts like bringing connectivity to everyone on the world as Facebook is promoting in the internet.org project are not enough. The digital progress is fast, even faster than the respective companies can anticipate problems. Tufekci (2017) puts the problem in a nutshell:

“Yet we have barely begun to understand what this all may mean. The transformation has been very rapid. There are many parts of the world where there was no electricity just a decade ago, and now where even children have cellphones—and there still may not be electricity, at least not regularly. One key lesson from the past is that our familiarity with a new and rapidly spreading technologies [sic] is often superficial, and the full ramifications of these technologies are far from worked out. Another lesson is that what appears to empower one group can also empower its adversaries, and introduce novel twists to many dynamics.” (p. 263)

ICT4D means a lot more than providing connection and devices. It means more than using new technologies and the social media for the good case. It also means developing human capability to use them, to understand them and to prevent misuse. Digital literacy is a keyword. There might be a gap between the youth and the elderly, between the educated and the underprivileged, between those who have had access to ICT for years and those who just started using smartphones. But as long as technology is developing, digital literacy will be a challenge for all of us.

I liked the possibility in this project to break free from academic conventions and try to write in different styles. As journalist, I learned to write for my audience and find angles that might catch the readers’ attention. In the academic discourse, I sometimes miss that attitude and the commitment to write for everyone. Especially in German academic literature, I have often got a notion that scholars primarily aim at stressing their expertise by using complicated sentence structures and uncommon words instead of making their topics and findings interesting and understandable to their readers. So far, I mostly separated between those two worlds and writing styles. But this blogging exercise gave the opportunity to experiment, to try to write less academic about the topics of my studies or to combine academic writing with anecdotic elements.

The blogging environment was also supportive for such experiments. On the one hand, as the blog was public, this gave motivation for the writing process. On the other hand, I assume that most of the readers were my fellow ComDev students. This made it easier to use the blog as a sand box and start to be creative. The encouraging comments and discussions as well as the inspiration from my fellow students’ blogs also helped to open up for less convenient writing styles and formats. I personally would have liked to extend the blogging project. With more time and less pressure it would have been easier to experiment and we would have had the opportunity to further elaborate our blog concept and our personal style.

References:

Brooking, E. T., Brookie, G., Kargar, S. & Kann, A. (2020). Five big questions as America votes: Disinformation – Elections 2020 by DFRLab. Atlantic Council. Retrieved from https://www.atlanticcouncil.org/blogs/new-atlanticist/five-big-questions-as-america-votes-disinformation/

Clegg, N. (2020). Welcoming the Oversight Board. Facebook. Retrieved from https://about.fb.com/news/2020/05/welcoming-the-oversight-board/

Donner, J. & Locke, C. (2018). Platforms at the margins. In Graham, M. (Ed.) Digital Economies at global margins. Cambridge: MIT Press.

Gerhards, J. & Schäfer, M. S. (2010). Is the internet a better public sphere? Comparing old and new media in the USA and Germany. New Media & Society, 12(1), 143–160.

Habermas, J. (1989 [1962]). The Structural Transformation of the Public Sphere. Cambridge: Polity Press.

Poell, T. & van Dijck, J. (2018). Social Media and new protest movements. In Burgess, J., Marwick, A. & Poell, T. (Eds.) The SAGE Handbook of Social Media, 546-561. London: Sage.

Rosen, G. (2020). Preparing for Election Day. Facebook. Retrieved from https://about.fb.com/news/2020/10/preparing-for-election-day/

The Citizens (2020). The Real Facebook Oversight Board. Retrieved from https://the-citizens.com/real-facebook-oversight/

Tufekci, Z. (2017). Twitter and Tear Gas: The Power and Fragility of Networked Protest. New Haven, CT: Yale University Press.