Fighting the Infodemic: Facebook and the fake news on COVID-19

Fighting the Infodemic: Facebook and the fake news on COVID-19

Facebook is a giant. It is not only the probably best known social media platform of the world, it is also a corporation that owns services such as WhatsApp or Instagram. Being such an online imperium acting worldwide, Facebook is powerful. And it faces many pitfalls, for example regarding data security, hate speech, different local laws and censorship. In this series, I will examine Facebook’s not always glorious role in development, conflict and crisis around the world.

COVID-19 changed our lives. The virus entered the world, changed our working lives and our attitude towards physical closeness. For many of us, it turned the world upside down. The pandemic slowed down some parts of life but accelerated others. For example, efforts for digitalization suddenly got tail wind. Companies discovered the advantages of remote work and home office, online offers for education and leisure sprouted like mushrooms. And being quarantined, many people started spending even more time with Social Media than before. That shift from offline to online favored misinformation as well.

On June 4th 2020, Věra Jourová, Vice President of the European Commission, held a speech with the title “From pandemic to infodemic”. She brought up the role of Social Media in the COVID-19 crisis and stated clearly: “The COVID-19 pandemic is just a reminder about the huge problem of misinformation, disinformation and digital hoaxes.”

The term infodemic Jourová used was amongst others coined by the World Health Organization (WHO). In a video, WHO explains, what infodemic means and what we can do to stop it:

 

But videos like that one often get lost in world wide networks like Facebook. Algorithms favor posts that seem to be more attractive. Thus, instead of educating videos, conspiracy theories are spread internationally. It is obviously more catchy for many Facebook users to blame 5G, Bill Gates or Chinese Scientists for the pandemic than helping to fight it.

The activist network Avaaz published a paper in August, in which Avaaz analyzed the reach of fake news regarding health issues on Facebook. The numbers are alarming: in April 2020, according to the activists’ estimations, websites that were spreading health misinformation reached 460 million views on Facebook. And although Facebook expanded its fact checking network and worked on identifying and labelling misinformation, only 16% of the articles analyzed by Avaaz had a warning label. As the report explains, one of the successful tactics of those who publish misinformation is extensive networking. The analyzed pages were connected and shared and republished their content. By translating and republishing, they avoided or at least complicated Facebook’s fact checking mechanisms. According to the report, the top ten websites spreading fake news on health issues reached four times as many users as did the ten leading health institutions.

Although the numbers cited in the Avaaz report and some methods are contested, the report drew attention to a huge problem. Regarding the question, how this problem could be solved or how the infodemic could be “quarantined”, Avaaz suggests two measures:

  • “Correct the record”: users who saw or interacted with posts offering wrong information, should be provided with corrections.
  • “Detox the algorithm”: Facebook should modify its algorithm so that misleading content and actors get less possibilities to increase their reach.

Facebook’s algorithm started as a simple sorting order of posts in 2009 and has been continuously evolving since. Today, it uses many factors and pieces of information to decide automatically, which content should be watched by which user. The number of interactions and popularity of posts play an important role. The problem: many of the posts spreading misinformation on COVID-19, provoke interaction and get a certain popularity through respective bubbles and networks. Thus, the algorithm favors them over reliable information.

Facebook of course reacted to accusations of supporting fake news not only after the Avaaz report. On it’s blog, the company lists all measures that were taken for “Keeping People Safe and Informed About the Coronavirus”. Facebook didn’t only try to give fake news less space, it also helped reliable sources to spread proper information, it supports health and economic relief efforts by donating money, offering free ads, providing data and tools or promoting petitions. The company also intervened when the market of sanitizers and face masks seemed to get out of control and Facebook ads lead to scams and inflated prices. But were these measures enough? Did they come in time or was Facebook too slow?

It would be interesting to read a follow up report by Avaaz to see if Facebook’s measures made an impact after some months. As in the case of the Rohingya crisis, the COVID-19 pandemic again shows how dangerous the dynamics of social networks like Facebook can be. Although Facebook in both cases took action, it only reacted after criticism got loud. It would be a future goal to be aware of the role of social media right at the beginning of a crisis and react before people get confused, incited, and even harmed. This task is not only Facebook’s duty. It is also a duty of politics and law. As Věra Jourová stated in her speech: “We must not move into censoring, we must not create ministries of truth. […] But we must equip ourselves better to address the challenges of our digital reality.“