digital crossroads between social media, datafication and development
Data’s Dirty Business

Data’s Dirty Business

On SoDaDev we write about social media, data and development and how these shape our world. In his recent post my colleague Stratos analysed the intersection of data with inequalities and discussed the role of data in widening, instead of closing the development gaps. Stratos focused on Cinnamon’s (2020) conceptualization of data inequalities, along three dimensions: access to data, world representation as data, and data flow control. 

In my post I would like to focus on yet another dimension of data, that is human rights and human dignity aspects of the appropriation of data for profit by the social media platforms. In particular, I will look into the lack of accountability of the tech giants in their actions in the fringes of the imperium and at the bottom of the data pyramid (Milan and Treré, 2021, p.321) and the practices of digital colonialism. I delve into platform’s manipulative business models and their role and economic interest in keeping the status quo.

The New Oil

In 2017 the Economist published an article “The world’s most valuable resource is no longer oil, but data”, coining the term “data is the new oil”. The article has generated a great deal of discussion about what happens with our data, and the role of the tech giants in handling it, prompting the legislators in the “North” to regulate how the citizen’s data is handled (see the Digital Services Act or American Data Privacy and Protection Act).

In the context of the “South” a lot has been said since about the role of data in development, and how it helps to address inequalities through improved monitoring capacities, allowing for better targeting those in need and identifying those needs in the first place. Data and the digitalisation at large are presented as THE NEW OIL that is improving lives and is responsible for “lifting a large majority of the population of the world out of dire poverty”. 

Aside from this rather romanticised view of data, it is paramount to understand the role of the digital platforms that play a crucial role in the extraction of this “new oil” for its commodification.

It has been widely reported that social media sit at the heart of the communication for/about development (Tufecki, 2017). They offer inexpensive means and space to tell the untold stories by the vulnerable communities themselves and support digital activism through allowing them to scale up their interventions. A fellow blogger from The New Media Activists, recently wrote about an excellent example of the power of TikTok in activism

Yet, in the shadows of this activism, sit the digital platforms that benefit directly from our engagement online and our data relations (Couldry, Mejias, 2019). In fact, the recent research by Access Now has identified that the more controversial the issues are discussed online, the more user engagement they generate, the more visible they are to a larger audience (Prikova, 2021). This is because of the very design of social media interface algorithms which rank higher posts that are controversial to maximise the profit of the ad revenue to the platform owner. The research commissioned by the European Parliament has called such trends “polarisation by design”, due to the fact that it is often the highly divisive content or pure disinformation that scores better in terms of visibility (and let’s not forget it, in direct revenues to the platforms).

Colonialism 2.0

In light of this,  is data the new oil that will bring millions out of poverty? The increased body of research shows that in fact data turns out to be the new means of oppression, with the researchers comparing it to the practices of colonialism (Couldry, Mejias, 2019). Digital colonialism “is the use of digital technology for political, economic and social domination of another nation or territory” (Kwet, 2021) and it comes in many shapes and forms, such as dataveillance, hardware and software manufacturers, bitcoin mines, internet scams bordering on humans slavery, to name just a few. 

For the sake of this study I will only focus on examples of data appropriation (Couldry, Mejias, 2019) and the impact of this practice on the subjects of colonialism by data, particularly those who operate on the margins of the biggest markets of the social media platforms. We will look into the examples where data of the colonised self is appropriated without its consent or when it is not protected making it vulnerable to exploitation – i.e. where the social media giants are allowed to mine for the new oil disregarding human rights abuses.  

Introducing: Forced Confessions on YouTube

The year was 2021 and Belarus’ Lukashenko successfully quenched the mass protests galvanised by the sham elections which illegally gave him his 6th consecutive presidential term, fortifying his position as the longest reigning dictator in Europe. The protests were said to be grassroot in nature, the opposition politicians and activists, many of whom were residing outside of Belarus for safety, were vocal in decrying the legitimacy of the election’s results by means of various social media channels. 

Nonetheless, one notorious dissident, Roman Protasevich, was seen on YouTube broadcasting a different message. Just days after his Ryanair flight was forcibly landed under a false bomb threat in Belarus, the captured activist was forced by means of torture to decry his former allegations and to legitimize the regime’s actions. The channel which hosted the “confession” was clearly loyalist to Lukashenko’s narrative, having set it’s logo to one that reminds of Belarussian presidentials flag, while the ads in question where directing users to pro-Lukashenko Telegram channels.

Setting aside the bizarre circumstances of his abduction, the means of spreading his forced confessions were equally outrageous. First brought to attention by a tweet by Tadeusz Giczan, the editor-in-chief of Nexta-TV – Belarus’ largest Telegram channel, the scheme involved gaming YouTube’s paid plans to increase the outreach of the forced confessions. 

The mechanism was simple: the video with forced confession was played out to a designated audience by means of paid ads, exactly in the same way that ads are normally bought and disseminated among YouTube users. It isn’t known how much the ads cost to be run but the price could not be insignificant: at the time of the video’s premiere the cost to promote a video in Belarus was 0.01 euro per view. The view count of various propagandist media found in the channel go into hundreds of thousands – adding up to over a whopping 6 million views for all published videos, a number technically unfeasible to reach organically for a channel with around 1 thousand subscribers. 

But the ad spend is not the crux of the matter in the case of Protasevich. Rather, it’s worth considering Giczan’s initial tweet again: despite multiple requests to YouTube the Belarussian state news agency BeITA have gamed the ads algorithm to broadcast propaganda and forced confessions from lower-profile dissidents, for at least a year prior to Protasevich’s arrest.

As the case gained traction the ads were suspended, however, videos of Protasevich and other propagandist media are still available in the channel. 

TikTok profits from livestreams of families begging

Not unlike Youtube in the case of Protasevich, TikTok has recently shown that it requires more than just self-regulation to keep exploitation from its own platform. It’s not certain how long TikTok allowed for exploitative begging in Syrian camps for displaced people, before the BBC brought the begging scandal to light, but the broadcaster says to have spent at least 5 months following 30 TikTok accounts and investigating.

The scheme involved Syrian children in the refugee camps, who would go live on TikTok streams begging the viewers for hours on end to send gifts which can in turn be redeemed for actual cash. To the end users it could easily pass as genuine pleas for help: but the reality behind channels transmitting children repeatedly screaming “Send me gifts” is insidious all the way through, and puts in question TikTok’s integrity itself. BBC established with certainty that TikTok takes up to 70% of the gifts’ value.

It first raised the investigative team’s suspicion that the kids in various channels were often saying the same thing. It seemed to them that children are being schooled in begging. Some streams also came with off-camera voices confirming that the kids are instructed on how to successfully elicit TikTok gifts. Some of these channels were said to make up to 1,000 USD per hour, gaining traction thanks to TikTok’s algorithms, but often also due to celebrities and influencers buying into the scheme and helping the kids go viral. 

The author behind the BBC’s case-breaking piece notes how the investigation took months to assess the full scale of the exploitative phenomenon. There are middle men in the camps, who often sell off their livestock and savings to acquire streaming hardware: phones, tripods and the UK SIM-cards, in order to start the enterprise. TikTok’s algorithms will display your streams based on your geographical location. Within the course of the investigation it was proven that the UK audience is the most charitable one, and the scale of the phenomenon was so large that the local stores doubled up the prices for the UK SIM-cards in order to profit from this novel market.

After acquiring the hardware you still need at least 1,000 followers to live stream on the platform, but TikTok comes with its own solution to this problem and advertises the services of their own livestream management agencies. The goal of these is to propel the spread of the social medium to entrepreneurial users without the required minimal following. 

From there, the scheme works as long as the middle men can negotiate with displaced families to have their children beg live on stream. In the BBC’s investigation, it was proven that for every 100 USD gifted to the streaming children up to 70% is taken by the social media giant itself (sic!). From what remains the local transfer companies charge their fees, the middle men take about 10USD, while the family benefiting at the rate of less than 18% of the gifted sum. 

According to Access Now the fundamental dignity and human respect standards that TikTok claims to maintain are false. Marwa Fatafta of the non-profit, says the “creating and enabling an ecosystem that runs on the exploitation of people’s suffering… in violation of its own policies and of human rights”, pointing to a contrary state of affairs

It may be argued that TikTok, which ranks the lowest among the major social media platforms in Glaad’s Social Media Safety Index, is barely facilitating the process, rather than encouraging it. Nonetheless, the streams were ignored by the giant for at least 5 months, before the BBC and human rights actors brought the case to light. 

Human Rights & Trafficking on Facebook

The highest rating platforms: Instagram and Facebook, now belonging to Meta, are not without their share of mishandling of human exploitation. Facebook’s leaked documents about internal investigations into social mogul’s handling of human trafficking, expoitation and even hitmen recruitment, caused an uproar and almost had the platform taken down from Apple’s app store.

In September 2021, The Wall Street Journal released a series entitled ‘The Facebook Files’ – an in-depth analysis of Facebook’s internal documents and staff interviews. The series shows an image of the social giant unable or unwilling to address grievous cases of human rights violation, essentially enabling offending channels and users to thrive in soliciting sales of domestic slaves in the Gulf region, allowing Mexican cartels to recruit hitmen from among teenagers, inciting violence against minorities in Ethiopia or Myanmar, or allowing for coordinated and state sponsored attacks on dissenters in Vietnam. 

The findings of WSJ brought more than just the revelations of wrongdoings happening in countless Facebook groups. The series uncovered that the corporation was acutely aware of their botched procedures and that it routinely showed carelessness in moderating their products in the developing countries. Instead, Facebook prioritised attentive moderation in the US and ‘the Global North’, where failing to comply with stricter human rights regulation could mean losing access to lucrative markets. 

There were a variety of flaws that Facebook’s own experts pointed out in their reports. The Arabic nations speak a variety of dialects, whereas Facebook’s own moderating workforce mainly speaks Moroccan dialect. This caused a variety of problems where mis-moderation caused users mentioning Palestinian holy sites be blocked for terrorism, adding fuel to the tense situation.

In another example, the Ethiopian government was free to incite hatred towards the Tigray minority, with Facebook’s moderators unable to cope with the variety of Ethiopia’s languages.

The nuances of the dielectic varieties and negligence allowed for the existence of a bustling slave trade market among the Gulf’s Instagram users, where images of domestic slaves, women sometimes as young as 16, were posted for sale. Facebook internal memo opined to allow the ads to stay on site as ‘the company’s policies “did not acknowledge the violation”’ despite direct evidence of human trafficking being present in the groups and details of the deals available in private messages. 

In other cases, Facebook would intentionally overlook human rights violations in order not to lose access to a growing market. The social media giant virtually silenced Mr Buy Van Thuan, a prominent critic of the Vietnamese in exchange for the Vietnamese government ending its practice of slowing down Facebook’s local servers. In a statement about Facebook’s operations in Vietnam, Facebook’s COO Sheryl Sandberg told a Senate committee that “We would only operate in a country when we can do so in keeping with our values.”

WSJ found out that in a purely PR move and to alleviate the backlash over the leaks Facebook has contracted outside assistance to assess and mitigate the  damage. One of the recommendations was to avoid financially benefiting from ads supporting human exploitation

My personal resolution

The aforementioned examples sit at the very heart of the business model of the online platforms: to make profit. This however, as we have seen, often stands in contradiction to the duty of care and protection of its users. This is because the way these companies make money is based on harnessing data of its users, proposing them content to their liking with a hope to generate more digital connections, and ultimately higher revues. 

The proliferation of COVID-19 mis- and disinformation brought a halt to this El Dorado prompting the European Union to work and finally announce this year its renewed Code of Practice signed by over 30 tech companies agreeing for self-regulation, introducing more tools to protect its users and increasing its transparency. All three companies analysed by me are the early signatories of the code. We will have to wait until the end of 2022 to see the Code’s implementation, but the good news is that they all operate on interface basis – the change in their algorithm in Europe will be implemented across the globe, hopefully increasing protection of those who sit far from its headquarters in the Silicon Valley. 

I will certainly continue to monitor the news about this legal initiative and I will hope to read stories about successful self-regulation by the social media and tech molochs in 2023. On this note, I am saying goodbye to the SoDaDev blogging experience. It was an interesting journey of making friends with my fellow SoDaDevs, learning from one another, and learning new digital tools. I was particularly driven by our discussions about the role of social media in our lives, its impact on our wellbeing and safety. The blogging experience helped me to reflect on my own digital habits and encouraged me to limit my personal digital carbon footprint through developing a more conscious relationship with my smartphone.

Thank you and see you on- and offline!


References: