ICT, Datafication, Covid-19, Social Listening and AI Technology in Development
Datafication and algorithms and data justice, oh my!

Datafication and algorithms and data justice, oh my!

This essay will conclude the work I have done with my colleagues on our blog Big Data-Challenges or Chances in Development? This blog afforded the opportunity for a deeper dive into this current #datarevolution. New technology has been questioned for centuries and there are diverging opinions about our digital arena. The purpose of this piece is to present an overview of my prior posts and their connection to development and course literature.

COVID and #datajustice

The first piece I contributed to the blog attempted to show the connection between the COVID-19 pandemic and data justice, a term I had no prior knowledge of. Data justice is defined as the fairness in the way people are made visible, represented and treated as a result of their production of digital data (Taylor 2017). The pandemic accelerated our dependency on technology, ICTs, and being able to ‘live virtually.’ What it also did was open the door for big data to entwine further its web into our lives.

The curiosity about big data was sparked up when the world began to reopen again where I was living in Bangkok towards the end of May in 2020. Businesses other than grocery shops and pharmacies were now open to the public but with a twist. We were required to scan a QR code into our phone for contact tracing purposes. This was my first experience with this technology. Your location, the time you checked in and the time you checked out was documented. I was skeptical of the data invasion then, and I’m certainly not any less skeptical about it now.

The focus will be on two aspects of contact tracing technology – access and privacy.

Access to contact tracing and humanitarian resources

Even during a pandemic when the world has become exponentially hyperconnected it is still possible for those who need humanitarian services the most are at the highest risk of exclusion. The use of location data to control the coronavirus pandemic can be fruitful and might improve the ability of governments and research institutions to combat the threat more quickly (Zwitter & Gstrein 2020). However, those with the least access to mobile devices often correlate to age and income, with the poor and very old least likely to have the resources to own one (Taylor, Sharma, Martin & Jameson 2020). They’re also the group who can least afford to miss work if they happen to be employed. Additionally, they have the least access to healthcare which makes them the most likely to suffer as a result of the pandemic.

Contact tracing which is used to mitigate an epidemic or pandemic grants a wonderful opportunity to employ big data to help the most marginalized societies. Let’s not forget those who would be excluded from data collection and avoid what Gilman and Green refer to as ‘the surveillance gap’: the ways in which society’s most vulnerable members’ ‘functional [invisibility]’ to surveillance systems can cost them dearly when such systems govern access to resources (Taylor, Sharma, Martin & Jameson 2020).

 

Please, take my privacy I wasn’t using it anyways

A double-edged sword of access but no privacy and privacy but no access should not exist. At the end of the day, what is all your data really being used for? How long would an itemized list of every entity that accessed your private data be? No matter how large a public health crisis may become, attempts to mitigate it should not come at the cost of citizen’s privacy. Our civil liberties shouldn’t go away because there is a virus that is afoot that might affect many people or because there is something in the air or because you become sick. It would be preferable to be able to avoid a pessimistic view here as this is a wonderful chance for big data in development. However, the view of this essay concurs with the notion that although networked digital technologies have invaluable and extensive capabilities that could seriously enhance the effectiveness of pandemic containment, their automaticity, speed, scale, and data-generating capabilities mean that their potential for abuse and overreach is vastly greater than the risks of abuse associated with more conventional social instruments of control (Taylor, Sharma, Martin & Jameson 2020).

I will touch upon an alternative to this type of ‘traditional data’ collection method further along this essay.

Algorithms, policy and #datamanipulation

Recent news of the Facebook whistleblower opened minds to the notion of the emerging corruption of policy decisions due to social media platform algorithms which are literally changing how countries are run. An algorithm is a series of instructions telling a computer how to transform a set of facts about the world into useful information. The facts are data, and the useful information is knowledge for people, instructions for machines or input for yet another algorithm (Denny 2020). An important aspect to remember about algorithms is that they are biased because they are modeled by humans. To create a model, choices about what is important enough to include get made. It is quite concerning to think of the volume of future policies that could become supported and/or implemented based off manipulated data.

The dark side of big data

Big data will profoundly change not only how governments work but also the nature of politics. It should cause alarm with how easy it can be to discredit credible movements or opposing policy stances while also creating inauthentic discourses. What is happening is that human activity is being turned into data points which can be manipulated to meet biased optimal combinations. It seems there should be regulations put in place to curtail big data’s influence. In this case with Facebook, if profits are the main concern even at the expense of societies and governments then perhaps these safeguards should make sure to voraciously address “power, politics, inclusion and interests, as well as established notions of ethics, autonomy, trust, accountability, governance and citizenship” (Dencik, Hintz, Redden & Treré 2019). Otherwise, if policy becomes hyper-data driven then those algorithms have immense affects to political and economic agendas that could be used to more easily marginalize certain groups. The poorest and most marginalized are also more likely to suffer disproportionately from some of the darker aspects of internet connectivity (Graham 2019).

Data has a performative power that is resignifying political life (Beraldo and Milan 2019). It is unfortunate the world has leaders who can be manipulated just as easy as the algorithms. This brave new world of solving pressing problems through machine learning has several dark sides (Qureshi 2020). This new political life is one such dark side of big data; how easily it can be manipulated. The will of the people that drives radical positions, in ways, is being molded by algorithms that promote misinformation and profit at the cost of civics and governance.

ICT and the ‘surveillance gap’

ICTs and the way we collect data will play a large role in who gains representation as well as access, not only to data, but to humanitarian resources. A big challenge is to empower people to develop a critical consciousness of ICT (Bentley et al, 2019).

A ‘surveillance gap’ was mentioned earlier in this essay wherein the most impoverished groups who have the least access to mobile phones and internet connections are most susceptible to exclusion. ICTs tend to be productivity biased, skill biased, and voice biased. Those who are already successful, talented, or better connected tend to benefit most (Graham 2019). If possible, development endeavors could aim to use technologies that operate with lessened or eliminated bias.

Augmenting ‘traditional data’

Data points of those who do not have access to ICTs do not exist. Digital technologies are a powerful accelerator of difference and inequality (Graham 2019). The ‘traditional’ way of data collection through mobile devices and the internet can render the most marginalized groups essentially non-existent to policymakers and/or development organizations. Furthermore, a fairly sized development problem is that high-quality, timely and accessible data are absent in most poor countries where development needs are then highest. Conventional methods of data collection, which requires substantial time to conduct and disseminate, have hindered efforts to implement change quickly and effectively. By the time reports are available to policy makers, data on the ground have already changed.

EO technology could possibly augment the way data gets gathered and utilized in development. There is increasing recognition that the data can be used to support the 2030 UN Agenda for Sustainable Development. It can also provide much needed data for climate action and disaster risk reduction. It’s an alternative to expensive big data that is held by private telecommunications companies. They are not interested in making their data open to researchers.

This technology in a sense brings big data to the most marginalized, is not as invasive to privacy as ‘traditional data’, and possibly not as exposed to data manipulation. Perhaps more attention could be placed on this type of technology in the development world. It seems to have the most use, for now, with macro issues like climate and disasters. Technology like this certainly could evolve and be useful in micro issues in the future.

Global digital connectivity is widely seen as essential for economic growth and as having significant potential to help attain the Sustainable Development Goals (Graham 2019). Obtaining the accurate data to help the right people will go a long way to achieving development goals.

Conclusions

There are many decisions that should be made with regard to big data. All these decisions will implicate human rights and moral values, including rights to individual and collective privacy, to data protection, to freedom of movement and other fundamental rights, and to equality and distributive justice (Taylor, Sharma, Martin & Jameson 2020).

It is both scary and inspiring what big data, and datafication in general, does and will continue and evolve to do in societies. I tend to have a more pessimistic view of big data, but I undoubtedly see the potential it has to unlock limitless possibilities for the betterment of global development.

Reflections

This blogging assignment was very challenging. My colleagues and I were in three different parts of the globe and we each had our own (in)competencies when it came to social media and creating and running the technical side of the blog. I am not the most technical of people when it comes to social media and digital design, but I am glad to have the knowledge that I now have. The dynamic of our group worked well. We were all rather active and engaged with the blog. I was educated and motivated by every post my colleagues published. One of the more challenging aspects of this blog was adapting a writing style more suited for blogging. We were all admittedly very new to the topics surrounding datafication and I believe we each did well writing about our big data interests in unique ways that simplified understanding. I hope our readers gained knowledge and greater insight into big data and its global influence.

 

References  

Bentley, C.M., Nemer, D. & Vannini, S. 2019: “When words become unclear”: unmasking ICT through visual methodologies in participatory ICT4D, AI & Society, 34477–493.

Beraldo, D. & Milan, S. 2019: From data politics to the contentious politics of data (Links to an external site.), Big Data & Society, 6:2.

Dencik L., Hintz A., Redden R., Treré E., “Exploring Data Justice: Conceptions, Applications and Directions”, 2019, Information, Communication & Society, 22:7, 873-881

Denny, Jory. “What is an algorithm? How computers know what to do with data.” October 16, 2020. https://theconversation.com/what-is-an-algorithm-how-computers-know-what-to-do-with-data-146665

Graham, M. (ed.) 2019: Digital Economies at Global Margins. Ottawa, ON/Boston, MA: IDRC/MIT Press.

Qureshi S., “Why Data Matters for Development? Exploring Data Justice, Micro-Entrepreneurship, Mobile Money and Financial Inclusion”, 2020, Information Technology for Development, 26:2, 201-213

Taylor, L. 2017: What is data justice? The case for connecting digital rights and freedoms on the global level (Links to an external site.), Big Data & Society, 4:2.

Taylor, L., Sharma, G., Martin, A. & Jameson, S. 2020: Data Justice and COVID-19: Global Perspectives (Links to an external site.), Meatspace Press.

Zwitter, A. & Gstrein, O.J. 2020: Big data, privacy and COVID-19 – learning from humanitarian expertise in data protection (Links to an external site.), Journal of International Humanitarian Action, 5:4.