The discourse around ICT for development offers diverse and contradicting opinions about its role in development and in bridging the gaps between developed and developing worlds. On the one hand, some believe that ICT has a magical power to empower people and address the inequalities within countries and across borders. On the other hand, others believe that ICT like any other aspect of globalization is deepening the same realities of inequality since ICT is not neutral or free from the effects of big corporations and political superpowers. In this article, I will try to discuss and explore the critical questions surrounding ICT4D with a focus on AI and big data based on the NMICT course literature and the the group’s insights through the blog’s posts and discussions.
AI Between Neutrality and Power Pressure
AI, Big data, and algorithms are buzz words that get thrown around carelessly; yet they have contested meanings across various disciplines including media studies, development studies, and computer sciences., and, like ICT4D, these terms are impossible to define conclusively.
Let’s take AI as an example. Exaggeration of the capacity of AI and giving it a god-like power are dominating the discussions around it. Moreover, AI tools are often presented as objective and value-free. Such approaches are diverting people from discussing the real dangers of AI in daily life and consequently in development (Birhane, 2019.)
As Cave and Dihal argue, AI is not a subjective force that is free from cultural, ideological, and racial effects. As analysed in several in their article, images of AI are not generic representations of human-like machines, but avatars of a particular rank within the hierarchy of the human. Most images of robots and AI products are Westernized, white, and colonial persons and ideas. The heritage of colonization still plays a role in the racialization of AI(Cave and Dihal, 2020.)
AI systems are not neutral. They can be biased. Commercially biased, westernization biased, adult biased, white biased, and male-biased. This is why governments and industries need to ensure that all AI systems are aligned with the rights, needs, and realities of children. Much of the interaction children will experience with AI-based systems will take place in the home or at school. So, parents and teachers need to be aware of the challenges, risks, and opportunities that AI can bring to children.
UNICEF as a key player in the field of AI for development and AI for children engaged heavily in the discussions around AI biases and developed policies and guidance on how to teach AI how to not discriminate. One of the core recommendations was to consider the different realities of children. Children are not the same everywhere. We cannot move them out of their economic and social contexts. Children are poor and rich, empowered, and marginalized, safe, and endangered. Yes, global guidance is important and necessary, but it is also important to consider the specific realities of minorities, for example, or children with disabilities (Alwaday, 2021.)
One of the barriers in the discussion around big data and AI is that the traditional approach toward AI and ICT for development dealt with “Data” as something that is up for grabs. However, critical questions are being asked of data, including, data justice, data ethics, and who gains access to and can extract value from data (Cinnamon, 2019)
Cinnamon ( 2019) articulated the critical questions that need to be asked when dealing with AI, Bid Data, and ICT4D in the following questions:
– how does data production and use shape the world around it;
– what mechanisms enable personal data to be controlled by corporations;
– how does data mining and algorithmic discrimination techniques shape individual and collective identities and life chances(Cinnamon, 2019.)
In addition to the growing concerns about the increasing economic inequality between the north and south, the same concerns have been expressed recently about data inequality. In this regard, data justice can mean a lot of things. First, it means that the people have fair and equal access to data in terms of technology, and financial capacity. Second, it means that all have the media liter-acy to protect themselves from online abuse along with a protective and human rights-based legal framework to protect vulnerable groups. Third, and most importantly, to have a balance representation of races, genders, cultures, and socio-economic realities in the AI “products”. A balance that moves beyond the white male dominant racialization of AI and algorithms (Kessel, 2021.)
Data Justice and Information Poverty, New Challenges or Upcoming Opportunities?
The new world of information coined the term “information poverty”. Britz defined it as follow:
“information poverty is defined as that situation in which individuals and communities, within a given context, do not have the requisite skills, abilities or material means to obtain efficient access to information, interpret it and apply it appropriately. It is further characterized [at societal/national scales] by a lack of essential information and a poorly developed information infrastructure” (Cinnamon, 2019.)
Information poverty is a complex term that can be addressed from several angles. One of the angles is the balance between information and privacy or the balance between the right to information and the right to protection. The right to information refers to the right of all people to generate, access, acquire, transmit, and benefit from the information. The right to protection concerns protection from all harms that can arise from the misuse and unintended consequences of data and ICTs, particularly the vulnerable groups (Zwitter and Gstrein, 2020.) Zwitter and Gstrein argue that what’s important here is not the abstract existence and observance of these rights, but to enable effective application procedures for individuals and populations affected by crises by enforcing national and global legal frameworks and laws that enable the vulnerable to protect themselves or to punish the abusers (Zwitter and Gstrein, 2020.)
These elements of information access and information poverty have been highlighted during the last COVID19 crisis taking into consideration that the pandemic makes vulnerable groups more vulnerable and more subject to data abuse and harassment. COVID19 pandemic is an unprecedented crisis not only on the health side, but also on the use of data and online information. Handling the crisis had three main parts: health (vaccine and treatment), social (social distancing and social responsibility)) and information (tackling rumours and spreading accurate information.
One of the biggest data/AI elements of handling the pandemic was location data to collect information about people’s mobility and send a warning about the possibility of contracting someone with COVID19. This technique has been discussed from both the “right to information” point of view and the “right to protection” point of view. Although location data provided governments and humanitarian actors with valuable data to track the outbreak, it also raises concerns about using these apps as mass surveillance apparatuses to monitor the movements of people. As Brad Smith, Microsoft’s former CEO argues “ “When your technology changes the world, your bear a responsibility to help address the world you have helped create” (Smith, 2018.) From one side Big Tech can serve as tools that are used to make the world better, and from the other side, they can be turned into weapons that are used to hack privacy, ruin security, and fail democracies. When the new digital technologies “the so-called big tech” invaded the world and transformed life making it more and more online, new challenges and unprecedented questions emerged shedding light on the new promises and perils of the new technology(Smith, 2018.)
The destiny of digital technologies is that they are operating in totally new areas that have not been regulated by the government or even considered to be part of the legislative system. More challenging is that legislations of the Big Tech is not, and cannot be, the responsibility of one country or one government, for many reasons; a) digital technologies has no boundaries, Facebook is a USA-based firm, however, its influence and users are all over the continent; b) with this wide range of political system and legislative principles the vary from country to county which raises several questions such as:
– is it possible to come together and agree on international regulations for the big tech operation ( especially social media)?
– Is it possible that a democratic regime like Sweden and a totalitarian regime like China can agree on the same principle to regulate freedom of speech and content gatekeeping on social media platforms? (Alwaday, 2021)
The so-called “information revolution” is the product of two developments: the massive increase of data generation and the extraordinary possibilities of analyzing data. The two developments partnered together to create the new information sphere where virtual reality cannot be separated from real reality and where virtual reality is operating real reality. A lot of ethical questions are raised that go beyond traditional ethical standards. For example, robots and machines can be taught and integrated many algorithms related to tasks, but there is no algorithm for ethics and empathy. The relationship that is shared and cared for among every living being in the biosphere is bioethics, and that is the one magical element that separates human intelligence from machine intelligence (Paul, 2021.)
However, the large scale of data generation and analysis tripled by another factor which is the production of a lot of information about vulnerable people who have never been part of the data production or consumption before. Moreover, the right to information as a legal instrument to use ICT for development is not a straightforward concept. Yes, there is this tension between information ad privacy, but the right to information itself has its internal challenges and contradictions. (Jose, 2021)
Algorithms , the New Big Brother
One of the biggest challenges to freedoms of thought and speech is “algorithms”. An algorithm is a series of instructions telling a computer how to transform a set of facts about the world into useful information. . For example, search algorithms determine the data and words on which your search results appear, and other algorithms (for example, medical) determine the criteria on which medical information or medical opinions appear. No matter how large and varied the number of data and information available on the Internet, it only shows you what the algorithms choose for you based on the criteria and preferences set within them. Herein lies deception and control.
The first customers interested in algorithms are the big advertisers because their ads work better when the algorithms support the preferences that lead people to their products.
We live in a world where we come to know the “real world” through the “virtual world”. If you want a restaurant for a dinner with your wife, you will search for options through food applications. And if you are looking for a doctor, you will turn to a well-known medical app. But never think that these apps direct you to all the available restaurants or all the doctors out there. They operate on complex webs of codes, software, and algorithms associated with advertisements and large investors, and accordingly, you are directed to a very limited number of options even as you move from site to site and page to page.
Although the debate about relationship data justice, privacy, and AI/Big data for development’s ethics have been around for the last two decades, the technological solutions to COVID-19 have reignited a conversation about the relationship between privacy and collecting personal information public health crises. Zwitter and Gstrein argument is that the relationship between privacy and health should not be framed as a choice and that the traditional framework of the right to privacy is not capable of addressing new challenges such as pandemics. When people refuse the collection of data about their movements to track the spread of the pandemic and build their refusal on the right of personal privacy, they are oversimplifying the debate. The misuse of privacy rights has been copied by the misuse of personal freedom rights to reject the vaccinations by the anti-vax groups.
Conclusion
The debate about big data for development and big data for humanitarian response/ pandemics response is crossing the old lines into new red and hypersensitive ones. Datafication has put new weapons in the hands of institutions and corporations in the business of managing people. And it seems to hit harder where people, laws, and human rights are the most fragile. Regardless of the challenges and the concerns, AI, big data ad algorithms have a lot of potentials to put forward new effective practices for Digital for Development, Development in a Digital World, and Digital in Development (Roberts, 2019)
Beyond the “black” and “white” simplification, the ICT4D as an academic/research field, and as a practice/fieldwork required deeper researches and a holistic approach that can analyze the tool and weapons and the promises and perils.
Finally, Personal Reflection!
In the blog, I wanted to explore along with trending issues, the core and thematic readings for this course, and link it to my field experience with UNICEF, Oxfam, and USAID throughout 12 years! Blogging provides us with an effective tool to engage with a wide range of audiences. Although I am an active blogger on Facebook with over 50,000 followers for my page, the experience with this blog on big data was a bridge between the academic part and the practical part of my ComDev career. In my posts I tried to simplify terms and issues, however, I tried more to avoid the risk of superficiality and making complex things simpler than reality. One of the key lessons from the blog’s experience is the importance of balancing the simplicity of the content with accuracy. It helped to start from my own experience of not knowing to imagine a reader/viewer who lacks the basic knowledge of ICT4development without compromising the strict academic touch.
Blogging about ICT4development is a critical activity that puts us face to face with the deep ethical questions about engaging and protecting vulnerable categories of society. The war in my country eliminated the simplest forms of traditional journalism and mass media. But luckily, social media were taking us to a new era and replacing what we had lost in traditional journalism and moving us into the age of citizen journalism and social movements. I have realized that virtual reality is just as important as real reality. Big data and AI have a huge potential to improve people’s lives. This is the key lesson I learned for the blog.
References
Alwaday, H., 2021. Artificial Intelligence and children. Abusing or Empowering? – Big Data – Challenges or Chances in Development?. [online] Big Data – Challenges or Chances in Development?. Available at: [Accessed 5 November 2021].
Alwaday, H., 2021. Beyond Facebook Controversy, The Promise and the Peril of the “Big-Tec for Development” – Big Data – Challenges or Chances in Development?. [online] Big Data – Challenges or Chances in Development?. Available at: [Accessed 5 November 2021].
Birhane, A., 2021. The Algorithmic Colonization of Africa — Real Life. [online] Real Life. Available at: [Accessed 5 November 2021].
Cave, S. and Dihal, K., 2021. The Whiteness of AI. [online] Springer Link. Available at: [Accessed 5 November 2021].
Cinnamon, J., 2019. Data inequalities and why they matter for development. [online] Taylor & Francis. Available at: [Accessed 5 November 2021].
Jose, M., 2021. Contact tracing – balancing access and privacy – Big Data – Challenges or Chances in Development?. [online] Big Data – Challenges or Chances in Development?. Available at: [Accessed 5 November 2021].
Kessel, S., 2021. How can algorithms, big data and AI cause biases and unfairness? – Big Data – Challenges or Chances in Development?. [online] Big Data – Challenges or Chances in Development?. Available at: [Accessed 5 November 2021].
Paul, S., 2021. Empathy in the Machine-led World – AI and Ethics – Big Data – Challenges or Chances in Development?. [online] Big Data – Challenges or Chances in Development?. Available at: [Accessed 5 November 2021].
Roberts, T., 2019. Digital Development: what’s in a name? | Appropriating Technology. [online] Appropriatingtechnology.org. Available at: [Accessed 5 November 2021].
Smith, B., 2018. Tools and Weapons: The Promise and the Peril of the Digital Age Hardcover. 1st ed. Penguin Radom House.
Zwitter, A. and Gstrein, O., 2020. Big data, privacy and COVID-19 – learning from humanitarian expertise in data protection. [online] Journal of International Humanitarian Action. Available at: [Accessed 5 November 2021].