
Welcome to the Anonymice’s one and only Afroxpter mouse, here to take you on journey through the nature of data and its ‘fication’; how it affects various facets of life, and its continual impact on several social challenges. As the title says, the computer, which in this case is taken to mean the algorithms, has the power to determine the fate and course of events at both individual level and global scale. Your data is collected, processed and scored. Based on your score, you are denied … entry, loan, reduced sentence, health care, college admission, etc. This is more real than fiction as Artificial Intelligence(AI) and Machine Language(ML) systems are taking over more and more decision-making functions. But fear not! AI and ML systems are increasingly being used to address various ills and challenges facing society. Our discussions involve how the collection of data affects societies; how it drives development, social cohesion, and cultural identities. Why is data important to ideas about culture, identity, power and development? I hear the murmurs from the back. Perhaps the question should be – Is technology neutral or is it something that is subject to the values and purposes of its creators, implementors, and users? Are datafication processes imaginative enough to predict and change the future to increase fairness? Or are they simply driven by profit motives of the data companies?
Woah! that’s a lot of not very positive pokes at datafication and its current effects on societies, relationships, and consequent outcomes. Here is an example of AI and datafication being put to good use – Rwanda Health services are using AI and datafication to improve the provision of health care to the citizens. AI is now part of the UN’s arsenal of tools for meeting Sustainable Development Goals (SDGs).
AI and Digital Poorhouses
The term “poorhouse” evokes images of Dickensian workhouse in Oliver Twist’s workhouse. However, poorhouses were a common sight in USA and

Europe until the 20th century. In Sweden, the concept of poorhouses was established from 1642 and maintained until around 1950s. Ah! What is the relevance of the 20th century poorhouses that long been consigned to the dusty corners of history? The digital age has provided us with poverty management systems that automate decision-making based on data-mining and predictive analysis. Virginia Eubanks mentions in Automating Inequality how our modern tools retain a punitive, moralistic view of poverty where the poor cannot access public resources, their spending is policed, sexuality curtailed, and with punishments as rewards for failure to comply. For the poor and marginalised, the refrain “Comply or Die” is as common as breathing. Well, perhaps that is a little dramatic but the consequences of datafication have real-life changing impact on thousands of lives on a daily basis. Datafication results into categorisation into undeserving and deserving, thus justifying why parts of our societies continue to lag behind in development.
This raises the question of how to use digital tools in a manner that does not result into entrenchment of existing unfairness and cruelty in existing systems.
Does my face look good enough for the AI system?
The evolution of AI systems such as facial recognition have been shown to be inconsistent when it comes to identification of people with different skin tones. Is misidentification of lighter skin tone racist or is it only when it applies to darker skin tones? How do AI and data systems handle such conflicts? There is an upward trend in use of AI facial recognition systems to tackle crime, migration and general monitoring in order to ensure safe communities, creating what is being termed as “Algorithmic surveillance”. How are these datafication systems affecting democracy and citizens’ rights? Can governments be trusted to monitor and not surveil the citizens? What happens when social and development challenges are reduced to technical issues?

Humanitarian agencies are increasingly using digital tools to deal with the various crises. This has raised questions on whether institutions can be trusted with people’s data. For example, data collected by UNHCR of the Rohingya people was handed over to the Myanmar’s government, government that has been accused of genocidal crimes against humanity. Use of biometric data in Northern Kenya has highlighted how identity, a colonial legacy, continue to create issues between citizens, refugees, and people with unverifiable birth data (stateless people).
Datafication, Subaltenity and Afro-pessimism
Afro-pessimism, a critical theory based in the experiences of African descendant people, questions conventional narratives of race and oppression. Does Afro-pessimism compound the idea of victimhood on black people. Datafication has emerged as an important component on understanding the continued struggled faced by black communities worldwide. The question on how the changing digital landscape shapes, eliminates or perpetuates racial inequalities and stereotypes. Is there exceptionalism in Afro-pessimism, i.e. does the same pessimism apply to both American and African experiences?

To whet your appetite even more, here is an extract from Weapons Of Math Destruction by Cathy O’Neil which articulates the impact of datafication and automation on the poor, and marginalised: “Poor people are more likely to have bad credit and live in high-crime neighbourhoods, surrounded by other poor people. Once the dark universe of WMDs (Weapons Of Math Destruction) digests that data, it showers them with predatory ads for subprime loans or for-profit schools. It sends more police to arrest them, and when they’re convicted it sentences them to longer terms. This data feeds into other WMDs, which score the same people as high risks or easy targets and proceed to block them from jobs, while jacking up their rates for mortgages, car loans, and every kind of insurance imaginable. This drives their credit rating down further, creating nothing less than a death spiral of modelling. Being poor in a world of WMDs is getting more and more dangerous and expensive”.
Are our lives the sum of our data – biometric data to claim citizenship, academic, employment, economic data to determine eligibility to access public services? Are our humanitarian sympathies subject to the vagaries of datafication? Hhmm, can we be certain of that?