Did you know that Big Tech companies, such as META, are creating psychographic profiling algorithms on you?

You did not? Well, neither did many of the users.

This question was quietly aiming to be addressed during my previous post where I discussed the power of datafication and the challenges and choices of the users who many times, if not always, serve as the product themselves.

First of all, let us define the term: `Psychographics`.

“The term Psychographics refers to a quantitative method to describe and segment consumers on the basis of psychological attributes such as behavioral preferences, personality, beliefs, opinions, interest, attitudes, values, and to some extent habits and lifestyles.” (Hult, 2017)

It relates closely to “the datafication model”, that represents the process when new personal information is deduced by employing predictive analytics on previously gathered data. This model adds up to the existing models of privacy – “the surveillance model” and “capture model”. (Mai,J-E. 2016:2) It is no secret that market segmentation techniques have evolved over the past years but it is still a few people who realize what those enhanced tactics actually mean in practice for the users or `us` as the product or more correctly used term `target groups`. By capturing data on our `interacting characteristics` such as: socio-economic status, gender, ethnicity, place of origin we may become a source of information for different kinds of databases and the level of influence they can have over us may vary. (Linnet,T. Nov.2017)

An individual’s psychographic profile can be inferred, to quite an accurate level, from the data the user leaves “behind” when surfing online, and performing daily activities such as: engaging on a social media site, making an online purchase or using a mobile application. In other words, when one just wants to use his freedom and privacy, enjoy his well deserved free time as he pleases after, lets say, a hard working day but no, he is not alone in his privacy, not alone at all. And that’s okay if you know of it and admit and accept it, but what if you are not aware of it and the hidden pitfalls that are part of surfing the internet and using technology in general today? Would you be interested to know?

Using linear regression techniques, algorithms can predict psychological traits and states of individual users and as a result produce and present attractive tailored messages, web designs, even background colors to match individual `consumer` needs. (Hult, 2017) “The online collection of demographic data (e.g. gender, age, income, nationality) and behavioral data (e.g. online purchasing behavior, clicks, likes), together with psychographic profiling, provide these platforms with highly detailed insights about their users and razor-sharp abilities to accurately predict (predictive analytics) and influence behavior online (targeted actions).” (Hult, 2017) Digital platforms now not only know ‘who’ you are and ‘what’ you do, but can also explain ‘why’ you do it, which is, if not at least worrying, acts in question on whether this is for the better or for the worse of the individual user at stake and overall society in general. A range of such `interacting characteristics’ “determine how individuals become administrative and legal subjects through their data and, consequently, how that data can be used to act upon them by policymakers, commercial firms and both in combination.” (Linnet,T. Nov.2017)

Depending on the side you are on and on your personal attitude towards such dedicated form of predictive analytics, the benefits or dangers of using psychographics are:

  1. Predicting future product choice – You still think you saw that particular campaign by coincidence?
  2. Political campaign effectiveness – Was it really your vote or someone has influenced you to vote so?
  3. Personality-based advertising campaigns – That tailored campaign messaging really resonated with you, didn’t it? It’s like it was written exactly for you, right? Well, perhaps it was, based on your personal data and choices you made online
  4. Crime management – Predicting potential crime one can commit, sounds unrealistic? Well, not that much anymore. (Hult, 2017)

The surveillance and the capture model are at their core concerned with privacy protection as a way of control of data gathered through surveillance activities or capture activities. The datafication model, on the contrary, “assumes that data have been collected, amassed, stolen, bought, hacked, or otherwise acquired and that the privacy concerns at play are the construction of new knowledge, insights, or realities based on the available data.” (Mai,J-E. 2016:8) Datafication aims to transform social behavior characteristics and paths into quantified data. Using the model the right way can of course bring up new prospects in the sales world, increase marketing reach for the target audience and help convert leads into opportunities, as well as serve as the efficient new business model for expansion and opening new horizons for the companies. That is also if the data privacy is managed as intended and abides by the law, if not legal, then moral. `With great power, comes great responsibility`, a famous quote from the Spider Man comics that so conveniently fits into this topic of discussion. If only, this was always the case, and users of all groups all over the world could execute their rights fairly when needed, but then again there are terms and conditions that have to be clicked on and accepted.

Consent to the world of surveillance

Predictive analytics was not really broadly known until Duhigg (2012) wrote a piece in The Modern York Times where he told the story of a father who went to a Target store determined to complain about the fact that his daughter received coupons for maternity clothing and infant items. It is then when the enormous information and prescient analytics all of a sudden got to be exceptionally concrete for the public. Why? Because it just so happened that Target knew the girl was pregnant before anyone else, including her father, knew. Target had collected information about the daughter’s buy history for a few interesting items, which, when analyzed together, delivered a Surprise! – `pregnancy prediction` score. Individuals came to realize that their personal data is a product that is not just being used to tailor attractive messaging exactly for you but it is also being sold and exchanged among data realms and information brokers. (Forbes, 2012) The principal assumption behind consent is that “data subjects make conscious, rational and autonomous choices about the processing of their personal data” (Schermer, Custers, and van der Hof 2014, 171)” (Mai,J-E. 2016:3), however in today’s world it is almost impossible to run daily life activities without accepting the terms of digital data and sharing personal information at least in some way. On top of that, the language that is used to define the terms in most cases includes a legal wording that is lengthy and far beyond the level of comprehension for the majority of citizens, unless they studied law in particular or are an actively licensed lawyer. Today, data is an “object whose production interests those who exercise power” (Ruppert et al., 2017:3). It is controversial in two ways: it is strategically operated within generic episodes of contentious politics (e.g. counter-mapping & critical cartography), or it is put on the line in crucial contemporary struggles (e.g. against algorithmic discrimination). (Beraldo,D.Nov.2019)

Edward Snowden, the former NSA contractor, who in 2013 “leaked thousands of classified National Security Agency documents, sparking a global conversation about citizens’ rights to privacy in the digital age,” (NPR, March 2020) was one of the first few that have managed to get a noticeable wider audience attention to the data politics matter and its unspoken terms and conditions for an ordinary web user today. His story back then even became the subject of the Oscar-nominated documentary, “Citizenfour.” Snowden claimed that his intention to leak the documents was not to gain any personal reward or get recognition but to be a good citizen and make people aware of what is really happening behind the closed doors. One of his testimonies was that “even if you’re not doing anything wrong, you’re being watched and recorded”. (PBS, Feb 2015) His, what seems as, pure intentions to help people have left him “under U.S. Justice Department charges of espionage and theft of government property” (NPR, March 2020) and forced to flee and gain a temporary asylum in Russia where he has lived ever since. That is not necessarily the best motivational happy story end for other whistleblowers who wish to one day release the truth to the public for the common good, but that looks like the world we live in today. “The impacts of big data are very different depending on one’s socio-economic position. The work of Gilliom (2001) and more recently of scholars such as Eubanks (2014) and Masiero (2016) shows that the greatest burden of data surveillance (surveillance using digital methods) has always been borne by the poor.” (Linnet. T. Nov.2017) So perhaps, if data justice would indeed be an actual and objective part of each society on the governmental level, it would be much easier to take the discussion of big data to the next level and address the dangers and benefits of it. But to progress, there first needs to be the basic fundamental equality that is reached for all groups, prior to trying to resolve other more complex issues without agreeing and setting rules or laws in stone on the basic framework. First things first.

There are however also defined strengths and benefits of technological progress, especially if we look at humanitarianism in the Global South and the world of social change that is in constant need of better and more efficient sustainable solutions that can develop the communication and trigger societal improvements for all groups. Argue that with both benefits and weaknesses it is important for the users to have equal rights and access to information available and the global tech companies have to ensure the same access to the basic and fundamental sources of data. Perhaps, having the law against it would help trigger this social change and increase the chances of the `poor` to get the same access as the more fortunate human groups.

Data Revolution

In August 2014 UN Secretary-General Ban Ki-moon asked an Independent Expert Advisory Group to make concrete recommendations on bringing about a data revolution in sustainable development. The prioritizing of data is clear which ‘would draw on existing and new sources of data to fully integrate statistics into decision making, promote open access to, and use of, data’. The data revolution within humanitarianism needs to be seen within the purview of a sector subject to dynamic political, economic and cultural pressures. (UN High Level Panel, 2014)

The report highlighed two big global challenges for the current state of data:

  • The challenge of invisibility (gaps in what we know from data, and when we find out)
  • The challenge of inequality (gaps between those who with and without information, and what they need to know make their own decisions) (UN Data Revolution Group, 2014)

“If improvements in data are to lead to changes in peoples’ lives, then data must be accessible and able to be used. Governments need data for planning and monitoring what they do, and people need data to hold those governments, and other institutions, to account.”

(UN Data Revolution Group, 2014)

The UN has raised important items that need to be widely known and addressed on the highest level to ensure a chance for a sustainable social change for all groups worldwide. The key items raised are such as: “creating norms, incentives and regulations to encourage and require the owners of data to make it publicly available, in ways that are useful to all potential users increasing data literacy so that more people are able to use and interpret data innovations in how, when and what data is collected and shared so that it is up to date, disaggregated and relevant to the concerns of people and policy makers” (UN Data Revolution Group, 2014)

According to the UN (2014) Campaigners for open and transparent data have been successful in persuading many governments and nonprofits to open up their data and make it available in ways that data sets can be easily used. At the same time “the promise of technology is that “more accurate situational data can be gathered and conveyed about humanitarian needs and responses. Here the hope is that drones, sensors, intelligent warning systems and similar technologies can address the information deficit often found in conflict or disaster-affected areas and produce high resolution ‘actionable information’. (Roisin Read et al. 2015:7) It is also claimed that data can be gathered and delivered at greater speed, with an impact on the timeliness of humanitarian responses. The value it can bring in, for example, a sudden onset of disaster “contexts in which static technologies (e.g. landlines) may be disabled.” (Roisin Read et al. 2015:8)

Last but not least, the technological advances in the humanitarian sector may bring opportunities for the “transformation of power relations between donors and ‘recipients’ in favour of a ‘leaner’ and more horizontal networks. All in all, in the “words of one self-proclaimed ‘digital humanitarian’, ‘anyone can be a digital humanitarian, absolutely no experience is necessary; all you need is a big heart and access to the Internet’”. (Roisin Read et al. 2015:8)

 

Conclusion:

It can be considered that discrimination by such data systems and the way the information is being captured is not new neither extraordinary today but the benefits and weaknesses of technological progress do raise some concerns and doubts whether the ultimate Big Tech development is the progress that will ultimately benefit all groups of users despite being based in the North or the South or not. The challenges of development are set at stake and perhaps if we aim to live in justice societies where each country has laws and introduces an appropriate set of regulations that shall be treated by everyone – the subject of data justice needs to be brought up more extensively an on a higher level – as “everyone has the right to be treated fairly by public (and private) authorities of all kinds” as if not, then it is not exactly the justice societies that we may think we live in. (Linnet,T. Nov.2017).

At the same time, it is quite clear that even by putting the efforts and the right measures in place by the qualified authorities or newly created special instances to control the information we provide and what is ultimately being gathered by the Big Tech today, it does seem nearly inexecutable to regulate the knowledge that is ultimately being derived from it.

 

Reflections:

It was the first assignment of such kind in my life to be able to work in a group while creating an engaging and informative blog and addressing the challenging topics in the world of datafication. Reflecting on the topics I discussed and the discussions we had with other group members I can with certainty say that I was challenged in the good way to look beyond my views, opinions and experiences and to open up to the world of new visions, questions and ways of looking at things from my counterparts perspectives. Each member was more experienced in a certain area and this has helped me dive into the unexplored subject matters that provided new knowledge and theories that can be exploited both: in study and in a real world. I am pleased to be able to sync the fundamentals of the previous post with the current academic post and to focus on the new subject matter in more detail while addressing difficult questions.

Diving into the depth of data privacy, datafication and power relations in the context of communication for development has evolved my expertise and allowed me to acquire a new skill set that I can deploy in my daily job while working with Compliance Managers from all over the world and improving the communication bridges externally and internally within the organization, as well as finding the middle ground on the difficult subjects and still be able to look beyond differences to achieve a common goal and to aim to create a more sustainable communication with all parties.

 

References:

Mai, J-E. (2016, April 13). Big data privacy: The datafication of personal information.

http://jenserikmai.info/Papers/2016_BigDataPrivacy.pdf

Hult, (2017). https://www.hult.edu/blog/psychographics-big-data-watching/

Beraldo,D. & Milan,S. (2019, November 7). From data politics to the contentious politics of data. https://journals.sagepub.com/doi/full/10.1177/2053951719885967

Linnet,T. (2017, November 1). What is data justice? The case for connecting digital rights and freedoms globally. https://journals.sagepub.com/doi/full/10.1177/2053951717736335

Forbes (2012). https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/

NPR, (2020, March). https://www.npr.org/2020/03/20/818341273/edward-snowden-why-does-online-privacy-matter

PBS, (2015, February). https://www.pbs.org/newshour/show/watching-snowdens-pivotal-moments-citizenfour

Read,R., Taithe,B. & Ginty, R-M (2016, February 29) UN High Level Panel, Economies through Sustainable Development. Data hubris? Humanitarian information systems and the mirage of technology Róisín Read, Bertrand Taithe and Roger Mac Ginty Humanitarian and Conflict Response Institute, University of Manchester, UK. https://www.tandfonline.com/doi/full/10.1080/01436597.2015.1136208

United Nations, UN Data Revolution Group (2014). https://www.undatarevolution.org/report/

Image credit:

https://pixabay.com/illustrations/entrepreneur-digital-marketing-7157631/

https://pixabay.com/photos/terms-and-conditions-of-use-7199836/

Please follow and like us: