﻿{"id":444,"date":"2024-12-16T10:26:30","date_gmt":"2024-12-16T10:26:30","guid":{"rendered":"https:\/\/wpmu.mau.se\/msm24group5\/?p=444"},"modified":"2024-12-22T23:03:27","modified_gmt":"2024-12-22T23:03:27","slug":"bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality","status":"publish","type":"post","link":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/","title":{"rendered":"Bias by Design: How AI Mirrors and Magnifies Gender Inequality"},"content":{"rendered":"<div id=\"fb-root\"><\/div>\r\n<p><span style=\"font-weight: 400\">Following my last article covering the Women 4 Artificial Intelligence Conference, I wanted to go more in detail into ways in which gender bias in tech and AI actually manifests in real-world situations. The conference was useful in understanding how using the RAM tool (Readiness Assessment Methodologies) helps you discover the current state of the country and the level of preparation to sustain a legal, political, and social change that favors ethical AI usage. But what is it actually that we need to prepare for?<\/span><\/p>\n<p>&nbsp;<\/p>\n<h6>When I first learned about gender bias in tech<\/h6>\n<p><span style=\"font-weight: 400\">The first time I learned about gender bias in machine learning processes was about 7 years ago. I was conversing with my Bangladeshi friend, Imran, about the differences in our languages, when he told me a \u201cfun fact\u201d. Different from Romanian, which is a Latin-based language, the Bengali language does not have gender. The fact in itself did surprise me as I knew many other languages simply do not use gender to structure and differentiate their syntax. However, he then showed me a quick exercise using Google Translate, he sent me a screenshot of a Bengali text, usual phrases such as &#8211; \u201cThey are an architect. They like to cook. They take care of the children. They are the boss. They are powerful.\u201d and translated these to English. To my surprise, even though the phrases did not hold any pre-assigned gender to all of these attributes\/actions when translated into English, they became \u201cHe is an architect. She likes to cook. She takes care of the children. He is the boss. He is powerful.\u201d<\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\">I remember being quite upset at this finding &#8211; How come Google translates it this way? Why would it do such a thing? It\u2019s technology, it <\/span><i><span style=\"font-weight: 400\">should be neutral. <\/span><\/i><span style=\"font-weight: 400\">I was just young, and still so sensitive to such issues, without having a deep understanding of how these technologies work. Now I understand better, but it still does not make it acceptable.\u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n<h6>former studies about gender and tech<\/h6>\n<p><span style=\"font-weight: 400\">Many studies look at the relationship between humans and technology, one of the first ones was Orlikowksi&#8217;s work in 1992, where she introduced the concept of the \u201cduality of technology\u201d which is \u201cphysically constructed by actors working in a given social context, and actors socially construct technology through the different meanings they attach to it and the various features they emphasize and use\u201d (p. 406). <\/span><\/p>\n<p><span style=\"font-weight: 400\">Similarly to Orlikowski, Fountain positions technology intersecting human agency, but he places this in an institutional and organizational context \u201cthe effects of the Internet on government will be played out in unexpected ways, profoundly influenced by organizational, political and institutional logics\u201d (p. 12). His framework highlights how institutions both influence and are influenced by ICTs and predominant organizational forms (p. 89). <\/span><\/p>\n<p><span style=\"font-weight: 400\">He distinguishes technology into two forms: objective (the hardware, Internet, software, etc) and enacted (\u201cthe perception of users as well as designs and uses in particular settings\u201d p.10). The author concludes that the outcomes of technology enactment are multiple and unpredictable as they result from a complex interflow of logic, relations, and institutions. AI is an \u201cobjective technology\u201d, but once it is put in use, it influences and becomes influenced by human agency and different forms of organization and arrangements, which lead to unexpected consequences. As long as gender biases are implicit in our society and culture, they\u2019ll just naturally become the contextual factors that make AI reproduce the same biases.\u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n<h6>Gender Bias in Language<\/h6>\n<p><span style=\"font-weight: 400\">Bias is often shown through language. The AI service \u201cGenderify\u201d which was launched in 2020, uses a person\u2019s name, username, and email address to find out the gender of that person (Vincent, 2020). Using the service, they quickly noticed that names beginning with \u201cDr\u201d were consistently flagged as male doctors, for example, the name \u201cDr Meghan Smith\u201d was assigned with a 75% likelihood of belonging to a male person. Hundt (2022) found that automated robots that were trained on large datasets and standardized models, showed strong reinforcement of stereotypical and biased behavior in regard to gender and race. <\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\">In 2018 a study was held by the Federal University of Rio Grande do Sul in Brazil to test the existence of gender bias in AI, and specifically automated machines of translation (Prates et al. 2020<\/span><span style=\"font-weight: 400\">). As my friend, Imran, ran the Google translate test with me, they did the same by running some sentence constructions from English into twelve gender-neutral languages (Malay, Hungarian, Swahili Bengali, Estonian, Finnish, Japanese, Turkish, Yoruba, Basque, Armenian, and Chinese). They found that the machine translation showed a strong bias towards male attributes, also especially for fields such as STEM (which is already viewed as more male-oriented than female-oriented). At the time of the study- 39,8% of women were working in \u201cmanagement\u201d, but sentences that were related to management positions were translated with a female pronoun on 11% of the time- the translations did not even reflect real-world statistics. When they ran the same test but this time with adjectives, they found that words such as \u201cshy\u201d, \u201cattractive\u201d, \u201chappy\u201d, \u201ckind\u201d and \u201cashamed\u201d tended to be associated with female pronouns, while \u201ccruel\u201d, \u201carrogant\u201d, \u201cguilty\u201d associated with male pronouns. <\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\">Soon after their research was published, Google released a statement when they admitted that their machine presents gender bias, and they later released a new feature that offers both gender translations.\u00a0<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone size-medium wp-image-476\" src=\"http:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/Screenshot-2024-12-22-at-22.02.47-300x138.png\" alt=\"\" width=\"300\" height=\"138\" srcset=\"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/Screenshot-2024-12-22-at-22.02.47-300x138.png 300w, https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/Screenshot-2024-12-22-at-22.02.47-458x211.png 458w, https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/Screenshot-2024-12-22-at-22.02.47.png 704w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p><span style=\"font-weight: 400\">In 2021, Google Translate also released the \u201cTranslated Wikipedia Biographies\u201d dataset, which measures gender bias in machine translation, through which they state that they can reduce errors by 67%.<\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><\/p>\n<p>&nbsp;<\/p>\n<h6>Gender bias in imagery<\/h6>\n<p><span style=\"font-weight: 400\">Gender bias in visual imagery is also nothing new, and something that has been of my interest for some time. \u201cBias in the visual representation of women and men has been endemic throughout the history of media, journalism, and advertising\u201d says Schwemmer et al. (2020). Gendered representation has been present in many forms and contexts of imagery &#8211; science education resources (Kerkhoven et al. 2016), commercial films (Jan et al. 2019), Iranian school textbooks (Amini and Birjandi 2012).\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">The website <\/span><a href=\"http:\/\/gendershades.org\/\"><span style=\"font-weight: 400\">Gender Shades <\/span><\/a><span style=\"font-weight: 400\">was created after a pivotal study carried out by Joy Buolamwini of MIT and Timnit Gebru from Microsoft Research in 2018. They started their study by emphasizing the use of facial recognition tools in public administration capacities and criminal justice systems. They raised the concern that these technologies are not neutral and <\/span><i><span style=\"font-weight: 400\">can<\/span><\/i><span style=\"font-weight: 400\"> threaten individuals\u2019 civil liberties (Boulamwini and Gebru 2018:2) including \u201ceconomic loss\u201d, \u201closs of opportunity\u201d in hiring, education, housing, \u201csocial stigmatisation\u201d by reinforcing stereotypes that, in the end, affect both the individual and the collective. <\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\">The study revealed intersectional errors in the software, showing less accuracy in identifying women than men, and faulty identification for darker-skinned people compared with light skinned people (with darker-skinned women being most likely to be misclassified.)<\/span><\/p>\n<p>&nbsp;<\/p>\n<h6>To conclude<\/h6>\n<p><span style=\"font-weight: 400\">Finally, AI machines not only reflect all these biases, but they can \u201cget caught in negative feedback loops\u201d (Busuioc, 2021) which then become the base for future predictions, especially if the initial data set was biased. Bias mitigation involves \u201cproactively addressing factors which contribute to bias\u201d (Lee et al.2019), and is usually associated with the concept of \u201cfairness\u201d. Through mitigation techniques such as rebalancing data, regularisation, and adversarial learning AI bias can be reduced, but a strong political and institutional structure is still needed to really support this change.\u00a0\u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<h6>References<\/h6>\n<p><i><span style=\"font-weight: 400\">Amini M, Birjandi P (2012) Gender bias in the Iranian High School EFL Textbooks. Engl Lang Teach 5(2):134\u2013147<\/span><\/i><\/p>\n<p><span style=\"font-weight: 400\">Buolamwini, J., &amp; Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In <\/span><i><span style=\"font-weight: 400\">Conference on fairness, accountability and transparency<\/span><\/i><span style=\"font-weight: 400\"> (pp. 77-91). PMLR.<\/span><\/p>\n<p><span style=\"font-weight: 400\">Busuioc M (2021) Accountable artificial intelligence: Holding algorithms to account. Public Adm Rev 81(5):825\u2013836<\/span><\/p>\n<p><i><span style=\"font-weight: 400\">Fountain JE (2004) Building the virtual state: Information technology and institutional change. Brookings Institution Press<\/span><\/i><\/p>\n<p><i><span style=\"font-weight: 400\">Hundt A, Agnew W, Zeng V, Kacianka S, Gombolay M (2022, June). Robots Enact Malignant Stereotypes. In 2022 ACM Conference on Fairness, Accountability, and Transparency (pp. 743\u2013756)<\/span><\/i><\/p>\n<p><i><span style=\"font-weight: 400\">Kerkhoven AH, Russo P, Land-Zandstra AM, Saxena A, Rodenburg FJ (2016) Gender stereotypes in science education resources: A visual content analysis. PLoS ONE 11(11):e0165037. <\/span><\/i><a href=\"https:\/\/doi.org\/10.1371\/journal.pone.0165037\"><i><span style=\"font-weight: 400\">https:\/\/doi.org\/10.1371\/journal.pone.0165037<\/span><\/i><\/a><\/p>\n<p><i><span style=\"font-weight: 400\">Lee, NT, Resnick P, Barton G (2019) Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms. Brookings Institute: Washington, DC, USA. <\/span><\/i><a href=\"https:\/\/www.brookings.edu\/research\/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms\/\"><i><span style=\"font-weight: 400\">https:\/\/www.brookings.edu\/research\/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms\/<\/span><\/i><\/a><i><span style=\"font-weight: 400\">.<\/span><\/i><\/p>\n<p><i><span style=\"font-weight: 400\">MIT Media Lab-b (2018) Gender Shades Project: Why This Matters. <\/span><\/i><a href=\"https:\/\/www.media.mit.edu\/projects\/gender-shades\/why-this-matters\/\"><i><span style=\"font-weight: 400\">https:\/\/www.media.mit.edu\/projects\/gender-shades\/why-this-matters\/<\/span><\/i><\/a><i><span style=\"font-weight: 400\">.<\/span><\/i><\/p>\n<p>Orlikowski WJ (1992) The duality of technology: Rethinking the concept of technology in organizations. Organ Sci 3(3):398\u2013427<\/p>\n<p><span style=\"font-weight: 400\">Prates, M. O., Avelar, P. H., &amp; Lamb, L. C. (2020). Assessing gender bias in machine translation: a case study with google translate. <\/span><i><span style=\"font-weight: 400\">Neural Computing and Applications<\/span><\/i><span style=\"font-weight: 400\">, <\/span><i><span style=\"font-weight: 400\">32<\/span><\/i><span style=\"font-weight: 400\">, 6363-6381.\u00a0<\/span><\/p>\n<p><i><span style=\"font-weight: 400\">Schwemmer C, Knight C, Bello-Pardo ED, Oklobdzija S, Schoonvelde M, Lockhart JW (2020) Diagnosing gender bias in image recognition systems. Socius 6:1\u201317<\/span><\/i><\/p>\n<p><i><span style=\"font-weight: 400\">Stella R (2021) A Dataset for Studying Gender Bias in Translation. Google AI Blog. <\/span><\/i><a href=\"https:\/\/ai.googleblog.com\/2021\/06\/a-dataset-for-studying-gender-bias-in.html\"><i><span style=\"font-weight: 400\">https:\/\/ai.googleblog.com\/2021\/06\/a-dataset-for-studying-gender-bias-in.html<\/span><\/i><\/a><i><span style=\"font-weight: 400\">.<\/span><\/i><\/p>\n<p><i><span style=\"font-weight: 400\">Vincent J (2020) Service that uses AI to identify gender based on names looks incredibly biased \/ Meghan Smith is a woman, but Dr. Meghan Smith is a man, says Genderify. The Verge. <\/span><\/i><a href=\"https:\/\/www.theverge.com\/2020\/7\/29\/21346310\/ai-service-gender-verification-identification-genderify\"><i><span style=\"font-weight: 400\">https:\/\/www.theverge.com\/2020\/7\/29\/21346310\/ai-service-gender-verification-identification-genderify<\/span><\/i><\/a><i><span style=\"font-weight: 400\">.\u00a0<\/span><\/i><\/p>\n<p><i><span style=\"font-weight: 400\">Jang JY, Lee S, Lee B (2019) Quantification of gender representation bias in commercial films based on image analysis. Proceed ACM on Human-Comput Interact 3:1\u201329<\/span><\/i><\/p>\n<p>Feature image: <em>Alan Warburton, CC BY 4.0 , via Wikimedia Commons<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Following my last article covering the Women 4 Artificial Intelligence Conference, I wanted to go more in detail into ways in which gender bias in tech and AI actually manifests in real-world situations. The conference was useful in understanding how using the RAM tool (Readiness Assessment Methodologies) helps you discover the current state of the [&hellip;]<\/p>\n","protected":false},"author":742,"featured_media":449,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[49],"tags":[],"class_list":["post-444","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-academic-blog"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.1.1 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\r\n<title>Bias by Design: How AI Mirrors and Magnifies Gender Inequality - Artificial Inequality<\/title>\r\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\r\n<link rel=\"canonical\" href=\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/\" \/>\r\n<meta property=\"og:locale\" content=\"en_US\" \/>\r\n<meta property=\"og:type\" content=\"article\" \/>\r\n<meta property=\"og:title\" content=\"Bias by Design: How AI Mirrors and Magnifies Gender Inequality - Artificial Inequality\" \/>\r\n<meta property=\"og:description\" content=\"Following my last article covering the Women 4 Artificial Intelligence Conference, I wanted to go more in detail into ways in which gender bias in tech and AI actually manifests in real-world situations. The conference was useful in understanding how using the RAM tool (Readiness Assessment Methodologies) helps you discover the current state of the [&hellip;]\" \/>\r\n<meta property=\"og:url\" content=\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/\" \/>\r\n<meta property=\"og:site_name\" content=\"Artificial Inequality\" \/>\r\n<meta property=\"article:published_time\" content=\"2024-12-16T10:26:30+00:00\" \/>\r\n<meta property=\"article:modified_time\" content=\"2024-12-22T23:03:27+00:00\" \/>\r\n<meta property=\"og:image\" content=\"http:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg\" \/>\r\n\t<meta property=\"og:image:width\" content=\"2048\" \/>\r\n\t<meta property=\"og:image:height\" content=\"1152\" \/>\r\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\r\n<meta name=\"author\" content=\"Anisa\" \/>\r\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\r\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Anisa\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"1 minute\" \/>\r\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/\"},\"author\":{\"name\":\"Anisa\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/person\/ad4486ea45583d7a69423f77f3f76deb\"},\"headline\":\"Bias by Design: How AI Mirrors and Magnifies Gender Inequality\",\"datePublished\":\"2024-12-16T10:26:30+00:00\",\"dateModified\":\"2024-12-22T23:03:27+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/\"},\"wordCount\":1531,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#organization\"},\"image\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg\",\"articleSection\":[\"Academic blog\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/\",\"url\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/\",\"name\":\"Bias by Design: How AI Mirrors and Magnifies Gender Inequality - Artificial Inequality\",\"isPartOf\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg\",\"datePublished\":\"2024-12-16T10:26:30+00:00\",\"dateModified\":\"2024-12-22T23:03:27+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#primaryimage\",\"url\":\"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg\",\"contentUrl\":\"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg\",\"width\":2048,\"height\":1152,\"caption\":\"Alan Warburton, CC BY 4.0 , via Wikimedia Commons\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/wpmu.mau.se\/msm24group5\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Bias by Design: How AI Mirrors and Magnifies Gender Inequality\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#website\",\"url\":\"https:\/\/wpmu.mau.se\/msm24group5\/\",\"name\":\"Artificial Inequality\",\"description\":\"The Hidden Impact of Global Tech Economy\",\"publisher\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/wpmu.mau.se\/msm24group5\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#organization\",\"name\":\"Artificial Inequality\",\"url\":\"https:\/\/wpmu.mau.se\/msm24group5\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/10\/artificial-logo-1.jpg\",\"contentUrl\":\"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/10\/artificial-logo-1.jpg\",\"width\":400,\"height\":400,\"caption\":\"Artificial Inequality\"},\"image\":{\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/person\/ad4486ea45583d7a69423f77f3f76deb\",\"name\":\"Anisa\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/2d3a296ac0774bf9b26fb5c69cab11661b3ecc2f7e3e537e145122c32c176e4e?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/2d3a296ac0774bf9b26fb5c69cab11661b3ecc2f7e3e537e145122c32c176e4e?s=96&d=mm&r=g\",\"caption\":\"Anisa\"},\"url\":\"https:\/\/wpmu.mau.se\/msm24group5\/author\/ap1096\/\"}]}<\/script>\r\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Bias by Design: How AI Mirrors and Magnifies Gender Inequality - Artificial Inequality","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/","og_locale":"en_US","og_type":"article","og_title":"Bias by Design: How AI Mirrors and Magnifies Gender Inequality - Artificial Inequality","og_description":"Following my last article covering the Women 4 Artificial Intelligence Conference, I wanted to go more in detail into ways in which gender bias in tech and AI actually manifests in real-world situations. The conference was useful in understanding how using the RAM tool (Readiness Assessment Methodologies) helps you discover the current state of the [&hellip;]","og_url":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/","og_site_name":"Artificial Inequality","article_published_time":"2024-12-16T10:26:30+00:00","article_modified_time":"2024-12-22T23:03:27+00:00","og_image":[{"width":2048,"height":1152,"url":"http:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg","type":"image\/jpeg"}],"author":"Anisa","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Anisa","Est. reading time":"1 minute"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#article","isPartOf":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/"},"author":{"name":"Anisa","@id":"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/person\/ad4486ea45583d7a69423f77f3f76deb"},"headline":"Bias by Design: How AI Mirrors and Magnifies Gender Inequality","datePublished":"2024-12-16T10:26:30+00:00","dateModified":"2024-12-22T23:03:27+00:00","mainEntityOfPage":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/"},"wordCount":1531,"commentCount":0,"publisher":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/#organization"},"image":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#primaryimage"},"thumbnailUrl":"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg","articleSection":["Academic blog"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/","url":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/","name":"Bias by Design: How AI Mirrors and Magnifies Gender Inequality - Artificial Inequality","isPartOf":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/#website"},"primaryImageOfPage":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#primaryimage"},"image":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#primaryimage"},"thumbnailUrl":"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg","datePublished":"2024-12-16T10:26:30+00:00","dateModified":"2024-12-22T23:03:27+00:00","breadcrumb":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#primaryimage","url":"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg","contentUrl":"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/12\/2048px-Plant_by_Alan_Warburton.jpg","width":2048,"height":1152,"caption":"Alan Warburton, CC BY 4.0 , via Wikimedia Commons"},{"@type":"BreadcrumbList","@id":"https:\/\/wpmu.mau.se\/msm24group5\/2024\/12\/16\/bias-by-design-how-ai-mirrors-and-magnifies-gender-inequality\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/wpmu.mau.se\/msm24group5\/"},{"@type":"ListItem","position":2,"name":"Bias by Design: How AI Mirrors and Magnifies Gender Inequality"}]},{"@type":"WebSite","@id":"https:\/\/wpmu.mau.se\/msm24group5\/#website","url":"https:\/\/wpmu.mau.se\/msm24group5\/","name":"Artificial Inequality","description":"The Hidden Impact of Global Tech Economy","publisher":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/wpmu.mau.se\/msm24group5\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/wpmu.mau.se\/msm24group5\/#organization","name":"Artificial Inequality","url":"https:\/\/wpmu.mau.se\/msm24group5\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/logo\/image\/","url":"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/10\/artificial-logo-1.jpg","contentUrl":"https:\/\/wpmu.mau.se\/msm24group5\/wp-content\/uploads\/sites\/100\/2024\/10\/artificial-logo-1.jpg","width":400,"height":400,"caption":"Artificial Inequality"},"image":{"@id":"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/person\/ad4486ea45583d7a69423f77f3f76deb","name":"Anisa","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/wpmu.mau.se\/msm24group5\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/2d3a296ac0774bf9b26fb5c69cab11661b3ecc2f7e3e537e145122c32c176e4e?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/2d3a296ac0774bf9b26fb5c69cab11661b3ecc2f7e3e537e145122c32c176e4e?s=96&d=mm&r=g","caption":"Anisa"},"url":"https:\/\/wpmu.mau.se\/msm24group5\/author\/ap1096\/"}]}},"_links":{"self":[{"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/posts\/444","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/users\/742"}],"replies":[{"embeddable":true,"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/comments?post=444"}],"version-history":[{"count":6,"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/posts\/444\/revisions"}],"predecessor-version":[{"id":479,"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/posts\/444\/revisions\/479"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/media\/449"}],"wp:attachment":[{"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/media?parent=444"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/categories?post=444"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/wpmu.mau.se\/msm24group5\/wp-json\/wp\/v2\/tags?post=444"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}