﻿{"id":354,"date":"2022-10-17T10:23:05","date_gmt":"2022-10-17T10:23:05","guid":{"rendered":"https:\/\/wpmu.mau.se\/nmict22group10\/?p=354"},"modified":"2024-04-19T10:24:34","modified_gmt":"2024-04-19T10:24:34","slug":"algorithm-bias-computers-can-be-racist-too-2","status":"publish","type":"post","link":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/","title":{"rendered":"Algorithm Bias &#8211; Computers Can Be Racist Too"},"content":{"rendered":"<div id=\"fb-root\"><\/div>\r\n<p><span style=\"font-weight: 400\">We are surrounded by algorithms &#8211; they decide what we see in our Facebook Feeds, screen our job applications, calculate probabilities and drive facial recognition software. While they are faster and sometimes more accurate than us at processing large amounts of data, there is one area where they can be just as flawed as humans &#8211; they are biased. <\/span><\/p>\n<p><span style=\"font-weight: 400\">While it sounds strange to assign an inanimate object with attributes usually reserved for people, the core of AI is, ultimately, human. It is humans that design it, humans that feed it information to train it, and humans that decide what the algorithm is going to be used for. And in every step, they are inadvertently shaping the bias that one day might cost you a job interview or an extra hour at the airport due to an AI-powered, racially-profiled secondary search. <\/span><\/p>\n<h3><b>In what ways are algorithms biased?<\/b><\/h3>\n<p><span style=\"font-weight: 400\">In 2014 <\/span><a href=\"https:\/\/www.reuters.com\/article\/us-amazon-com-jobs-automation-insight\/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G\"><span style=\"font-weight: 400\">Amazon<\/span><\/a><span style=\"font-weight: 400\"> started working on a computer program that would screen job applications and suggest the top candidates. The program was trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period and since technology tends to be a male-dominated industry, the data contained a disproportionate amount of male-submitted resumes vs. female. From the data, the program taught itself that male candidates were preferred and started to penalize resumes that contained the word \u201cwomen\u2019s\u201d as well as downgrade candidates who attended all-women\u2019s colleges.<\/span><\/p>\n<p><span style=\"font-weight: 400\">Tools like Google Translate learn languages by crawling the web and analyzing billions of words, word combinations and sentences to determine how a specific word should be translated. I recently read about sexism in some of the translations, so I decided to test it myself. My native language, Croatian, has gendered nouns, so when it comes to professions we will have two versions of the same word &#8211; a male and a female (for example, a male doctor is <\/span><i><span style=\"font-weight: 400\">lije\u010dnik<\/span><\/i><span style=\"font-weight: 400\"> while the female is <\/span><i><span style=\"font-weight: 400\">lije\u010dnica<\/span><\/i><span style=\"font-weight: 400\">). I started putting in different professions into Google Translate and its bias aligned with common stereotypes. Professions such as doctor, lawyer and engineer were translated in the male form of the word, while kindergarten teacher, secretary and cleaner were translated in the female form. <\/span><\/p>\n<p><span style=\"font-weight: 400\">While some cases of algorithm bias are more annoying than dangerous, sometimes algorithm bias can have life-altering consequences. In 2016, ProPublica released a <\/span><a href=\"https:\/\/www.propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing\"><span style=\"font-weight: 400\">report<\/span><\/a><span style=\"font-weight: 400\"> on algorithm bias in software used in US courtrooms to rate a defendant\u2019s risk of committing future crimes. Their research found that the software was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants while white defendants were mislabeled as low risk more often than black defendants. According to their research, the defendants whose risk rate the algorithm marked as higher were more likely to be sentenced to longer prison sentences regardless of circumstances or facts. The reason this particular program is biased is that it was fed decades&#8217; worth of police records that are riddled with systemic racism.<\/span><\/p>\n<h3><b>The need for algorithmic transparency and accountability<\/b><\/h3>\n<p><span style=\"font-weight: 400\">One of the major issues surrounding algorithm bias is the lack of transparency. We know that the programs are making biased decisions, but we don\u2019t know <\/span><b>how<\/b><span style=\"font-weight: 400\"> they came to those conclusions. Better yet, sometimes we are not even aware algorithms are being used to make decisions about us.<\/span><\/p>\n<p><span style=\"font-weight: 400\">Algorithms are often developed by private companies that do not disclose their proprietary secrets &#8211; even when their methods and results are being questioned. The aforementioned risk rating system used in courtrooms was developed by a private company, Northpointe, and sold to the US judicial system. Neither the defendants whose lives the results are altering nor the judges making the rulings based on them know how the system works, which types of data it uses, or how it comes to its predictions. <\/span><\/p>\n<p><span style=\"font-weight: 400\">As with most things digital, lawmakers are lagging behind the rate at which new algorithm-driven technologies are being developed and are struggling to come up with a framework that would ensure proper regulation of algorithmic systems. Various institutions and agencies are trying to address this issue (for example, the European Centre for Algorithmic Transparency (ECAT) but in the meanwhile, it is unclear where the responsibility for negative outcomes from algorithms lies. <\/span><\/p>\n<p><span style=\"font-weight: 400\">Who is at fault when a person becomes radicalized from being in a social media filter bubble where they only receive content that an algorithm deemed suiting or when someone is racially discriminated against? Is it the coder, the data set that was fed to the AI or the company that decided to use it?<\/span><\/p>\n<p><span style=\"font-weight: 400\">Ultimately, there is a great need for regulation and openness when it comes to programs that have more and more impact on our lives every day. And, same as how the machines are learning about us, it is important that we learn about them too so that we as individuals can have a better understanding of how our data is used and what our rights are. <\/span><\/p>\n<p>&nbsp;<\/p>\n<p><em><strong><span style=\"font-weight: 400\">Image credit: rawpixel.com\/freepik.com<\/span><\/strong><\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>We are surrounded by algorithms &#8211; they decide what we see in our Facebook Feeds, screen our job applications, calculate probabilities and drive facial recognition software. While they are faster and sometimes more accurate than us at processing large amounts of data, there is one area where they can be just as flawed as humans &#8211; they are biased. While it sounds strange to assign an inanimate object with attributes usually reserved for people, the core of AI is, ultimately, human. It is humans that design it, humans that feed it information to train it, and humans that decide what &hellip;<\/p>\n","protected":false},"author":422,"featured_media":188,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"class_list":["post-354","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-datafication"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.1.1 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\r\n<title>Algorithm Bias - Computers Can Be Racist Too - Big Data Big Responsibility<\/title>\r\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\r\n<link rel=\"canonical\" href=\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/\" \/>\r\n<meta property=\"og:locale\" content=\"en_US\" \/>\r\n<meta property=\"og:type\" content=\"article\" \/>\r\n<meta property=\"og:title\" content=\"Algorithm Bias - Computers Can Be Racist Too - Big Data Big Responsibility\" \/>\r\n<meta property=\"og:description\" content=\"We are surrounded by algorithms &#8211; they decide what we see in our Facebook Feeds, screen our job applications, calculate probabilities and drive facial recognition software. While they are faster and sometimes more accurate than us at processing large amounts of data, there is one area where they can be just as flawed as humans &#8211; they are biased. While it sounds strange to assign an inanimate object with attributes usually reserved for people, the core of AI is, ultimately, human. It is humans that design it, humans that feed it information to train it, and humans that decide what &hellip;\" \/>\r\n<meta property=\"og:url\" content=\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/\" \/>\r\n<meta property=\"og:site_name\" content=\"Big Data Big Responsibility\" \/>\r\n<meta property=\"article:published_time\" content=\"2022-10-17T10:23:05+00:00\" \/>\r\n<meta property=\"article:modified_time\" content=\"2024-04-19T10:24:34+00:00\" \/>\r\n<meta property=\"og:image\" content=\"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg\" \/>\r\n\t<meta property=\"og:image:width\" content=\"2560\" \/>\r\n\t<meta property=\"og:image:height\" content=\"1707\" \/>\r\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\r\n<meta name=\"author\" content=\"Maja Tidemand\" \/>\r\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\r\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Maja Tidemand\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"1 minute\" \/>\r\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/\"},\"author\":{\"name\":\"Maja Tidemand\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/person\/56bc9d33662c1bc6d082e33be6c23030\"},\"headline\":\"Algorithm Bias &#8211; Computers Can Be Racist Too\",\"datePublished\":\"2022-10-17T10:23:05+00:00\",\"dateModified\":\"2024-04-19T10:24:34+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/\"},\"wordCount\":852,\"publisher\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#organization\"},\"image\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg\",\"articleSection\":[\"Datafication\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/\",\"url\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/\",\"name\":\"Algorithm Bias - Computers Can Be Racist Too - Big Data Big Responsibility\",\"isPartOf\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg\",\"datePublished\":\"2022-10-17T10:23:05+00:00\",\"dateModified\":\"2024-04-19T10:24:34+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#primaryimage\",\"url\":\"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg\",\"contentUrl\":\"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg\",\"width\":2560,\"height\":1707,\"caption\":\"Image credit: rawpixel.com\/freepik.com\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/wpmu.mau.se\/nmict22group10\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Algorithm Bias &#8211; Computers Can Be Racist Too\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#website\",\"url\":\"https:\/\/wpmu.mau.se\/nmict22group10\/\",\"name\":\"Big Data Big Responsibility\",\"description\":\"A discussion about technology&#039;s role in development\",\"publisher\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/wpmu.mau.se\/nmict22group10\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#organization\",\"name\":\"Big Data Big Responsibility\",\"url\":\"https:\/\/wpmu.mau.se\/nmict22group10\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/09\/cropped-cropped-IMG-20220923-WA0003-1.jpg\",\"contentUrl\":\"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/09\/cropped-cropped-IMG-20220923-WA0003-1.jpg\",\"width\":1062,\"height\":382,\"caption\":\"Big Data Big Responsibility\"},\"image\":{\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/person\/56bc9d33662c1bc6d082e33be6c23030\",\"name\":\"Maja Tidemand\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/1188a9c84a93f531201b4c0ca2f8a5b648f5ce6395dfb37fac978d1889d48d49?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/1188a9c84a93f531201b4c0ca2f8a5b648f5ce6395dfb37fac978d1889d48d49?s=96&d=mm&r=g\",\"caption\":\"Maja Tidemand\"},\"url\":\"https:\/\/wpmu.mau.se\/nmict22group10\/author\/am1974\/\"}]}<\/script>\r\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Algorithm Bias - Computers Can Be Racist Too - Big Data Big Responsibility","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/","og_locale":"en_US","og_type":"article","og_title":"Algorithm Bias - Computers Can Be Racist Too - Big Data Big Responsibility","og_description":"We are surrounded by algorithms &#8211; they decide what we see in our Facebook Feeds, screen our job applications, calculate probabilities and drive facial recognition software. While they are faster and sometimes more accurate than us at processing large amounts of data, there is one area where they can be just as flawed as humans &#8211; they are biased. While it sounds strange to assign an inanimate object with attributes usually reserved for people, the core of AI is, ultimately, human. It is humans that design it, humans that feed it information to train it, and humans that decide what &hellip;","og_url":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/","og_site_name":"Big Data Big Responsibility","article_published_time":"2022-10-17T10:23:05+00:00","article_modified_time":"2024-04-19T10:24:34+00:00","og_image":[{"width":2560,"height":1707,"url":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg","type":"image\/jpeg"}],"author":"Maja Tidemand","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Maja Tidemand","Est. reading time":"1 minute"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#article","isPartOf":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/"},"author":{"name":"Maja Tidemand","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/person\/56bc9d33662c1bc6d082e33be6c23030"},"headline":"Algorithm Bias &#8211; Computers Can Be Racist Too","datePublished":"2022-10-17T10:23:05+00:00","dateModified":"2024-04-19T10:24:34+00:00","mainEntityOfPage":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/"},"wordCount":852,"publisher":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#organization"},"image":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#primaryimage"},"thumbnailUrl":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg","articleSection":["Datafication"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/","url":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/","name":"Algorithm Bias - Computers Can Be Racist Too - Big Data Big Responsibility","isPartOf":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#website"},"primaryImageOfPage":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#primaryimage"},"image":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#primaryimage"},"thumbnailUrl":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg","datePublished":"2022-10-17T10:23:05+00:00","dateModified":"2024-04-19T10:24:34+00:00","breadcrumb":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#primaryimage","url":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg","contentUrl":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg","width":2560,"height":1707,"caption":"Image credit: rawpixel.com\/freepik.com"},{"@type":"BreadcrumbList","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/wpmu.mau.se\/nmict22group10\/"},{"@type":"ListItem","position":2,"name":"Algorithm Bias &#8211; Computers Can Be Racist Too"}]},{"@type":"WebSite","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#website","url":"https:\/\/wpmu.mau.se\/nmict22group10\/","name":"Big Data Big Responsibility","description":"A discussion about technology&#039;s role in development","publisher":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/wpmu.mau.se\/nmict22group10\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#organization","name":"Big Data Big Responsibility","url":"https:\/\/wpmu.mau.se\/nmict22group10\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/logo\/image\/","url":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/09\/cropped-cropped-IMG-20220923-WA0003-1.jpg","contentUrl":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/09\/cropped-cropped-IMG-20220923-WA0003-1.jpg","width":1062,"height":382,"caption":"Big Data Big Responsibility"},"image":{"@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/person\/56bc9d33662c1bc6d082e33be6c23030","name":"Maja Tidemand","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/wpmu.mau.se\/nmict22group10\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/1188a9c84a93f531201b4c0ca2f8a5b648f5ce6395dfb37fac978d1889d48d49?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/1188a9c84a93f531201b4c0ca2f8a5b648f5ce6395dfb37fac978d1889d48d49?s=96&d=mm&r=g","caption":"Maja Tidemand"},"url":"https:\/\/wpmu.mau.se\/nmict22group10\/author\/am1974\/"}]}},"_links":{"self":[{"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/posts\/354","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/users\/422"}],"replies":[{"embeddable":true,"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/comments?post=354"}],"version-history":[{"count":1,"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/posts\/354\/revisions"}],"predecessor-version":[{"id":355,"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/posts\/354\/revisions\/355"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/media\/188"}],"wp:attachment":[{"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/media?parent=354"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/categories?post=354"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-json\/wp\/v2\/tags?post=354"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}