﻿{"version":"1.0","provider_name":"Big Data Big Responsibility","provider_url":"https:\/\/wpmu.mau.se\/nmict22group10","title":"Algorithm Bias - Computers Can Be Racist Too - Big Data Big Responsibility","type":"rich","width":600,"height":338,"html":"<blockquote class=\"wp-embedded-content\" data-secret=\"2BHwscqoR3\"><a href=\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/\">Algorithm Bias &#8211; Computers Can Be Racist Too<\/a><\/blockquote><iframe sandbox=\"allow-scripts\" security=\"restricted\" src=\"https:\/\/wpmu.mau.se\/nmict22group10\/2022\/10\/17\/algorithm-bias-computers-can-be-racist-too-2\/embed\/#?secret=2BHwscqoR3\" width=\"600\" height=\"338\" title=\"&#8220;Algorithm Bias &#8211; Computers Can Be Racist Too&#8221; &#8212; Big Data Big Responsibility\" data-secret=\"2BHwscqoR3\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" class=\"wp-embedded-content\"><\/iframe><script type=\"text\/javascript\">\n\/* <![CDATA[ *\/\n\/*! This file is auto-generated *\/\n!function(d,l){\"use strict\";l.querySelector&&d.addEventListener&&\"undefined\"!=typeof URL&&(d.wp=d.wp||{},d.wp.receiveEmbedMessage||(d.wp.receiveEmbedMessage=function(e){var t=e.data;if((t||t.secret||t.message||t.value)&&!\/[^a-zA-Z0-9]\/.test(t.secret)){for(var s,r,n,a=l.querySelectorAll('iframe[data-secret=\"'+t.secret+'\"]'),o=l.querySelectorAll('blockquote[data-secret=\"'+t.secret+'\"]'),c=new RegExp(\"^https?:$\",\"i\"),i=0;i<o.length;i++)o[i].style.display=\"none\";for(i=0;i<a.length;i++)s=a[i],e.source===s.contentWindow&&(s.removeAttribute(\"style\"),\"height\"===t.message?(1e3<(r=parseInt(t.value,10))?r=1e3:~~r<200&&(r=200),s.height=r):\"link\"===t.message&&(r=new URL(s.getAttribute(\"src\")),n=new URL(t.value),c.test(n.protocol))&&n.host===r.host&&l.activeElement===s&&(d.top.location.href=t.value))}},d.addEventListener(\"message\",d.wp.receiveEmbedMessage,!1),l.addEventListener(\"DOMContentLoaded\",function(){for(var e,t,s=l.querySelectorAll(\"iframe.wp-embedded-content\"),r=0;r<s.length;r++)(t=(e=s[r]).getAttribute(\"data-secret\"))||(t=Math.random().toString(36).substring(2,12),e.src+=\"#?secret=\"+t,e.setAttribute(\"data-secret\",t)),e.contentWindow.postMessage({message:\"ready\",secret:t},\"*\")},!1)))}(window,document);\n\/\/# sourceURL=https:\/\/wpmu.mau.se\/nmict22group10\/wp-includes\/js\/wp-embed.min.js\n\/* ]]> *\/\n<\/script>\n","thumbnail_url":"https:\/\/wpmu.mau.se\/nmict22group10\/wp-content\/uploads\/sites\/75\/2022\/10\/technology-human-touch-background-modern-remake-creation-adam-scaled.jpg","thumbnail_width":2560,"thumbnail_height":1707,"description":"We are surrounded by algorithms &#8211; they decide what we see in our Facebook Feeds, screen our job applications, calculate probabilities and drive facial recognition software. While they are faster and sometimes more accurate than us at processing large amounts of data, there is one area where they can be just as flawed as humans &#8211; they are biased. While it sounds strange to assign an inanimate object with attributes usually reserved for people, the core of AI is, ultimately, human. It is humans that design it, humans that feed it information to train it, and humans that decide what &hellip;"}