HumanitarAI delves into the dynamic intersection of datafication, AI, and social media, exploring how these tools are reshaping the way we approach humanitarian efforts and communication for development.
 
Deepfakes: a data security issue or a gender issue?

Deepfakes: a data security issue or a gender issue?

I have never feared posting a selfie on FaceBook, or updating my headshot on LinkedIn. 

However, deepfakes are a new phenomenon creating fear in everyday women just like myself. Deepfakes are a recent development, the term coined on the popular social media platform, Reddit, (van der Nagel, 2020). Unfortunately, the creation of deepfakes are not only a data security issue, but a gender issue- with women being the main target. 

Over the summer, PBS (Public Broadcasting Service) debuted a special interview about the fears women are now living with due to the rise in deepfakes. Deepfakes are taking pornography websites by storm, and 96% are not consensual uses of a woman’s image. Nina Jankowicz, Author of “How to Be a Woman Online: Surviving Abuse and Harassment, and How to Fight Back” found herself in that exact situation with her image printed on the naked body that was not hers. Facing years of cyber-sexual harassment, she decided to help those in similar situations. In July 2023, she sat down with PBS to address the new fears of women, (Barron-Lopez & Jankowicz, 2023).

Image by PBS, click here to watch interview.

By a landslide, the main reason why deepfakes are created is for sexual purposes, specifically to create images of celebrities, (Lucas, 2022). Several Twitch streamers are reporting their images being used without their consent on popular websites like PornHub. Political figures like Hilary Clinton and Greta Thunberg are also experiencing this violating circumstance. However, a concerning poll shows that most men would like to see women they know nude, (Reuther). 

Graphic by Reuther.

Predators are defending their actions stating that they have the right to create ‘art’ with the public images of public figures, (Barron-Lopez & Jankowicz, 2023). However, is this truly affecting only public figures? After watching the interview, I am beginning to wonder are my FaceBook photos of myself, my friends, and my children truly safe from this form of sexual harassment? To make it worse, the photos used for deepfake porn are not only taken from social media posts. ‘Creepshots’ are on the rise, with women being photographed in public by strangers, (van der Nagel, 2020). Reddit has been an active platform for predators to share images of neighbors, classmates, and other non-famous women for their pleasure and without repercussions.

If I were to ever find my image in a deepfake video, I would have no idea what to do. I would be embarrassed and ashamed, where would I start to try to find the person who did this? Nina Jankowicz confirmed my worst nightmare when she mentions that if the predator is out of your state or country, there is no jurisdiction, (Barron-Lopez & Jankowicz, 2023). However, there are more programs developing, like Reddit Gonewild, where you must confirm that the photo you are uploading is of yourself, (van der Nagel, 2020). These verification systems are hoping to limit the amount of times a stranger is exploiting another without their consent. 

“The men are very concerned about their own privacy. They don’t want to be found out for making these videos, but they’re not concerned about the privacy of the women that they’re making videos of.” (Barron-Lopez & Jankowicz, 2023)

Jankowicz sat on the Disinformation Governance Board for the Biden administration. The board was dissolved in less than a month due to the Republican party arguing that this board was against their amendment right to freedom of speech and promoted censorship, (Barron-Lopez & Jankowicz, 2023). As I am writing this blog, I am residing in the United States where I feel utterly unprotected and vulnerable as a woman on the internet. 

Although there are deepfake detection technologies available, once the victim’s image is published on a porn site, whether it can be proven that the video is a fake or not, the emotional and mental damage has been done. Not to mention the professional and social reputation that has been damaged due to the sexual harassment, forcing women into hiding, (Okolie, 2023). 

In 2019, BBC spoke with two Zimbabwean women affected by deepfakes. One woman lost her job, and the other was disowned from her community and removed from school, (BBC, 2019). Deepfakes are debilitating women in the Global North and South. 

Image by BBC, click here to watch video.

How has this form of sexual harassment expanded to every corner of the world in recent years? Perhaps due to how easy it is. There are ‘nudifying’ websites and apps, like Telegram, available to anyone. Once the picture/video has been generated, the website simply asks for a Visa or Mastercard to complete the checkout, (Das, 2019). Telegram is the culprit to more that 68,000 nude images created, (Lucas, 2022). 

With the rise of social media platforms, like TikTok, creating a cultural movement, how can we know our daughter’s, sister’s, and my images and clips are safe from extortion through deepfakes? Not only is this a sexual harassment concern, but this has developed into a child pornography issue as well. With almost a third of TikTok users being under the age of 14, several of these children have been victims to the deepfake movement, (Lucas, 2022). But that is another topic for another day.. 

Thanks for reading, and be sure to follow for more! 

References: 

Barron-Lopez, L., & Jankowicz, N. (2023, July 23). Women face new sexual harassment with deepfake pornography. PBS. other. Retrieved from -pornography.

BBC. (2019). What are the effects of revenge porn to victims. The SheWorld. Retrieved from https://www.bbc.co.uk/programmes/p07xs7qs.

Das, S. (2021, August 29). Credit card firms linked to deepfake ‘nudify’ site; An AI website that boasts it can ‘strip’ women and girls has attracted millions of users willing to pay for the privilege. Sunday Times [London, England], 21. mark-STND&xid=759e3bd0

Lucas, K. T.  (2022) Deepfakes and Domestic Violence: Perpetrating Intimate Partner Abuse Using Video Technology, Victims & Offenders, 17:5, 647-659, DOI: 10.1080/15564886.2022.2036656

Okolie, C. (2023). Artificial Intelligence-Altered Videos (Deepfakes), Image-Based Sexual Abuse, and Data Privacy Concerns. Journal of International Women’s Studies, 25(2), COV2+. https://link-gale-com.proxy.mau.se/apps/doc/A744256756/ITOF?u=mahogsk&sid=bookmark-ITOF&xid=29e42d2c

Reuther, J. (n.d.). Digital Rape: Women Are Most Likely to Fall Victim to Deepfakes. The Deepfake Report. https://www.thedeepfake.report/en/09-digital-rape-en

van der Nagel, E. (2020). Verifying images: deepfakes, control, and consent, Porn Studies, 7:4, 424-429, DOI: 10.1080/23268743.2020.1741434