Tackling Deepfake Pornography in Peru

Tackling Deepfake Pornography in Peru

Author: Mariana del Pilar Apaza Calderon
San Martin de Porres University, Lima

Introduction:

Deepfake pornography[1], a term that might sound like a plot from a futuristic thriller, has become a stark reality in our digital age. This clandestine realm of manipulated videos and images obscures the line between reality and fabrication, ensnaring unsuspecting individuals in explicit narratives they never agreed to[2].

This article begins by revealing a case from Lima where deepfake pornography resurfaced, emphasizing the pressing need to address it. The author will then explore the world of deepfake pornography, focusing on its disproportionate impact on women. Finally, the author will discuss potential solutions, highlighting how the Peruvian government can combat it while addressing its underlying cause: online misogyny.

A Disturbing Discovery:

One day, a female student at St. George’s College, a school in Lima (Peru), gained access to a school computer. While surfing, she stumbled upon an open Instagram account belonging to one of her male classmates. She discovered a chat containing before-and-after images of her female classmates’ faces superimposed onto bodies that did not belong to them, including her own. This chat included information about the prices of these manipulated pictures, potential buyers, and references to the artificial intelligence program used[3].

Shocked and concerned, she promptly reported this discovery to the school office. After receiving no response, she told her parents about the situation. As a result, numerous parents organized a protest outside the school, demanding the expulsion of the classmates at fault[4].

According to a statement sent by St. George’s College to the parents of the eighth to eleventh grades, the institution responded to the situation by offering emotional support to the affected teenagers and contacting both the victims’ families and the alleged perpetrators, while emphasizing that the incident did not occur within the school’s educational environment since it took place in a virtual space[5].

Meanwhile, the Family Prosecutor’s Office in Chorrillos is investigating the alleged violation of child pornography laws at St. George’s College, involving interactions that took place outside the school[6]. This investigation includes collaboration with various authorities and agencies, such as the Ministry of Women and Vulnerable Populations, the Ministry of Education, and the National Police[7].

Unmasking Deepfake Pornography: Tracing the Genesis

In 2014, Ian Goodfellow was celebrating his recent graduation from a doctoral program at a bar with friends when he had a remarkable insight: can two computer programs compete to invent and judge the realism of data?[8] This idea gave birth to Generative Adversarial Networks (GANs)[9], a pair of computer programs where one generates data like images, and the other evaluates how real they appear. They continually challenge each other to improve, producing lifelike content over time[10].

The easy accessibility of deepfake creation tools made anyone capable of generating highly convincing yet entirely fabricated content. This is known as deepfakes, stemming from Ian Goodfellow’s GANs. This accessibility raises significant concerns. Not because of its inherent nature, but because it enables the crafting of deceptive content with harmful intentions. This is done without the consent or involvement of the individuals portrayed.

The issue mentioned is notably prevalent in the pornography industry, since 96% of deepfakes are pornographic[11]. The inception of deepfake pornography can be traced back to 2017, when an anonymous Reddit user, known as “deepfakes,” circulated compelling videos featuring the faces of female celebrities overlayed onto explicit scenes[12].

Grasping the concept:

Before we explore deepfake pornography, it’s crucial to understand Image-Based Sexual Abuse (IBSA)[13]. IBSA comprises actions such as the creation, theft, extortion, threatened or actual distribution, or any use of sexualized or sexually explicit materials without the consent of the person portrayed[14].

Deepfake pornography fits into the category of “creating explicit content without consent” within IBSA[15], since it involves the use of advanced technology to generate authentic-looking but entirely fake content, by using the images of a person to portray them —without authorization— in sexual acts they never engaged in[16]. In essence, deepfake pornography represents a form of IBSA[17].

Creating deepfake content is not inherently wrong, as it involves the generation of content that is not real. Similarly, crafting fake erotic content is not intrinsically objectionable. Because online spaces can provide a platform for individuals to explore and enjoy their sexuality consensually. However, the red flag emerges when this creative process occurs without the individuals’ consent [18]. In both instances, the central concern revolves around the lack of consent.

The vulnerability of women:

When we discussed earlier that 96% of deepfakes fall into the category of deepfake pornography, it becomes even more alarming to realize that an astonishing 99% of these deepfake pornographic materials exclusively target women[19]. This stark gender imbalance highlights that women bear a disproportionate burden of this form of IBSA. This clearly makes deepfake pornography an unquestionably gendered problem.

Deepfake pornography prevails, particularly focusing on women, with the evolution of the internet and specific online spaces. Porn content still predominantly caters to male audiences, despite its shift in the past years. Online platforms like Reddit, Discord, 4chan, and 8chan have cultivated communities that objectify women and perpetuate harassment. Within these digital platforms, some men believe they have the right to dominate women, which has led to the creation of deepfake pornography content, reducing women to dehumanized and manipulable images[20].

The result of this alarming problem of female victims is profoundly distressing. It often results in severe psychological distress and physiological symptoms. Women subjected to deepfake pornography often feel sexually violated, leading to depression or post-traumatic stress disorder in some cases. The intensity of these emotional and physical reactions varies based on factors such as the content’s nature, the victim’s background, and their prior experiences[21].

Combatting Deepfake Pornography:

Addressing non-consensual deepfake pornography is a multifaceted challenge, as discussed by Arwa Mahdawi, given the risk of inadvertently promoting harmful websites while trying to combat the issue[22]. However, it is imperative to shift our focus toward effective strategies to tackle this problem. Along with simultaneously addressing pervasive online misogyny that glorifies the non-consensual objectification and exploitation of women.

In Peru, addressing non-consensual deepfake pornography requires targeted criminal law reforms to safeguard adult victims. As adopted by the current legal framework which exclusively focuses on child pornography. Legislators should consider enacting legislation that explicitly criminalizes the creation, distribution, and possession of non-consensual explicit content when it targets adults. This is to acknowledge the disproportionate harm inflicted on women. Moreover, to understand the unique challenges and vulnerabilities faced by them concerning deepfake pornography.

The incorporation of IBSA into Comprehensive Sexual Education as part of Peru’s National Basic Education Curriculum is vital. Because it promotes students’ sexual education and socio-emotional well-being. By including topics like deepfake pornography, it ensures a comprehensive education covering the risks, consequences, and ethical aspects of IBSA. Equipped with this knowledge, students can navigate the digital landscape safely and responsibly, contributing to the eradication of online misogyny, a root cause of the deepfake pornography problem.

Conclusion:

Deepfake pornography is a distressing problem that has far-reaching effects, especially on women. To address this issue, we must consider legal reforms, comprehensive sexual education, and challenging the online misogyny that enables its creation. By taking these steps, we can work toward a safer digital world where consent is respected, and individuals are shielded from the harm of deepfake pornography. It’s a complex challenge, but one that we must confront for the well-being of all.


[1] Matt Burgess, Deepfake Porn is Out of Control, Wired, (Feb 10, 2024, 10:43 AM), https://www.wired.com/story/deepfake-porn-is-out-of-control/.

[2] Daniel Story, Ryan Jenkins, Deepfake Pornography and the Ethics of Non-Veridical Representations, 36 Philosophy and Technology, 56 (2023) https://doi.org/10.1007/s13347-023-00657-0.

[3] Andina, Chorrillos: Parents unaware of the number of girls affected by photos with sexual content, (August 29th, 2023), https://andina.pe/agencia/noticia-chorrillos-padres-desconocen-numero-ninas-afectadas-fotos-contenido-sexual-953181.aspx

[4] Infobae, Chorrillos: Parents report students for selling sexually explicit photos of teenagers, (Augusth 28th, 2023), https://www.infobae.com/peru/2023/08/28/padres-denuncian-a-escolares-del-colegio-st-georges-en-chorrillos-por-comercializar-fotos-con-contenido-sexual/

[5] Perú 21, Students manipulate the photographs of their classmates and sell them, (August 28th, 2023), https://peru21.pe/lima/escolares-pornografia-infantil-ciberacoso-st-georges-college-chorrillos-escolares-manipulan-las-fotografias-de-sus-companeras-y-las-venden-noticia/

[6] Andina. An investigation has been opened against students who edited photos of their classmates to sell them, (August 28th, 2023), https://andina.pe/agencia/noticia-abren-investigacion-contra-escolares-editaron-fotos-sus-companeras-para-venderlas-953149.aspx

[7] Forbes Peru, Prosecutors investigate those who sold photos of minors manipulated with AI in Lima, (August 29th, 2023), https://forbes.pe/actualidad/2023-08-29/fiscalia-investiga-a-quienes-vendieron-fotos-de-menores-manipuladas-con-ia-en-lima

[8] MIT Technology Review, The creator of GANs: The man who gave machines imagination, (March 2nd, 2018), https://www.technologyreview.es/s/10016/el-senor-de-las-gan-el-hombre-que-dio-imaginacion-las-maquinas

[9] Overview of GAN Structure, Google Developers, (Feb 10, 2024, 10:53 AM), https://developers.google.com/machine-learning/gan/gan_structure.

[10] We Lab Plus, Deepfakes: Danger or evolution?, (December 1st, 2021), https://welabplus.com/2021/01/11/deepfakes-evolucion-o-peligro/

[11] MIT Technology Review, Forget fake news—nearly all deepfakes are being made for porn, (October 7th, 2019), https://www.technologyreview.com/2019/10/07/132735/deepfake-porn-deeptrace-legislation-california-election-disinformation/

[12] Security Intelligence, Don’t Believe Your Eyes: Deepfake Videos Are Coming to Fool Us All, (March 6th, 2019), https://securityintelligence.com/dont-believe-your-eyes-deepfake-videos-are-coming-to-fool-us-all/

[13] Image Based Sexual Abuse, National Center on Sexual Exploitation, (Feb 10, 2024, 10:56 AM), https://endsexualexploitation.org/issues/image-based-sexual-abuse/.

[14] National Center on Sexual Exploitation, Image Based Sexual Abuse, (2023), https://endsexualexploitation.org/issues/image-based-sexual-abuse/

[15] PROTECT Act Will aid Victims of Image-Based Sexual Abuse against Big Porn, Dr. Rich Swier, (Feb 10, 2024, 11:06 AM), https://drrichswier.com/2022/10/04/protect-act-will-aid-victims-of-image-based-sexual-abuse-against-big-porn/.

[16] Sophie Maddocks, Image-Based Abuse A Threat to Privacy, Safety, and Speech, MediaWell, (Feb 10, 2024, 11:04 AM), https://mediawell.ssrc.org/research-reviews/image-based-abuse-a-threat-to-privacy-safety-and-speech/.

[17] Asher Flynn, Anastasia Powell, Adrian J Scott, Elena Cama, Deepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging Form of Image-Based Sexual Abuse, 62 The British Journal of Criminology, 1622 2023,  https://doi.org/10.1093/bjc/azac012.

[18] Annenberg School for Communication of the University of Pennsilvania, What Is Deepfake Porn and Why Is It Thriving in the Age of AI?, (July 13th, 2023), https://www.asc.upenn.edu/news-events/news/what-deepfake-porn-and-why-it-thriving-age-ai

[19] Centre for International Governance Innovation, Women, Not Politicians, Are Targeted Most Often by Deepfake Videos, (March 3rd, 2021), https://www.cigionline.org/articles/women-not-politicians-are-targeted-most-often-deepfake-videos/

[20]Vice, Deepfakes Were Created As a Way to Own Women’s Bodies—We Can’t Forget That, (June 8th, 2019),  https://www.vice.com/en/article/nekqmd/deepfake-porn-origins-sexism-reddit-v25n2

[21] Health News, The Damage Caused By Deepfake Porn, (April 6th, 2023), https://healthnews.com/mental-health/anxiety-depression/the-damage-caused-by-deepfake-porn/.

[22] The Guardian, Nonconsensual deepfake porn is an emergency that is ruining lives, (April 3rd, 2023).  https://www.theguardian.com/commentisfree/2023/apr/01/ai-deepfake-porn-fake-images

0
Deinstitutionalisation of Childcare: The Case of Bulgaria Female Judges in Afghanistan: Collapse since Taliban

No Comments

No comments yet

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.