In recent years, Deepfakes have advanced rapidly, infiltrating daily life and reshaping how society operates. While some use them as a harmless form of entertainment, they have also been weaponised, emerging as a new form of technology-facilitated gender-based violence, with women, girls, and gender diverse people disproportionately targeted.
What is a deepfake, and how is it made?
Deepfakes are created from advanced AI algorithms that analyse large datasets to generate audio-visual material. Deepfake technology is used to falsify an individual’s actions or words or to depict fabricated events in a way that is indistinguishable from real footage. Creators of deepfakes wield enormous power and have the ability to make their subjects say or do anything they desire. As technology becomes increasingly sophisticated, deepfakes no longer demand extensive resources or specialised expertise, allowing the average internet user to create highly believable deepfake content.
What does a deepfake look or sound like?
Deepfakes take on many forms, such as superimposing an individual’s face onto a pornographic video or instructing a deepfake application to perform specific actions involving the subject. ‘Nudify’ applications allow users to ‘undress’ women by uploading non-sexual images and selecting a preferred body type, designed to function exclusively on women’s bodies. This indicates that such platforms are not merely for entertainment but perpetuate, and profit from, harmful gender dynamics, misogynistic attitudes, and power dynamics rooted in control and objectification over women’s bodies.
How are deepfakes being used to harm women, girls and gender diverse people?
The Victorian Women’s Trust has been investigating deepfakes as an emergent form of image-based sexual abuse. A 2020 survey from RMIT University found that over 1 in 3 Australians had experienced image-based sexual abuse, compared to 1 in 5 in 2016, with women twice as likely to be victims than men.
There is compelling data that suggests the creation and sharing of non-consensual sexualised deepfakes have increased online. Non-consensual pornographic videos make up 90-95% of the deepfake material currently online, with over 90% depicting women and girls. Many deepfakes are shared across social media platforms, often containing personal data such as the victim’s name, age, location, and social media profiles, enabling viewers to harass or stalk victims further.
The new age of sexism
UK feminist and author Laura Bates’ recently published book ‘The New Age of Sexism’, which draws attention to the way misogyny is coded into the design of AI. Bates explores concepts of AI girlfriends, chatbots, cyber brothels, and deepfakes, specifically highlighting how deepfakes are an emergent form of technology-facilitated gender-based violence that disproportionately targets women and girls.
The real world impact of deepfakes on victim-survivors
The emotional and psychological toll of deepfake abuse is complex. As many people are unaware of deepfake technology, even low-quality media can appear real and cause victims significant psychological harm. As much of the sexualised deepfake content is not real (i.e. one’s face has been superimposed on another person’s body), victims may be reluctant to report deepfake abuse to authorities or doubt whether they deserve to feel traumatised. This is particularly common when deepfakes were created or shared by someone close to the victim, such as a current or former intimate partner.
Deepfakes and young people
Alarmingly, deepfake abuse is not limited to adults. While many sites don’t explicitly market themselves to minors, there are often no restrictions prohibiting access to underage users as young as ten. Recent reports of students in Melbourne schools creating and sharing sexualised deepfake content of their female peers and educators highlight how these harmful platforms lack the necessary age restrictions. This disturbing trend demonstrates how deepfakes dehumanise women and perpetuate a culture of misogyny and gendered violence, with such attitudes emerging at alarmingly early ages.
Deepfakes are a largely unregulated industry with several ‘grey areas’ that hinder legal action regarding privacy, online safety, and data misuse. The rapid development of this technology has created challenges for policymakers in drafting timely legislation, as AI continues to advance at an unprecedented rate.
While the eSafety Commissioner supports victims by removing deepfake content, it is difficult to completely erase media from the internet if it has been shared privately. If victims go down the defamation route — a legal process few can afford — monetary compensation does not prevent the continuous sharing of deepfakes, nor does it recompense the social ramifications of deepfake harm.
Preventing deepfake abuse through better legislation
However, some legislative reforms to address deepfake abuse have been implemented in Australia. The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 seeks to criminalise those who engage a carriage service (i.e. social media) to share non-consensual sexual material, warranting six years’ jail time for the sharing of deepfake pornography (both on private and public platforms), and an additional year for those who created the material. The Bill aims to address a larger issue at hand: violence against women, and the role of social media in reinforcing harmful misogynistic content and representations.
While the Online Safety Amendment (Social Media Minimum Age) Bill seeks to implement a social media ban for users under 16, many deepfake platforms remain accessible to underage users. Sciberras (2024) argues that implementing age restrictions on social media platforms suggests that misogyny only occurs in online spaces, rather than acknowledging misogyny as a wider societal problem. Educating young people on digital literacy and online safety is considered to bring wider change than taking a preventative approach to using social media.
What will banning deepfakes achieve?
Restricting deepfake technology will not prevent the creation of non-consensual sexualised deepfakes or image-based sexual abuse. This technology is merely a tool through which this abuse occurs. This form of gendered violence reflects misogynistic social and cultural attitudes that view women as sexual objects to whom men are entitled. This is statistically proven; women and girls are disproportionately the victims of non-consensual sexualised deepfakes, and men are overwhelmingly the perpetrators.
Tools for digital safety
Deepfake detection technologies allow consumers to make informed and critical decisions about the content they consume. For example, ‘FakeCatcher’ is a deepfake detection platform that analyses eye movement to detect deepfakes with 96% accuracy. Additionally, projects such as ‘Detect DeepFakes’ by the Massachusetts Institute of Technology (MIT) aim to educate users regarding how to counteract and detect misinformation (particularly political deepfakes) created by AI.
However, detection technologies reveal deeper inequalities. The datasets used to train these detection technologies are encoded with ideological biases, excluding racially or ethnically diverse people. Many detection tools can only detect deepfakes on white bodies, exemplifying how deeply layered and complex deepfakes and deepfake regulation measures are.
Deepfakes are not a standalone form of gendered violence, rather, they are merely a tool for technology-facilitated gender-based violence that has long existed in online and offline spaces. It is damning evidence of an ongoing imbalance of cultural values, whereby the lives, experiences and dignity of women and girls are deemed less worthy of respect. To truly eradicate the scourge of technology-facilitated abuse against women and girls, we first have to interrogate the society in which such misogyny has the ability to flourish, and ask, is this the future we want for women and girls?
Sofia is a human geographer passionate about placemaking as a catalyst for social change and community empowerment. Her research specialises in the interplay between social and built environments, with a particular focus on identity and belonging among LGBTQIA+ communities.