An origami like bird iconDirectory of WomenA pile of documentsSeveral strings of flagsMap marker iconMegaphone iconMountains with flag on top iconFind Victorian Women's Trust on Youtube

The digital frontier: Why gender equality matters in the age of AI

At the Victorian Women’s Trust, our mission has always been to protect and advance the rights of women and girls. Today, that mission has a new digital frontier. Artificial Intelligence (AI) is no longer a ‘future’ technology; it is the silent engine behind our hiring processes, our healthcare, and our online social interactions.

Image: (from left) Dr. Kirsten Abernethy (Executive Director, Victorian Women’s Trust), Dr. Jessica Lake (Melbourne University), and Cara Shrivastava (Your Creative).

We recently gathered a small group of experts and supporters to pull back the curtain on these systems. We invited Cara Shrivastava (Senior Digital Strategist, Your Creative), a specialist in translating complex technology into clear digital experiences. With a Master’s in Marketing Communications and a decade of experience across journalism and digital strategy, she advocates for designs that consider the social impact too often ignored by the ‘tech bros’.

To bring a different lens to the discussion, we also invited renowned expert in media law and legal history, Dr Jessica Lake (Senior Lecturer, Melbourne Law School). Jessica’s research reveals that women have been fighting for image dignity since the 1850s. Her work, including her latest book, Special Damage: The Slander of Women and the Gendered History of Defamation Law, explores the gendered history of defamation and privacy.

The insights shared in our conversation are vital for our entire community. We are sharing these key learnings to ensure that technology serves equality, not exclusion.

 

The Myth of ‘Tech Neutrality’

It is a common misconception that algorithms are objective. Cara highlights that tech neutrality is a myth:

  • Encoded bias: Automation and generative AI reflect the biases of the data they are fed, which is often historically skewed against women.

  • Amplified outcomes: Shrivastava explains that AI is “basically a data set that’s just trying to predict the next outcome… but all of those data sets are from the world […] that clearly haven’t been equal to women in the slightest. And AI has taken that and amplified it.”

  • Hidden presence: AI has been part of our lives for years, quietly embedded in social media algorithms, financial services, GPS navigation, healthcare, government databases, and recruitment systems. Opting out has become an extremely difficult thing to do — even if you do not personally use AI, the systems around us evidently do.

 

Justice and ‘Image Dignity’

Jessica notes that while Deepfakes feel like a new phenomenon, they are the modern evolution of old harms; women have been fighting for image dignity since the invention of the camera.

“[Deepfakes] have been a problem since the beginnings of photography and cinema. What’s new now, however, is the speed and ease.”Dr Jessica Lake

  • Deepfakes = real harm: This is a deeply gendered problem; 98% of deepfakes circulating online are pornographic, and 99% of those are of women or girls (eSafety Commission, 2024).

  • The need for education: We must build awareness of ‘image autonomy’ into sex education, starting as early as primary school.

  • Law reform: Australia can learn from countries like Denmark, which is leading a push to grant individuals stronger rights over their own faces and voices to prevent unauthorised deepfakes.

 

Defining ‘Feminist AI’

Creating a fair digital future requires more than just representation; it requires a specific framework. Feminist AI is built on:

  • Agency: Ensuring women have a say in how data is collected and used.

  • Accountability: Holding tech giants responsible for the “unintended consequences” of their products.

  • Intersectionality: Designing systems that work for all women, recognising that bias hits marginalised communities the hardest.

 

 

How you can take action

We can all advocate for a more equitable digital world by taking these practical steps:

  • Demand transparency: Support legislation that requires companies to explain their algorithms.

  • Audit your tools: Jessica recommends investigating the AI systems you currently use to make an informed, ethical choice. For instance, be aware that OpenAI (the maker of ChatGBT) works with the US military, and Palantir (one of the organisations behind PayPal) is often used for military and surveillance of civilians. 

  • Consent matters: Cara emphasises reading Terms and Conditions (or using AI tools to summarise and dig into what you are signing up for) to ensure your data is not used without your consent.

 

Looking ahead: Our inaugural Feminist Research in Residence program 2026

This discussion was the first of many to come, and ties directly into our 2026 Feminist Researcher in Residence program. This inaugural researcher will spend the coming months translating complex tech issues into policy recommendations and practical insights for Australian women and girls. This role is currently open for submissions, closes April 10.

 

Join the movement:

Together, we can create a stronger movement committed to social progress and equality in the digital age.

Read next

Beyond the algorithm: Why we’re launching a Feminist Researcher in Residence program

Beyond the algorithm: Why we’re launching a Feminist Researcher in Residence program

Blog

Special IWD announcement — introducing our bold, new research initiative.

Dr Kirsten Abernethy, Executive Director, Victorian Women's Trust Read more

"Feminism, and the Trust, has always been about the collective" Dr Kirsten Abernethy, VWT Executive Director

Blog

Dr Kirsten Abernethy gave her inaugural speech as Executive Director of the Victorian Women's Trust at the Trust's 40th Birthday Party on October 12, 2025.

Dr Kirsten Abernethy Read more

"There's something missing; the voices of women" Diana Bryant AO KC on the need for Madam Speaker

Blog

Photo: (from left) Anna Burke, Mary Crooks AO, Diana Bryant AO KC by Breeana Dunbar The following is a transcript of a speech delivered by Diana...

Diana Bryant AO KC Read more