Australia Targets AI-Driven Abuse with New Laws to Combat Deepfake Nudes and Stalking

September 2, 2025
Australia Targets AI-Driven Abuse with New Laws to Combat Deepfake Nudes and Stalking
  • Australia is moving forward with legislation to combat AI-generated non-consensual nude images and stalking, aiming to better protect individuals, especially children, from online abuse.

  • The government emphasizes restricting apps used solely for abuse, humiliation, or harm, with a focus on targeting 'nudify' apps and sextortion scams that have surged, particularly affecting minors.

  • While the government plans to collaborate with industry stakeholders to develop this legislation, no specific timeline has been provided for its implementation.

  • These initiatives are part of broader efforts to combat online harm, especially targeting vulnerable groups such as children.

  • The government recognizes that legislation alone isn't enough and plans to work with tech platforms to address these issues more effectively.

  • A social media age verification system is under development, and authorities are pressuring tech companies to prevent the spread of harmful AI-generated content.

  • Industry groups like DIGI support these efforts, and companies such as Meta are actively removing deepfake content and sharing signals to combat AI-enabled abuse.

  • Recent studies show that effective age verification is feasible through various technologies, though no single method fits all contexts, as authorities work to improve online safety for minors.

  • Reports from the eSafety Commissioner reveal that incidents involving digitally altered intimate images of minors have doubled over the past 18 months, highlighting the growing threat.

  • Meta has taken legal action against platforms like Hong Kong-based CrushAI that create sexually explicit deepfakes, demonstrating proactive enforcement against non-consensual imagery.

  • Offenders often train AI models with images of children and then delete the data, making it difficult for law enforcement to distinguish real from synthetic material, complicating investigations.

  • A 2022 survey found that 10% of young respondents knew someone who had deepfake nude imagery created of them, with 6% reporting they had been victims.

Summary based on 16 sources


Get a daily email with more World News stories

More Stories