
Independent Federal Member for Curtin, Kate Chaney, has introduced a Private Member’s Bill to criminalise the use of AI in child sexual abuse. Photo: Kate Chaney Facebook.
Independent federal MP Kate Chaney will begin the second week of the 48th Parliament introducing a Private Member’s Bill to better protect children against exploitation and sexual abuse.
It addresses loopholes in a law that is yet to catch up with the sinister uses of artificial intelligence.
The bill targets emerging technologies designed to create, train or facilitate the production of child sexual abuse material.
“We need to do everything we can to stop the alarming growth in child sexual abuse material,” she said.
“Deepfakes, AI-generated child sexual abuse material and child-like AI personas can be created by these sophisticated tools and are inundating law enforcement with more material.
“Right now, predators can access and download these sickening tools and train them to generate child sexual abuse material and then delete them before detection, to evade existing possession laws.
“The technology landscape is changing at such a rapid rate – we need to be across advancements in AI to make sure the laws keep up with the tools offenders use to create this horrific material.”
Ms Chaney’s bill is responding to one of the key recommendations from the First National Roundtable on Child Safety in the Age of AI, hosted by the International Centre for Missing and Exploited Children Australia (ICMEC), at Parliament House last Thursday (17 July).
Ms Chaney has worked with ICMEC Australia on the PMB to ensure it addresses the identified critical gap in the Criminal Code Act 1995 (Cth).
ICMEC Australia CEO Colm Gannon said the organisation was proud to have convened national experts to the roundtable to help drive a shift from concern to action.
“There was strong consensus at last week’s roundtable that tools built to generate child abuse material have no place in our society,” he said.
“This bill is a clear and targeted step to close an urgent gap, and a strong signal that protecting children must be front and centre as AI evolves.”
Ms Chaney’s bill makes it an offence to download the technology for creating child abuse material and an offence to collect, scrape or distribute data with the intention to train or create technology for creating child abuse material.
A public benefit defence is included for law enforcement and intelligence officers.
“This Private Member’s Bill will limit the ability to generate on-demand unlimited child sexual abuse material at scale, often tailored to specific preferences, including the use of real children’s images and details,” the Member of Curtin said.
“The generation of this material impacts law enforcement’s ability to investigate offences due to the difficulty in distinguishing synthetic material from real material, diverting precious resources from investigating sexual abuse and exploitation of real children.
“This Private Member’s Bill represents a proactive and targeted legislative response, and we urge the government to act.”
The MP has been backed in her bid by Queensland’s 2019 Australian of the Year, Jon Rouse, who was also awarded the Australian Police Medal.
A well-known children’s champion, he is now a retired Detective Inspector.
“For several years, offenders have been training and exploiting AI generation tools to create child sexual abuse material (CSAM) and to obscure their digital footprints,” he said.
“While existing Australian legislation provides for the prosecution of CSAM production, it does not yet address the use of AI in generating such material – an urgent legislative gap.
“This bill marks a crucial advancement, equipping investigators and the judiciary with the means to directly confront the misuse of these technologies.
“It is important to recognise that real children have been harmed – their abuse images used to train these AI systems.
“This is not a victimless crime.”
Senior lecturer in cybersecurity and privacy at the University of Tasmania, Joel Scanlan, described the bill as a “critical and proactive step” in addressing the threat of AI-generated child sexual abuse material.
“By criminalising the training of AI systems to produce this content, this legislation closes a dangerous loophole and is fundamental to reducing the amount of child abuse material available on the internet,” Dr Scanlan said.