8 December 2025

Search engines being made to redirect self-harm queries and blur ‘lawful but awful’ content

| By Chris Johnson
Start the conversation
Hands holding mobile phone

Search engines will have to automatically redirect queries about self-harm to mental health support services. Photo: Michelle Kroll.

CONTENT WARNING: This story may distress some readers.

Search engines such as Google and Bing will soon have to automatically redirect Australian users seeking information about suicide, self-harm and eating disorders to mental health support services.

They will also have to blur pornographic images in search results to protect children from accidentally seeing them.

Adult users can then proceed by clicking through to view the images if they so choose.

The new rules will take effect from 27 December, three weeks after the social media ban for under-16s kicks in.

They form part of the Age-Restricted Material Codes that apply to online service providers such as app stores, social media services, equipment providers, online pornography services, and generative AI services.

eSafety has published regulatory guidance for the code, which it says industry drafted, requiring online services to protect children from exposure to age-inappropriate content.

READ ALSO Parliamentary Services admits handing masses of classified documents to non-vetted contractor

eSafety Commissioner Julie Inman Grant said more young people were unintentionally encountering age-inappropriate content.

“We know this is already happening to kids from our own research, with one in three young people telling us that their first encounter with pornography was before the age of 13 and this exposure was ‘frequent, accidental, unavoidable and unwelcome’ with many describing this exposure as being disturbing and ‘in your face’,” Ms Inman Grant said.

“We know that a high proportion of this accidental exposure happens through search engines as the primary gateway to harmful content, and once a child sees a sexually violent video, for instance, maybe of a man aggressively choking a woman during sex, they can’t cognitively process, let alone unsee that content.

“From 27 December, search engines have an obligation to blur image results of online pornography and extreme violence to protect children from this incidental exposure, much the same way safe search mode already operates on services like Google and Bing when enabled.

“And one of the most crucial requirements under the code will be automatic redirects to mental health support services for searches related to suicide, self-harm or eating disorders.

“These are important societal innovations that will provide greater protections for all Australians, not just children, who don‘t wish to see ‘lawful but awful’ content.”

eSafety Commissioner Julie Inman Grant says the new codes are all about protecting children. (Photo eSafety)

The eSafety Commissioner said it gives her some comfort knowing that if an Australian child is thinking about taking their own life, thanks to these codes, vulnerable kids won’t be sent down harmful rabbit holes or to specific information about lethal methods.

They will instead now be directed to professionals who can help and support them.

READ ALSO ‘A disgraceful attack on open and accountable government’: Furious dissent against Labor’s FoI changes

“If this change saves even one life, as far as I’m concerned, I believe it’s worth the minor inconvenience this might cause some Australian adults,” she said.

“Suicide devastatingly reverberates across families and communities, and represents a point of no return.

“But let me be clear, what this code won’t do is require Australians to have an account to search the internet, or notify the government you are searching for porn.

“And while certain images of pornography or extreme violent material in search results might be blurred, adults who wish to view that content can still click through to see it if they choose.

“Again, this is about protecting our kids from accidental exposure to material they will never be able to unsee.”

The regulatory guidance covers both the new codes coming into effect that apply to age-restricted material and the pre-existing Unlawful Material Codes and Standards, which tackle the “worst-of-the-worst” unlawful online material, including child sexual exploitation and abuse material, as well as pro-terror content.

The Age-Restricted Material Codes will complement the upcoming social media minimum age obligations set to commence on 10 December.

The codes were developed following public consultation.

If this story has raised any issues for you, you can call Lifeline’s 24-hour crisis support line on 13 11 14.

Free Daily Digest

Want the best Canberra news delivered daily? We package the most-read Canberra stories and send them to your inbox. Sign-up now for trusted local news that will never be behind a paywall.
Loading
By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.

Start the conversation

Daily Digest

Want the best Canberra news delivered daily? Every day we package the most popular Region Canberra stories and send them straight to your inbox. Sign-up now for trusted local news that will never be behind a paywall.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.