
The risk of malicious and deceptive use of AI in elections is increasing. Photo: Michelle Kroll.
The ACT Electoral Commission has called for new laws to catch up with the growth of artificial intelligence and the threat it poses to the democratic process.
Its official report on the conduct of the 2024 ACT Legislative Assembly election says no concerns were raised about the use of AI during the voting process. But if they had, the commission’s hands would be tied.
“It appears the current regulatory framework in the ACT to combat the use of AI to influence the outcome of an election is insufficient,” the report says.
It says the Electoral Act does not currently address the threat of deepfakes and other forms of manipulated images, audio or video directly.
The two key sections – on misleading advertising and defamation of candidates – are limited in scope and difficult to enforce.
The report says under present law a political deepfake will not break any laws as long as it carries the required authorisation.
“Consequently, an appropriately authorised standalone video containing an AI-generated image of a candidate making AI‑generated false statements is unlikely to fall within the enforcement parameters of any of these legislative provisions,” it says.
“If the commission received a complaint about an AI-generated replica of a candidate making falsified statements, it would largely be powerless to act against the creators or publishers.”
The report says examples of AI artificially generated imagery and audio during the federal election was obviously not real and created to either make a political point or be satirical.
But as the technology becomes more widely available and improves, the risk of malicious and deceptive AI-generated electoral advertising will increase, it says.
“Controversies around AI-generated deepfake content intended to deceive the public or influence public opinion have become increasingly prevalent globally,” the report says.
“AI-generated content involving images, audio or video manipulated to create fictitious depictions of people can damage an individual’s reputation, posing a threat to political candidates engaged in an election campaign.
“The ease of producing and spreading disinformation and misinformation across the internet makes it harder for the public to discern trustworthy information, undermining informed voting decisions.”
The commission recommended the Legislative Assembly investigate potential new laws to deal with AI-generated political deepfake content to be enacted before the 2028 ACT election.
The report offered new laws in South Australia as a starting point.
Its Electoral (Miscellaneous) Amendment Act 2024 passed last November introduced offences relating to the distribution of artificially generated election ads.
It is now an offence to publish or distribute an election ad that includes a simulated depiction of a person performing an act they did not actually perform – unless the advertisement clearly states it has been artificially generated.