
Public servants’ use and understanding of GenAI varies significantly, according to new academic research. Photo: Michelle Kroll.
Public servants don’t yet fully understand or trust the applications and impacts of artificial intelligence (AI) in their work.
The use of Generative AI varies widely across the Australian Public Service, as well as state and territory services. But technology failures and concerns over trust in government are casting a big shadow over how it is being embraced.
One of the biggest factors contributing to these concerns is the massive failures in adopting technology for the tragic and illegal Robodebt scheme.
UNSW Canberra’s Public Service Research Group has released a comprehensive study into GenAI and how it’s applied to policy work.
It found that while the use of GenAI was growing rapidly within the public service, there were disparities in attitudes, the level of understanding and proficiency among public servants.
Co-author of the report Professor Helen Dickinson said the opinions on GenAI varied greatly among public servants. She said some viewed it as a valuable tool to increase efficiency, others thought the risks outweighed potential benefits.
“Past challenges with implementing new technologies are clearly front of mind for many public servants,” Professor Dickinson said.
“Learnings from the massive failures identified in the Robodebt scheme are influencing how senior public servants perceive advancements in the application of GenAI.
“Public servants are wary of anything that might compromise citizen trust and confidence in government and so there is hesitance in allowing GenAI to be used in areas like external-facing service delivery.
“As such, there is widespread agreement that the use of GenAI in policy work requires adequate human oversight.”
The academics interviewed senior public servants across 22 state, territory and federal government agencies to gather perspectives across the public sector.
The research found the use of GenAI in policy work has so far been used to speed up administrative work requiring large datasets or summarising large volumes of information or documents.
It said some public servants were using platforms such as ChatGPT to generate new ideas or content, in much the same way as website searches have become commonplace.
Others reported the ability to use the technology to create a “subject specific knowledge base” by feeding in relevant policies, data, legislation and other agency material.
“Many government agencies are using GenAI to support their goals. However, opinions on GenAI’s potential vary among senior public servants,” the report stated.
“Some believe it can transform policy work, while others are cautious due to perceived risks. Past challenges with technology implementation, including learnings from policy failures… are influencing how senior public servants perceive advancements in the application of GenAI.
“There is widespread agreement the use of GenAI in policy work requires adequate human oversight.”
The report found GenAI programs – easily available online and often for free – allowed public servants to access functions on their personal phones or laptops.
It allowed public servants to build their skills in using GenAI but also meant the level of proficiency varied greatly across the workforce.
Other concerns about GenAI use included its reliance on the data fed into it – potentially amplifying biases – and that it would compromise data sovereignty by giving up control of information governments held on behalf of the people they served.
Professor Dickinson said some also perceived the use of GenAI as a potential mechanism to conceal the workings of government agencies.
She said it was imperative for governments to formalise the intended role of GenAI in policy work.
“A statement regarding why governments are investing in GenAI is critical to building understanding of its intended contribution to high-quality policy work,” Professor Dickinson said. “Through a GenAI strategy, its value can be further identified, monitored and assessed.
“Another key goal should be to enhance AI literacy among public servants as it is going to remain a feature of the future workplace.
“Agencies must also ensure human policy crafting skills are maintained as current AI tools can only provide content based on historical data and some complex policy issues cannot be solved by what has been done before.”
The potential environmental impact of GenAI also raised questions of whether it could be ethically used in the public service as the amount of energy and water needed to maintain rapidly expanding AI servers was expected to grow exponentially.
The report recommended in detail how governments should ensure their strategies for GenAI adoption were aligned with governance arrangements; develop plans and resources to build confidence in its adoption; and re-examine workforce training, development, and support across all levels in GenAI-augmented policy craft, including senior executives.