In June 2025 Deloitte exposed the innards of the Federal trough into which the consulting firms had inserted their snouts.

A report prepared for the Department of Employment and Workplace relations was published on the department website before it was recognised that it was littered with errors, references to papers that could not be found, made up quotes, and misinterpretation of facts. These were on top of errors of grammar, syntax, and simple spelling.

Deloitte was rightly forced to repay part of the $440,000 contract.

Why not all of it?

For this to happen at a top of the tree consulting firm, compounded by the inattention of the department, certainly means the rest of us should be critically aware of the potential of AI to hallucinate.

LLM’s function by looking for the next word (token) that best fits the sentence or paragraph. It tries to please, so when it identifies a gap for which there is little or misleading information, it has the potential to just make stuff up and make it up in a way that looks completely authentic.

It’s like a student trying to cover what he does not know by sounding authoritative.

The only way to prevent this is to have your own robust verification protocols that require human review of every AI generated citation, claim, or statistic somewhere in your workflow. This is now dramatically easier than it was a few months ago as a URL can accompany each citation. You can also limit the sources from which information is pulled to specific, high quality domains.

At its core, AI is just like a very smart, inexperienced new intern. Trust their output only after critical human review. Wise humans are increasingly necessary in the new world of generative AI. As the sophistication of the AI tools increases, so does the potential for creating credibility out of nothing.

Somebody should have told Deloitte, but it does not really matter to them. The stuff-up was a tiny component of the millions in contracts they extract from government.

A junior probably got fired as a token of remorse.