Consulting firm Deloitte is in the doghouse after being exposed for using AI tools to prepare a report for the Australian government.
The consulting firm is to begrudgingly give back part of its $440,000 fee after the report was found to contain numerous AI-generated errors including references to non-existent court rulings and fictitious academic papers.
Depressingly the report was on the use of IT to run welfare compliance systems, something particularly telling as Australian governments still struggle to accept the consequences of the Robodebt scandal.
The fact the government seems quite relaxed about this and aren’t even going to press for the full contract amount to be returned, let alone sanction Deloitte in any meaningful way, shouldn’t fill the Australian citizenry with any optimism about that the Robodebt lessons have been learned when it comes to using AI or any other technology.
What is particularly dispiriting about the Deloitte mini-scandal, is until recently this sort of work was previously done by newly recruited graduates, or even interns.
Those graduates or interns would have had the importance of citing verifiable references drilled into them during their years at university.
Even then, that work would have been checked before being approved and sent to the client.
Now it appears the major consulting companies can’t even be bothered checking their work, so confident they are that Australian governments will continue to use their services rather than rely on better value, local providers or – god forbid – internal advice from their own Departments.
It’s long been a truism that governments bring in external consultants to tell them what they want to hear, so in many ways the new age of AI-slop is the perfect tool for the job.
Although it would be nice if they could be at least bothered checking their own work.
Leave a Reply