Deloitte Faces Scrutiny Over AI-Generated Citations in $1.6 Million Canadian Healthcare Report
A major report commissioned by the Newfoundland and Labrador government and produced by global consulting firm Deloitte has come under fire after journalists and researchers discovered that it contains a series of fake academic citations—likely generated by artificial intelligence.
The 526-page Health Human Resources Plan, released in May 2025, was intended to guide the province’s strategy for addressing persistent shortages of nurses and physicians. The report cost the provincial government approximately $1.6 million CAD ($1.13 million USD), making it one of the most expensive government-commissioned studies in recent memory.
Errors in the report were first flagged by The Independent, a Canadian news outlet, which found at least four citations referencing academic papers that do not exist. In some cases, real researchers were linked to studies they never authored, and others were paired together on papers they never collaborated on. The report also cited a publication from the Canadian Journal of Respiratory Therapy that cannot be found in the journal’s database.
Deloitte Canada has responded by stating that while AI was not used to write the report, it was used to assist in generating a limited number of research citations. The company says it is currently reviewing and revising the report to correct these errors, but maintains that the findings and recommendations remain valid.
“Deloitte Canada firmly stands behind the recommendations put forward in our report,” a spokesperson told Fortune. “We are revising the report to make a small number of citation corrections, which do not impact the report findings. AI was not used to write the report; it was selectively used to support a small number of research citations.”
The controversy comes on the heels of a similar incident involving Deloitte’s Australian branch. In July 2025, an Australian government report on welfare compliance, also produced by Deloitte, was found to contain fabricated academic references and even a fictitious quote from a federal judge. The Australian government received a partial refund after the errors were discovered.
Despite the similarities, no information has been released regarding whether the Newfoundland and Labrador government will seek a refund or compensation for the errors in the Canadian report. The Health Human Resources Plan remains posted on the provincial government’s website, with no disclosure about the use of AI in its production.
Experts and policymakers have expressed concern about the implications of using AI in government-funded research. “To whatever extent AI was used in both reports, it undermines confidence in the reports and in the decisions that will come,” said a provincial official, referencing the broader trend of AI-generated errors in high-stakes policy documents.
The incident has sparked a wider debate about the need for transparency and robust quality control when AI tools are used in research and policy development. Deloitte itself has previously published guidance on the responsible use of generative AI, emphasizing the importance of governance and upskilling to ensure that technology is used ethically and effectively.
As governments increasingly turn to AI to streamline research and reporting, the Deloitte case serves as a cautionary tale about the risks of relying on automated tools without rigorous oversight. The credibility of policy recommendations may hinge on the integrity of the evidence behind them—and, for now, the spotlight remains firmly on how AI is being used in the public sector.