Can AI Hallucinate Tax Advice?
Tax research has always required a verified answer, not just a plausible one. For accountants reaching for AI tax tools to support compliance work, that distinction has direct professional consequences.
General-purpose AI tools can generate tax references that appear credible but do not exist. That can include references to rulings, legislative sections, ATO guidance or cases. The output may read confidently, but confidence is not the same as verification.
For accountants, the issue is not whether AI can be useful. It can be. The issue is whether the answer can be traced back to source material the practitioner can open, read and verify before relying on it.
What AI Hallucination Means in Plain Terms
Large language models, the technology behind tools like ChatGPT, generate responses by predicting likely text based on patterns in their training data. They do not automatically check each answer against a verified database before responding.
In many use cases, that limitation is inconvenient. In tax research, it can become a professional problem.
Australian courts have already dealt with the risk of AI-generated false citations in professional proceedings. In JML Rose Pty Ltd v Jorgensen (No 3) [2025] FCA 976, the Federal Court considered submissions containing AI-generated authorities and quotations that did not exist. Wheatley J described the references as fabricated and misleading, and adopted a redaction approach so false citations were not further propagated.
That case was about legal submissions, not tax advice. But the underlying issue matters for accountants: a general-purpose AI tool can produce an answer that sounds authoritative without giving the practitioner a reliable path back to the source.
If an accountant asks a general-purpose AI tool about an ATO ruling, a trust distribution question or a Division 7A issue, the question is not just "does the answer sound right?" The question is "where is the source, and can I verify it?"
Why the Professional Stakes Are Higher for Accountants
Tim Sandow, President of The Tax Institute, told Accounting Times that tax agents remain responsible for work produced with AI and must be careful about maintaining client confidentiality when using AI tools.
That is the practical point. AI does not remove the professional obligation sitting with the accountant or tax agent. It changes the workflow, but not the responsibility.
Australian tax practitioners must still meet their obligations under the Tax Agent Services Act 2009, including the Code of Professional Conduct. The Tax Practitioners Board's guidance on reasonable care also makes clear that practitioners must take reasonable care to ensure taxation laws are applied correctly to a client's circumstances.
For professional accountants, APES 110 Code of Ethics for Professional Accountants also remains relevant. The principles of professional competence and due care, confidentiality, integrity and professional behaviour do not disappear because a tool produced the first draft of an answer.
AI-generated output without verifiable source attribution is not enough to support professional tax advice. At best, it is a starting point for further research. At worst, it can introduce errors into work that should have been checked against authoritative source material.
Shadow AI Has Been in Accounting Firms Longer Than Most Partners Realise
The other risk is not just inaccurate output. It is unmonitored use.
John Munnelly, KPMG Australia's Chief Digital Officer, told The Register that KPMG blocked ChatGPT after discovering an AI tool had located thousands of employee credit card numbers on internal servers. For a large firm with mature governance processes, that was a clear warning about how quickly AI use can create exposure when it is not properly controlled.
Most accounting firms will not discover shadow AI in such a dramatic way. The more common version is quieter: a graduate uses a free AI tool to summarise a client document, a manager asks a public chatbot to draft a response to a technical question, or a staff member uses an online tool to reword notes from a client file.
That behaviour is usually not malicious. Staff are trying to work faster. But from a firm governance perspective, intent is not enough.
CPA Australia's recorded webinar, "Can you use AI and comply with your ethical obligations?", frames AI use as an ethics issue for accountants and asks practitioners to consider how AI ethics interact with their professional responsibilities before using AI in business.
Unmonitored AI use can create undocumented processes, unclear data handling and weak quality controls. That matters for firms maintaining a quality management system under APES 320 Quality Management for Firms, which requires firms to establish policies and procedures for quality management across their work.
Data exposure and unsupported citations are different issues, but they come from the same underlying problem: using tools that were not built around the professional accountability standards accounting firms need to meet.
What Purpose-Built AI Tax Software Does Differently
The distinction that separates professional-grade AI tax software from general-purpose AI comes down to how the tool handles source material.
SavvyWise's tax research workflow is built around source-backed answers. It draws on a curated knowledge base of authoritative Australian tax sources and surfaces citations as part of the response, so accountants can open the source and verify it before relying on the answer.
That source link is the difference between a useful research workflow and a plausible answer that still needs to be checked from scratch.
For Australian tax research, relevant source material may include ATO rulings, legislative instruments, court decisions and expert commentary from practitioners, including Adrian Cartland of Cartland Law. When a tax question requires interpretation, the value is not simply that the tool gives an answer quickly. The value is that the answer points back to the underlying material a practitioner can assess.
This matters because accounting firms do not just need faster answers. They need answers that fit the way tax work is reviewed, documented and defended.
Pro Research with full citation visualisation reflects that design principle. Citations are not a footnote. They are part of the workflow. The accountant can see the source, open the source and decide whether the source supports the answer before using it in client work.
What to Look for in an AI Tax Tool
Before your firm adopts AI tax software for research, there are a few practical questions worth asking.
- Does the tool use authoritative Australian tax sources?
- Does the answer include direct, openable citations to the source material?
- Can the practitioner verify the source before relying on the output?
- Does the tool protect confidential client data by design?
- Does the vendor clearly explain whether firm inputs are used to train models?
- Is the tool built with accounting and tax workflows in mind?
- Does it support the firm's review, documentation and quality management obligations?
If a tool cannot answer these clearly, it may still be useful for general drafting or productivity. That does not make it suitable for professional tax research.
Frequently Asked Questions
Can AI tools hallucinate in tax research?
Yes. General-purpose AI tools can generate references that appear authoritative but do not exist. This happens because they generate likely responses rather than verifying every claim against a confirmed database. In tax research, that is a serious limitation because the practitioner needs to verify the source before relying on the answer.
What makes AI tax software suitable for professional use?
AI tax software should be built around authoritative source material, clear citations and data protection. For Australian accountants, that means source-backed answers using Australian tax materials such as ATO guidance, legislation, court decisions and expert commentary. The practitioner should be able to open the source and confirm the answer before using it in client work.
What professional obligations apply to AI use by accountants?
Australian tax practitioners must comply with the Tax Agent Services Act 2009 and the Code of Professional Conduct. Professional accountants should also consider APES 110 and APES 320, particularly around confidentiality, professional competence and due care, and quality management. AI can support the workflow, but it does not remove the practitioner's responsibility for the advice given.
Is ChatGPT suitable for tax advice?
ChatGPT and other general-purpose AI tools can be useful for drafting, summarising or brainstorming. They should not be treated as a verified tax research source unless the answer is checked against authoritative material. For tax advice, the source matters as much as the wording of the answer.
The Distinction That Protects a Firm's Professional Standing
The documented cases of AI hallucination in professional settings are not arguments against AI in accounting. They are arguments for being precise about what kind of AI is being used, what source material it relies on and what controls sit around it.
General-purpose AI tools were built to be broadly useful. For many tasks, that is enough. For tax research, where every answer needs to be traceable to source material that actually exists, it is not.
Firms that want to use AI for tax research should start with three requirements: verified source material, visible citations and clear data protection. Those requirements do not slow the firm down. They are what make AI usable in a professional tax environment.
SavvyWise is available to trial for firms that want to see what source-backed AI tax research looks like in practice.