The Senior Accounting Officer (SAO) regime has always put personal accountability at the centre of large company tax compliance. Since its introduction in 2009, Finance Directors and Tax Directors at qualifying companies have been required to certify that their organisation has appropriate tax accounting arrangements in place — and to take personal responsibility if it does not.
In January 2026, new AI governance standards created a significant extension of that accountability. For the first time, the use of AI in tax compliance processes is explicitly in scope.
What the SAO regime requires
For companies with turnover above £200m or a balance sheet above £2bn, the SAO must:
- Take reasonable steps to ensure the company has appropriate tax accounting arrangements
- Certify annually that those arrangements are adequate, or disclose where they are not
- Accept personal liability for penalties (up to £5,000) if the certification is incorrect
"Appropriate tax accounting arrangements" are defined as those that allow tax liabilities to be calculated accurately. HMRC's guidance has always made clear that this includes the systems, processes, and controls that underpin tax compliance — not just the final figures.
The January 2026 standards extend this definition explicitly to include AI-assisted processes.
What the January 2026 AI standards require
HMRC's updated SAO guidance requires that, where AI tools are used in tax compliance processes, the SAO must be satisfied that:
The AI tool is fit for purpose. This means understanding what the tool does, what data it uses, and what its known limitations are. Using an AI tool without understanding its accuracy characteristics is not consistent with appropriate tax accounting arrangements.
Human review is applied to AI outputs. AI-generated tax analysis, disclosures, or calculations must be reviewed by a qualified person before being relied upon. The guidance specifically states that AI outputs cannot substitute for professional judgment without appropriate review.
A governance trail exists. The company must be able to demonstrate, if asked by HMRC, what AI tools were used in preparing any specific return or disclosure, what human review was applied, and how errors would have been identified.
The AI tool's use is disclosed where required. In certain circumstances — particularly in the context of UTT notifications — the use of AI in identifying or assessing positions may need to be disclosed.
Where the SAO risk actually lies
The practical SAO risk is not primarily about using AI tools. HMRC is not discouraging AI use — they use it extensively themselves. The risk is in using AI tools without the governance infrastructure to ensure they are used appropriately.
The most common failure modes to watch for:
AI tools used informally without policy. Individual tax team members using AI tools for research or drafting, with no organisational awareness and no governance trail. If a disclosure is based partly on AI-assisted analysis with no documented review process, the SAO has a problem.
AI-generated outputs not reviewed adequately. Treating an AI-drafted disclosure as final without substantive professional review. The speed advantage of AI is real — but it must not come at the cost of the review step.
No documentation of limitations. Using an AI tool that has a training cutoff, and not checking that the AI's knowledge is current. A disclosure based on outdated legislative analysis is a compliance failure, regardless of whether a human or an AI drafted it.
No assessment of data quality. The AI's output is only as good as the data it's given. A company with unreliable source data feeding an AI analysis tool does not have "appropriate tax accounting arrangements" — it has the appearance of a systematic process producing unreliable outputs.
Practical steps for SAO compliance
Step 1: Identify where AI is being used. Survey your tax function — formally or informally — to understand which AI tools are in use, for what tasks, and by whom. You may be surprised.
Step 2: Establish a simple AI use policy. The policy does not need to be complex. It needs to cover: which AI tools are approved, what tasks they can be used for, what human review is required, and what data cannot be inputted.
Step 3: Create a governance trail. For significant compliance outputs — tax returns, UTT notifications, transfer pricing documentation — note where AI assistance was used and what review was applied. This does not need to be elaborate: a brief note in the working paper file is sufficient.
Step 4: Review your existing tools against the new standards. If your tax function has implemented any AI tools in recent years — document processing, contract review, workpaper analysis — review them against the January 2026 standards to confirm they meet the governance requirements.
Step 5: Include AI governance in your SAO certification process. The annual SAO certification should now explicitly consider AI governance. If you cannot answer "yes" to the governance requirements above, the certification should reflect that.
The broader point
The January 2026 AI governance standards are not a burden on tax functions that are already operating well. They are a formalisation of what good governance always required: that you understand your tools, review their outputs, and can explain your processes.
For tax directors who have been thoughtful about AI adoption, the new standards should require minimal additional work. For those who have allowed AI use to develop organically without governance, 2026 is the moment to catch up.
The SAO regime has always been about ensuring personal accountability creates an incentive for good process. The extension to AI governance is consistent with that logic.
Tax Haus publishes practical analysis on AI and UK corporate tax. If you'd like to discuss AI governance for your tax function, get in touch.
Topics