
McNall: Appropriate to say what he had done
A judge has used artificial intelligence (AI) to summarise documents and help him produce his ruling.
Judge Christopher McNall in the tax chamber of the First-tier Tribunal (FTT) said he had satisfied himself that the summaries were accurate and that he had not used AI for legal research.
“This decision has my name at the end. I am the decision-maker, and I am responsible for this material.
“The judgement applied – in the sense of the evaluative faculty, weighing-up the arguments, and framing the terms of the order – has been entirely mine.”
Judge McNall, ruling on an application by claimants for disclosure of documents from HMRC, said: “I have used AI in the production of this decision. This application is well-suited to this approach
“It is a discrete case-management matter, dealt with on the papers, and without a hearing. The parties’ respective positions on the issue which I must decide are contained entirely in their written submissions and the other materials placed before me.
“I have not heard any evidence; nor am I called upon to make any decision as to the honesty or credibility of any party.”
Judge McNall, a part-time judge, is a barrister specialising in agricultural law as well as tax, based at St John Street Chambers in Manchester.
Delivering judgment in VP Evans (as executrix of HB Evans deceased) and others v HMRC [2025] UKFTT 1112 (TC), he referred to the practice direction on reasons for decisions, released in June 2024, in which the then Senior President of Tribunals, Sir Keith Lindblom, said judges should make “full use [of] any tools and techniques that are available to assist in the swift production of decisions”.
Judge McNall said: “I regard AI as such a tool, and this is the first decision in which I have grasped the nettle of using it.
“Although judges are not generally obliged to describe the research or preparatory work which may have been done in order to produce a judgment, it seems to me appropriate, in this case, for me to say what I have done.”
In April this year, the senior judiciary published updated guidance for judicial officer-holders on the use of AI, which emphasised that “any use of AI by or on behalf of the judiciary must be consistent with the judiciary’s overarching obligation to protect the integrity of the administration of justice”.
Judge McCall said the guidance “mandated the use of a private AI tool”, Microsoft Copilot Chat, available to judicial office-holders through the eJudiciary platform.
“As long as judicial office holders are logged into their eJudiciary accounts, the data they enter into Copilot remains secure and private. Unlike other large language models, it is not made public.
“Principally, I have used AI to summarise the documents, but I have satisfied myself that the summaries – treated only as a first-draft – are accurate. I have not used the AI for legal research.”













He used AI as a tool! This is not irresponsible use or behaviour as he also had the integrity to reveal that he used it.