
Holman: Companies looking for AI audit services
Law firms are in the race to offer companies artificial intelligence (AI) assurance services, according to a specialist AI and data lawyer.
The Financial Times reported this month that the Big Four accountants were working to create a new type of audit that verifies the effectiveness of AI tools.
Matt Holman, a partner at Kent and London firm Cripps, told Legal Futures that law firms – including his – were also exploring AI audit capabilities.
“Like most legal audits, they focus on the things that fit within the purview and safety of regulated law firms,” he said.
“That means, in practice, that law firms see whether the AI system is being used in a way that complies with the legal frameworks in which the client company operates, focusing often on retrospective analysis.
“It is very likely that in years to come, more law firms will explore providing this service.
“By 2030 it is increasingly likely that law firms will begin to offer more technical AI audits, such as analysis of whether the AI system is functioning properly and in accordance with the service level agreements and contract terms provided by the suppliers.”
Mr Holman – one of only 20 solicitors to hold accredited IT lawyer status with the Society for Computers and Law – explained that there was “increasing desire” amongst large corporations for AI audits.
For those organisations caught by the EU AI Act, this was “more than just a desire to do the right thing”, he went on.
The Act requires organisations to have risk management systems, monitoring of accuracy and robustness of AI systems and quality management systems regarding use of high-risk AI systems. There are also duties to ensure human oversight of key decision making by high-risk AI systems. The majority of these obligations fall onto AI providers – such as large tech corporations developing AI – although basic risk management duties also fall to the organisations using them.
“Aside from the legal requirements, there are many good practice and governance frameworks that mandate audit and oversight of AI systems,” Mr Holman went on.
“The Department for Science Innovation and Technology published its draft AI Management Essentials framework last year which, amongst other things, recommended keeping internal records for audit of performance of AI systems.”
Existing UK laws, such as GDPR, require record keeping, documentation and assessment of use of data, which is critical to the use of AI, he added.
“Whether by direct law such as the EU AI Act or indirect law such as the GDPR, or good practice recommendations such as AIME, businesses need to be aware of the increasingly burdensome requirement to carry out and monitor risk assessments.”
But Mr Holman predicted that AI audit services from professional firms were not likely to extend to “lifting the bonnet” on the inner workings of generative AI programs to make judgements about underlying mathematical accuracy, coding or system execution.
“This is, in part, because generative AI does not operate like standard computer programs and so it is not always possible to know how or why it responds in certain ways to certain prompts. Indeed, this is often known as the ‘black box problem’ which currently even generative AI developers are unable to crack.
“Instead, professional services auditing is likely to focus on the organisation’s procedures, methods and deployment of AI to determine whether it is using AI in safe and legal manner.”
Leave a Comment