Computers using artificial intelligence (AI) could be given separate legal personalities enabling them to own property, a Supreme Court justice has suggested.
Lord Hodge said although the idea “may sound far-fetched”, there was “no reason in principle why the law cannot create such personality”.
Lord Hodge said the separate legal personality of a ‘one-person’ company had been recognised in English law since 1897, and more recently it had recognised the separate legal personality in Indian law of a “ruined temple which was little more than a pile of stones”.
Delivering the first Edinburgh FinTech Law Lecture, the Supreme Court justice said giving computers a separate legal personality was “one option” for tackling the questions raised by AI in fintech.
He went on: “It would be possible for the machine as a separate legal person to own intellectual property and in turn to be owned by a financial institution.
“That institution’s licence or the general regulatory law could impose on the firm responsibility for any malfunction, if, for example, it had been involved in the design of the algorithm.
“The law could confer separate legal personality on the machine by registration and require it or its owner to have compulsory insurance to cover its liability to third parties in delict (tort) or restitution.
“And as a registered person the machine could own the intellectual property which it created.”
Lord Hodge, a Scottish lawyer, said it was “not practicable” for the common law to evolve through case law to create a “suitable legal regime” for fintech, and new legislation would be needed.
He welcomed the setting up by the Lord Chief Justice, Lord Burnett, of an advisory group on AI as a means of “alerting the judiciary and the court system to the opportunities and challenges” it posed.
“But a larger-scale collaboration involving the executive branch of government, focusing on AI and fintech and aiming to produce facilitating legislation is probably needed if the UK is to facilitate the development of fintech without harming the integrity of the markets or financial consumers.”
Lord Hodge said property law would need to adapt to cope with “assets which are the product of fintech”, whether they were digital currencies or intermediated securities.
“Another matter which needs to be addressed is whether the AI involved in fintech should give rise to intellectual property which the law should recognise. If machines act autonomously to create new contracts, should there be copyright, and who should own it?
“Similar questions arise in relation to patents if such machines invent things which have industrial application.
“In relation to copyright, UK law treats as the author of a computer-generated work the person by whom the arrangements necessary for the creation of the work are undertaken.
“This approach appears to have considerable potential to create disputes, particularly if a machine is involved in the arrangements.”
Lord Hodge said the law of tort (delict in Scotland) would need to be revised to attribute liability for harm caused by machines using AI, just as the use of driverless cars raised liability issues.
“Similar but more difficult questions will arise in relation to attribution of liability and causation in the context of transactions performed by fintech.
“Is liability for harm caused by the decisions of the machine to be imposed on the producer of the machine on a product liability model? Or should the owner or organisation which operates the machine be answerable for such harm?
“More fundamentally, in a market system in which it is entirely legal to impose economic harm on a competitor if one trades within the boundaries of the law, how do you define what is an economic wrong resulting from the autonomous acts of machines?”
On contract law, Lord Hodge said that although smart contracts could not be “unscrambled” in the same way as traditional contracts, “much greater problems” could arise if computers used machine learning to “optimise” transactions”.
“If businesses were to use computers with machine-learning capability to deal with other computers with similar ability, they could autonomously generate transactions which would not fit easily into our contract law.
“If a financial institution could walk away from a machine-created transaction, that might create chaos in the commercial world.”
Lord Hodge concluded: “Data is power, and AI is power. Can the law cope? My answer is yes, but it will require legislation. There also needs to be innovative regulation.”
Meanwhile, the Information Commissioner’s Office is calling for contributions to its work on creating the first auditing framework for AI.
Simon McDougall, the office’s executive director for technology policy and innovation, said in a blog: “The framework will give us a solid methodology to audit AI applications and ensure they are transparent, fair; and to ensure that the necessary measures to assess and manage data protection risks arising from them are in place.”
He said a formal consultation paper was likely to be issued next January, with the final framework and associated guidance published by spring 2020.