Deepfake evidence – are you ready?


Guest post by Sinead O’Callaghan, managing partner at City firm Cooke Young & Keidan

O’Callaghan: Deepfake material could turn a case on its head

As litigators, we are familiar with evaluating and challenging the integrity of evidence presented in the courtroom. However, the growing awareness and popularity of deepfake material means lawyers will soon be facing challenges like never before, as even the existence of deepfakes can give rise to questions about the authenticity of evidence presented.

This was particularly notable in a recent Elon Musk court case concerning Tesla’s self-driving feature. After a man died whilst using this feature, his family brought a claim and cited a YouTube video of Mr Musk claiming the vehicles “at this point can drive autonomously with greater safety than a person”.

However, at trial, Mr Musk’s legal team disputed the evidence, claiming that “like many public figures, [Mr Musk] is the subject of many ‘deepfake’ videos and audio recordings that purport to show him saying and doing things he never actually said or did”.

In this case, Judge Evette Pennypacker wrote in her ruling: “Mr Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do. The court is unwilling to set such a precedent by condoning Tesla’s approach here.”

However, it’s easy to see how deepfake material has the potential to turn a court case on its head – either because manipulated images or audio are accepted as genuine evidence and therefore impact the quest for truth and justice, or because the opposing side claims that genuine evidence is in fact a deepfake, thus casting doubt on claims and potentially disrupting or at the very least, slowing down the case as lawyers seek to prove their evidence is genuine.

Writing in a 2018 paper, Deep fakes: A looming challenge for privacy, democracy and national security, US law professors Robert Chesney and Danielle Citro provide what is now a clearly prescient analysis of how deepfake technology can tap into and exacerbate problems related to how “the marketplace of ideas already suffers from truth decay as our networked information environment interacts in toxic ways with our cognitive biases.”

We have all seen how easy it can be nowadays to discredit genuine news by simply calling it fake.

Deep fakes are certainly an issue lawyers need to be alive to and consider, in all manner of cases. Lawyers could be coming up against very convincing but fake evidence or their own material could be questioned.

In 2019, for example, it was reported that a Dubai man’s voice was ‘heavily doctored’ in an attempt to portray him as a violent man, making threats against his wife, in a child custody case.

We’re also increasingly seeing deepfakes being used in fraud cases, where audio is doctored to convince people to transfer cash or even, through stolen images and personal information, creating digital likenesses which can bypass security checks.

Change is afoot to regulate this area. Although it remains to be seen how this will work and be policed. In November, Rishi Sunak will be hosting a global conference to discuss the future of artificial intelligence and its regulation.

It would seem deepfakes and how they should be labelled and identified will form an important part of this conversation – the prime minister is apparently considering plans to require any material created by artificial intelligence to be labelled/watermarked.

It feels like we’re standing at the precipice of change – not many lawyers will have come across deepfakes just yet, but may well soon.

It remains to be seen how the law will need to transform to keep up with this technological change, but it certainly seems reasonable that in some cases we could see lawyers having to spend more on professional support from forensic digital specialists to prove (or disprove) the authenticity of important evidence.




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Loading animation