Law firm plays down BBC story on restricting AI use


ChatGPT: No client files uploaded, firm says

Commercial law firm Hill Dickinson has played down a BBC report that it has blocked general access to several artificial intelligence (AI) tools after it found a “significant increase in usage” by staff.

The BBC said it had seen an email from the firm’s chief technology officer which warned staff about the use of AI tools and said it was only granting access to AI tools via a request process.

It quoted an Information Commissioner’s Office spokesperson saying that firms should not discourage the use of AI in work.

The email said: “We have been monitoring usage of Al tools, particularly publicly available generative Al solutions, and have noticed a significant increase in usage of, and uploading of files to, such tools.”

Hill Dickinson had detected more than 32,000 hits to ChatGPT over a seven-day period in January and February, and more than 3,000 visits to the Chinese AI service DeepSeek, as well as almost 50,000 hits to Grammarly, the writing assistance tool.

However, it is understood that these were not individual instances but inputted prompts, where multiple prompts were likely to have happened in a single session.

In a statement, the 200-partner practice said: “Like many law firms, we are aiming to positively embrace the use of AI tools to enhance our capabilities while always ensuring safe and proper use by our people and for our clients. AI can have many benefits for how we work, but we are mindful of the risks it carries and must ensure there is human oversight throughout.

“Last week, we sent an update to our colleagues regarding our AI policy, which was launched in September 2024.

“This policy does not discourage the use of AI but simply ensures that our colleagues use such tools safely and responsibly – including having an approved case for using AI platforms, prohibiting the uploading of client information and validating the accuracy of responses provided by large language models.

“We are confident that, in line with this policy and the additional training and tools we are providing around AI, its usage will remain safe, secure and effective.”

It is not believed that any client or internal files were uploaded during the period that was monitored.

The firm has since received and approved staff requests for use of AI tools.




Blog


Mazur: a symptom not a cause?

If Mazur is a symptom, what does it mean for the underlying health of our civil justice system: the ‘finest legal system in the world’?


Cross-generation collaboration: the key to in-house legal tech adoption

In-house legal function leaders will increasingly have to evolve their thinking on how to manage multigenerational teams containing differing levels of technological expertise.


AI and law firm risk – the view of professional indemnity insurers

In considering law firm applications for cover, many insurers will expect to see evidence of how firms are adapting to AI and preparing for the future.


Loading animation