Law school sets minimum “secure” assessments to combat GenAI


UCL: Call for legal educators to come together

At least half of the assessments in the law degree at University College London must now be in a form that “reliably safeguard against the use” of generative artificial intelligence (GenAI), it has decided.

Academics in the law faculty urged the legal education sector to “come together to respond to the future”, rather than “passively responding” to the business models of AI and cloud companies.

They said: “There are voices arguing that heavy use of AI in, for example, coursework, is acceptable—these tools will be available in the workplace, so why should students not have access to them in a university? Perhaps they can be used in assessment, as long as they are acknowledged? We disagree.

“Students at UCL Laws are assessed on underlying abilities and skills rather than a capacity for ‘content creation’. This is part of what distinguishes tasks in higher education from tasks in workplaces.

“Materials produced in a workplace may look similar to those produced in a law school (e.g. a brief, a presentation, or a piece of legal analysis) but, in an educational context, the skills demonstrated in the process are more crucial than the output.”

While lawyers did need to be able to produce content, they went on, this was “just one skill amongst many”.

“It is a grave mistake to think that content creation proxies for the many foundational understanding and skills that legal education should be trying to impart.”

The academics argued their case on how the legal education sector should respond to GenAI in the discussion paper Artificial Intelligence, Education and Assessment at UCL Laws: Current Thinking and Next Steps for the UK Legal Education Sector.

They explained that UCL Laws has a policy that between 50-100% of every taught module, both undergraduate and postgraduate, was assessed “securely” – that is, in a way that could guarantee that AI did not substitute for the skills or knowledge acquisition being evaluated.

“Secure assessments include in-person examinations (which also have advantages of fairness and reducing the mental stress of alternative assessment formats over more extended, overlapping time periods) and other in-person assessments, including oral assessments.

“In addition, for the remaining non-secure types of assessment, such as coursework, the use of AI that can create or alter content is prohibited by default unless explicitly authorised by the module convenor for a valid pedagogical reason.”

The paper said there was a “critical balance” between using AI as a true study aide, facilitating meaningful learning, and using it to “cognitively offload tasks in a way that hinders learning”.

“We believe this balance to be delicate, and that academic judgement is crucial to achieving it.”

The academics said they recognised “the value of diverse forms of assessment”, but “in assessing our degrees, integrity takes priority and should not be put at risk”.

They went on: “AI is an opportunity, but it is not an opportunity to abandon foundational legal education because Microsoft has decided to embed a chatbot into their ubiquitous office software suite.

“As a law school, we have shifted significantly to secured forms of assessment, but in many respects, this is also a return to recent practice following the huge shift to coursework necessitated by the coronavirus pandemic.”

The academics said law schools had to approach AI “critically” to avoid being part of the ongoing hype cycle.

Universities could not “fully escape the political economy of software platforms”, and must be pragmatic.

“Yet, we also need to be mindful, and wary not to become an instrumental piece of a larger business model designed less for education than for ensuring a steady pipeline of future customers.

“There is a serious risk that if we go passively down this road, private firms, rather than our teachers and scholarly communities, will lead in determining pedagogy and practice.”

The academics added: “Universities must steer, and if necessary, themselves create, the technology they need for their missions. Working together makes this viable.”




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


Choosing a reporting accountant

It would be beneficial for numerous reasons if the SRA considered providing certain reporting accountants with an accreditation or quality mark.


Jeff Zindani

Blinded by the light: Can law firms survive the PE gold rush?

In a legal market where tradition collides with transformation, law firms of every size and stripe are being approached almost daily by private equity houses.


The COFA role: Balancing responsibility, risk and reality

The world of legal compliance is a pressured one, with few positions carrying the weight of personal responsibility quite like that of the COFA.


Loading animation