Lawyers should have to take online tests every 10 years to prove that they remain competent in their specialist fields, Sarah Chambers, chair of the Legal Services Consumer Panel (LSCP), said yesterday.
Meanwhile, new research by the Legal Services Board (LSB) has revealed strong public backing for a tougher approach to continuing competence across the profession.
Ms Chambers said it was “really hard” for consumers to assess whether lawyers were doing a good job.
“They have to assume that someone more senior is more skilled at their job than someone junior. They have no way of knowing.”
She said the current system, where lawyers “tick a box” to say they have reflected on their learning “just does not cut it”.
Ms Chambers went on: “Lawyers tell me they simply cut and paste what they put on the form last year, perhaps changing a word or two. They do not pay any attention to this supposed scheme of self-assessment.”
Speaking at a Westminster Legal Policy Forum virtual seminar on legal education and training, she asked what was wrong with lawyers being subjected to a test of their technical knowledge every 10 years, which could be done online.
“If they don’t pass the first time, they should take it again.” If they still did not pass, Ms Chambers went on, “their accreditation should be taken away”.
She added that regulated lawyers could use taking the tests as a way of distinguishing themselves from unregulated law firms.
Chris Nichols, the director of policy and regulation at the LSB, said a survey of just over 1,000 members of the public, to be published in full next month, showed that 79% thought there should be more specific rules for checking the competence of lawyers throughout their careers.
Another large majority, 87%, thought legal regulators should do more to reduce the risk of incompetent lawyers, while 95% thought lawyers should have to demonstrate that they were competent throughout their career.
Mr Nichols said more detailed, qualitative research with a panel of 23 members of the public produced similar results.
There was “unanimous agreement” that there should be “more specific rules” for checking continuing competence, that existing arrangements left room for lack of competence and there were “too many gaps” in the system.
The LSB issued a call for evidence on continuing competence last autumn, and in a report on responses in February said it would be going ahead with action to ensure regulators introduced checks.
Mr Nichols said legal regulators tended to rely on continuing professional development (CPD) and there was a “real mismatch” between the public’s expectations and reality.
He said that while the call for evidence found particular competence concerns in the areas of immigration and criminal advocacy, it concerned him that “we cannot say with any certainty” what the level of competence was across the profession. “The status quo is not serving the public interest.”
Mark Neale, director general of the Bar Standards Board (BSB), said he disagreed with Mr Nichols that the legal profession’s “main assurance” on competence came from CPD.
He said CPD gave “some assurance” of the competence of barristers, but a “great deal” came from the competitive nature of the market and the scrutiny of instructing solicitors and clients.
“I would advocate a more targeted approach, and not the universal revalidation approach adopted by public services.”
This included ensuring “better flows of intelligence where problems exist”, where the BSB was working very closely with the judiciary and the Solicitors Regulation Authority (SRA).
It also involved developing separate competencies for barristers in the youth courts and the coroners’ courts, and “looking afresh at the competence requirements” for new barristers.
Julie Brannan, director of education and training at the SRA, said the issues in terms of ongoing competence varied from sector to sector and there were “different challenges in different areas”.
She said that the SRA believed that solicitors were “overwhelmingly” doing their training, though more work was needed to improve the quality of “reflection and reporting”.
Ms Brannan said it was much easier to “pile on lots of training” than to measure the impact, and attention should be targeted at “high risk areas”.