SRA response to the SQE1 pilot

31 July 2019

SQE1 is the first stage of the SQE assessment and is mainly focused on the application of a candidate's functioning legal knowledge. We ran the SQE1 pilot with more than 300 candidates in March 2019, aiming to test whether our proposed assessment works well. We wanted to know, for instance, whether it is fair, reliable and appropriately robust.

Our response is informed by a report by our assessment provider, Kaplan, on the results of the pilot and a report by our independent reviewer to provide external scrutiny on the pilot and whether it achieved its purpose.





The SQE and its purpose

High standards in the legal profession are crucial. Our role is to make sure that both the public and employers can trust that anyone entering the solicitor's profession is competent and fit to practise.

In April 2017, following 18 months of extensive consultation, our Board agreed to change the way solicitors qualify by introducing a common assessment - the SQE.

The SQE will:

  • be taken by anyone wishing to qualify as a solicitor
  • be delivered by an independent assessment organisation
  • assess application of a candidates' functioning legal knowledge (SQE1) and legal skills (SQE2)
  • make sure that those who pass it have met the consistent, high standards we expect from a solicitor.

We agreed to work with experts and key stakeholders to develop the SQE assessments in an open and transparent way, ahead of its introduction in autumn 2021.

It is vital that we develop an assessment that:

  • properly and consistently tests whether someone meets the high standards we expect
  • is fair, that those who pass deserve to pass, and those that fail deserve to fail
  • is cost-effective and manageable
  • does not place any unjustifiable requirements on candidates
  • has been designed with input from our stakeholders.

The SQE1 pilot is a key part of developing and testing the assessment to make sure the SQE is credible and fit for purpose.

Progress on developing the assessment

  • We appointed Kaplan as the independent assessment organisation in August 2018 following a one-year procurement process.
  • Since then we have worked closely with Kaplan - and a range of stakeholders - to refine the design and content of the SQE assessments. We have involved hundreds of stakeholders in the development of the SQE. This includes:
    • discussing the SQE1 pilot plans with the SQE Reference Group - a group of academics, training providers, firms and representative groups
    • feedback from education and training providers through direct engagement, meetings, events and, as well as through our dedicated LinkedIn group, which is open to all stakeholders who are interested in the development of the SQE and currently has 1,147 members including training providers and law firms
    • meeting with a group of subject specialists, including those from law firms, universities and training providers, to refine the Functioning Legal Knowledge (FLK) in the Assessment Specification.

Between August 2018 and April 2019 we have contributed to 56 meetings or speaking events which were attended by around 1950 people.Kaplan have also conducted user research to get views on what stakeholders want from the SQE website.

Ahead of running the SQE1 pilot, we appointed an independent reviewer, following an open recruitment process. The independent reviewer provides external scrutiny of the piloting of the SQE and, in due course, the running of the live assessment by Kaplan and the SRA.

Summary of our response to the pilot

We are pleased with the results of the pilot as:

  • both Kaplan and the independent reviewer confirmed that the pilot was a useful and valid exercise that achieved our aims
  • it showed it is possible to design a Functioning Legal Knowledge (FLK) assessment that is robust and manageable
  • the majority of pilot candidates were positive in their feedback and
  • the operational aspects of the pilot went well.

There are two key areas where Kaplan, supported by the independent reviewer, have made recommendations for changes:

  • Kaplan have advised that amending the FLK assessment design from three 120 questions assessments to two 180 assessments will improve the reliability and accuracy of the assessment and make the SQE more robust. Good levels of reliability and accuracy mean we can be confident about pass/fail decisions. They are critical in a national licensing exam where consumers must be protected. So we have decided to accept Kaplan's recommendation.
  • Kaplan report that the results from the pilot do not give a sound basis for proceeding with the proposed assessment of skills in SQE1. They recommend removing the skills assessment from SQE1. We have decided to take some time to discuss this finding with stakeholders and to explore whether there are any other suitable ways to assess skills in SQE1. We will use the SQE2 pilot to help inform our thinking on this question, if necessary.

These decisions have also been informed following feedback from our SQE Reference Group. We have provided further details on these decisions at the end of this page.

Next steps

  • SQE1 pilot: We will continue to discuss the findings from the pilot with stakeholders. In particular, we will seek views on the final draft of the SQE1 Assessment Specification and on the inclusion of skills in SQE1.
  • SQE1 Assessment Specification: We plan to publish a final version of the SQE1 Assessment Specification later in the year. We will also be publishing some sample FLK questions this year. If we have not made a final decision on SQE1 skills by then, we will publish the final FLK section of the Assessment Specification so that universities and other training providers have the detail they need to plan their SQE1 training.
  • Other recommendations from the pilot: We will work with Kaplan and the independent reviewer to implement the operational recommendations from the pilot.
  • Quality assurance: We will continue to develop and document our quality assurance processes ready for the live assessments.
  • Timing of the live assessments: The first SQE1 live assessments are due to take place in 2021, with SQE2 assessments following in 2022. We will seek stakeholder views on the exact timing of the live SQE assessments in the Autumn with a view to publishing the timings for the first live assessments by the end of the year.
  • SQE2 pilot: We will continue to work with Kaplan on the design of the SQE2 pilot. In the summer we will ask stakeholders for views on the assessment objectives for SQE2 in the Assessment Specification over the Summer. We will open applications for the SQE2 pilot in August and the pilot will take place in December. With 14 skills tasks in SQE2 we do not anticipate the same issues that we encountered in the SQE1 skills pilot; however we will be looking at this closely.
  • SQE implementation: Our Board will consider the findings from the SQE2 pilot and make a final decision on go-live for the SQE in summer 2020. After that we will apply to the Legal Services Board for approval of the regulatory arrangements needed to introduce the SQE.
  • Diversity: In light of findings on the skills assessment, including the performance by candidates from protected groups, we are looking again at the place of the SQE1 skills assessments. We will also investigate attainment by candidates with protected characteristics in the SQE2 pilot. We will make sure our quality assurance processes for the FLK scrutinise questions during writing and editing, and after their use to check for gender and ethnicity effects and monitor performance by protected characteristics.

Further details of the pilot

What went well

  • Kaplan reported that it is possible to design and deliver an FLK assessment that meets our objectives.
  • The independent reviewer confirmed that the pilot was successful and achieved its purpose.
  • The 316 candidates who completed the assessment were broadly representative of those who would sit SQE1, both in terms of prior education and demographic characteristics.
  • 82% of candidates who responded to the post pilot survey agreed/strongly agreed (63%) or were neutral (19%) that the FLK questions were clear. And 71%* agreed/strongly agreed (56%) or were neutral (16%) that the FLK questions covered appropriate knowledge.
  • Kaplan concluded that the operational aspects of the pilot went well. It is possible to run a computer-based SQE1 assessment both in the UK and abroad at Pearson VUE test centres. 89% of candidates who responded to the post pilot survey agreed/ strongly agreed (72%) or neutral (17%) that the pre-exam information in the joining instructions was helpful. 85% were agreed/strongly agreed (73%) or neutral (12%) that the instructions provided on the day of the exams were clear.

*Because figures have been rounded, total figures may appear not to add up

Lessons learnt

  • Kaplan will review communication with candidates, particularly for those candidates with reasonable adjustments, to make sure there is clarity about all aspects of the exam process.
  • The independent reviewer made a number of recommendations for continuing to review and document question writing, quality assurance and standard-setting procedures for the live assessments.

Equality, diversity and inclusion

  • The performance of candidates with protected characteristics was monitored. In addition, Kaplan conducted exploratory analyses to give an indication of the best predictors of candidate performance. They cautioned that despite having a diverse spread of candidates, there are limitations to drawing conclusions from the results. Reasons for this included the small sample size, overlapping variables (eg completion of a GDL and ethnicity) and the fact that behaviour will be different in a pilot when compared to a live examination.
  • With those caveats, key things to note include:
    • The most significant predictors of FLK performance were completion of a GDL and completion of a law degree at a Russell Group university
    • Male candidates performed marginally better in the FLK assessment, although there was little difference in the skills assessment. Overall Kaplan concluded that gender was of limited significance in determining performance.
    • White candidates generally performed better than Black, Asian and minority ethnic candidates in both the FLK and particularly the skills assessment. However, analysis suggested that completion of a GDL and a law degree at a Russell group university were much more significant sources of score variance than ethnicity on the FLK. In the skills assessment white candidates performed better even taking into account FLK scores. It also appeared that ethnicity was the most significant predictor of scores in the skills assessment.

Both Kaplan and the SRA are committed to ensuring that their processes are robust in scrutinising questions and assessment procedures to ensure they are fair for all candidates.

As Universities UK have recently pointed out in their report 'Closing the Gap', an attainment gap by ethnicity is reported in most universities, so the FLK results are, sadly, not surprising.

In light of the pilot findings, we are looking again at the place of SQE1 skills assessments. We will also investigate attainment by protected characteristics in the SQE2 pilot. And we will be reporting on performance by protected characteristic when the SQE assessments go live. The SQE, through changing to a consistent, single standard, will enable us to better understand how different groups perform.

More widely our aim is for the SQE changes to help enable talented people from all backgrounds to become a solicitor. This includes through making sure there are more routes to qualifications and more opportunities to earn as you learn.

The pilot - candidate results

The purpose of the pilot was to test assessment design and processes, not candidate performance. It was not appropriate to set a pass mark since the pilot was not testing candidate performance. And it was not necessary to do so for the purposes of the pilot.

Pilot candidates performed worse than would be expected in a live assessment. The range of scores was 17.5% - 85% for the pilot FLK assessment, with the average mark for the pilot FLK being 50%. The range of marks for the skills was 8% to 100%.

The pass mark for the SQE will vary between exams, to make sure that the standard of the assessment remains consistent from one sitting to the next. However, on these pilot questions the pass mark would likely have been above 50%. It can be expected that the performance of candidates will improve significantly for a live licensing assessment as opposed to a pilot. In the live assessment, candidates will be more motivated and training aligned to the SQE will be available.

We will send candidates their full marks for the pilot assessments together with an indication of their performance as compared to the cohort. And we will publish an indicative range of the possible pass marks for the sample questions we publish.

Further detail – our response on the design of the FLK assessment

We have considered the recommendation that we amend the design of SQE1 to two 180 question assessments rather than three 120 question assessments. We are aware that some stakeholders have previously expressed concerns that having question papers that cover a number of legal topics traditionally assessed separately means that some people could fail one part of the assessment but still pass overall. There is also a view that if topics are assessed separately, they would be covered in more depth.

We shared the findings from the pilot with our Reference Group. There were no strong feelings among the Group about the proposal to amend the FLK design. Kaplan and the independent reviewer advised that the evidence from the pilot showed that candidates would find it very difficult to achieve a pass without good functioning legal knowledge across all areas. Generally, where a candidate does well in an assessment, they do well across all subjects, not in some.

Reliability is critical in a national licensing exam where consumers must be protected. Kaplan have advised that amending the FLK design will improve the reliability and accuracy of the assessment and make the SQE more robust. So we have decided to accept Kaplan's recommendation.

Further detail - our response on the SQE1 skills assessment

Kaplan report that the results from the pilot do not give a sound basis for proceeding with an assessment of skills in SQE1. Their analysis showed that the assessment did not reach the levels of accuracy or reliability considered appropriate for a national licensing exam. This means that we could not be confident that it would assess candidates to a consistent standard at each exam sitting, or that it would accurately identify which candidates should pass or fail.

They also found that white candidates were likely to perform better in the skills assessment than candidates from other ethnic backgrounds even taking into account their performance on the FLK. And they reported difficulties with determining the standard for the assessment because it is set at a different, lower standard to the rest of the SQE.

We were already aware that there were some challenges with the inclusion of a skills assessment in SQE1 since it does not assess candidates to the same standard as the other SQE assessments (ie the standard required for a 'day one solicitor'). Our original design for SQE1 did not include a skills assessment.

We decided to pilot a skills assessment in SQE1 in response to feedback from some stakeholders during consultation. They told us that an SQE1 assessment which relied purely on single best answer assessments would not be robust and would lack credibility. Firms also felt strongly that trainees need to have the right knowledge and skills to contribute to the business during their qualifying work experience.

We shared the findings from the pilot with members of our reference group. The group felt that there is still a need for a preliminary assessment of writing and research skills in SQE1.

In view of the feelings among stakeholders, we have decided to take some time to discuss this finding further with stakeholders and to explore whether there are any other suitable ways to assess skills in SQE1. We will use the SQE2 pilot to help inform our thinking on this question, if necessary.