SQE Independent Reviewer Annual Report 2025

Ricardo Lé, SQE Independent Reviewer

Purpose

The role of Independent Reviewer is to provide external assurance that the SQE assessments and outcomes are fair, defensible and will command public confidence.

This report comments on the progress of recommendations made in my previous report published in March 2025. Further recommendations for improvements or enhancements are made, as well as encouragement to continue good practice when it has been observed and identified. In making these, I am guided by my judgement based on what has been observed over the past two years in my role and applying my own assessment knowledge from other similar qualifications. The guiding principle is to establish that the SQE robustly assesses the competencies required of the level of a newly qualified solicitor in England and Wales, taking into account the best interests of all stakeholders, including the candidates.

Executive summary

The SQE assessments are high-stakes assessments that are complex and multi-faceted. This makes them challenging to deliver but results in an assessment that is robust and defensible, providing value to its outcomes.

Overall, the 2025 delivery of the SQE was good and demonstrated improvement based on past annual reports, from both myself last year and the previous post holder . This report covers assessments that were delivered in the 2025 calendar year – two SQE1 and four SQE2 sittings. For SQE2, October 2024 is included as the bulk of the results review and ratification process took place in 2025 and October 2025 will be covered in a future report.

Overall, there were some small issues as expected with an assessment of this scale and complexity, but no significant issues that warrant a specific mention in this report.

Kaplan remains committed to continual improvement and enhancement of processes and there is no mentality of settling into a business-as-usual phase. The increase in candidate numbers drives many of these developments, with larger sittings and more assessment forms delivered in 2025 than previously.

The SRA and Kaplan teams work together to ensure openness and accountability, collaborating when issues arise to ensure optimal outcomes. Candidates, stakeholders and the public should have confidence that the SQE outcomes delivered in 2025 were fair and defensible and that there is a clear commitment to the continual enhancement of assessment design, development and delivery processes.

Open all

Evidence was gathered through a mixture of:

  • direct observation of a wide range of exam development, creation and delivery activities
  • interviews with key staff at the SRA and Kaplan
  • access to reports and information produced by the SRA and Kaplan
  • support and advice from the Independent Psychometrician
  • attendance at key meetings, including Assessment Boards and meetings that feed into Assessment Board preparations and the ongoing quality assurance of the SQE.

To provide a succinct overview of information gathered over the past year, this report is broken down into key activities which enable the delivery of the SQE exams:

  • exam creation and production
  • exam delivery and assessment
  • candidate support
  • SQE in Welsh
  • standard setting, determining the pass mark and issuing results
  • quality assurance.

The key processes for successful exam creation and production continue to be in place, as mentioned in previous reports. Assessment is viewed as an art, as there is no such thing as a perfect assessment as the various components of assessment utility, as defined by van der Vleuten – reliability, validity, acceptability, feasibility and educational impact – need to be balanced.

A crucial aspect of validity is that the assessments require the appropriate knowledge and skills that a day one solicitor is required to have. An appropriate mix of reliability and validity is observed across the single best answer multiple choice questions in SQE1 and the mix of written and live observed stations in SQE2. The diversity of item types is a strength of the assessment but does make them complex and technically demanding to create and deliver, especially given their high-stakes nature. I mentioned this approach to utility and the need to balance its various components in my previous report but mention it again to highlight its importance and this will lead to several points later under assessment familiarisation.

Over the past year, the increased demand for seats has resulted in more assessment days within each sitting and more sittings overall. SQE2 April 2025 offered two written sittings for the first time to accommodate increased volumes. Kaplan has expanded its academic team to deal with this increased demand. While observing exam activities, I have met multiple members of this team and found staff to be knowledgeable and committed to ensuring assessments are as valid and reliable as possible.

The SRA engages subject matter experts (SMEs) who are solicitors tasked with providing assurance to the SRA that the assessment meets expected quality standards. They are trained in their roles to provide constructive feedback to the SRA, which is then shared with the academic teams at Kaplan.

There was previously a recommendation for SMEs and Kaplan’s academic staff to communicate effectively around the provision and discussion of feedback, and that SMEs are kept updated on exam build processes. Last year saw the introduction of some new SMEs replacing the previous ones whose contracts had finished. These new SMEs had a thorough onboarding to ensure an understanding of the assessment processes early in their roles, as they were quickly tasked with reviewing items. This process showed a marked increase in the depth of training previously provided. It included sessions ranging from the role overview to specifics of question writing and assessment build, reflecting the recommendation in the 2024 report. The overall process was also improved in 2025 by providing SMEs with more assessment data, such as item statistics from SQE1, to help inform their feedback. These onboarding sessions and additional data were useful and enabled the SMEs to provide comments from their review of items that aligned with the SME quality assurance role and complement the input from Kaplan’s academic team.

Overall delivery

The SQE comprises two parts:

  • SQE1 requires candidates to sit two assessments, each are 180 one-mark questions with no negative marking, assessing functioning legal knowledge (FLK), which require the candidate to select the single best answer out of five possible answers.
  • SQE2 requires candidates to sit four oral assessments, taken at a small number of locations across England and Wales, plus 12 written assessments taken at Pearson VUE test centres worldwide.

During the period covered by this report, there were two sittings of the SQE1 and four sittings of the SQE2 exams.

SQE1 and SQE2 written exams took place in many assessment centres in England and Wales, as well as international venues across Africa, America, Asia, the Middle East, Europe and Australia. For the vast majority of candidates, the exam delivery went without issue. A very small number of candidates experienced technical issues when using a Pearson VUE test centre.

SQE1 exams enable computer marked candidate responses, objectively passing through quality assurance and standard setting to arrive at the final outcome, resulting in a robust, effective and reliable method for assessing the FLK. SQE2 exams require examiners, who are a mix of solicitors and trained actors, to be versed on a common standard of marking and use their professional judgement when applying the marking criteria.

With the exception of SQE2 oral exams and a small number of candidates who have arrangements in place for a particular requirement, usually to support a reasonable adjustment in place, all exams are delivered via computer-based testing at Pearson Professional Assessments test centres. This was effective in 2025, however given the large number of candidates across a wide range of sites, a small number of on-the-day delivery issues were inevitable.

These issues mainly relate to technical computer issues or unforeseen venue issues, such as fire alarms. If candidates experienced an assessment delivery failure, Kaplan was usually able to reschedule their exam within the same window, so disruption in these instances is minimal.

The Mitigating Circumstances Policy includes consideration of assessment delivery failures, where delivery of the assessment has been prevented. This has proved successful at providing candidates with a clear set of options and is allowing them to make more informed decisions regarding any rescheduling. In the majority of these cases, the impact of this was generally minimal and quickly managed so that candidates could proceed with their scheduled examination, sometimes with a short delay.

Spell checker function

The absence of a spell checker function in the SQE2 written exams was highlighted by the previous Independent Reviewer and it was an area that I also commented on in 2024. Without a spell-checking function, the validity of the SQE2 exams is hindered as they do not accurately replicate the context in which a day one solicitor would operate.

To account for this in the marking process, additional guidance is given to markers to ensure candidates are not unfairly penalised for spelling errors that would have been picked up by a spell check function. And to make sure the approach when faced with spelling or grammatical errors is consistent across markers.

Guidance and assurance on spelling is also given to candidates in the SQE2 assessment specification which seems to be an adequate temporary stopgap. But the only way to eliminate the risk of crediting candidates who cannot communicate at the appropriate competency level is through the provision of a spell-checking function. In my previous report, I recommended that urgency is placed on the development and rollout of the spell check function as an integral requirement of SQE2.

In 2025 the development and rollout of this function remain a key priority for Kaplan. Testing has now been carried out to confirm the feasibility of taking the assessment on a browser-based exam delivery platform that would have the option to activate spell check in the future. Evaluation of this testing will help understand how this delivery platform performs for the SQE2 tasks. If the outcome is positive, this new platform with spellcheck feature can replace the current one. At the time of writing, testing suggests that spell check could be implemented before the end of 2026.

SQE2 oral exams

SQE2 oral exams were delivered at test centres located in Cardiff, Manchester, Birmingham and three venues in London. The delivery of the assessments at these sites is fully managed by Kaplan. SQE2 oral assessments are logistically complex, requiring the assessor and candidate to be face-to-face in a space under appropriate exam conditions, with quarantining in place for the multiple sessions across the day to ensure no leakage of exam content.

Over the past year, I visited the oral exam venues in Manchester, London (Holborn) and London (Euston) and spoke to Kaplan and SRA staff to receive their feedback about other locations. Combined with my visits in 2024, I have now been to all venues and all are of an extremely high calibre.

Those that are purpose-built for delivering in-person skills-based performance assessments meet the needs of the SQE2 oral exams. While those that are intended for more generic purposes undergo an extensive set-up process to ensure they are appropriate and provide a secure, efficient and professional environment for the delivery of these exams.

The exam venues are well-staffed to ensure:

  • the assessment runs smoothly
  • that adequate calibration occurs at each venue
  • there is adequate calibration between the different venues running on the day.

This gives assurance that the candidate experience and the standard applied to the marking process is uniform across venues.

Marking and calibration sessions for SQE2 written exams

I observed several marking and calibration sessions for both solicitor and actor assessors for the SQE2 written assessments. These sessions are run online which is an appropriate and efficient way to carry out this exercise, with a mix of large group sessions and breakout rooms to cover the calibration objectives. This year saw an improvement in the delivery of these sessions with engagement from all participants because of smaller breakout rooms and more structure to the sessions.

Reported IT issues are also very rare. I commend the improvement to the structure around these sessions and the expectations set for the facilitators running the sessions.

I note that 2026 will see the largest candidate numbers yet for SQE2. I recommend Kaplan to continue monitoring the engagement with these marking and calibration sessions. This is to ensure that these large groups of assessors go through the calibration process with the same quality we have seen this reporting year.

Candidate services

Candidate services processes worked well and Kaplan has an excellent approach to obtaining candidate feedback. The team is constantly monitoring the demand for spaces and the geographical location of venues to ensure candidates get as close to their preferred location as possible.

In 2024, there was consideration of new venues for the oral assessments in the north of England beyond the Manchester venue. However, candidate demand for seats has been focused on London which resulted in the addition of a third London venue. I am assured that the focus of London for venues continues to match candidate demand.

Candidate feedback and preparation materials

There is feedback from candidates regarding preparation materials and the desire for more practice questions. This is common with candidates preparing for any high-stakes assessment.

For SQE1, there are currently 220 sample questions available. 90 of these were released prior to the first sitting, so while they were not used in an SQE1 assessment they were written using the same process. A recent release of pre-tested questions this year brings the total of pre-tested questions available up to 130. These questions contain data on difficulty to help candidates and test preparation organisations and provide assurance that this sample is representative of the live assessments.

Additional practice content is also available for SQE2, with two new sample questions added in late 2025 along with the refresh of the existing ones. These questions, along with performance indicators and guidance on how to use them, is an invaluable resource to candidates. Adding to these resources are FAQs with guidance on preparing for SQE2 and understanding the marking process, also published in late 2025.

Building on the requests for more practice questions and concerns about the quality of those developed by some test preparation organisations, Kaplan took a wider look at how to help candidates prepare more effectively for the SQE.

In July 2025, Kaplan delivered a series of workshops for organisations involved in producing candidate preparation materials. These focused on preparation of multiple choice questions (MCQs) for SQE1 to better align them with the style candidates face in SQE1 assessments. My observation of one of these workshops was that it was well-attended with excellent engagement from those present, so this should help with the alignment of future preparation materials. Feedback from attendees was also very positive.

My observations on the SQE2 oral assessments were noted above, but it is also worth noting the candidate experience under this section. In the venues observed, the staff were extremely professional and polite in candidate interactions. Candidates were kept informed of the assessment processes and timings at all points throughout the day. In several candidate focus groups that I attended, their positive interactions with staff during their oral assessments was noted so this is to be commended.

Reasonable adjustments

Reasonable adjustments are available to candidates with a disability that request and need them. The most common adjustments are:

  • time adjustments (extra time, stop the clock)
  • low occupancy or sole use of assessment room
  • screen/text adjustments.

Kaplan treats each candidate's request individually and further provisions in addition to the above are also offered, provided adequate supporting information is available. The adjustment plans are generally well-communicated to candidates in advance ahead of the assessment. During one of the observed candidate focus groups, a very small number of candidates reported some confusion on the day of the assessment, particularly for the SQE2 oral exams.

There have been improvements in the provision of reasonable adjustments in 2025, where assessments have been delivered using a range of assistive technologies including Braille display and text to speech software. There was a published change to the reasonable adjustment process in June 2025 to make it more streamlined. This change removed the need for separate applications for SQE1 and SQE2, while allowing the application status and details of the agreed adjustments to be viewed in an improved candidate account. These improvements are reflected in the substantial increase in satisfaction in the candidate survey, with responses to the statement 'it was a simple process to request a reasonable adjustment' improving from the 2023/24 to 2024/25 windows.

Mitigating circumstances and appeals

Claims can be submitted by candidates who believe they have encountered a material disadvantage while taking an exam. The majority of those who made a claim did so under 2.1(a) citing a 'mistake or irregularity in the administration or conduct of the assessment'.

I observed some of a mitigating circumstances panel, which involves the thorough consideration of each claim. These panels are scheduled across multiple consecutive days to ensure adequate time to:

  • discuss each case
  • align it with guiding principles
  • reference past decisions where available to ensure that consistency is maintained over time.

The outcome from each discussion is a recommendation to accept or reject the claim that is then considered by the Assessment Board.

Candidates can appeal a decision of the Assessment Board on the grounds of either:

  • there are mitigating circumstances which could not have been put before the Assessment Board before it made its decision; or
  • the decision of the Assessment Board, or the manner in which that decision was reached involved material irregularity and/or was manifestly unreasonable and/or irrational; or
  • the candidate disputes the Assessment Board's finding of malpractice or improper conduct.

Over the past year, across SQE1 (January 2025 and July 2025) and SQE2 sittings (October 2024, January 2025, April 2025 and July 2025) there were 323 First Stage Appeals submitted and reported to the Assessment Board.

Where appeals are upheld, the most common reason was related to late mitigating circumstances submissions that could not have been put before the Assessment Board. From the information reported, the process and policy were appropriately followed and cases were given full consideration.

Previous Independent Reviewer reports have thoroughly reported on the pilot undertaken and further processes to prepare the SQE1 and SQE2 for delivery in Welsh. I did not directly observe any of these, as I understand the assessment is ready to deliver should a candidate request it. To date, there have been several expressions of interest to take the SQE in Welsh but it was ultimately taken in English.

Prospective SQE candidates fed back that there are currently no training providers that offer full SQE preparation in Welsh. And this is a key contributing factor as to why no one has sat SQE1 or SQE2 in Welsh to date. It is hoped that there may be some demand for Welsh delivery in the near future.

I did not directly engage in any observations or discussions about Welsh exam preparedness in 2025. In my observation of the SQE2 oral exams in Cardiff in 2024, all candidates took the sitting in English but there was evidence that the Welsh provision is ready. Candidates can opt to sit a Welsh version in two SQE1 sittings and two SQE2 sittings per year.

Decisions as to where to set the pass marks are clearly outlined in processes and policies. The basis for these processes and policies is from well-established standard setting techniques widely used in other high-stakes professional qualifications. The processes are supported by the robust analyses carried out on the psychometric data and comprehensive reports to support the Assessment Board in determining the pass marks. The outcomes appear to be fair and defensible.

Scaled scores for SQE2 were introduced from January 2025. By using this common scale, scores can be accurately and fairly compared between test takers, both within their assessment window and between assessment windows over time. This does not impact the threshold standard so the expected level for a solicitor on 'day one' of practice remains constant with this change in presentation of scores and therefore does not impact candidate outcomes.

I observe the Assessment Board and have sight of the reports produced by Kaplan to support the preparation. These reports contain extensive psychometric data that provide item and station level analysis in addition to measures of overall test performance.

The Kaplan academic team reviews the item level data to highlight any potential items that may need further scrutiny due to their performance. Options for intervention are considered in meetings ahead of the Assessment Board, where  the technical data is scrutinised and further information or analyses can be requested ahead of the formal Board meeting to help the Board to make the most robust decisions possible. I am satisfied with the processes and conduct of the Assessment Board.

During 2024/25, Kaplan has demonstrated comprehensive quality assurance procedures and took every opportunity to make continual enhancements on processes. Effective checks continue to be in place and feedback from all key stakeholders drives improvements.

Towards the end of 2024, a new process was introduced consisting of meetings dedicated to exploring issues that are not related to any single SQE sitting and its associated results, which is the purpose of the previously mentioned Assessment Board. This process has continued throughout 2025 and highlights the dedication to ongoing quality assurance of the assessment over time. It allows for time outside of the pressured results-determination process to consider overarching issues and give some dedicated time to the development of the assessment. This ensures that future developments are proactive and considered in light of assessment best practice and educational theory rather than reactive issues arising from individual sittings. This also allows the opportunity to explore data and trends across time, which is important as this data continues to build with the 8th delivery of SQE1 (July) and 13th delivery of SQE2 (October) occurring in 2025.

A key quality assurance recommendation from my 2024 report was the development of a mechanism for analysis of assessor behaviour in SQE2 and assessor feedback provision. There are several processes in place within each assessment window to identify any irregularities or problems with the marking process. However, a wider look at assessor performance over time would be useful to the Kaplan team and individual assessors. This is particularly true now the 13th delivery has just occurred and some assessors have been involved in many of these deliveries.

The development and dissemination of assessor performance reports is in progress at the time of writing. I had the opportunity to feed back on the initial report templates and they have now been produced for all assessors who have participated in a minimum number of assessments.

The psychometrics team will lead on the distribution of this information to ensure understanding of the content and what it means. The team has set out a timetable for future reporting cycles, ensuring this activity continues while allowing enough time between issuing new reports to allow more assessment opportunities and sufficient data to build up. I commend the approach and speed at which this has been actioned. I recommend that next steps include formal feedback on this, both from Kaplan staff leading assessor teams and individual assessors themselves, as it would be interesting to analyse its impact on both assessor behaviour and performance.

In the Exam Creation and Production section above, I referenced the concept of assessment utility and how the various components need to be balanced in order to design and deliver an effective assessment. I believe the SQE has this balance right. I feel there is scope to increase the assessment literacy of various stakeholders, particularly candidates, as feedback sometimes highlights a lack of understanding of key assessment processes.

For example, there has been some criticism of the format of SQE1 and lack of a clear link to professional practice. So this year there was information added to the SQE website outlining why single best answer (SBA) multiple choice questions (MCQ) are used and why they are fair. It is recognised that solicitors are unlikely to be presented with a list of five options whenever a decision needs to be made. However, this format is appropriate as a measure of functioning legal knowledge as SBA MCQs test application of knowledge in a wide range of areas. Similarly, for SQE2, the website contains clear objectives of each station, providing rationale for why the design is as it is. I would encourage the continued emphasis of assessment literacy to stakeholders, particularly candidates, as this understanding of objectives and rationale for assessment design can help with preparation.

The SQE is a unique assessment in the field of solicitor qualification and the wider legal education sphere, so there is not always precedence when difficult decisions need to be made or new processes developed. However, Kaplan is open to using external expertise and I have witnessed this at several points throughout this year. I continue to encourage this external input from fields that run similar assessments, such as those in medical education or the wider education academic field. This highlights a culture that is constantly looking to improve and remain at the forefront of professional exams.

This report highlights areas of good practice and recommendations for future enhancements, including:

  • continue to place urgency on the rollout of the spell check function as an integral requirement of SQE2
  • continue to monitor the quality and engagement of participants in online marking and calibration sessions in 2026, ensuring large sessions maintain the same calibre observed in 2025
  • monitor the impact of assessor performance reports
  • emphasise the importance of assessment literacy for external stakeholders, particularly candidates, highlighting the extensive information that the SQE website contains about the assessment design and rationale.