SQE Quality Assurance Report 2024-25

Introduction

This is the fourth quality assurance report since the introduction of the Solicitors Qualifying Examination (SQE) in autumn 2021. Here we explain how we quality assured the SQE1 and SQE2 assessments delivered between October 2024 and July 2025.

The previous SQE Quality Assurance report was published in March 2025.

Kaplan is the sole assessment provider for the SQE. It publishes a statistical report for each assessment delivery providing information on the assessment, including the pass marks and pass rates. The relevant reports for this period are:

Kaplan has also published an annual report which provides a cumulative picture of the outcomes from SQE assessments delivered between October 2024 and July 2025.

The SQE is also the end point assessment (EPA) for solicitor apprentices. We were appointed by Skills England as the external quality assurance organisation for the EPA. Our quality assurance is in line with Skill’s England’s principles.

Our quality assuring priority is that standards for entry into the profession are at the right level and consistently applied, and that the SQE is up-to-date and fit for purpose. This provides confidence in the SQE as a professional, high-stakes assessment and in the competence of those who have passed.

Open all

We have a quality assurance framework to make sure that effective controls are in place to manage the risks to the quality and standard of the SQE. Following the framework enables us to gather evidence that the SQE is a fair and robust assessment of day one solicitor competence. We do this through:

  • regular meetings with Kaplan, and with then Independent Psychometrician and the SQE independent reviewer
  • Kaplan and us fulfilling our contractual obligations
  • systematic monitoring, evaluation and analysis of assessment data
  • checking compliance with agreed policies and procedures
  • observations of live assessments and assessor and marking standardisation meetings
  • ensuring compliance with the SQE Assessment Regulations.

We engaged with three ‘Subject Matter Experts’ (SMEs) during the reporting period, one of whom was appointed in September 2024 for a 3-year term. Two SMEs had been in post since 2021, and their term came to an end in August 2025. Further SMEs were appointed in August 2025.

The SMEs provide expert, objective and independent judgment of the assessments and contribute to the quality assurance of the SQE. They all have experience in practice as a solicitor and in higher legal education. And bring a breadth and depth of knowledge and experience in the areas covered by the SQE functioning legal knowledge and professional assessment.

In the past year, SMEs have reviewed a sample of questions for both SQE1 and SQE2 assessments, observed live assessments and attended and observed markers’ meetings and the standardisation and calibration of assessors.

The SMEs were provided with training prior to taking up their role. This covered:

  • their role as an SME
  • an overview of the quality assurance framework
  • SQE1 question writing and assessment build
  • SQE2 question writing and assessment build
  • SQE assessment methodology.

This ensured a good understanding of the SQE assessment arrangements and processes before they undertook any assessment reviews.

The SQE is subject to the oversight of an Independent Reviewer, who during the year, has:

  • observed live assessment deliveries
  • observed assessment standardisation and markers' calibration meetings
  • attended Assessment Board meetings
  • observed Mitigating Circumstances panels
  • observed candidate focus groups
  • interviewed members of the Kaplan management team
  • met with us
  • observed workshops facilitated by Kaplan
  • considered reports and information produced by Kaplan and the SRA.

The Independent Reviewer's latest report covers the performance of the SQE processes and outcomes delivered in 2025.

He states that: 'The SRA and Kaplan teams work together to ensure openness and accountability, collaborating when issues arise to ensure optimal outcomes. Candidates, stakeholders and the public should have confidence that the SQE outcomes delivered in 2025 were fair and defensible. Ad that there is a clear commitment to the continual enhancement of assessment design, development and delivery processes.'

An Independent Psychometrician continues to provide expert guidance on the psychometric analyses conducted on each SQE sitting. This includes:

  • regular meetings with Kaplan’s psychometricians
  • conducting checks for potential bias, question analysis
  • the identification of trends over time and checking that the interpretation and reporting of these analyses are appropriate.

The Independent Psychometrician provides a written report after each assessment and has attended Assessment Board meetings as a member. She has also held regular meetings with us and provided guidance and assurance regarding how standards are set for both SQE1 and SQE2. We liaise with Kaplan’s academic heads to ensure that any recommendations made in her reports to us are followed up and actioned.

Candidates are asked to complete a feedback questionnaire after each delivery. This is carried out by Kaplan, and the findings are shared with the SRA.

Candidates' overall satisfaction levels for SQE1 range between 50% and 55%. This is a slight decrease in the overall satisfaction levels from year 2023-24 (51% in January 2024 and 60% in July 2024). However, this is not reflective of candidate satisfaction levels in respect of particular aspects of the SQE, on which the scores are generally much higher. The overall score might reflect candidates’ wider feelings about the experience of preparing for and taking a challenging high stakes assessment rather than how it was delivered. For example, when asked about satisfaction relating to the administration on the day of the assessment 84% of candidates were satisfied or very satisfied.

There were also improvements in scores for the booking process and requesting reasonable adjustments for the reporting period compared to the previous year. When asked about each aspect of the assessment, satisfaction levels were all above 62%.

For SQE2, candidates' overall satisfaction levels range between 46% and 57%. This is comparable with the overall satisfaction from year 2023-24 (43% - 59%). However, when asked about satisfaction in relation to specific aspects of the SQE, the percentage of candidates who indicated very satisfied or satisfied is much higher. The administration on the day (oral and written assessments), and instructions provided to candidates in relation to assessment tasks ranged between 82-86%. There was an increase in the satisfaction in the booking process, scoring 67% positive compared with 53% positive in the previous year.

When asked about each aspect of the assessment, most satisfaction levels were above 70%.

Further, in relation to both SQE1 and SQE2, when asked about each aspect of the assessment, all satisfaction levels were higher than the overall satisfaction level.

Candidate feedback for those with reasonable adjustments continues to be below the overall cohort satisfaction levels, indicating that the journey for these candidates requires improvement. There was an increase in scores given by reasonable adjustments candidates indicating that there have been improvements and greater satisfaction amongst these candidates. These include:

  • the process for requesting adjustments (increase of 2.5 percentage points across the reporting period from previous year)
  • information provided regarding requesting reasonable adjustments (increase of 9%)
  • reasonable adjustments received matching their plan (increase of 7.25%)

From the written comments provided by candidates, key themes remain in relation to:

  • the booking process
  • the assessment specification not being detailed enough
  • the sample questions provided on the website not being representative of the actual assessment
  • waiting times on the day for oral assessments
  • issues with computers (being slow and aging).

Actions taken to address these issues are summarised below.

Candidates were also invited to attend focus groups after each SQE1 delivery, and after the April and October SQE2 assessments. These are run by Kaplan and usually observed by the SRA and/or the Independent Reviewer and are divided into two sessions: those with reasonable adjustments, and those without. Apprentices are also encouraged to attend the relevant focus group.

The focus groups allow for more qualitative data to be collected. Similar questions are asked in each focus group, but with the opportunity for candidates to give feedback on reasonable adjustments should they want to, in the former. During the reporting period, some of the main themes discussed for improving the candidate experience included the booking experience and the request for more candidate information regarding the exam content.

The SQE Assessment Regulations set out the SQE Assessment Board's responsibilities. These include meeting after each delivery to approve the pass mark and make other checks on the reliability and fairness of the assessment. The Assessment Board also considers and approves recommendations referred to them by the Mitigating Circumstances Panels held after each SQE assessment delivery.

In keeping with the SQE Assessment Regulations, it is chaired by an SRA chief executive officer or their nominee. The CEO has delegated this function to the Executive Director of Strategy, Innovation and External Affairs at the SRA. The Board's members include senior personnel from the SRA and Kaplan and the Independent Psychometrician. The Independent Reviewer is an observer.

In reaching its decisions, the Assessment Board receives:

  • a report on delivery and any adverse events
  • a report on any allegations of malpractice and improper conduct
  • minutes of the meeting and recommendations of the Mitigating Circumstances Panel
  • a statistical and qualitative report containing information on test quality, the profile of the cohort, assessment performance (validity and standard), provisional pass rates and demographic group performance.

We are confident that the evidence we have obtained through the above provides the following assurance:

  • the assessments were valid: they tested the competences expected of a newly qualified, day one solicitor to the correct standard and they were set in realistic contexts
  • each assessment was constructed according to the assessment blueprint and reflected the SQE assessment specification
  • the psychometric analyses provided evidence that the assessments were reliable; they measured consistently the performance of the candidates
  • appropriate methods for setting the pass mark for this high stakes professional exam were applied
  • the assessments were fair and free from bias
  • the assessments were secure
  • risk was appropriately identified and managed
  • there is a commitment to continuous improvement and mechanisms are in place to learn from any assessment related issues, including delivery failures and reduce or eliminate the risk that they are repeated.

Evidence that we have collected which supports the above assurances is listed in Annex 1.

In our previous reports, we identified some areas of SQE delivery which required improvement. The actions taken in those priority areas were in relation to:

  • requesting reasonable adjustments
  • information for candidates and training providers
  • greater transparency surrounding the SQE assessments
  • SQE website navigation
  • provision of a spell check function for SQE2 written assessments

Action in these priority areas has continued in the period 2024/25:

  • Changes were made to the reasonable adjustments process in June 2025. A new reasonable adjustment application form was introduced allowing candidates to request reasonable adjustments for both SQE1 and SQE2 removing the need to make separate applications for each assessment window. This was introduced to reduce the number of applications and to provide peace of mind for candidates regarding their reasonable adjustment plans across both assessments.
  • Kaplan added a further 50 new sample SQE1 questions to the website taking the total number of sample questions up to 220. The questions were published in response to candidates' request for more assessment materials to enable sufficient preparation for the SQE1, and which reflect the quality and content of the actual assessment. The newly published questions were previously used in SQE1 assessments. Performance data was also published detailing the percentage of candidates that answered the question correctly in relation to the 130 sample questions that have been used in SQE1 assessments.
  • Kaplan published more information and guidance on SQE1 single best answer multiple choice questions. This aims to provide greater clarity as to what these questions are and why they are appropriate for use in SQE1 assessments.
  • Further SQE2 sample question materials were added to the website, specifically sample written assessments for Wills and Probate (case and matter analysis) and Business Organisations (legal writing).
  • Changes were made to the structure and navigation of the SQE website to help candidates find and access information about the SQE. The changes were designed to make it easier for candidates to find what they need, with clearer signposting to key topics and more accessible, easy-to-read content.
  • Progress was made by Kaplan and Pearson Vue towards the introduction of spell check functionality for SQE2.

Before we introduced the SQE we commissioned the University of Exeter to research the potential causes of different levels of attainment for ethnic groups in professional assessments. We committed to a set of actions, in line with the findings. During the reporting period, we have continued to address these actions:

  • Kaplan published an analysis of candidate characteristics and factors that influence scores in the SQE. This highlighted the correlation between success and prior educational experience and attainment.
  • As part of developing targeted support to familiarise candidates with SQE assessments, Kaplan delivered workshops to training providers on writing single best answer questions, mirroring the SQE1 format. A video of the workshop content was published in September 2025.
  • In August we surveyed past SQE1 candidates to better understand how they prepared for SQE1, so that more targeted support can be offered in the future. 

In October 2025, Kaplan published a report on apprentice performance in the SQE. The report provided analyses of the cohort characteristics and performance of solicitor apprentice candidates in all SQE assessments to date.

The report stated that most apprentices sitting SQE1 and SQE2 have been graduate apprentices. However, analysis showed that there have been no overall significant differences in performance between graduate and non-graduate apprentices to date.

In summary, there are some differences in the characteristics and performance of apprentice candidates and non-apprentice candidates. These differences are:

  • apprentice candidates are more likely than non-apprentice candidates to come from lower socio-economic groups, be of white ethnicity and female
  • apprentices generally achieve higher scores and higher pass rates than non-apprentice candidates in SQE1, and particularly so in SQE2.

Assurance on valid assessments

The assessments are valid

Evidence:

  • sample of SQE1 questions reviewed by SMEs and SRA
  • sample of SQE2 assessments reviewed by SMEs and SRA
  • observation by SMEs and SRA at SQE2 oral assessments
  • composition of assessment checked by SRA
  • Independent Reviewers report.

Assurance on weightings for blueprint and assessment specifications

Each assessment has been constructed according to the weightings within the assessment blueprint for SQE1 and for SQE1 and SQE2 reflect the assessment specifications.

Evidence:

  • sample of SQE1 questions reviewed by SMEs and SRA
  • sample of SQE2 assessments reviewed by SMEs and SRA
  • report from Kaplans Head of Quality Assurance on each assessment confirming that all processes relating to the question writing and assessment build have been followed
  • composition of assessments checked by SRA.

Assurance on reliability

The assessments are reliable.

Evidence:

  • Cronbachs alpha has been greater than 0.8 in all SQE2 assessments and greater than 0.9 in all SQE1 assessments in this period. Cronbachs alpha is a measure of test reliability. The gold-standard for high stakes assessments in 0.8
  • Independent Psychometrician checks.

Assurance on fairness

The assessment is fair and free from bias, decisions about candidate performance are fair and methods agreed for setting the pass mark have been applied.

Evidence comes from:

  • adherence to a question writing methodology
  • assessor recruitment and training
  • reasonable adjustments policy – reported against at monthly contract meetings
  • SME review of a sample of the questions for each assessment
  • recognised appropriate standard setting methods for high stakes professional assessments applied
  • SME, SRA, Independent Psychometrician and Independent Reviewer observations of live delivery of SQE2 oral assessments
  • SME and SRA attendance (as observers) at assessor standardisation and markers' meetings
  • SME and SRA and Independent Psychometrician and Independent Reviewer attendance (as observers) at Angoff Panel training for SQE1 standard setting
  • analyses and evaluation of psychometric data reviewed by Independent Psychometrician and presented to the Assessment Board
  • SRA attendance and Independent Psychometrician and independent reviewer observations at mitigating circumstances panel meetings
  • Independent Reviewers report
  • SRA Independent Psychometrician checks and requests for further analyses where appropriate.

Assurance on assessment security

The assessments are secure.

Evidence:

  • confirmation from Kaplans Head of Quality Assurance prior to signing off each assessment that all processes relating to training, writing the individual assessments and the assessment build have been followed
  • processes are in place to ensure the security of assessment materials during delivery of the live assessments
  • confidentiality obligations imposed on all assessors
  • conflict of interest policy and process (reported on in monthly contract meeting).

Assurance on risk

Risk is appropriately and effectively identified and managed.

Evidence:

  • monthly meetings with Kaplan to check against service levels including those relating to progressing applications for reasonable adjustments, managing complaints and website accessibility
  • review of joint risk log at monthly contract meetings
  • checking Kaplan's internal audit plans
  • monitoring Kaplan's lessons learned log and action plan
  • reviewing and monitoring Kaplan's Business Continuity Planning
  • Independent Reviewers report.

Assurance on commitment to continuous improvement

Continuous improvement is made to the SQE and assessment delivery and action taken where necessary because of lessons learnt.

Evidence:

  • lessons learnt log and actions taken are available to SRA
  • annual review of all processes
  • regular stakeholder engagement through meetings and focus groups
  • qualitative feedback obtained from candidates through focus groups.

evidence of actions taken in response to issues.