Using multiple choice questions in legal education

By Julie Brannan, SRA Director of Education and Training, 25 April 2016

Our recent consultation on the introduction of a Solicitors Qualifying Examination (SQE) closed last month following much debate and discussion.

Multiple choice tests

One of the things we have proposed is the use of multiple choice tests (MCTs) to assess functioning knowledge in Part 1 of the SQE. This has been controversial.

In fact, the use of multiple choice questions (MCQs) in all sorts of tests, including professional legal assessment, is commonplace. For example, it is how the USA Multi-State Bar Exam assesses the legal knowledge of intending attorneys.

This test forms the core of most local US states' bar exams, from New York to California – the very jurisdictions which are often held up as providing the greatest competition to English and Welsh solicitors.

So I thought I would share an article which I have been reading on 'Developing High-Quality Multiple-Choice Questions for Assessment in legal assessment'1. It is written by Susan Case and Beth Donahue who were both involved for many years in the USA Multi-State Bar Exam. In fact, Susan Case was the Director of Testing for the Multi-State.

The authors recognize that multiple choice tests are commonly viewed as "less intellectually rigorous than essay questions and less realistic in their relationship to the actual practice of law". As they say, “Lawyers write briefs and interpret cases and construct arguments. They do not select advice or arguments from a list of choices".

This challenge is reflected in the criticisms made of the proposed SQE multiple choice test. People argue that it will neither test to the required standard nor assess candidates' abilities to analyse facts or build an argument.

Why do we want to use multiple choice tests

In deciding whether to use MCTs, the starting point, as the article points out, must be: what do you want to test?

If you want to test writing skills, or in-depth analysis of a detailed point, then an MCT is not the right choice. It is absolutely spot on that you cannot test writing, or persuasion, or the formulation of a legal argument through MCTs.

That is why our proposed SQE Part 2 includes skills assessments which will test candidates’ ability to write, draft, build an argument, persuade, research, advise, and analyse facts from primary sources.

The authors advise that, if you want to test application of legal knowledge, analysis, legal reasoning and problem solving skills, then both essays and MCTs can work. In deciding between them, multiple choice questions have particular advantages over essays.

While essay questions are relatively easy to develop, they are difficult to mark. It is difficult to maintain a consistent standard across large numbers of papers. They are also less good at assessing breadth, so they play into the hands of candidates who are lucky at "question spotting".

MCTs are easy – and cheap - to mark. There is no bias in the marking. A consistent standard can be identified and applied to all candidates.

The cost point is significant for the SQE, because whilst MCQs might be cheap to mark, they are really hard – and therefore expensive - to write. But the large numbers of candidates involved will create the resource needed to write good questions. MCQs will still be a cost-effective assessment method.

Indeed one of the reasons for the prejudice against these type of questions is that it is so easy to find poor examples. These include questions:

  • which test nitpicky facts
  • which test surface learning, or the candidate’s recall of isolated facts
  • where the wording is ambiguous
  • where more than one answer could be correct
  • where the scenario is unrealistic
  • which test obscure legal principles unlikely to be addressed by new lawyers.

The list of potential mistakes is a long one.

Ensuring an effective assessment

So, for our test to provide an effective assessment of candidates' ability to apply legal reasoning to facts, or their problem solving skills, the questions must be really well written.

The article advises that a good test depends first on a clear vignette, or fact pattern. This must set up a legal issue without giving too much away, or accidentally supporting an unintended option. Without this, MCQs simply test recall: What’s the rule? What’s true?

In the context of legal assessment, the vignette should set up a question which requires intending solicitors to undertake a range of core lawyering tasks.

The article provides examples of the type of questions used in the US Multi-State. What is striking how well these map on to the competences in the Solicitors Competence Statement which we might assess through the proposed SQE Part 1.

MCT questions which have appeared in the US Multistate Bar Exam Competences in the Solicitors' Competence Statement tested

Gathering information

  • What issue must the lawyer resolve before advising his client?
  • What information does the lawyer need, in order to proceed?
  • What question should the lawyer ask next?
  • Which source of information will be the most helpful?

A4: identification and application of legal principles

A5: Applying understanding, critical thinking and analysis to solve problems, including recognizing gaps in information and evaluating the quality and reliability of information.

B1 (c): recognizing when additional information is needed.

B3: developing and advising on relevant options, strategies and solutions

B7(a): applying relevant processes and procedures to progress the matter

Identifying issues and formulating strategy:

  • Which of the following causes of action is most likely to be successful?
  • What is the best defence?
  • Whom should your client sue?
  • What test should the judge apply?
  • Which law governs the claim?

Synthesizing law and facts/predicting outcomes

  • Should the statement be excluded?
  • Should the conviction be overturned?
  • Is the opinion admissible?
  • Who owns which interest in land?

These questions are not testing knowledge recall. They are testing higher level cognitive skills. Critically, for us, they are testing the competences needed for practice as a solicitor, including critical thinking, analysis and application.

The experience of the authors in the US multi-state is that a valid, well-designed assessment drives students' learning: the assessment communicates to them what matters and what they need to learn. It also motivates them to learn it.

So a good assessment will make for better solicitors, not just because it should distinguish more accurately between candidates who are competent and those who are not, but because it focuses students' learning on what is needed for safe practice.


1. Journal of Legal Education, Volume 58, Number 3 (September 2008)

Print page to PDF