Incoming reports – risk assessment methodology

 

Last updated 20 October 2014

Introduction

Reports about alleged breaches of the SRA Handbook received by the SRA are assessed by the Risk Centre Assessment Team. The team applies a consistent classification and scoring mechanism in order to

  • understand the relative seriousness of each allegation in terms of direct impact on the public
  • understand the credibility of the evidence and its source
  • clearly and unambiguously express that understanding in a manner that is understood and accepted by the SRA's various directorate
  • help prioritise the regulatory response, ensuring that it is proportionate to the risk
  • provide a common way of describing what we do. This will improve reporting and communications across the SRA and to regulated community and other stakeholders.

Process overview

The process consists of

  • understand the relative seriousness of each allegation in terms of direct impact on the public
  • understand the credibility of the evidence and its source
  • clearly and unambiguously express that understanding in a manner that is understood and accepted by the SRA's various directorate
  • help prioritise the regulatory response, ensuring that it is proportionate to the risk
  • provide a common way of describing what we do. This will improve reporting and communications across the SRA and to regulated community and other stakeholders.

These factors are combined in a mathematical model to produce a ’Red-Amber-Green’ scale of risk which is then available to the business to support their decision making process.

Process – step by step

Each allegation is assigned one of around 180 different ‘event’ categorisations. These events are organised by reference to the SRA Handbook and are each given a severity score from 1 to 10. The relative score of each event represents the considered view of the SRA that certain risks are inherently more serious than others, before further impact factors are considered.

An event's 1-10 score places it relative to, and in context with, the other events within the same Level 1 grouping of the classification, but also across events in other sections. This ensures that similar event categories seen in different sections of the classification consistently have the same, or at the least similar, scores.

Some examples of the current event classifications and scorings categories are given in Table 1 below. The SRA will review and if necessary amend the events and their relative scores from time to time, in response to emerging regulatory threats or changes in the SRA's perception of those risks.

In conclusion, the event scores are designed to broadly reflect the SRA's view of the relative severity of identified regulatory events. As we will see below, the Impact score helps to refine the relative seriousness of specific instances.

The Impact score is designed to take into account any one of several standpoints, depending on which is most relevant at the time:

  • the number of people affected or their relative vulnerability
  • the financial implications
  • the effect on public confidence

The Impact score ranges from Minor - Limited - Moderate - Serious - Catastrophic, and the various descriptions can be seen in Table 2.

The basic risk score is modified to take account of the credibility of the source and the quality of the intelligence or supporting evidence. The system for this is based on the National Intelligence Model 5 x 5 x 5 grading.

The total risk score is the product of the Event severity score x Impact score x Source credibility x Strength of evidence.

The risk assessment process is shown below in simple visual terms.

The SRA applies this methodology so that staff can apply a consistent, proportionate and transparent approach as they identify and react to risks being reported to us. However, it is only a starting point. There will be circumstances in which a different approach will be necessary. An example of such a situation is as follows:

  • The risk assessed by highly experienced staff is strongly at variance with the event risk score. This might be due to particular circumstances relating to the information received, or imperfections in the scoring mechanism itself. In those exceptional cases, experience and judgement will override the risk score. This will be clearly recorded so that lessons can be learned.

Risk assessments of events will not usually be disclosed in individual cases. There are a number of reasons for this: it may be necessary to protect the source of information; the assessment will usually be carried out before all evidence is obtained, and we know that different concerns often arise once we begin engaging or investigating; and the risk assessment is irrelevant to any final decision as to whether or not a solicitor is not meeting the Principles or outcomes. From time to time, the methodology may also be adjusted to reflect thematic concerns of the SRA Regulatory Risk Committee or other stakeholders. It will also take account of the SRA's overall tolerance of risk.

Finally, the initial assessment of incoming information, important though it is to ensure we are consistent and proportionate, is only one element in the overall risk-based approach across the SRA.

Table 1 – event category severity scoring examples

Level 1 Level 2 Level 3 Score
Integrity of firm / individual Financial dishonesty Misappropriation of client money 10
Overcharging 7
Taking undisclosed profits 7
Integrity of firm / individual Conviction Non-dishonesty conviction - custodial sentence 10
Dishonesty conviction 10
Non-dishonesty conviction - non-custodial sentence 6
Serious motoring offence 4
Caution 3
Integrity of firm / individual Money laundering Money laundering Perpetrator / facilitator 10
Breach of Money Laundering Regulations/POCA 8
Suspicious activity 5
Integrity of firm / individual Bogus firm / individual Unqualified person pretending, holding out or acting as a solicitor /REL 10
Relationship with client Client care Failure to redeem mortgage 8
Failure to pay SDLT 8
Failure to register charge 7
Taking unfair advantage - of client for another 6
Failure to register transfer 6
Inappropriately acting/refusing instructions 5
Failure to provide client care information 5
Costs information deficiency 5
Unlawful fee arrangement 4
Incompetence / negligence / delay 4
Failure to release papers 4
Failure to deal with complaints 4
Firm / practice management Disorderly closure Abandonment of practice 10
Death/incapacity of sole practitioner 10
Failure to manage closure of firm 7
Failure to safeguard files following closure 6

For a comprehensive table of risk scoring categories, please email us.

Table 2 – Objective impact scores

  1 Minor 2 Limited 3 Moderate 4 Serious 5 Catastrophic
No. of persons affected 1 or few clients or other Over 5 clients or others affected Over 25 clients or others affected Over 50 clients or others affected Over 1000 clients or others affected
Financial £5,000 or less Over £5,000 Over £50,000 Over £100,000 Over £500,000
Public confidence Little or no media/political attention. Little or no effect on public confidence in the profession Minor, short term media/political attention. Minor effect on public confidence in the profession Longer time media/political attention. Significant effect on public confidence in the profession National media/parliamentary attention for several days. Major damage to public confidence in the profession Sustained national media/parliamentary attention. Massive damage to public confidence in the profession
Vulnerability of individuals Little or no reason why unable to protect self against harm or exploitation Minor reason why unable to protect self against harm or exploitation Significant reason why unable to protect self against harm or exploitation Major reason why unable to protect self against harm or exploitation Unable to protect self against harm or exploitation

These examples are for guideline purposes only; it may not be clear how much money is involved or precisely how many people are affected.