Rethinking evaluation: the rigour of rapidity

3 May 2019

Across the NHS, teams are trying to improve patient services. This is challenging, particularly in such a resource-constrained system. Front-line teams often recognise that in a complex environment not everything is going to succeed first time, and evaluators are commissioned to help understand the impact that changes have. This demonstrates practitioners’ clear commitment to learning, being under pressure to deliver quality health services.

In order for evaluators to inform decision-making and allow for appropriate course correction, it is important that they provide their findings in a timely manner and that evaluations are sometimes repeated. This type of evaluation is often referred to as ‘rapid’, or ‘rapid-cycle’ evaluation. Growing interest in this concept is reflected in the fact that three rapid evaluation teams have been established over the last few years: NIHR-funded centres RSET and BRACE, as well as the Improvement Analytics Unit (IAU) – a joint partnership between the Health Foundation and NHS England.

What does ‘rapid’ mean?

As an evaluator, it strikes me that there is no consensus yet about what ‘rapid’ means. Earlier in the year, I attended a Nuffield Trust conference to discuss with experts the challenges and opportunities of rapid evaluation. The morning session focused on possible tensions between an evaluation being rapid and being rigorous, suggesting that the two might need to be traded off.

But I passionately believe that rapid evaluation can be, and has to be, rigorous if it is going to meaningfully support teams making changes to health services on the front line. For instance, comparing hospital use by one patient group with a robust counter factual provides better understanding (of the impact that changes are having) than looking at a patient group’s hospital use over time.

To deliver rigorous evidence-based evaluation rapidly, we might need to re-think what rapid evaluation means. It is also worth considering how evaluation teams are set up. And while my background is in quantitative analysis, I also recognise the importance of combining quantitative and qualitative evidence. The principles I lay out here are with my ‘quants hat’ on but they can certainly translate to qualitative evaluation as well.

I think the term ‘rapid’ should not relate to the overall duration of an evaluation study – rather, certain steps should be considered outside the ‘rapid’ framework. These include commissioning the study, understanding the intervention and evaluation needs, and developing an analysis plan. A lot of time is also spent on data acquisition, and restructuring data that are often collected for administrative purposes to make them suitable for evaluation. Similarly, at the end of a study there is a lengthy peer review process. I would argue that the term rapid should refer solely to the stage of the evaluation process where data are analysed, results are reviewed and interpreted, initial findings are written up and presented to those involved in the change initiative, and evaluators engage with front-line teams to understand what the results mean.

I am not suggesting that other (non-rapid) aspects of a study are not important. In fact, the opposite is true: these aspects bring rigour and ensure the study adds value, but can be conducted without affecting the timeliness of the evaluation.

The rigour of rapid evaluation

It is worth investing time in planning and preparing for evaluation studies before the ‘rapid’ phase begins. For example, evaluators can:

  • Tailor evaluation questions to look at key assumptions of an intervention’s logic model, or to identify key metrics. Understanding the intervention, and the context around the changes to health services, and subsequently understanding exactly what is being measured (and why), will help with the interpretation of the findings later.
  • Specify a robust analysis plan up front. This reduces the risk of evaluators falling into the trap of data mining (looking for something in the data that might be the result of coincidence). A pre-specified analysis plan can also be peer-reviewed to make sure the proposed methods are rigorous.
  • Access the data as early as possible. Data acquisition and preparation can be very time-consuming, so it is useful to start this as early as possible and develop some of the syntax ahead of time.
  • Seek peer review – but be clear about the order of events. Peer review is crucial to demonstrate the rigour of a study, and should always be encouraged, but sometimes findings can be used before the peer review and publication process is completed.

Considering the above, I think the ‘rapid’ in rapid evaluation should refer to the period between data collection and findings being presented. In this context, rapidity effectively refers to the ‘freshness’ of the evidence. To use a crude analogy, when visiting my local bakery, my main concern is not the hour at which the baker has to get out of bed to make my bread, or how long the oven needs to warm up, but I would like to enjoy deliciously fresh bread as soon as it comes out of the oven.

Thinking about rapid evaluation this way does have implications for the way we organise evaluative capability in the health service, academia and the wider health sector. Rather than commissioning individual pieces of work that each require a team to be formed, staff to be recruited and data to be acquired, for instance, it is worth considering a more permanent resource. This will enable us to build on existing processes and develop capability that can be carried through and re-used through various evaluations. Having a dedicated long-term team in place is one of the major strengths of centres like BRACE, RSET and IAU that allows them to systematise evaluation. For me, rigorous evaluations require strategic planning but rapid delivery.

Arne Wolters (@4RN3W0L73R5) is Senior Analytics Manager at the Health Foundation

Further reading

Partnership

Improvement Analytics Unit

An innovative partnership between NHS England and the Health Foundation providing robust evaluation of complex changes.

Blog

Evaluating complex change: The role of formative evaluation

Formative evaluation during the early stages of programme development can inspire us to look again, ask questions and identif...

You might also like...

Blog

Is data the new oil?

Is it helpful to compare data and oil to make sense of how data can best be used to tackle health and care challenges?

Press release

NHS cancer waiting times in 2018/19 the worst since targets were set

Health Foundation response to NHS monthly performance statistics

Research project

Real-time data analytics to improve mechanical ventilator wean – guiding clinical behaviour (ATTITUDE study)

Project that aims to understand the barriers to implementing evidence-based care in complex critically ill patients, and will...

Kjell-bubble-diagramArtboard 101

Work with us

We look for talented and passionate individuals as everyone at the Health Foundation has an important role to play.

View current vacancies
Artboard 101 copy 2

The Q Community

Q is an initiative connecting people with improvement expertise across the UK.

Find out more