Unfortunately, your browser is too old to work on this website. Please upgrade your browser
Skip to main content

At the recent NHS Confederation Annual conference, Simon Stevens’ speech emphasised the importance of independent evaluation of the new, local models of care he is championing.

Struck by the certain features of how we organise care – features ‘whose original rationale is now either forgotten, irrelevant or on balance unhelpful’ – he advocated a new, ‘horses for courses’ approach. That while mergers and service reconfiguration will continue to have a place within the NHS, there must also be room for complementary models to emerge – models designed for the local context and reflecting, in part, the heterogeneity of their patients.

Simon then went on to describe how these new models needed to be evaluated; to demonstrate how they help solve particular issues within local communities, including periodic opportunities to decide whether to continue with, or amend, the arrangements.

This policy direction challenges the more traditional top-down blueprints for change. This is part of what makes it an exciting opportunity for health services, as well as for the evaluation community that works with them.

In recent years, evaluators have been responding to the challenges involved in describing, assessing and understanding interventions aimed at improving the quality of care. This work provides a strong starting point from which to tailor approaches to evaluation in a way which supports the development of these local models of care delivery.

Evaluation of quality improvement is rarely straightforward; interventions tend to be complicated and the NHS context is undeniably complex. Here at the Health Foundation our experience in quality improvement evaluation – 10 years, 21 evaluations – has sharpened our sense of what is involved and the potential pitfalls.    

We have tried and tested a myriad of different ways of undertaking evaluation across the four countries of the UK, from conducting in-house studies, to commissioning independent teams and supporting arrangements for self-evaluation or locally commissioned/partnered evaluation.

Our evaluated programmes cover many of the features of high quality health care. Our early evaluations were on the leadership programmes that we were initially known for. However, since then, the Foundation’s large scale programmes in patient safety and person-centred care have been accompanied by large, independent studies. We have also been working on economic evaluation and efficiency.

So what have we – and the wide range of evaluation experts we work with – learned that will help us to understand how to effectively evaluate these new local models of care? Here are my six top tips:

  1. Planning for evaluation adds most value when included at the start of the programme, preferably in the intervention or service development phase. 
  2. A collaborative approach to agreeing a clear theory of change is helpful. It can ensure that all stakeholders have the same understanding about what change is being made and how it will work.
  3. It’s important to differentiate between what you are aiming to achieve within a specific timeframe and your longer term ambition. 
  4. Align evaluation design to programme design, review the evaluation model regularly, be flexible and plan for change.
  5. Understand the detail of the intervention, where in the system it is working and, last but by no means least, how the intervention interacts with its context. 
  6. Co-creating evaluations with those being evaluated can help to surface the tensions that can exist between local evaluation priorities and programme priorities.

And this is only scratching the surface.   

There is a lot still to learn from this material that can be drawn on to effectively evaluate the potentially diverse, local models of care that Simon Stevens is suggesting. With our interest in evaluability assessment, developmental evaluation and complexity, our commitment to improvement science and the creation of our new data analytics team, keep an eye out for more learning to come.

Louise is a Research Manager at the Health Foundation.

You might also like...

Kjell-bubble-diagramArtboard 101 copy

Get social

Follow us on Twitter
Kjell-bubble-diagramArtboard 101

Work with us

We look for talented and passionate individuals as everyone at the Health Foundation has an important role to play.

View current vacancies
Artboard 101 copy 2

The Q community

Q is an initiative connecting people with improvement expertise across the UK.

Find out more