How to evaluate an ever-changing experiment?

30 August 2018

Most people can name a global franchise such as Starbucks or McDonalds. The franchising approaches used by companies like these are increasingly being developed for use in other sectors with a focus on scaling social impact rather than commercial success. 

How could social franchising help the NHS?

We know that scaling what works is hard to do across the NHS, which is why we’re trying to find new and experimental approaches. We’re exploring whether social franchising might support the effective spread and scale of interventions in the NHS

While social franchising has proven effective in other sectors, such as international development, it is relatively underused (and under-tested) in the UK health care context. However, we can see potential for using it in the NHS. For example, social franchising could help support adopter teams to adapt and implement an intervention by providing training and resources as part of an ongoing relationship. It also has the potential to help create financial sustainability for the innovator team that is trying to scale the intervention. 

A chance to test a new approach to evaluation

As we adopt innovative approaches to tackling some of the long-standing challenges facing health care, approaches to evaluation also need to innovate.

There are two main objectives for our Exploring Social Franchising programme: to build social franchises in order to scale proven interventions more effectively; and to learn about how teams do this and whether social franchising as a scaling mechanism delivers the benefits we hope it might. This means that the design of the evaluation has been central to the success of the programme. 

We didn’t know if social franchising would be a viable model for scaling in the NHS. So in the first instance we awarded four innovator teams funding to explore social franchising and begin to build and test a replication model specific to their needs. We expect the development of each franchise to work to different timescales requiring different levels of support. The scale of each team’s ambition will be different as well as their definition of success so we knew this was not going to be a straightforward evaluation. 

Agreeing the priorities

As the trajectories of the projects were unclear, our first challenge was to agree priorities for the evaluation. As the innovators already had evidence that their interventions worked we decided not to focus our evaluation on demonstrating that the interventions were successful. Instead we agreed we wanted to learn about social franchising as a mechanism for supporting scale, exploring some of the following questions:

  • does social franchising, as a mechanism to achieve scale, support the replication of outcomes (and not just the replication of interventions) in other contexts? 
  • does social franchising support balance between fidelity to the original intervention and local adaptation? 
  • does social franchising enable the creation of a learning network of adopters and innovators together, and what value does this add?
  • does the ongoing relationship between innovators and adopters (training and support in one direction, and fees and data in the other) mean that social franchising supports better and more sustainable implementation?

We also wanted to learn about what it takes to set up a social franchise, what the experience of the innovators was as their franchises developed and what skills they needed to build successful replication models. 

Creating an evaluation that can cope with complexity

Because of this complexity, we worked closely with an independent evaluator (a partnership between Cordis Bright and the Innovation Unit) – from first exploration of the topic to ongoing management of the evaluation. We co-produced the evaluation design to ensure it met the needs of the programme. 

Collaborating on the initial design of an evaluation was a new way of working for us but meant we created an evaluation framework that met the needs of the programme as well as building a strong and constructive relationship with the evaluators. 

We created a joint governance structure to link us up with the evaluator in a timely way including regular working and advisory groups. This has practical advantages – for example as timescales of projects shifted we have reflected this quickly in evaluation plans, flexing our approach and use of resources. Equally the evaluator has been able to discuss the challenges that the programme design poses with early identification of risks and necessary mitigations. 

Learning from the evaluation has fed into the design of the next phase of the programme. Having efficient mechanisms for managing the delivery of the programme is important, but perhaps the greatest value of this joint way of working is the opportunity for constructive challenge in both directions with early course correction to keep us all focused on the right questions. 

Building trust between the programme team and evaluators has been central to this and perhaps our most important piece of learning has been not to underestimate the time and effort it takes to get this right. We know that projects can’t be neatly designed and contained to enable simple evaluations. With experimental approaches we need evaluation methods, and evaluators, that are flexible and can deal with emergence and uncertainty. We’ve learned that not only is this possible but if done well, can be a rewarding learning opportunity for us, the evaluators, the innovators and adopters, and ultimately we hope the wider system. 

Sarah Henderson (@sarahjhhen) is Assistant Director of Improvement Programmes at the Health Foundation.

Further reading

Blog

Collaborate to replicate

We’ve spent the last year collaborating with others to explore how social franchising and licensing can spread proven interve...

You might also like...

Invitation to tender

ITT: Summative Evaluation of the Flow Coaching Academy Programme

The deadline for responses is 12.00 on Tuesday 5 November 2019.

Apply now

Blog

Making time to talk: the challenge of spreading knowledge

Dr Nicola Burgess on the practical value of creating formal spaces to spread informal knowledge within organisations.

Blog

On the front line of quality improvement in Manchester: flashing lights, focused funding and forming habits

Will Warburton reflects on what he learnt from his recent visit to front-line improvement practitioners in Manchester.

Kjell-bubble-diagramArtboard 101

Work with us

We look for talented and passionate individuals as everyone at the Health Foundation has an important role to play.

View current vacancies
Artboard 101 copy 2

The Q Community

Q is an initiative connecting people with improvement expertise across the UK.

Find out more