In April 2017, we announced the first Q Lab challenge would look at what it would take for peer support to be available to everyone who wants it, to help manage their long-term health and wellbeing needs.

Since then, the Lab has become real ‒ we’re now collaborating with over 130 people (our ‘Lab participants’) to make progress on this challenge over 12 months.

We have just completed research and discovery – the first of three phases. Our purpose was to understand what is already known about peer support, and bring new perspectives and insights together. In ‘Phase 2’, we will build on this to develop and test ideas over the next four months.

Reflecting on our first five months

It’s our pilot year, so we need to balance making progress on our challenge with testing the approach. Five months in is a good time to reflect: how are the core features of the Lab working in practice? And what we can learn for the rest of the pilot year?

1. Collaboration and diversity

Collaborating with a diverse group of people was always going to be a core part of our approach. It is key to how the Q community aims to create change, as well as being a common characteristic of other social innovation labs.

Having over 130 Lab participants from across the UK – including people at the front line of care, policy makers, patients, health care managers, as well as people from other sectors like housing, academia and charities – has surpassed our expectations.

We have been exploring ways to support this diverse group to collaborate and work on the Lab challenge together. We know that harnessing the insight and energy of communities and networks, with the skills and knowledge of people involved in improving care, can have a transformational impact. We are working hard to nurture these relationships through the Lab.

An important issue for us is getting the right balance between making the most of everyone’s experience and perspective, while moving forward the work on the Lab challenge.

2. Learning and evaluation

As good improvers, it is important we evaluate, reflect on and share the Lab’s learning during this pilot year. Our quick guide on evaluation is one of our most downloaded publications, and we understand why: evaluation doesn’t always feel easy.

External evaluators, RAND Europe, have been appointed to help assess whether the Lab shows potential to catalyse change. As with Q, the Lab has an independent and embedded evaluation approach. Members of the RAND team spend time in our office observing the Lab team, as well as carrying out more traditional forms of data capture, such as surveys and interviews.

We are also trialling new ways to become a learning team: not just relying on external evaluators to shine a light on our practice but testing new ways to reflect on what we are learning as openly as possible. We complete a weekly ‘learning log’ to give thought to what we have achieved, what we’ve learned, what we might do differently, and how creative and collaborative we feel we have been.

In the next phase, we will continue to tweak these approaches, while also acting on the findings from RAND’s feedback too.

3. Using different forms of knowledge

There is a good amount of evidence and research on best practice in peer support already. We want to build on this existing evidence base, but also bring to bear different forms of knowledge that might not always be represented in, say, academic literature. Being able to tap into local, informal and experiential knowledge, and combine this with more formal evidence is challenging but can open up new insights to problems.

During Phase 1, we tried to do this by:

  • conducting interviews with a range of people who deliver, design, commission and/or receive peer support
  • gathering existing evidence and desk research (such as Realising the Value) about the benefits and challenges of high quality peer support
  • bringing together people with experience and expertise to interrogate the issue using quality improvement and innovation techniques.

In Phase 2, the question will be how we continue to embrace different forms of knowledge and combine it to generate new insights on complex issues.

What does Phase 2 have in store?

Phase 2 is a great opportunity to share the lessons we learn from co-designing, testing and iteratively developing ideas to drive local improvements.

Over the next four months, the Lab will be working on three specific areas identified in Phase 1:

  1. How can we improve the routine offering and promotion of peer support in a primary care setting?

  2. How can we generate sources of evidence that capture the holistic impact that peer support can have on people's lives?

  3. How can we support the sharing of knowledge, experience and evidence of what does and doesn’t work in peer support?

It’s exciting to make that shift into developing and testing practical ways to improve peer support, working at pace and with a wide range of people.

If you’d like to get involved, please check out the further information on phase 2 available on the Q website, and get in touch. We look forward to hearing from you.

Tracy Webb is Head of Q Lab at the Health Foundation, and is leading the design and delivery of the Q Improvement Lab, @TracyWebb007

Comments

Rosalind Lovegrove



Nursing Director going through Accreditation in Bahrain Middle East. UK nationality and qualified. Very interested in looking at all that the Q lab are doing



David Trigger



In these days of integration and multi-discipline teams, it could be useful to see whether the move to more care being delivered at home and in the community has impacted on peer support delivery.
A major problem now for rural communities is social isolation, which can also affect delivery of peer support. For example, in Worcestershire, we have around 30,000 older folk living in social isolation and it would be useful to see how this has affected peer support.



Add new comment

* indicates a required field

Your email address will not be published on the site and will only be used if we need to contact you about your comment.

View our comments policy