Unfortunately, your browser is too old to work on this website. Please upgrade your browser
Skip to main content

The health service is awash with data – data about performance, processes, outcomes, staff and patients. There is a centrally dictated requirement to measure certain aspects of care, such as waiting times in A&E (not more than 4 hours) and for start of treatment (not more than 18 weeks from the date of referral). These are just two of a whole suite of data that healthcare providers have to collect regularly, both for internal consumption and external reporting to the regulators.

No one can deny that measuring what we do and how we do it is important. It’s a scientific fact that optimal performance cannot be achieved if we don’t know how we’re performing. However, being forced to measure the same things, whether they’re relevant to the local context or not, is what becomes frustrating. I’ve often heard people comment that we’re ‘hitting the targets but missing the point’.

In mental health, things become even more complex. There are no valid, reliable or universally agreed quality indicators or outcome measures. We’ve often substituted process measures for outcomes as the former lend themselves better to scientific measurement. Outcome measures have to be more nuanced and subtle when we talk about mental health conditions which often have a chronic, remitting/relapsing course.

In January, we heard that much of the waiting times data from hospitals are unreliable and the National Audit Office found wrong and inconsistent recording. This is perhaps not surprising, given the amount and complexity of data that need to be collected and the number of staff engaged in collecting and reporting them.

And this news doesn’t help the reputation of the NHS and the public’s trust in it. In the last year, rarely has a week gone by where the NHS has not hit the headlines for the wrong reasons, sapping the confidence and morale of frontline staff.

What does it mean for patients if an organisation is meeting its target for waiting times at, say, 95%? If they happen to be in the other 5%, it is not much consolation that the organisation is where it is expected to be from a regulatory perspective – a clinical paradox that’s difficult to reconcile.

This is where statistically based measurements fall down and person-centred care matters. Unfortunately, blunt tools and targets do not recognise such differences. This is one of the problems with league tables of NHS trusts which, while providing a degree of assurance about standards, do not necessarily encourage excellence in patient care.

The enormous amounts of performance data beg another question: how much time and effort goes into collecting them? What useful information comes out of it? Is this ‘knowledge’ being turned into corrective action? If not, why not? There is no single or easy answer to these questions.

Historically, the standard response to any instances of poor performance has been to bring in greater oversight, regulation and control and, as a result, bureaucracy into the system. This is not unique to healthcare either. We’ve seen similar responses to the financial and banking crisis of 2008. While this can be very comforting for people who are charged with accountability for the system’s performance, it has its downsides. It makes frontline staff disengaged and potentially disincentivises their efforts in taking responsibility for continuous quality improvement.

So what’s the solution? Here are some of my ideas…

Allow freedom for local teams and organisations to define what needs to be measured in their local context, in consultation with patients and carers. This will make them more engaged, enthused and give them a sense of control over their destiny. Drastically reducing the number of mandatory targets to just a handful is a good way of empowering local providers and clinicians, in conversation with patients and carers, to come up with more meaningful quality and outcome measures.

The primary purpose of a health service is to improve health of the population and add value to its customers – the customer has to be actively involved in defining that value. We need to get better at measuring the softer side of the care experience.

Kallur Suresh is a Health Foundation GenerationQ Fellow and a consultant psychiatrist for older people in Essex, www.twitter.com/Kallur_Suresh

You might also like...

Kjell-bubble-diagramArtboard 101 copy

Get social

Follow us on Twitter
Kjell-bubble-diagramArtboard 101

Work with us

We look for talented and passionate individuals as everyone at the Health Foundation has an important role to play.

View current vacancies
Artboard 101 copy 2

The Q community

Q is an initiative connecting people with improvement expertise across the UK.

Find out more