Unfortunately, your browser is too old to work on this website. Please upgrade your browser
Skip to main content

Key points

  • New technologies in the fields of automation, artificial intelligence (AI), robotics and communications are creating a range of opportunities to improve health care, from diagnostics to remote monitoring. However, they could also create challenges for delivering empathetic, person-centred care by reducing or mediating human interaction.
  • Through an online YouGov survey in October 2020, we asked more than 4,000 UK adults how four different uses of technology (self-check-in, robotic care assistants, AI triage and communicating bad news by video link) might impact on the person-centredness of care and whether they would be comfortable with such approaches. Of these, self-check-in was the only one that a majority of respondents were comfortable with, and the only one where a majority thought the technology-based approach was better.
  • Our results suggest that, to get technology-enabled care right, policymakers and practitioners will need to: engage with the public and NHS workforce to inform decisions about how best to develop and deploy technology; co-design new approaches with patients and staff to ensure they are tailored to user needs and preferences; and allow for differentiated approaches, taking into account who technology works for and when, and ensuring alternative options are available where feasible.

New technologies in the fields of automation, artificial intelligence (AI), robotics and communications hold vast potential for improving health care. But their increasing use in patient-facing contexts could have significant implications for patient experience – in particular, for the ‘human dimension’ of health care.

In this long chart, we’ll explore the impact that some of these technology-based approaches could have and what the public think about them. We’ll consider four scenarios that may create particular challenges in delivering empathetic, person-centred care, in each case considering whether these uses of technology would facilitate or undermine empathy and person-centredness. In doing so, we hope to stimulate debate about where the boundaries between technology and human agency in health care should lie and highlight some important issues for policymakers and practitioners to consider.

Over the past year, we’ve seen the health service roll out new technologies at impressive pace and scale in response to COVID-19. And there are hopes that technology-based approaches will help the NHS recover from the impact of the pandemic and meet future demand. But, as we argued in our recent long read on the use of technology during COVID-19, it will be important to evaluate the impact of these approaches on care quality and patient experience before they get ‘locked in’ for the future.

Technology and the human dimension of care

Recent advances in data analytics, machine learning and communications are extending the reach of technology into a range of areas, with implications for patient experience. These include diagnosing conditions and predicting risks, recommending treatment options, and enabling remote monitoring and consultations, among others.

These growing uses of technology – the subject of a Health Foundation report out later this month – pose some interesting challenges for the human dimension of health care. In some cases, automated systems are undertaking patient-facing tasks previously carried out by health care workers, such as appointment booking. In other cases, technologies are supporting the delivery of health care in ways that mediate or otherwise affect clinician-patient relationships, such as the use of clinical decision support tools or the use of apps for monitoring health.

A key focus of the Health Foundation’s work over the past decade has been improving the person-centredness of health care – by ‘person-centred’ we mean personalised, coordinated care that affords people dignity, compassion and respect, and supports them to live an independent and fulfilling life. Person-centredness is recognised as a key domain of quality in health care, one that improves experience and outcomes (and can lead to better use of health care resources too).

So why might the requirement of person-centredness pose challenges for certain uses of technology?

  • First, person-centred care requires traits that computers cannot replicate. Chief among these are social and emotional intelligence – the ability to read the feelings of others, know how to respond in an appropriate way, and manage relationships. Social and emotional intelligence include important attributes like empathy (a willingness to appreciate the patient’s perspective) and compassion.
  • Second, person-centred care requires treating people with dignity and respect, for example, giving people privacy when they want it, supporting them to be independent, and treating them as equals. While some uses of technology may strengthen the ability of care providers to treat patients with dignity and respect, others could undermine it.
  • Third, person-centred care requires strong clinician-patient relationships, which are fundamental to co-production and shared decision making. So there will be aspects of care, such as care planning, that cannot be delegated to a machine because they rely on a sense of partnership between the patient and clinician.

Exploring four technology-based approaches

To explore what impact technology might have on person-centred care, we looked at four scenarios with applications of technology that exist already or could be implemented in the near future:

  1. Self-check-in for appointments
  2. Communicating bad news by video link
  3. AI triage
  4. Robotic care assistants

Each of these scenarios presents interesting questions about the human dimension of health care. How important is human interaction and emotional intelligence in each case? Might the use of technology undermine possibilities for human interaction or emotional intelligence and make it harder to treat people with empathy, dignity and respect? Or might it help person-centredness by empowering patients and enabling better personalisation of care?

Working with YouGov in October 2020, we asked more than 4,000 adults in the UK for their views on how these four uses of technology might impact on the person-centredness of care, using an online survey. We also asked people whether they would personally be comfortable using them. For this latter question we would expect a variety of factors to influence their views, including the range of perceived advantages and disadvantages of the technology, how familiar people are with the technology, how easy or difficult they think it would be to use, and the degree of risk associated with the situation.

Scenario 1: Self-check-in for appointments – automation in the GP surgery reception

Automated systems are now able to perform some tasks traditionally carried out by a GP receptionist, such as check in and appointment booking. This could free up receptionists for other work and potentially offer benefits for patients too, such as greater privacy. On the other hand, patients may value interacting with a receptionist: in a recent University of Oxford study looking at the opportunities for automation in primary care, the researchers often observed patients bypassing the touch screen to speak to the receptionist instead. And there may be a variety of benefits to this, such as enabling the receptionist to identify patient needs. So, what might the negative consequences be of reducing face-to-face contact in the GP surgery reception through technologies such as touch screen check in, and do these outweigh the benefits that automated systems offer?

Figure 1

  • When asked to choose between two competing statements, 72% chose the statement that ‘Checking in with a touch screen is better – it's usually quicker, more convenient and private, and frees up the receptionist for other work’, with just 21% choosing ‘It's a shame if people have to check in with a touch screen – they may prefer to speak to the receptionist and ask them questions’.
  • When people were asked about how they would feel if they personally had to use a touch screen to check in rather than speaking to the receptionist, 86% said ‘comfortable’ compared with 10% ‘uncomfortable’.
  • Among the different demographic groups captured in our poll, one of the most striking differences on this question was for people with a carer. This group were somewhat less keen on the touch screen option than the population as a whole – though the majority (62%) still thought the touch screen option was better.

Scenario 2: Communicating bad news by video – virtual consultations

The NHS Long Term Plan signalled major ambitions to expand virtual consultations, and this process has since been rapidly accelerated by the need to adapt quickly to COVID-19. But what impact might greater use of virtual consultations have on empathy in health care?

We know empathy matters for effective clinician-patient interactions – for example, when clinicians are empathetic, patients disclose more – but being empathetic online can be challenging: for example, it can be more difficult to pick up non-verbal cues and make eye contact. So, are there situations where virtual consultations may not be appropriate? Consider, for example, the case of a doctor communicating a positive cancer test result to a patient. Does empathetic care here require face-to-face contact, as some have suggested following first-hand experience? Is it harder to do this sensitively and compassionately by video link? Or does it provide a better experience for the patient to receive the news in their own home, where they can be with family or friends, and not have to face a journey home afterwards, as some studies have shown?

Figure 2

  • When asked to think about a medical consultation where a doctor has to inform a patient that they have a serious illness, 71% preferred the statement that ‘This should be done in person as it's easier to communicate the news with sensitivity and respect and to provide more compassionate care when face-to-face’, compared with 18% who preferred the statement ‘It is better to do this over a video link from the patient's home, where they may feel more comfortable and be with family or friends – and they don't have to make a journey back from the hospital afterwards’.
  • Participants were asked to imagine a consultation where a doctor has to inform them that they have a serious illness. They were asked how comfortable or uncomfortable they personally would be if the consultation was done by video link rather than face-to-face. 60% said ‘uncomfortable’ and 31% said ‘comfortable’.
  • In terms of demographic differences, people aged 55 or older were more likely to want to speak in person in this situation, with 67% saying they would be ‘uncomfortable’ if it was done by video link as opposed to 26% saying ‘comfortable’.

Scenario 3: AI triage – using computers to prioritise patient needs

Advances in data analytics and machine learning are increasingly enabling computers to diagnose conditions and predict risks, allowing their use in triage (in A&E, for example). But even though these systems can come close to matching (and may eventually surpass) human performance, can they ever replace clinicians in triage?

A significant finding of social psychology in recent decades is that people can care about the processes of decision making independently of the outcomes. For example, people who lose court cases may nevertheless feel more satisfied if they believe they have had a fair chance to have their argument heard, compared with those who believe they haven’t. In a similar way, is the ability to have cases heard and considered by a human clinician important for patients to feel they have been treated with dignity and respect?

Alternatively, if AI triage can lead to quicker and more consistent decision making, potentially free from certain kinds of human bias, would patients prefer it? In short, there are significant issues of procedural acceptability concerning how decisions are made and resources are allocated in health care, where the increased use of automation and AI in decision making could have important ramifications. 

Figure 3

  • Even when imagining that a computer was as accurate as a human doctor in diagnosing their conditions, 54% chose the statement that ‘It's important that a patient's case is heard and considered by a human doctor in deciding their priority for treatment – it could feel disrespectful, uncaring or unfair for this to be done by a computer’, compared with 37% who chose ‘Provided it was as accurate as a human doctor, it's fine for a computer to determine a patient's priority for treatment, rather than a human doctor’.
  • When asked how they would feel if they personally were in a situation where a computer asked about their symptoms and decided where in the queue they should be seen (again, on the assumption that it was as accurate as a human doctor), opinion was more evenly split, with 46% saying ‘uncomfortable’ as opposed to 45% saying ‘comfortable’.
  • This was another area where people with a carer were less keen on the technology-based approach than the population as a whole, with only 37% saying they would be ‘comfortable’ as opposed to 55% saying ‘uncomfortable’.

Scenario 4: Robotic care assistants – using technology to support people at home

With rising demand for social care, robotic technology is increasingly being seen as a possible means of assisting people who might need support with daily living at home. As the sophistication of such technology grows, robots could assist with tasks such as getting dressed and going to the toilet, and also potentially provide an element of companionship. But does using robots to fulfil these functions undermine the ethos of care and support? Is it impersonal or degrading, weakening dignity and creating a feeling of objectification? Or could it actually strengthen dignity by giving people a greater sense of privacy and independence? ​

Figure 4

  • When asked to choose between two competing statements, 47% chose the statement that ‘Having a robotic carer rather than a human carer risks undermining a person's dignity – it's impersonal and de-humanising’, compared with 31% who chose ‘Having a robotic carer rather than a human carer could strengthen a person's dignity by giving them greater independence as well as privacy’.
  • When asked how comfortable or uncomfortable they personally would be with a robotic carer providing some of their care (rather than a human carer), 48% said ‘uncomfortable’ while 38% said ‘comfortable’.
  • The most striking demographic difference here was between men and women: more men said they would feel comfortable with a robotic carer providing some of their care (46%) than uncomfortable (39%), whereas many more women said uncomfortable (56%) than comfortable (30%).

Comparing results across these four scenarios

Figure 5

Of the four scenarios we explored, touch screen check in was the only one that the majority of respondents said they would be comfortable with, and the only one where a majority thought the technology-based approach was better. Perhaps this is not surprising as it is also likely to be the approach that people are most familiar with (in contrast to technologies like robotic care assistants), as well as being a relatively ‘low-risk’ scenario (compared to, say, triage). It is possible that attitudes and norms will evolve over time as people become more familiar with some of these other uses of technology. Nevertheless, our results suggest there may be challenges with deploying some of these approaches in a way that patients feel comfortable with.

Interestingly, however, in all four scenarios, scores for respondents’ feelings towards these uses of technology for their own care were consistently more positive than scores for their views about these approaches in principle. This suggests that people don’t necessarily have to believe the technology-based approach is better to feel comfortable with using it personally.

Finally, it’s clear that public opinion is diverse. Despite touch screen check in being the only intervention that the majority of respondents felt comfortable with, sizeable minorities were comfortable with each approach, and on some options views were fairly evenly divided. So rather than attempting to generalise about what patients do or don’t want, we need to find practical ways of capturing and responding to the diversity of preferences that exists.

What lessons are there for policymakers and practitioners?

First, it’s clear that engagement with the public and the NHS workforce on technology in health care will be very important – to understand people’s views, raise awareness of new technology-based approaches, and inform decisions about the best ways to develop and deploy technology.

Second, co-design of new approaches with patients and staff will be essential to ensure new technology-based approaches are properly built around the needs of users and reflect their views and preferences.

Third, differentiation is going to be important: rather than thinking in terms of blanket applications of technology, it will be a case of understanding who they work for and when, and ensuring alternative options are available for people who want them, where feasible.

Of course, our polling only scratches the surface of public attitudes towards these technologies in health care, and more research is needed to understand in more depth what people think and to track how norms change over time. Our aim here has been to shine a light on some of these interesting questions about how technology will shape patient experience and how we can ensure that health care remains person-centred as we take advantage of the benefits that new technologies offer.

Further reading

Download the polling data on technology and empathy

You might also like...

Kjell-bubble-diagramArtboard 101 copy

Get social

Follow us on Twitter
Kjell-bubble-diagramArtboard 101

Work with us

We look for talented and passionate individuals as everyone at the Health Foundation has an important role to play.

View current vacancies
Artboard 101 copy 2

The Q community

Q is an initiative connecting people with improvement expertise across the UK.

Find out more