Labor market modeling suggests that the impact of new technologies will be felt differently across industries and occupations. A 2021 report by PwC predicted that the risk of job displacement in health and social care from ‘AI and related technologies’ would be lower than that in many other sectors. Indeed, against the backdrop of increasing patient demand, the report expected health and social care to see the biggest net employment increases of any sector over the next 20 years, with technology proving largely ‘complementary’.
There are several potential factors behind this more positive outlook—including factors that support the nature of health care work itself.
Technology struggles to replicate the attributes or competencies in healthcare
First, many tasks in healthcare are difficult to automate because they require characteristics or skills that AI and other technologies currently struggle to replicate. For example, recent research from Open AI, Open Research and the University of Pennsylvania (2023) found that jobs that require critical thinking skills are less likely to be influenced by current major language models. Critical thinking is central to much of healthcare, where staff must weigh the benefits and risks of different possibilities, approaches and solutions. For example, important nuances may be needed to translate patient symptoms into diagnoses and treatments. While AI like large language models can assist critical thinking – for example by supporting clinicians’ training, education and professional development or by distilling large volumes of research to generate health advice – it is different from actually doing critical thinking. Other key competencies needed in healthcare, such as creativity and negotiation skills, are similarly difficult to automate.
Social and emotional intelligence are also essential components of high-quality care, enabling staff to empathize, communicate effectively and meet patient needs. Analysis by the Office for National Statistics (ONS) in 2019 – which found that medical practitioners were one of the three occupations at the lowest risk of automation – noted that health-related words such as ‘patient’ and ‘treatment’ appeared frequently in the task descriptions of jobs with a low risk of automation. The ONS suggested this reflects the dimension of ‘working with people’ and ‘the value that people add in these roles, which is difficult to computerise’. Again, emerging research suggests that AI can support empathic communication – for example by generating draft responses to patient questions – but this is different from being empathic, which requires the ability to read and understand the feelings of other people, and to express to push and argue with emotion.
Healthcare is seen as intrinsically ‘human’
A second factor is that health care in the UK, as in many cultures, is seen as intrinsically ‘human’. Given the considerable value attached to the interpersonal dimension of care, some activities – such as communicating a diagnosis of serious illness or comforting a patient – cannot be delegated to machines without compromising the quality and ethos of care. not undermine. To take another example, while some patients may be happy with AI making clinical decisions in areas such as triage, others may feel that listening to a human and considering their case is an important component of being respectful and to be treated with compassion. Healthcare is not a product, but a service co-designed between professionals and patients and built on trust. Human relationships therefore take on particular significance in areas such as care planning, where the need for genuine partnership can place important constraints on the use of automation.
A 2021 study from the University of Oxford looked not only at which healthcare activities could be automated, but also at what healthcare practitioners think about the desirability of automating them. Interestingly, several activities ranked high for automation potential but low for automation desirability – typically those involving a ‘high level of physical contact’ with patients (such as administering anesthetics or examining the mouth and teeth). Many healthcare tasks sit at the intersection of attending to a patient’s physical, mental, and social needs, and this likely influences attitudes toward automation.
Even where a task can be automated, it does not necessarily follow that it must be. In the study by Open AI, Open Research and the University of Pennsylvania, researchers noted that it is difficult to make predictions about the impact of large language models on activities where there is “currently a regulation or norm that human supervision, judgment or empathy require or suggest. ‘ – a description that characterizes much of health care.
Few healthcare roles consist of completely automated tasks
A third reason why there is a lower risk of widespread job displacement in health care is that automation applies to tasks, not roles per se, and few health care occupations consist entirely of automatable tasks. A Health Foundation-funded study on the potential of automation in primary care found that, while there were a small number of roles (such as a prescription clerk) likely to be heavily affected by automation, no occupation could be fully automated.
Where only specific tasks can be automated, staff can adapt by focusing on other tasks or expanding their roles. Research by Goldman Sachs (2023) on the exposure of different industries to automation and generative AI predicted that the occupational categories of ‘healthcare support’ and ‘healthcare practitioners and technicians’ will be largely complemented rather than replaced by AI – precisely because of the mix of tasks involved . Similarly, research by Accenture (2023) suggested that, compared to many other industries, a smaller proportion of healthcare has a high potential for automation, but a larger proportion has a high potential for ‘augmentation’ through technology.
Disclaimer for Uncirculars, with a Touch of Personality:
While we love diving into the exciting world of crypto here at Uncirculars, remember that this post, and all our content, is purely for your information and exploration. Think of it as your crypto compass, pointing you in the right direction to do your own research and make informed decisions.
No legal, tax, investment, or financial advice should be inferred from these pixels. We’re not fortune tellers or stockbrokers, just passionate crypto enthusiasts sharing our knowledge.
And just like that rollercoaster ride in your favorite DeFi protocol, past performance isn’t a guarantee of future thrills. The value of crypto assets can be as unpredictable as a moon landing, so buckle up and do your due diligence before taking the plunge.
Ultimately, any crypto adventure you embark on is yours alone. We’re just happy to be your crypto companion, cheering you on from the sidelines (and maybe sharing some snacks along the way). So research, explore, and remember, with a little knowledge and a lot of curiosity, you can navigate the crypto cosmos like a pro!
UnCirculars – Cutting through the noise, delivering unbiased crypto news