News | Health & Safety
6 Dec 2025 6:26
NZCity News
NZCity CalculatorReturn to NZCity

  • Start Page
  • Personalise
  • Sport
  • Weather
  • Finance
  • Shopping
  • Jobs
  • Horoscopes
  • Lotto Results
  • Photo Gallery
  • Site Gallery
  • TVNow
  • Dating
  • SearchNZ
  • NZSearch
  • Crime.co.nz
  • RugbyLeague
  • Make Home
  • About NZCity
  • Contact NZCity
  • Your Privacy
  • Advertising
  • Login
  • Join for Free

  •   Home > News > Health & Safety

    AI is beating doctors at empathy – because we’ve turned doctors into robots

    AI chatbots are outperforming doctors in empathy ratings. But the real story isn’t about robot superiority, it’s about how we’ve broken healthcare.

    Jeremy Howick, Professor and Director of the Stoneygate Centre for Excellence in Empathic Healthcare, University of Leicester
    The Conversation


    Artificial intelligence has mastered chess, art and medical diagnosis. Now it’s apparently beating doctors at something we thought was uniquely human: empathy.

    A recent review published in the British Medical Bulletin analysed 15 studies comparing AI-written responses with those from human healthcare professionals. Blinded researchers then rated these responses for empathy using validated assessment tools. The results were startling: AI responses were rated as more empathic in 13 out of 15 studies – 87% of the time.

    Before we surrender healthcare’s human touch to our new robot overlords, we need to examine what’s really happening here.

    The studies compared written responses rather than face-to-face interactions, giving AI a structural advantage: no vocal tone to misread, no body language to interpret, and unlimited time to craft perfect responses.

    Critically, none of these studies measured harms. They assessed whether AI responses sounded empathic, not whether they led to better outcomes or caused damage through misunderstood context, missed warning signs, or inappropriate advice.

    Yet even accounting for these limitations, the signal was strong. And the technology is improving daily – “carebots” are becoming increasingly lifelike and sophisticated.

    Beyond methodological concerns, there’s a simpler explanation: many doctors admit that their empathy declines over time, and patient ratings of healthcare professionals’ empathy vary greatly.

    Inquiries into fatal healthcare tragedies in the UK – from Mid Staffordshire NHS Foundation Trust to various patient safety reviews – have explicitly named lack of empathy from healthcare professionals as contributing to avoidable harm. But here’s the real issue: we’ve created a system that makes empathy nearly impossible.

    Doctors spend about a third of their time on paperwork and electronic health records. Doctors must also follow pre-defined protocols and procedures. While the documentation and protocols have some benefits, they have arguably had the unintended consequence of forcing the doctors to play the bot game. Therefore, we shouldn’t be surprised when the bot wins.

    The burnout crisis makes this worse. Globally, at least a third of GPs report burnout – exceeding 60% in some specialties. Burned-out doctors struggle to maintain empathy. It’s not a moral failing; it’s a physiological reality. Chronic stress depletes the emotional reserves required for genuine empathy.

    The wonder isn’t that AI appears more empathic; it’s that human healthcare professionals manage any empathy at all.

    A GP with his patient.
    Doctor’s empathy declines over time. Stephen Barnes/Shutterstock.com

    What AI will never replicate

    No carebot, however sophisticated, can truly replicate certain dimensions of human care.

    A bot cannot hold a frightened child’s hand during a painful procedure and make them feel safe through physical presence. It cannot read unspoken distress in a teenager’s body language when they’re too embarrassed to voice their real concern. It cannot draw on cultural experience to understand why a patient might be reluctant to accept certain treatment.

    AI cannot sit in silence with a dying patient when words fail. It cannot share a moment of dark humour that breaks the tension. It cannot exercise the moral judgment required when clinical guidelines conflict with a patient’s values.

    These aren’t minor additions to healthcare; they’re often what make care effective, healing possible and medicine humane.

    Here’s the tragic irony: AI threatens to take over precisely those aspects of care that humans do better, while humans remain trapped doing tasks computers should handle.

    We’re heading toward a world where AI provides the “empathy” while exhausted humans manage technical work – exactly backward. This requires three fundamental changes.

    First, we must train doctors to be consistently excellent at empathic communication. This cannot be a brief module in medical school. It needs to be central to healthcare education. Since AI already matches humans in many technical skills, this should free doctors to focus on genuine human connection.

    Second, redesign healthcare systems to protect the conditions necessary for empathy. Dramatically reduce administrative burden through better technology (ironically, AI could help here), ensure adequate consultation time, and address burnout through systemic change rather than resilience training.

    Third, rigorously measure both benefits and harms of AI in healthcare interactions. We need research on actual patient outcomes, missed diagnoses, inappropriate advice, and long-term effects on the therapeutic relationship – not just whether responses sound empathic to raters.

    The empathy crisis in healthcare isn’t caused by insufficient technology. It’s caused by systems that prevent humans from being human. AI appearing more empathic than doctors is a symptom, not the disease.

    We can use AI to handle administrative tasks and free doctors’ time and mental space, and even provide tips to help healthcare professionals boost their empathy. Or we can use it to replace the human connection that remains healthcare’s greatest strength.

    The technology will continue advancing, regardless. The question is whether we’ll use it to support human empathy or substitute for it – whether we’ll fix the system that broke our healthcare workers or simply replace them with machines that were never broken to begin with.

    The choice is ours, but the window is closing fast.

    The Conversation

    Jeremy Howick receives funding from the Stoneygate Trust.

    This article is republished from The Conversation under a Creative Commons license.
    © 2025 TheConversation, NZCity

     Other Health & Safety News
     05 Dec: Warnings not to collect and eat shellfish along the Bay of Plenty coast - with unsafe levels of toxins that could cause paralysis, respiratory failure and, in severe cases, death
     05 Dec: Kindercare says it's shocked by the incident and harm to children
     05 Dec: A Christchurch school at the centre of this week's mouldy lunches debacle, has launched an internal inquiry
     04 Dec: New Zealand Food Safety has recalled some flavours of Tom and Luke brand Snacka Balls due to hard plastic contamination
     04 Dec: Health officials have confirmed one new case of measles today, in Wellington
     04 Dec: Nurse prescribers can now offer patients nearly double the number of medicines
     03 Dec: Five new measles cases have been identified
     Top Stories

    RUGBY RUGBY
    The Black Caps are trying to wrap up the first cricket test against the West Indies, on day four in Christchurch More...


    BUSINESS BUSINESS
    Chris Luxon admits many Kiwis still haven't felt the economy turn a corner More...



     Today's News

    Soccer:
    Coach Darren Bazeley is relatively blasé over who joins the All Whites' 20-26 football World Cup pool 21:57

    Cricket:
    West Indies batter Shai Hope has reached his century, late on day four of the first cricket test against New Zealand in Christchurch 21:17

    International:
    'Cloud Dancer' named as Pantone colour of the year for 2026 21:07

    Health & Safety:
    Warnings not to collect and eat shellfish along the Bay of Plenty coast - with unsafe levels of toxins that could cause paralysis, respiratory failure and, in severe cases, death 18:57

    Law and Order:
    A South Korean man's been sentenced to 14 months prison time for attempting to buy a rare native gecko 18:37

    International:
    Israel's continued participation in Eurovision the latest in a string of contentious moments 18:17

    Law and Order:
    A man's died after turning up at an Auckland medical centre with stab injuries - plunging Police into a homicide inquiry 18:07

    Health & Safety:
    Kindercare says it's shocked by the incident and harm to children 18:07

    Christchurch:
    A fun activity at a Christchurch daycare went horrifically wrong today - after a misidentified corrosive substance ended up on a water slide 17:37

    Rugby League:
    Australian boxing promoter George Rose is relieved Nelson Asofa-Solomona can finally be rewarded for belting opponents 17:37


     News Search






    Power Search


    © 2025 New Zealand City Ltd