News | Health & Safety
8 Dec 2025 17:25
NZCity News
NZCity CalculatorReturn to NZCity

  • Start Page
  • Personalise
  • Sport
  • Weather
  • Finance
  • Shopping
  • Jobs
  • Horoscopes
  • Lotto Results
  • Photo Gallery
  • Site Gallery
  • TVNow
  • Dating
  • SearchNZ
  • NZSearch
  • Crime.co.nz
  • RugbyLeague
  • Make Home
  • About NZCity
  • Contact NZCity
  • Your Privacy
  • Advertising
  • Login
  • Join for Free

  •   Home > News > Health & Safety

    AI is beating doctors at empathy – because we’ve turned doctors into robots

    AI chatbots are outperforming doctors in empathy ratings. But the real story isn’t about robot superiority, it’s about how we’ve broken healthcare.

    Jeremy Howick, Professor and Director of the Stoneygate Centre for Excellence in Empathic Healthcare, University of Leicester
    The Conversation


    Artificial intelligence has mastered chess, art and medical diagnosis. Now it’s apparently beating doctors at something we thought was uniquely human: empathy.

    A recent review published in the British Medical Bulletin analysed 15 studies comparing AI-written responses with those from human healthcare professionals. Blinded researchers then rated these responses for empathy using validated assessment tools. The results were startling: AI responses were rated as more empathic in 13 out of 15 studies – 87% of the time.

    Before we surrender healthcare’s human touch to our new robot overlords, we need to examine what’s really happening here.

    The studies compared written responses rather than face-to-face interactions, giving AI a structural advantage: no vocal tone to misread, no body language to interpret, and unlimited time to craft perfect responses.

    Critically, none of these studies measured harms. They assessed whether AI responses sounded empathic, not whether they led to better outcomes or caused damage through misunderstood context, missed warning signs, or inappropriate advice.

    Yet even accounting for these limitations, the signal was strong. And the technology is improving daily – “carebots” are becoming increasingly lifelike and sophisticated.

    Beyond methodological concerns, there’s a simpler explanation: many doctors admit that their empathy declines over time, and patient ratings of healthcare professionals’ empathy vary greatly.

    Inquiries into fatal healthcare tragedies in the UK – from Mid Staffordshire NHS Foundation Trust to various patient safety reviews – have explicitly named lack of empathy from healthcare professionals as contributing to avoidable harm. But here’s the real issue: we’ve created a system that makes empathy nearly impossible.

    Doctors spend about a third of their time on paperwork and electronic health records. Doctors must also follow pre-defined protocols and procedures. While the documentation and protocols have some benefits, they have arguably had the unintended consequence of forcing the doctors to play the bot game. Therefore, we shouldn’t be surprised when the bot wins.

    The burnout crisis makes this worse. Globally, at least a third of GPs report burnout – exceeding 60% in some specialties. Burned-out doctors struggle to maintain empathy. It’s not a moral failing; it’s a physiological reality. Chronic stress depletes the emotional reserves required for genuine empathy.

    The wonder isn’t that AI appears more empathic; it’s that human healthcare professionals manage any empathy at all.

    A GP with his patient.
    Doctor’s empathy declines over time. Stephen Barnes/Shutterstock.com

    What AI will never replicate

    No carebot, however sophisticated, can truly replicate certain dimensions of human care.

    A bot cannot hold a frightened child’s hand during a painful procedure and make them feel safe through physical presence. It cannot read unspoken distress in a teenager’s body language when they’re too embarrassed to voice their real concern. It cannot draw on cultural experience to understand why a patient might be reluctant to accept certain treatment.

    AI cannot sit in silence with a dying patient when words fail. It cannot share a moment of dark humour that breaks the tension. It cannot exercise the moral judgment required when clinical guidelines conflict with a patient’s values.

    These aren’t minor additions to healthcare; they’re often what make care effective, healing possible and medicine humane.

    Here’s the tragic irony: AI threatens to take over precisely those aspects of care that humans do better, while humans remain trapped doing tasks computers should handle.

    We’re heading toward a world where AI provides the “empathy” while exhausted humans manage technical work – exactly backward. This requires three fundamental changes.

    First, we must train doctors to be consistently excellent at empathic communication. This cannot be a brief module in medical school. It needs to be central to healthcare education. Since AI already matches humans in many technical skills, this should free doctors to focus on genuine human connection.

    Second, redesign healthcare systems to protect the conditions necessary for empathy. Dramatically reduce administrative burden through better technology (ironically, AI could help here), ensure adequate consultation time, and address burnout through systemic change rather than resilience training.

    Third, rigorously measure both benefits and harms of AI in healthcare interactions. We need research on actual patient outcomes, missed diagnoses, inappropriate advice, and long-term effects on the therapeutic relationship – not just whether responses sound empathic to raters.

    The empathy crisis in healthcare isn’t caused by insufficient technology. It’s caused by systems that prevent humans from being human. AI appearing more empathic than doctors is a symptom, not the disease.

    We can use AI to handle administrative tasks and free doctors’ time and mental space, and even provide tips to help healthcare professionals boost their empathy. Or we can use it to replace the human connection that remains healthcare’s greatest strength.

    The technology will continue advancing, regardless. The question is whether we’ll use it to support human empathy or substitute for it – whether we’ll fix the system that broke our healthcare workers or simply replace them with machines that were never broken to begin with.

    The choice is ours, but the window is closing fast.

    The Conversation

    Jeremy Howick receives funding from the Stoneygate Trust.

    This article is republished from The Conversation under a Creative Commons license.
    © 2025 TheConversation, NZCity

     Other Health & Safety News
     08 Dec: Children from a Christchurch daycare where some were burnt by chemicals accidentally put on a waterslide - are back at the centre today
     08 Dec: People at 11 locations of interest are being asked to contact Heathline urgently about measles
     07 Dec: Warnings signs as new data indicates steadily declining oral health in New Zealand
     06 Dec: Kindercare daycare says it's shocked that children entrusted in their care were harmed at a Christchurch centre yesterday afternoon
     06 Dec: Eight people remain infectious with measles - after two new cases were detected yesterday
     05 Dec: Warnings not to collect and eat shellfish along the Bay of Plenty coast - with unsafe levels of toxins that could cause paralysis, respiratory failure and, in severe cases, death
     05 Dec: Kindercare says it's shocked by the incident and harm to children
     Top Stories

    RUGBY RUGBY
    The father of Tauranga teen running sensation Sam Ruthe has paid tribute to the long-term record-holder his son has consigned to history More...


    BUSINESS BUSINESS
    A significant milestone for the country's wool sector, with an export deal signed More...



     Today's News

    Entertainment:
    Vanessa Hudgens and her husband Cole Tucker have welcomed their second child into the world 17:20

    Cricket:
    The Black Caps are adopting a "next man up" mentality after a swathe of injuries 16:57

    Entertainment:
    Ed Sheeran spoke to Taylor Swift for "four hours" after he learnt of her engagement to Travis Kelce through social media 16:50

    Accident and Emergency:
    Crews are tackling a 110 hectare fire near Tongariro National Park - exactly a month since another damaging blaze 16:47

    Entertainment:
    Meghan Trainor quit drinking coffee "cold turkey" after a psychic told her husband "she's crashing out" 16:20

    Accident and Emergency:
    It's been a busy week for lifeguards - with 61 rescues around the country since December 1 16:17

    National:
    More women are using steroids – and many don’t know the risks 16:07

    Cricket:
    Black Caps coach Rob Walter has no qualms about injecting a test rookie into the fray in the second test against the West Indies, starting in Wellington on Wednesday 15:57

    Entertainment:
    Hilary Duff claims Jennifer Coolidge was "mean" and "scary" to her on the set of A Cinderella Story 15:50

    Living & Travel:
    Wellington's 24-7 Police Maritime Unit could officially be a thing of the past 15:27


     News Search






    Power Search


    © 2025 New Zealand City Ltd