News | National
23 Oct 2024 12:32
NZCity News
NZCity CalculatorReturn to NZCity

  • Start Page
  • Personalise
  • Sport
  • Weather
  • Finance
  • Shopping
  • Jobs
  • Horoscopes
  • Lotto Results
  • Photo Gallery
  • Site Gallery
  • TVNow
  • Dating
  • SearchNZ
  • NZSearch
  • Crime.co.nz
  • RugbyLeague
  • Make Home
  • About NZCity
  • Contact NZCity
  • Your Privacy
  • Advertising
  • Login
  • Join for Free

  •   Home > News > National

    Humanising AI could lead us to dehumanise ourselves

    As AI ‘companions’ permeate our most intimate spheres, we risk diminishing our own human essence by falsely attributing human qualities to them.

    Raffaele F Ciriello, Senior Lecturer in Business Information Systems, University of Sydney, Angelina Ying Chen, PhD student , University of Sydney
    The Conversation


    Irish writer John Connolly once said:

    The nature of humanity, its essence, is to feel another’s pain as one’s own, and to act to take that pain away.

    For most of our history, we believed empathy was a uniquely human trait – a special ability that set us apart from machines and other animals. But this belief is now being challenged.

    As AI becomes a bigger part of our lives, entering even our most intimate spheres, we’re faced with a philosophical conundrum: could attributing human qualities to AI diminish our own human essence? Our research suggests it can.

    Digitising companionship

    In recent years, AI “companion” apps such as Replika have attracted millions of users. Replika allows users to create custom digital partners to engage in intimate conversations. Members who pay for Replika Pro can even turn their AI into a “romantic partner”.

    Physical AI companions aren’t far behind. Companies such as JoyLoveDolls are selling interactive sex robots with customisable features including breast size, ethnicity, movement and AI responses such as moaning and flirting.

    While this is currently a niche market, history suggests today’s digital trends will become tomorrow’s global norms. With about one in four adults experiencing loneliness, the demand for AI companions will grow.

    The dangers of humanising AI

    Humans have long attributed human traits to non-human entities – a tendency known as anthropomorphism. It’s no surprise we’re doing this with AI tools such as ChatGPT, which appear to “think” and “feel”. But why is humanising AI a problem?

    For one thing, it allows AI companies to exploit our tendency to form attachments with human-like entities. Replika is marketed as “the AI companion who cares”. However, to avoid legal issues, the company elsewhere points out Replika isn’t sentient and merely learns through millions of user interactions.

    Some AI companies overtly claim their AI assistants have empathy and can even anticipate human needs. Such claims are misleading and can take advantage of people seeking companionship. Users may become deeply emotionally invested if they believe their AI companion truly understands them.

    This raises serious ethical concerns. A user will hesitate to delete (that is, to “abandon” or “kill”) their AI companion once they’ve ascribed some kind of sentience to it.

    But what happens when said companion unexpectedly disappears, such as if the user can no longer afford it, or if the company that runs it shuts down? While the companion may not be real, the feelings attached to it are.

    Empathy – more than a programmable output

    By reducing empathy to a programmable output, do we risk diminishing its true essence? To answer this, let’s first think about what empathy really is.

    Empathy involves responding to other people with understanding and concern. It’s when you share your friend’s sorrow as they tell you about their heartache, or when you feel joy radiating from someone you care about. It’s a profound experience – rich and beyond simple forms of measurement.

    A fundamental difference between humans and AI is that humans genuinely feel emotions, while AI can only simulate them. This touches on the hard problem of consciousness, which questions how subjective human experiences arise from physical processes in the brain.

    A child with spectacles looks closely at a monitor lizard through glass.
    Science has yet to solve the hard problem of consciousness. Shutterstock

    While AI can simulate understanding, any “empathy” it purports to have is a result of programming that mimics empathetic language patterns. Unfortunately, AI providers have a financial incentive to trick users into growing attached to their seemingly empathetic products.

    The dehumanAIsation hypothesis

    Our “dehumanAIsation hypothesis” highlights the ethical concerns that come with trying to reduce humans to some basic functions that can be replicated by a machine. The more we humanise AI, the more we risk dehumanising ourselves.

    For instance, depending on AI for emotional labour could make us less tolerant of the imperfections of real relationships. This could weaken our social bonds and even lead to emotional deskilling. Future generations may become less empathetic – losing their grasp on essential human qualities as emotional skills continue to be commodified and automated.

    Also, as AI companions become more common, people may use them to replace real human relationships. This would likely increase loneliness and alienation – the very issues these systems claim to help with.

    AI companies’ collection and analysis of emotional data also poses significant risks, as these data could be used to manipulate users and maximise profit. This would further erode our privacy and autonomy, taking surveillance capitalism to the next level.

    Holding providers accountable

    Regulators need to do more to hold AI providers accountable. AI companies should be honest about what their AI can and can’t do, especially when they risk exploiting users’ emotional vulnerabilities.

    Exaggerated claims of “genuine empathy” should be made illegal. Companies making such claims should be fined – and repeat offenders shut down.

    Data privacy policies should also be clear, fair and without hidden terms that allow companies to exploit user-generated content.

    We must preserve the unique qualities that define the human experience. While AI can enhance certain aspects of life, it can’t – and shouldn’t – replace genuine human connection.

    The Conversation

    The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

    This article is republished from The Conversation under a Creative Commons license.
    © 2024 TheConversation, NZCity

     Other National News
     23 Oct: Mitch Hay's credited his Canterbury teammates for accelerating his progression into the Black Caps
     23 Oct: Donald Trump is planning more trade barriers if he becomes president – but they didn’t work last time
     23 Oct: An Auckland woman has been charged with misconduct in respect of human remains
     23 Oct: A higher rates rise could be on the cards in Christchurch
     23 Oct: ‘They do not respect our land. They do not respect our people’. Brazil’s traditional people take on BHP in one of the world’s biggest class actions
     23 Oct: From Camilla to the ‘ugly’ Elizabeth of Austria: a problematic history of obsessing over royal women’s looks
     23 Oct: A third person will appear in court for murder, over the death of a 43-year-old man found near the Hampton Downs Racetrack at Whangamarino
     Top Stories

    RUGBY RUGBY
    Fullback Ruben Love and loose forward Peter Lakai have both been named to debut off the bench in the All Blacks team to face Japan on Saturday in Yokohama More...


    BUSINESS BUSINESS
    The boss of New Zealand's biggest bank has pushed back against questions over its sky-high profits More...



     Today's News

    Cricket:
    Mitch Hay's credited his Canterbury teammates for accelerating his progression into the Black Caps 12:27

    Entertainment:
    Liam Payne was honoured at the Rock and Roll Hall Of Fame induction ceremony on Saturday night (19.10.24) 12:05

    Netball:
    The Australian netballers are at a loss to explain their run of seven defeats in New Zealand 11:57

    Business:
    The boss of New Zealand's biggest bank has pushed back against questions over its sky-high profits 11:47

    Entertainment:
    Liam Payne was forced to extend his trip to Argentina due to issues with his visa 11:35

    National:
    Donald Trump is planning more trade barriers if he becomes president – but they didn’t work last time 11:17

    Health & Safety:
    Israel's military confirms killing of late Hezbollah leader Hassan Nasrallah's expected successor, Hashem Safieddine 11:17

    Rugby League:
    Second-rower Jackson Ford is the latest player to re-commit to the Warriors, signing a two-year contract extension to the end of the 2027 NRL league season 11:07

    Entertainment:
    Rosie O'Donnell's daughter has been charged with felony child neglect and drug possession 11:05

    Law and Order:
    An Auckland woman has been charged with misconduct in respect of human remains 10:47


     News Search






    Power Search


    © 2024 New Zealand City Ltd