News | Law and Order
15 Jan 2025 2:51
NZCity News
NZCity CalculatorReturn to NZCity

  • Start Page
  • Personalise
  • Sport
  • Weather
  • Finance
  • Shopping
  • Jobs
  • Horoscopes
  • Lotto Results
  • Photo Gallery
  • Site Gallery
  • TVNow
  • Dating
  • SearchNZ
  • NZSearch
  • Crime.co.nz
  • RugbyLeague
  • Make Home
  • About NZCity
  • Contact NZCity
  • Your Privacy
  • Advertising
  • Login
  • Join for Free

  •   Home > News > Law and Order

    The dynamics that polarise us on social media are about to get worse

    Relying on social media users to police information accuracy could further polarise platforms and amplify extreme voices.

    Colin M. Fisher, Associate Professor of Organisations and Innovation and Author of "The Collective Edge: Unlocking the Secret Power of Groups", UCL
    The Conversation


    Meta founder and CEO Mark Zuckerberg has announced big changes in how the company addresses misinformation across Facebook, Instagram and Threads. Instead of relying on independent third-party factcheckers, Meta will now emulate Elon Musk’s X (formerly Twitter) in using “community notes”. These crowdsourced contributions allow users to flag content they believe is questionable.

    Zuckerberg claimed these changes promote “free expression”. But some experts worry he’s bowing to right-wing political pressure, and will effectively allow a deluge of hate speech and lies to spread on Meta platforms.

    Research on the group dynamics of social media suggests those experts have a point.

    At first glance, community notes might seem democratic, reflecting values of free speech and collective decisions. Crowdsourced systems such as Wikipedia, Metaculus and PredictIt, though imperfect, often succeed at harnessing the wisdom of crowds — where the collective judgement of many can sometimes outperform even experts.

    Research shows that diverse groups that pool independent judgements and estimates can be surprisingly effective at discerning the truth. However, wise crowds seldom have to contend with social media algorithms.

    Many people rely on platforms such as Facebook for their news, risking exposure to misinformation and biased sources. Relying on social media users to police information accuracy could further polarise platforms and amplify extreme voices.

    Two group-based tendencies — our psychological need to sort ourselves and others into groups — are of particular concern: in-group/out-group bias and acrophily (love of extremes).

    Ingroup/outgroup bias

    Humans are biased in how they evaluate information. People are more likely to trust and remember information from their in-group — those who share their identities — while distrusting information from perceived out-groups. This bias leads to echo chambers, where like-minded people reinforce shared beliefs, regardless of accuracy.

    It may feel rational to trust family, friends or colleagues over strangers. But in-group sources often hold similar perspectives and experiences, offering little new information. Out-group members, on the other hand, are more likely to provide diverse viewpoints. This diversity is critical to the wisdom of crowds.

    But too much disagreement between groups can prevent community fact-checking from even occurring. Many community notes on X (formerly Twitter), such as those related to COVID vaccines, were likely never shown publicly because users disagreed with one another. The benefit of third-party factchecking was to provide an objective outside source, rather than needing widespread agreement from users across a network.

    Worse, such systems are vulnerable to manipulation by well organised groups with political agendas. For instance, Chinese nationalists reportedly mounted a campaign to edit Wikipedia entries related to China-Taiwan relations to be more favourable to China.

    Political polarisation and acrophily

    Indeed, politics intensifies these dynamics. In the US, political identity increasingly dominates how people define their social groups.

    Political groups are motivated to define “the truth” in ways that advantage them and disadvantage their political opponents. It’s easy to see how organised efforts to spread politically motivated lies and discredit inconvenient truths could corrupt the wisdom of crowds in Meta’s community notes.

    Social media accelerates this problem through a phenomenon called acrophily, or a preference for the extreme. Research shows that people tend to engage with posts slightly more extreme than their own views.

    A keyboard with a dividing crack down the middle, and two miniature figurines standing on either side
    Extreme and negative views get more attention online, driving social media communities apart. evan_huang/Shutterstock

    These increasingly extreme posts are more likely to be negative than positive. Psychologists have known for decades that bad is more engaging than good. We are hardwired to pay more attention to negative experiences and information than positive ones.

    On social media, this means negative posts – about violence, disasters and crises – get more attention, often at the expense of more neutral or positive content.

    Those who express these extreme, negative views gain status within their groups, attracting more followers and amplifying their influence. Over time, people come to think of these slightly more extreme negative views as normal, slowly moving their own views toward the poles.

    A recent study of 2.7 million posts on Facebook and Twitter found that messages containing words such as “hate”, “attack” and “destroy” were shared and liked at higher rates than almost any other content. This suggests that social media isn’t just amplifying extreme views — it’s fostering a culture of out-group hate that undermines the collaboration and trust needed for a system like community notes to work.

    The path forward

    The combination of negativity bias, in-group/out-group bias and acrophily supercharges one of the greatest challenges of our time: polarisation. Through polarisation, extreme views become normalised, eroding the potential for shared understanding across group divides.

    The best solutions, which I examine in my forthcoming book, The Collective Edge, start with diversifying our information sources. First, people need to engage with — and collaborate across — different groups to break down barriers of mistrust. Second, they must seek information from multiple, reliable news and information outlets, not just social media.

    However, social media algorithms often work against these solutions, creating echo chambers and trapping people’s attention. For community notes to work, these algorithms would need to prioritise diverse, reliable sources of information.

    While community notes could theoretically harness the wisdom of crowds, their success depends on overcoming these psychological vulnerabilities. Perhaps increased awareness of these biases can help us design better systems — or empower users to use community notes to promote dialogue across divides. Only then can platforms move closer to solving the misinformation problem.

    The Conversation

    Colin M. Fisher does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    This article is republished from The Conversation under a Creative Commons license.
    © 2025 TheConversation, NZCity

     Other Law and Order News
     14 Jan: Dunedin Police are appealing for information after an attempted robbery in the North East Valley area
     14 Jan: Police have arrested a man after alleged incidents in two Auckland suburbs
     14 Jan: A man who escaped police custody in Taupo has been missing for 24 hours
     14 Jan: Five thousand dollars in cash, three vehicles, and 100 ounces of cannabis have been uncovered at a Taranaki property
     14 Jan: A man's been left with a minor stab wound to the hand, and a woman's been arrested - after an altercation in Auckland's Glenfield this afternoon
     14 Jan: Nelson Police have identified the man who was discovered dead in his Stoke home on Friday night
     14 Jan: A woman's seriously injured, after allegedly stealing and crashing a vehicle in central Christchurch
     Top Stories

    RUGBY RUGBY
    Former Scotland rugby captain Peter Brown has died aged 83 More...


    BUSINESS BUSINESS
    The Finance Minister's welcoming a more positive outlook from businesses More...



     Today's News

    Law and Order:
    Dunedin Police are appealing for information after an attempted robbery in the North East Valley area 21:57

    Politics:
    Australian man Oscar Jenkins reportedly killed after being captured while fighting for Ukraine 21:47

    Entertainment:
    Angelina Jolie's motivations have changed during the course of her career 21:35

    International:
    Donald Trump engaged in 'criminal effort' to overturn 2020 election, report finds 21:27

    Law and Order:
    Police have arrested a man after alleged incidents in two Auckland suburbs 21:17

    Entertainment:
    Karla Sofia Gascon feels inspired by criticism 21:05

    Entertainment:
    James Woods feels "happy and grateful" after his home survived the Los Angeles wildfires 20:35

    Entertainment:
    Harris Dickinson isn't very comfortable with "being desired" 20:05

    Living & Travel:
    Quick Sports Summary 19:47

    Entertainment:
    Angelina Jolie found starring in 'Maria' to be a therapeutic experience 19:35


     News Search






    Power Search


    © 2025 New Zealand City Ltd