News | National
2 Oct 2024 11:43
NZCity News
NZCity CalculatorReturn to NZCity

  • Start Page
  • Personalise
  • Sport
  • Weather
  • Finance
  • Shopping
  • Jobs
  • Horoscopes
  • Lotto Results
  • Photo Gallery
  • Site Gallery
  • TVNow
  • Dating
  • SearchNZ
  • NZSearch
  • Crime.co.nz
  • RugbyLeague
  • Make Home
  • About NZCity
  • Contact NZCity
  • Your Privacy
  • Advertising
  • Login
  • Join for Free

  •   Home > News > National

    I investigated millions of tweets from the Kremlin’s ‘troll factory’ and discovered classic propaganda techniques reimagined for the social media age

    Tailored messaging, repeated exposure, and false grassroots campaigns are classic propaganda techniques, now used widely by state-back trolls.

    Maksim Markelov, PhD Candidate, Russian and East European Studies, Manchester University
    The Conversation


    Gentlemen, we interfered, we interfere, and we will interfere … Carefully, precisely, surgically, and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once.

    These are the words of the architect of Russian online disinformation, Yevgeny Prigozhin, speaking in November 2022, just before the US midterm elections. Prigozhin founded the notorious Russian “troll factory”, the Internet Research Agency (the agency) in 2013.

    Since then, agency trolls have flooded social media platforms with conspiracy theories and anti-western messages challenging the foundations of democratic governance.

    I have been investigating agency tweets in English and Russian since 2021, specifically examining how they twist language to bend reality and serve the Kremlin. My research has examined around 3 million tweets, taking in three specific case studies: the 2016 US presidential election, COVID-19, and the annexation of Crimea. It seemed that wherever there was fire, the trolls fanned the flames.

    Though their direct impact on electoral outcomes so far remains limited, state-backed propaganda operations like the agency can shape the meaning of online discussions and influence public perceptions. But as another US election looms, big tech companies like X (formerly Twitter) are still struggling to deal with the trolls that are spreading disinformation on an industrial scale.

    The creation of the troll factory

    Research has suggested that the agency began experimenting with online propaganda and disinformation before 2013, employing a modest number of people tasked with spreading pro-Kremlin narratives and targeting opposition figures within Russia.


    Read more: Five disinformation tactics Russia is using to try to influence the US election


    By 2012, it started to focus on international audiences, particularly the US. Since then, it has been implicated in numerous campaigns to influence public opinion in Russia and abroad, most notably attempting to interfere in the 2016 US presidential election. Prigozhin and his troll factory colleagues later confirmed these findings.

    The agency was founded and financed by Prigozhin and initially based in St Petersburg. It operated through several front entities, including Teka and Glavset, both of which were dissolved and reformed under new names so it could continue its operations seamlessly.

    Prigozhin first gained prominence as a Kremlin caterer but later expanded his activities significantly, leading the agency and then the Wagner Group – Russia’s state-funded private military company that was active in Ukraine and Africa, among several other countries under Prigozhin, but is now reported to have been largely subsumed by the Kremlin. During the full-scale invasion of Ukraine, Prigozhin increasingly challenged Russia’s military top brass for its handling of the conflict.

    His fate took a decisive turn after a brief but highly publicised mutiny in June 2023, when Wagner forces marched toward Moscow, stopping less than 125 miles (200km) from the capital after negotiations. As part of the settlement, Prigozhin temporarily relocated to Belarus. Then on August 23, 2023, a jet with Prigozhin on board crashed near Kuzhenkino village, north of Moscow killing everyone on board.

    Some sources claimed that Prigozhin’s death was orchestrated by the Kremlin, marking him out as another figure who challenged its authority and paid the ultimate price.

    As its operations scaled up between 2014-2015, the agency hired hundreds of online commentators (trolls) who were responsible for promoting and amplifying the narratives that suited the Kremlin’s agenda.

    According to previous whistleblowers, who started shedding light on the organisation in 2015, most agency employees were young people, often students, recruited through public job adverts, with applicants undergoing interviews involving English language and “political knowledge” tests. Earning modest salaries of around 40,000 rubles per month (roughly £324 or US$432).


    This article is part of Conversation Insights. Our co-editors commission long-form journalism, working with academics from many different backgrounds who are engaged in projects aimed at tackling societal and scientific challenges.


    On the job, they were heavily monitored through digital surveillance and direct supervision, and later reportedly asked to go through lie detector tests to prove loyalty, with those failing risking dismissal. The agency also provided training ranging from “political education classes”, resembling Soviet-era ideological indoctrination, to studying target audiences’ social media communication styles and the cultural nuances of target societies – the aim being to blend in seamlessly with local users.

    There were multiple teams, each with its own regional and thematic focus, which created fake social media personas, posted misleading content, and amplified divisive issues to polarise their target audiences. The agency’s activities were well funded and highly organised. Employees worked in 12-hour shifts using virtual private networks (VPNs) to encrypt their internet connection. Various other bits of technology were used to mask their origins and bypass security measures on the social media platforms they were infiltrating and to make them appear as “ordinary” users.

    Whistleblowers lift the lid

    Whistleblowers have shed light on Russia’s troll factory, often at great personal risk. Ludmila Savchuk, a freelance journalist and former agency employee, was one of the first to expose the organisation’s inner workings. After two months “undercover”, she sued the the agency for non-payment of wages and moral damages, winning a symbolic settlement. Her actions drew broad public attention to the existence of the troll factory, revealing its strict hierarchy and far-reaching agenda.

    Savchuk described the agency as a multi-front operation with departments working on social media posts, commenting on news articles, creating fake news stories, and producing YouTube videos. Employees received daily instructions and faced penalties for non-compliance, including fines for lateness or ideologically incorrect posts. “It really is a factory for producing lies,” Savchuk concluded in one interview.

    Marat Mindiyarov, another former employee, likened the agency to an Orwellian prison, with total control enforced by security guards and constant surveillance. Both Savchuk and Mindiyarov noted that most employees were motivated by “easy money” rather than ideological commitment.

    “I observed the department that comments on news articles: they don’t really think the way they write. In fact, they don’t think at all,” Savchuk recalled. After exposing the agency, both reportedly faced severe repercussions. Savchuk endured a trolling campaign against her, while Mindiyarov was arrested on false bomb threat charges after speaking to US media.

    Dissenting voices like Savchuk and Mindiyarov in today’s Russia often find themselves cornered under mounting legal and police pressure. Many inside Russia who are critical of the Kremlin saw the death of Alexei Navalny, the longtime Russian anti-corruption campaigner, in Russia’s Arctic penal colony as a warning, leaving them with a choice: to flee into exile or speak up and pay the price.


    Read more: Navalny dies in prison - but his blueprint for anti-Putin activism will live on


    Inside the troll echo chamber

    As a linguist and researcher in Russian and East European Studies, I have always been interested in how states use language online in an attempt to popularise their agendas. But I was really convinced to investigate this relationship more closely during the COVID pandemic in 2021 when online conspiracy theories seemed to suddenly flourish. Then, in 2022, Russia invaded Ukraine and the troll factories started working overtime.

    I quickly found myself in an echo chamber: the comments on articles critical of Russia’s actions in Ukraine were flooded with pro-Kremlin remarks that seemed contrived. I couldn’t believe that on social media, where verbal spats are a daily occurrence, users would unanimously laud the Kremlin’s stance.

    So began the exhaustive process of analysing agency tweets and contrasting them with random users’ tweets. To do this, I used Twitter’s “troll” datasets, which contain tweets from accounts linked to the agency based on various “signals”, including account location and other credentials.

    I trawled through masses of tweets in Russian and English from 2014 to 2020 looking for ones which contained keywords such as “Crimea”, “Ukraine”, “Russia”, “Annexation”, and “Politics”. I then collected a random sample of random users’ tweets using the same keywords and dates from Twitter. In all of the case studies I examined I found examples of tailored messaging, repeated exposure, and false grassroots campaigns – all of which are classic propaganda techniques which had been reimagined for the social media age. However, not every case utilised all three techniques simultaneously; rather, they were employed selectively depending on the context and goals of each campaign.

    Tailored messages are strategically crafted to resonate with the target audience’s beliefs and emotions, as seen in first and second world war leaflets designed to demoralise enemy soldiers by playing on their fears.

    Repeated exposure involves consistent repetition of propaganda messages to foster familiarity and acceptance, exemplified by US cold war TV and radio broadcasts that aimed to counter the Soviet propaganda and ingrain anti-Soviet and anti-communist sentiments in the public mind.

    Meanwhile, false grassroots campaigns, or “astroturfing”, create the illusion of widespread public support for a political or social cause and can be illustrated by the Soviet Union’s use of front organisations in the US, such as The National Council of American-Soviet Friendship, to spread Soviet state propaganda.

    These tactics are deeply rooted in early-mid 20th-century propaganda theory, as articulated by the likes of the American political scientist and communications theorist Harold Lasswell and the French sociologist Jacques Ellul.

    The trolls employed these age-old methods to foster acceptance of the Kremlin’s far-fetched views domestically and to challenge western positions abroad.

    2016 US election

    The most prominent case of the agency’s interference was the 2016 US presidential election, which the US government investigated extensively.

    The election drew significant attention due to controversies surrounding the two candidates – Donald Trump and Hillary Clinton – including the allegations of Russian interference and collusion. This unfolded amid rising concerns about income inequality, unemployment, racial tensions, immigration, healthcare, and foreign policy.

    Despite high interest in the election and its perceived importance, most voters were dissatisfied with the candidate choices (58%) and the country’s direction (65%-71% in 2016). The agency exploited this, strategically timing posts to exacerbate tensions around issues of economy, immigration, gun rights, race, gender, sex, and religious identities. This mirrored significant voter concerns.

    My research of around 213,000 agency election tweets confirmed that the trolls “played both sides” and adapted to their target audience – US voters.

    The agency operatives tweeted both in favour of and against the candidates. But Trump received the most positive assessments from the agency, especially in the two years after the election. The sentiment towards Clinton was predominantly negative before and during the election year, but declined sharply afterwards. The trolls then focused on the pro-Trump and anti-Trump dynamic, amplifying sentiments like “Make America Great Again” while also engaging with post-election anti-Trump sentiments such as, “Not My President”.

    Two months after Trump’s inauguration, an agency troll ironically tweeted, “#NeverTrump and Democrat back [sic] trolls are everywhere,” mocking the anti-Trump sentiment on social media.

    Another agency tweet criticised the Trump administration’s decision to introduce stricter work requirements for benefits recipients: “Trump/GOP moves to take food stamps away from millions of Americans. Boosting billionaires while busting the poorest.”

    These tweets exemplify the agency strategy to amplify divisions on both sides of the political spectrum, while downplaying the Kremlin’s involvement.

    “ADAM SCHIFF Admits why he’s conducting the sham #TrumpRussia investigation,” one agency troll commented as the “Russia investigation” gained pace, mocking the claims about Russian interference.

    But the agency’s tactics went beyond social media provocations, and involved the orchestration of real-world events like protests and rallies, as the US Senate Intelligence Committee has reported. These operations also targeted both sides of the political spectrum with the trolls posing as US political activists to manipulate Americans into organising and promoting events – thus creating a false grassroots campaign which heightened existing societal tensions.

    My topic analysis revealed the trolls’ focus wasn’t random and prioritised hot-button issues of concern to the US voters, such as the economy, security, and immigration.

    But while agency trolls attempted to “sound American”, rhetorically, their tweets seemed rather dull. They had a more pronounced bias towards the extremes, focusing on inflammatory language and hot takes. This was not surprising, considering that elaborate rhetoric requires full competence in the target language, time, and effort – luxuries the trolls could not afford with their round-the-clock operation and strict quotas.

    My time-series analysis also revealed that the trolls’ interventions were strategic and time-sensitive becoming more active during key election periods. Outside these periods, they targeted far-right media with outlandish conspiracy theories like Pizzagate, which falsely claimed the existence of paedophile and human trafficking rings linked to US Democratic party members. This tactic proved effective as their disinformation was easily picked up and circulated with minimal external correction.

    These were all examples of the tailored messaging and astroturfing techniques used by the agency to target specific audiences and imitate genuine political activism. The agency used the same tailored messaging tactic and more during the pandemic.

    COVID-19

    The initial period of the pandemic was marked by fear and uncertainty, creating fertile ground for agency trolls to spread disinformation and conspiracy theories, exploiting people’s anxieties and intensifying confusion.

    In 2020, agency COVID tweets promoted falsehoods about the virus, its origins, and public health measures. They strategically avoided communicating messages about essential health and preventive measures. Instead, they focused on conspiracy theories, such as “COVID as a US bioweapon,” claiming the US created the virus in a secret military lab, and their spin on the “The Great Reset” conspiracy.

    Protestor holding placard on march
    Protester holding a placard reading ‘STOP THE GREAT RESET’ at the Vigil for the Voiceless anti-lockdown protest in London, March 2021. Shutterstock/JessicaGirvan

    One troll “foretold”, “The collapse of the #western centered world,” stating, “It is not humanity that is being destroyed, it is the #West, and the systems and values it has imposed on the #world.”

    Agency operatives also aimed to undermine public health efforts, exploiting technological anxieties and critiquing neoliberal economic policies. “The US is closer to adoption of an AI-driven mass surveillance system under the guise of combating coronavirus,” a troll tweeted, suggesting US government overreach in COVID containment measures.

    Surprisingly, the trolls used “authoritarian” as a self-description, comparing the perceived “effective” COVID response in authoritarian regimes like Russia or China to the “weak” and “incompetent” response in the west.

    This marked a departure from historical Russian state propaganda, which portrayed Russia as a “real democracy” compared to western “minority rule” democracies. By portraying modern autocracies as more effective in crisis management, the agency questioned the efficacy of democratic governance during crises, aiming to deepen mistrust in western institutions and promote pro-Kremlin narratives.


    Read more: Four experts investigate how the 5G coronavirus conspiracy theory began


    The trolls’ engagement with conspiracy theories spanned far-right (QAnon, “evil elites” manipulating the public) and far-left (The Grayzone, US government using COVID for profit) fringe communities, as well as general concerns across the political spectrum, such as the invasion of privacy by the state and the erosion of civil liberties. This revealed an opportunistic, multilayered approach.

    For Russian-speaking audiences, the trolls downplayed the seriousness of the pandemic at home while focusing on perceived failures of the US and the west, highlighting Russia’s economic resilience and avoiding health-related terms like infektsiya (“infection”) and profilaktika (“prevention”). At the same time, the “virus from the US lab” conspiracy theory was heavily promoted, shifting focus away from domestic problems using cold war-era “whataboutism.”

    Some of their Russian tweets attempted to be humorous and sardonic, mocking Russia’s own COVID response with memes, irony, and sarcasm to highlight inefficiencies. Like in their English tweets that leveraged existing conspiracy theories circulating in fringe western communities, the goal was probably to obscure the fact that the the agency was behind the operation.

    Again, these were examples of the tailored messaging and repeated exposure techniques used by the agency to exploit the fears and uncertainties around COVID while consistently pushing falsehoods, such as COVID being a US-created bioweapon.

    Crimea

    It was clear to see how the trolls’ tactics evolved over the years. I began my analysis with the invasion of Crimea in early 2014 – a time when Twitter, and other social media platforms were still relatively young. Back then, it seemed as though the trolls were more concerned with using more classic historical tropes to spread their lies.

    Following mass protests in Kyiv and the ousting of President Yanukovych, Russian soldiers seized control of Crimea. The region’s parliament later declared independence from Ukraine and a disputed referendum to join Russia was held. The election officials reported around 97% of votes in favour of joining Russia but the referendum was internationally condemned as illegal.


    Read more: Ten years since its annexation, Crimea serves as a grim warning to any Ukrainian lands that fall under Russian occupation


    Some inside Russia doubted the official results alleging election fraud. Despite the many sanctions imposed by the west, Russia maintained control of the peninsula.

    For English-speaking audiences, the agency operatives used euphemisms (mild terms for something harsh) and dysphemisms (harsh terms for something mild) to shape the narrative. For example, “return home” (euphemism) instead of “annexation” or “neo-Nazi regime” (dysphemism) instead of “Ukrainian government”.

    They framed the annexation as a “reunification” and a “restoration of historical justice” for Russia while portraying Ukraine as a “neo-Nazi regime” under western influence. This language aimed to undermine Ukraine’s sovereignty and promote the Kremlin’s agenda. Weeks before the Crimean referendum, a troll posted, “#Sevastopol yearns for mother #Russia”.

    “Western Powers Back Neo-Nazi Coup in Ukraine,” a troll tweeted in February 2014, echoing Russian state propaganda about the ousting of pro-Kremlin President Yanukovych. Another agency troll in January 2015 proclaimed, “Obama must stop supporting Nazis in Ukraine!” after President Obama condemned the annexation of Crimea and imposed sanctions on Russia.

    For Russian speakers, terms like vozvrashcheniye v rodnuyu gavan (returning to the home harbour) and vosstanovleniye istoricheskoy spravedlivosti (restoration of historical justice) were used to evoke a sense of historical continuity and legitimacy, making the annexation appear as a rightful correction.

    By 2015, the agency trolls began associating “Ukraine” with emotionally charged terms like “crisis” and “danger,” and by 2017, with terms like “Nazis” and “terrorists”. These strategic choices aimed to influence the negative perception of Ukraine while obscuring the reality of Russia’s annexation.

    The agency even posted tweets which had an apparently anti-Kremlin stance since they wanted not only to generate engagement, but to polarise and confuse their target audiences.

    For Russians, they drew a line between “Russian” and “Ukrainian identities”. For instance, the Russian word for “reunification” (vossoedinenie) was used extensively to imply a voluntary and mutual return of Crimea to Russia, evoking historical associations with events like the Pereiaslav Council – a 1654 agreement where Ukraine sought military protection from Russia. This was historically seen as “reunification” by Russia to suggest legitimacy and continuity for its claim to Ukraine.

    These reframing attempts persisted from 2014 to 2017, intensifying around key political events. In the Russian-language context, the agency trolls’ use of propaganda cliches far exceeded that of random Twitter users, who often employed these terms ironically.

    Despite ordinary users’ scepticism, the agency consistently followed the Kremlin’s agenda. Once again, this illustrates the role of repeated exposure as a state propaganda tactic, fostering the acceptance of even the most far-fetched state propaganda claims.

    The agency and Russia’s intelligence services

    Many in the west see Russia’s state apparatus as a monolithic entity, with state officials receiving direct orders from the Kremlin. However, researchers highlight the chaos and fragmentation within, with intelligence agencies like the FSB, (Federal Security Service), GRU (foreign military intelligence), and SVR (Foreign Intelligence Service) often having conflicting roles and overlapping operations.

    Private contractors like the agency appear central to online trolling but are themselves a part of broader disruption efforts.

    The Kremlin may set agendas for these operations, but detailed coordination and a “masterplan” are unlikely. The outcome, however, usually favours the Kremlin. If the operation creates enough confusion and disarray – even if the Kremlin is blamed – it can boost its image as a strong player with a long reach.

    In all this “information disorder”, defeating state trolls requires confronting a sobering reality: both state and non-state actors increasingly engage in organised disinformation on social media, while platforms struggle to effectively address the challenge – despite having the technology to do so including user reporting and moderation, content transparency tools, and third-party content reviewing.

    While many view Russia as a prime example of state-sponsored online disinformation and trolling, it is joined by dozens of other countries, including several in the west. In 2020, Oxford researchers identified 81 countries that used social media to spread propaganda and disinformation.

    The key difference is the degree of state involvement and the scope of targeted countries. In many western countries, these efforts are sporadic and are led by politicians, political parties, private contractors, and influencers, targeting domestic audiences. In contrast, Russia’s campaigns are persistent, more centralised, and target many overseas countries.

    While Russia has developed a sophisticated approach to disinformation and propaganda, in many ways it acted as a learner rather than innovator, adapting and contextualising existing techniques while having operational freedom due to the lack of accountability compared to liberal democracies.

    Street view of an office building.
    The building in Savushkina Street, St. Petersburg, Russia, that was once home to the Internet Research Agency. Charles Maynes/Wikimedia Commons

    One high-profile example of using these digital authoritarian techniques in the west is the US military’s anti-vax campaign to undermine China’s Sinovac vaccine, launched under President Trump and continuing several months into Joe Biden’s presidency. Like the agency trolls, this campaign used hundreds of social media accounts to impersonate users in other countries, reportedly targeting Southeast Asia, Central Asia, and the Middle East.

    This does not in any way justify Russia’s systematic and widespread trolling operations against political opponents and democratic institutions, but raises concerns about digital transparency and trust in modern democracies.

    Despite increased awareness, the evolution of technology and the adaptability of state-backed disinformation means this threat is unlikely to diminish soon. Faking user activity and engagement on social media is becoming easier and quicker.

    Platforms such as X often limit the visibility of violating content instead of removing it, so phoney accounts might not get banned promptly. Even when reported, platforms often take minimal action.

    To address this complex and persistent challenge, there must be more transparency, ongoing vigilance from users and the media, and timely action from the big tech platforms.

    AI and a new troll leader

    According to reports, months after Prigozhin’s death his “troll factory” remained active, continuing to push pro-Kremlin views on the full-scale invasion of Ukraine. It is not clear who works for the latest version of Russia’s troll factory, as reports suggest it is now highly fragmented and consists of multiple organisations, each with its own hiring criteria.

    The identity of the new troll leader is also unclear. But there is speculation about increased involvement from Russian intelligence and the Kremlin.

    These efforts primarily target the west’s support for Ukraine, involving dozens of trolls and translators, as revealed by leaked internal Kremlin documents. Ilya Gambashidze, a little known but seemingly well-connected political strategist from Moscow, is often cited as “Prigozhin’s successor”. Gambashidze’s role in online disinformation campaigns as head of the political PR firm Social Design Agency (SDA) has placed him on a US sanctions list.

    As artificial intelligence (AI) technologies evolve, state-sponsored disinformation is likely to use more sophisticated content generation and dissemination methods. OpenAI, the creator of ChatGPT, recently uncovered two covert influence operations originating in Russia. These operations used OpenAI tools to generate social media content – mainly short political comments – in multiple languages on platforms like X and Facebook. OpenAI claims to have identified and disrupted these operations after they failed to go viral.

    Although AI has allegedly increased the productivity of these operations, they remain prone to human error, such as posting messages that reveal the use of a particular AI model.

    In early September 2024, the US Department of Justice reported the disruption of a covert Russian state-sponsored operation that used fake websites and social media profiles to spread AI-generated disinformation. This disinformation targeted support for Ukraine, promoted pro-Kremlin narratives, and attempted to influence US voters ahead of the 2024 presidential election. The operation was traced to Russian firms that appear to be well-positioned to continue the agency’s operations.

    This trend is concerning. The use of AI to create realistic fake content, combined with advanced data analytics to target specific audiences, will pose significant challenges to efforts aimed at countering disinformation.


    For you: more from our Insights series:

    To hear about new Insights articles, join the hundreds of thousands of people who value The Conversation’s evidence-based news. Subscribe to our newsletter.

    The Conversation

    Maksim Markelov does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    This article is republished from The Conversation under a Creative Commons license.
    © 2024 TheConversation, NZCity

     Other National News
     02 Oct: Israel has a history of unsuccessful invasions of Lebanon. Will this time be any different?
     02 Oct: A change in the mid-court looms for the Silver Ferns' must win second test in the Taini Jamison Trophy against England in Porirua tonight
     02 Oct: Two Police cars have been severely damaged while responding to a crash near Warkworth, north of Auckland
     02 Oct: Yes, calling someone ‘mentally disabled’ causes real harm
     02 Oct: Police say the death of an eight month old baby in Wellington last week was not suspicious
     02 Oct: An Auckland man is facing charges after failing to stop for Police in the early hours of this morning
     02 Oct: Some residents of Northland's Dargaville are becoming vigilantes
     Top Stories

    RUGBY RUGBY
    New Tall Blacks coach Judd Flavell is hopeful he can reconnect with Steven Adams and entice him to join the national team in the future More...


    BUSINESS BUSINESS
    US dock workers hit picket lines in strike that could bring higher prices and shortages of goods More...



     Today's News

    Entertainment:
    Sean 'Diddy' Combs is convinced he is the victim of a "racially motivated prosecution" 11:39

    Cricket:
    Tim Southee insists it was his decision to stand down as New Zealand test cricket captain 11:17

    National:
    Israel has a history of unsuccessful invasions of Lebanon. Will this time be any different? 11:17

    Business:
    US dock workers hit picket lines in strike that could bring higher prices and shortages of goods 11:17

    Entertainment:
    Taylor Swift is said to be helping Travis Kelce get domesticated 11:09

    Netball:
    A change in the mid-court looms for the Silver Ferns' must win second test in the Taini Jamison Trophy against England in Porirua tonight 11:07

    Environment:
    Several parts of the country are being warned to brace for stormy conditions 10:47

    Entertainment:
    Kate Winslet considers herself a "street urchin" who "got lucky" 10:39

    International:
    What time is the Walz-Vance VP debate? Here's how to watch in Australia 10:37

    Law and Order:
    The brother of an accused murderer denies he's the one responsible for a gruesome Wairarapa killing last year 10:27


     News Search






    Power Search


    © 2024 New Zealand City Ltd