News | International
2 Nov 2025 16:24
NZCity News
NZCity CalculatorReturn to NZCity

  • Start Page
  • Personalise
  • Sport
  • Weather
  • Finance
  • Shopping
  • Jobs
  • Horoscopes
  • Lotto Results
  • Photo Gallery
  • Site Gallery
  • TVNow
  • Dating
  • SearchNZ
  • NZSearch
  • Crime.co.nz
  • RugbyLeague
  • Make Home
  • About NZCity
  • Contact NZCity
  • Your Privacy
  • Advertising
  • Login
  • Join for Free

  •   Home > News > International

    I tried to outsmart the social media algorithm. Here's how it outsmarted me

    Armed with a burner phone and a fake name, I set out to learn how Instagram’s algorithm really works. This is what happened next.


    Armed with a burner phone and a fake name, I set out to learn how Instagram’s algorithm really works. This is what happened next 

    At some point — let's say a Wednesday night around 8:40pm, too late to work but far too early to sleep — I find myself watching someone spread toothpaste and butter on a cracked phone screen.

    Scroll.

    A pigeon with a hole in its neck pecks at a plastic water bowl.

    Scroll.

    A woman, her face and hands coated in a shiny paste the colour of Vegemite, runs the back of a butter knife along her left cheek revealing a patch of clear, "poreless" looking skin underneath.

    There's something strangely intimate about sharing the content of your social media feed with others. It feels like exposing a secret part of yourself — your idle curiosities, guilty pleasures, the things you like to click on when no one else is watching. Our feeds can often feel like private mirrors, reflecting back a version of ourselves others rarely get to see.

    On this particular occasion, though, it wasn't exactly my reflection I was staring at. These clips belonged to a sock puppet account I had created on Instagram using a factory-reset burner phone. It was part of a month-and-a-half-long experiment I was running to figure out how social media apps curate the unique digital worlds we occupy.

    The motivation to run the experiment arrived on a separate evening I spent mindlessly scrolling through my genuine Instagram feed. I was struck by the feeling that the videos are a poor (and sometimes, shameful) representation of who I am. As a journalist — with what I hope are diverse interests in world politics, arts and culture — my feed ripples with banal videos of celebrities, makeup tutorials, and puppy hijinks.

    Put another way, if my Instagram feed were a diet, it would be one riddled with ultra-processed junk food, accompanied by almost no vegetables or greens. All pleasure, zero nutrition.

    I know I'm not alone with my anxieties.

    With Australians spending an average of about 2 hours a day on social media, the potential harms of these apps to our lives is under scrutiny. Australia's ban on social media for children under the age of 16 starts next month, a move the European Commission's President has praised, while the United States' former surgeon general recommended adding warning labels to social media platforms.

    Why do social media apps show us certain content over others?

    Are they a true reflection of the unique interests and values that define us — or are the algorithms driving our social media feed somehow pernicious, drip feeding us an endless slurry of addictive brain rot?

    Can we wrestle back control?

    My social media diet

    My social media experiment attempted to answer these questions.

    By tracking how my fake feed responded to deliberate behaviour on and off the platform, I wondered if I could unravel how apps like TikTok, Instagram and YouTube decide what to show us — and how they seize us in a loop of provocative or misinformed content. Maybe I could even coax my account into reflecting a more nourishing version of myself.

    So, I built a sterile digital environment, untethered from my real life. I wiped the contents of an old phone clean, randomly generated an alias — "Tank Promotion" — and created a matching email account. After buying a new SIM card to verify the device, I downloaded Instagram: my platform of choice to run the experiment since I was most familiar with its interface.

    Other apps — like Google, YouTube and others I downloaded — would serve as tools to more subtly manipulate the signals Instagram uses to curate its feed.

    Stage one of the experiment could begin.

    Stage one

    I started scrolling. The content seemed random, unpredictable. With no existing axis of information, my page was a menagerie of interests — videos of gooey sauces being mixed into steaming pasta, of men holding fish, of kittens, of idyllic landscapes.

    Instagram operates with a basic motive: to ensure you stay on their app as long as possible. Finding out what holds our attention also keeps social media companies profitable.

    Most users enter their account with existing friends and networks, allowing the platform to guess our likes with decent accuracy. Faced with a blank profile like mine, Instagram acts like a parent feeding a fussy child, serving up a plethora of tastes until it strikes the one that's gobbled up with glee.

    "When you start with a vanilla profile, you have this diverse content pool and the algorithm is trying to gauge what this user is interested in," says Mohammad Haroon, a former Meta intern and doctoral student in computer science from UC Davis whose research focuses on how content is recommended to you online. "They have a bunch of different feedback mechanisms … like if you watch a reel a second longer than you watch something else."

    At this early stage of the experiment, I was determined to limit my feedback. I made sure not to linger on any one piece of media. I watched each video in its entirety and quickly scrolled to the next. I kept my movements steady and rhythmic.

    Despite my attempts, Instagram kept unlocking parts of me I tried to keep hidden. A steady mix of AFL content and hyper-local news stories suggested the app knew I lived in Melbourne. I checked my phone's location settings, and sure enough Instagram had recently accessed this information.

    Other additions to my feed were harder to explain.

    One video featured a woman in a sari frying up South Indian dumplings called kuzhi paniyaram — an obscure dish my mum would make when I was growing up. On another occasion, photos of kittens overtook my page, some of them eerily similar to my friend's cat. He'd come for dinner just days earlier.

    Were these coincidences, or did this phone know who my family and friends were?

    Social media is stalking us

    Professor Daniel Angus introduced me to one possibility that gives me chills. With access to our location data, social media platforms suggest profiles and content to us based on not just where we are, but who we're with.

    "It's going to be using that geolocation [to] profile you and your tastes and likes and wants," says Angus, the director of the Queensland University of Technology's Digital Media Research Centre.

    "It's not individual targeting, it's associative logics now — your tastes, behaviours, and connections mapped as coordinates within a larger field of data relations."

    Associative logic works like this: social media systems see your phone as one node in a vast network of devices. When you spend time in the same place as certain other people — say at work, home or out with friends — these connections are reinforced. The system begins associating your account with theirs. This information shapes the content you see online.

    "Think less about what you were doing individually on the device and more about what everybody else around you is doing on theirs," Angus advises.

    This model may explain spooky experiences many of us have on social media. Meeting a stranger at a party and finding their profile appears on your "suggested friends" list. Seeing an ad for a product your partner recently purchased. Or, in my case, finding content on Tank Promotion's Instagram account that relates to parts of my personal life.

    "You need to remove this idea of you being one single isolated point in a space," Angus says. "You are the product of not just you, but every other phone and device in your environment."

    I will never know for sure why certain videos turned up in my feed. But the experiment was already making clear the media we see online is tied to people and places we see when we are not using it at all.

    Faced with this reality, Australia's proposed social media ban offers the comfort of action, shielding users under the age of 16 from the omnipresence of targeted content.

    But, as Angus warns, it may also distract us from answering deeper issues: will such a ban address how social media companies harvest data, including that of their youngest users? Will the online behaviour of adults or older siblings still shape what children see on their devices? And are these models inherently harmful to young people, if they simply mirror a world that's already around them?

    Enter 'the algorithm'

    In social media's early days — some 20 years ago — only the posts from accounts you followed appeared chronologically in the order they were shared. You could quickly reach the end of your feed, a place where there was nothing more to see. The result was a more user-controlled, though restrictive, experience.

    As social media companies looked to include more advertising, they began curating our feeds without us actively looking for content. They adopted recommender systems — now called "the algorithm" — that act like invisible strings pulling us toward certain content.

    The "collaborative model" pioneered by Netflix was one of the earliest recommender systems: if User 1 liked movies A, B and C, and User 2 liked movies A, B, C and D — the algorithm would suggest movie D to User 1.

    As the models became more sophisticated, social media companies designed systems that learned directly from an individual's behaviour. Every deliberate action on a post became "explicit signals", as computer scientists call it, acting like a weighted vote for content you want more of.

    Most of us understand if you consistently like, comment on and share videos of puppies, for example, the algorithm will rank puppy videos higher, and they will show up more regularly in your feed.

    But we're not always reliable reporters of our own preferences. To appeal to our deepest desires, social media companies started tracking our "implicit signals" — how long you gaze at a video, when you close the app, how quickly you scroll through your page. This all feeds the ranking system, helping predict what will keep you scrolling.

    Haroon says TikTok is particularly advanced in interpreting these "implicit signals". No surprise, then, that the platform is also particularly addictive.

    At least, this is what we think we know about algorithms. Modern platforms use AI and deep-learning methods, where systems are trained on masses of data and programmed to optimise our feeds for high engagement. Haroon calls these systems a sort of "black box", where it's hard to understand exactly why the model spits something out at you. He says even experts on the inside are finding it increasingly hard to understand what's happening.

    That's where we're left today: a recommendation system that is almost entirely data and engagement driven, without reliable methods to automatically block harmful content. Many of us now feel stuck on a conveyor belt of content intricately designed to appeal to our specific interests. And no one can fully explain how it happens.

    My experiment was a chance to pull back the curtain.

    Manipulating my signals

    To examine the power of these systems, I wanted to see if I could convince my new Instagram account that I was someone else — say a mother, with a newborn baby.

    I tested a popular conspiracy: your phone's microphone listens in. I opened my laptop, typed "new mother" into YouTube's search engine and it auto-played a variety of associated videos. I took advantage of YouTube's own recommendation system by allowing it to cycle through suggested videos while I was at work. I put my burner phone right up against my laptop speaker, and left.

    Once I returned home, I eagerly checked my Instagram feed. No new baby-related content appeared.

    On its websites, Meta says that it does not listen to our conversations through the microphone. Perhaps it doesn't need to — perhaps our behaviour on our devices speak loudly enough.

    Time to ramp up my signals.

    I downloaded a suite of pregnancy and baby-tracking apps. I added fake data. I pretended to track the rituals and schedules of my fake baby (who I christened "Tanky").

    A few days later the first video waiting for me in my Instagram timeline was a cartoon of a woman in labour.

    Over the next week, I noticed more videos and thumbnails of infants and mums. By no means was my account flooded with baby content — so I introduced new variables to see if I could magnify the results. I Googled "best baby carriers". I joined new mother groups on Facebook. I followed subreddits called "NewMomStuff" and "pregnancy_care".

    My account continued to shift, but in more unexpected ways.

    I began to get semi-pornographic content. Busty women rolling around their beds. Animations of maids gyrating suggestively back and forth alongside a green hotdog-like character. At times, my suggested friends' profile pictures consisted of several women with their heads cropped out, their curvaceous bodies pouring out of lingerie. Even the motherhood content seemed somehow salacious — a man squeezing a woman's breast to fill his mug with milk.

    I suspect two reasons. I first noticed the change while I was using free Wi-Fi in a London cafe. Was there something about my location or others in the cafe that prompted the algorithm to serve me adult content?

    Alternatively, a simpler reason: sex sells. Porn is ubiquitous online — it's estimated that one in three young Australians encounter pornography unintentionally before they turn 13, often through social media.

    I was about three weeks into my experiment, and I hadn't clicked, liked or interacted with any content. Perhaps, in an act of desperation, the platform was sending me the posts that kept many of its users engaged.

    Researcher Haroon finds this theory plausible.

    "When you optimise for [engagement]," he says, "you start seeing content that could be misinformation, controversial, and content that can be shocking, that keeps you engaged because of morbid curiosity".

    After a few more days of watching my feed slide into filth, alongside some motherhood and other content, I decided to give Instagram a more obvious push. I selected a few thumbnails on my Explore Page that look vaguely child orientated: photos of toilet rolls crafted into animals, a cake shaped like the number six, an infant's walker.

    Within seconds, the account bloomed with baby gear and motherhood content. My Reels feed became a constant stream of chubby infants. In under eight minutes, Instagram had re-imagined me as a parent.

    The behaviour that may have led to me being classified a new mother was relatively benign, and I imagine useful for someone who really had a new baby.

    But soon, that same profiling logic would push me toward something far more toxic.

    Entering the world of the teenager

    A few weeks later my phone began insisting I follow an all-meat diet and massage my husband every night. As a single woman and a life-long vegetarian, this was an odd development.

    I wanted to push my feed away from the motherhood content that now dominated, into the world of teenage boys.

    On the advice of the children of friends, I downloaded Snapchat, TikTok, Spotify, Compass School Manager, and a few mobile games. On Instagram's platform, I followed gaming accounts like "Brawl Stars" and "PlayStation".

    In three days, ads for gaming devices appeared, alongside prank videos, tips on using AI to breeze through homework and more suggestive content — a young man testing a "honey pack" on his girlfriend to encourage her to have sex with him.

    This stage also marked the first time I saw a video related directly to current affairs. A young woman and man re-enacted the scene leading up to the stabbing of Ukrainian refugee Iryna Zarutska in the United States earlier this year. In this re-enactment police arrest the man before he can approach the girl.

    "Image embeddings" is a method that helps AI-driven recommender systems measure the visual similarity between two pieces of content. Every frame of a video is translated into numbers representing patterns of colour, shapes, themes, motion, and hundreds of other dimensions. Those numbers become points in a vast digital map, where videos that look or behave alike are placed closer together. 

    On top of that, engagement optimisation algorithms try to maximise the time we stay watching, based on how we, and similar users, behave.

    I can't know for certain but this could explain why the algorithm might group gaming videos, teen pranks, videos about sex, and a re-enactment of a news event. To Instagram's deep-learning system, they might all appeal to similar demographics, feature fast movements or expressive human faces, or have similar camera angles and colours that the algorithm maps side-by-side. Even if to human eyes they are judged as remarkably different.

    This phenomenon became clearer during the final phase of the experiment, where I nudged my feed toward "manosphere" content.

    An ideological filter bubble

    I stopped following the gaming accounts and began exploring corners of Instagram linked to young men. I searched for the popular podcaster Joe Rogan. On YouTube, I subscribed to Rogan's channel and commented on his videos. I did the same to other accounts in YouTube's recommended list — namely Fox News and SkyNews Australia.

    My Instagram feed didn't budge — not yet, anyway.

    The next day, I watched Rogan's videos directly on Instagram. One showed him interviewing a human biologist named Gary Brecka. I checked out Brecka's account too, watching a video of him plunging into an ice bath and another where he makes a smoothie.

    I returned to my Reels feed. What I found there shocked me.

    In three swipes, I landed on a video from TopGInspired. It featured controversial internet personality Andrew Tate talking to a blonde woman. In the video, Tate applauds men as being "good at suffering". He called women "spiteful sufferers".

    I had not searched for Tate, a man who has been accused of rape and inspiring a generation of misogynists, on or off Instagram. I wasn't following his accounts — or any, for that matter. Yet here he was.

    Tate wasn't the only bigoted commentator I met that day. Sean Strickland, a mixed martial artist, appeared shortly after, wearing a T-shirt that said: "Woman in every kitchen. Gun in every hand." He was talking to a man off-screen, who called himself an ally of the gay community. Strickland responded saying "there are two genders" and called the man "weak", an "infection", and "the fucking enemy".

    Over the next week my feed entered a universe of anti-vaccine screeds, odes to all-meat diets, information on how to increase testosterone and why nicotine is good for you, alongside more clips of Tate railing against feminism.

    These videos seemed to carry a distinct ideological tilt, and I'm stunned that a handful of actions, both on and off the platform, were enough to trigger a wave of hyper-masculine, conspiratorial content.

    Haroon, along with his colleagues at UC Davis, found that recommendation engines can funnel users towards progressively more extreme and polarising content. They ran an experiment with 100,000 sock puppet accounts on YouTube, each with an assigned political ideology. More than 36 per cent of these accounts received extremist video recommendations. For right-leaning accounts, that number was closer to 40 per cent.

    "These more extremist channels — which are usually white-nationalist, anti-woke, anti-feminism — appear more often for people who are not necessarily watching these channels explicitly, but they are engaging with right-leaning content," Haroon says, adding this study showed "some degree of radicalisation" was happening because of the recommendation systems used by social media companies.

    "We showed personalisation is causing you to be in an ideological filter bubble," Haroon says.

    The question is, how do you escape?

    Finding the escape hatch

    On a grey Melbourne afternoon, I clicked on an Internet and Technology Addicts Anonymous (ITAA) Zoom meeting. A gallery of boxes filled my screen — some blank, with just their first name, others with faces from around the world.

    ITAA is a support group for people struggling with internet addiction. For many of its members, social media has become a vice, trapping them in an addictive bubble that they can't escape alone. ITAA follows a 12-step recovery model based around mutual support and spiritual guidance.

    Several spoke about the compulsion to scroll, others about how online addiction has had serious consequences on their work and personal lives. Many expressed gratitude for being part of the group and their journey to recovery.

    "My gateway drug was internet addiction," explained the founding member of ITAA's Melbourne chapter, a young man who prefers not to use his name. "It was YouTube, and then just watching one video after another, boom, boom, boom, boom."

    Entire nights were lost surfing YouTube and other platforms, falling deeper down the rabbit hole of recommended content. Even while signed out or on different devices, he felt pulled back.

    Peer-support through ITAA helped him. So did engaging with his religious community. He also uses special programs on his phone that block social media platforms and the sites he finds particularly addictive.

    My experiment showed we can take deliberate steps to develop a healthier relationship with social media. We can spend time with people and places that reflect the content we want to see. Engage with posts that uplift us and actively block those that don't. Algorithms reward engagement, so the more consciously we participate, the more consciously our feed responds.

    I tried following accounts on my experimental Instagram account that better reflected my interests — the bands I liked, the hobbies that excited me, the news sites I trusted. But, often, I found myself drifting back to the extreme and conspiracy-laden videos that now littered my feed. I felt like a spy entering a digital universe I didn't usually occupy, and I was hesitant to give up this access.

    Yes, escaping these worlds curated by engagement-hungry algorithms takes effort. But after this experiment, I feel I have a few new tools available to outsmart them.

    Maybe I'll do that later. For now, I continue to scroll.

    Credits

    Words: Prianka Srinivasan

    Editor: Catherine Taylor

    Illustrations: Gabrielle Flood

    © 2025 ABC Australian Broadcasting Corporation. All rights reserved

     Other International News
     02 Nov: UK train stabbing leaves 10 in hospital, nine with life-threatening injuries, police say
     02 Nov: As US government shutdown drags on, 42 million people could be affected by food program cuts
     02 Nov: Pitch invader stands with Kangaroos during national anthems before Ashes Test against England
     02 Nov: New York's underbelly is thriving — this FBI case pulled back the curtain
     02 Nov: Prince Andrew is leaving Windsor Estate. Why wasn't he given another property on the grounds?
     02 Nov: England power past Wallabies 25-7 at Twickenham as Matt Dawson slams Australians as being 'woeful'
     01 Nov: Japan wants to hire more bear hunters after record number of people attacked
     Top Stories

    RUGBY RUGBY
    The All Blacks have gone some way to vanquishing the demons of Soldier Field from nine years ago, with a 26-13 win over Ireland in Chicago More...


    BUSINESS BUSINESS
    Changes to the Protection of Personal and Property Rights Act have shown little consideration for intellectually disabled Kiwis More...



     Today's News

    Hamilton:
    UK train stabbing leaves 10 in hospital, nine with life-threatening injuries, police say 16:17

    Rugby League:
    Kiwis coach Stacey Jones reckons the intense interest in league's Pacific Championships is a signal New Zealand's ready for a second NRL team 16:17

    Rugby League:
    The Kiwi Ferns are aware they can't afford a slow start against the Jillaroos in this afternoon's Pacific Championship contest at Eden Park 15:27

    Auckland:
    A bus has collided with the edge of an overpass and caught fire on Auckland's North Shore 15:17

    Health & Safety:
    A public health expert's slamming the Government's handling of measles, as the number of cases climb 14:17

    Rugby:
    The All Blacks have gone some way to vanquishing the demons of Soldier Field from nine years ago, with a 26-13 win over Ireland in Chicago 14:07

    Soccer:
    Liverpool have ended a four-game slide in the English Premier League with a 2-nil home win over Aston Villa 13:57

    Netball:
    The Silver Ferns are sticking with the same 14-player squad from the Constellation Cup loss to Australia for this month's Northern Tour 13:47

    Business:
    Changes to the Protection of Personal and Property Rights Act have shown little consideration for intellectually disabled Kiwis 13:27

    Politics:
    As US government shutdown drags on, 42 million people could be affected by food program cuts 13:17


     News Search






    Power Search


    © 2025 New Zealand City Ltd