Like almost all her friends, teenager Kai Warner has been using Instagram, TikTok, Facebook and Snapchat since she was 11.
Subscribe now for unlimited access.
or signup to continue reading
The 14-year-old says she used to spend more than three hours a day scrolling through videos and chatting to friends on her phone.

"Without having to face anything on the outside, it's all right there," she says.
Kai, from Bendigo in Victoria, is one of a growing number of young Australians questioning their own use of social media amid concerns that inappropriate, algorithm-driven content is becoming increasingly widespread and harmful.
Ten early-career journalists from around the ACM network have teamed up to examine the often overlooked impact of social media on regional Australians in a new Young & Regional special series, The Algorithm Decides.
The investigations paint an alarming picture of regional communities and individuals battling misinformation online and ill-equipped to protect themselves from the real-world impact.
Reporters have been told of children as young as one spending up to 30 per cent of their waking hours online, problem gamblers and underage children being routinely served gambling ads and links to porn sites, psychologists warning of AI-driven bots driving dangerous eating habits and verified, reputable information sources being replaced with factually incorrect information designed to stoke conflict.
Meta, owner of Facebook and Instagram, says it is investing heavily in protections designed to keep children safe, including making it harder for children to get around age restrictions and making it more difficult for adults to find and contact under 18s via their platforms.

"In addition to removing content that violates our policies we also aim to make it harder for teens to see potentially sensitive content. For example, we have a sensitive content control that's devoted defaulted to the most restrictive setting for people under the age of 18," Antigone Davis, Meta's global head of safety told the Joint Select Committee on Social Media and Australian Society in early September.
To check the pulse on how Australians view social media algorithms, the Young & Regional reporters spoke to Australians aged 14 to 69 from across the country about their digital life.
Tailor-made digital life
According to the Australian Communications and Media Authority, 99 per cent of Australian adults have access to the internet and 86 per cent use a mobile phone to go online multiple times a day.
Almost 70 per cent of children aged 12 to 17 have Snapchat or TikTok, according to a report on the online safety of young people by Sydney University.
As debate rages in federal parliament about raising the age for social media access and limiting online advertising for gambling, government committees and the nation's security watchdog ASIO are warning of increased risks of home-grown radicalisation fueled by hateful online content.
Using social media for three years, Kai says she became aware the content being presented to her was tailor-made to keep her scrolling.
Realising her screen time was getting out of control, she downloaded a blocker app to her phone in an attempt to regain some control.
Impact felt far and wide
Teachers, psychologists, speech pathologists and occupational therapists have warned it's not only tweens and teenagers being exposed to dangerous messages on social media. Some experts interviewed have suggested a link between addiction to scrolling and skyrocketing numbers of ADHD diagnoses in children younger than school age.
But it's not just health professionals raising the alarm. In May the federal government announced a Parliamentary Select Committee looking into the influence and impact of social media on Australian society. The NSW and South Australian governments will also host a first-of-its kind joint social media summit in October to examine growing concerns about the affect of social media on the mental health and social development of young people.
By the time Australian kids reach high school, they're likely to be using social media to message others and encounter explicit material, according to the Australian Centre to Counter Child Exploitation report.
'I could look like that, why don't I'
Hailey Gillard is wary of bots and fake accounts that send explicit or scam content on social media.
"Several of my friends have bots messaging them with links to (online porn site) PornHub," she says.

The 18-year-old from Wagga Wagga, NSW, says she also has trouble telling whether content on Instagram is real or trustworthy.
According to the University of Canberra's Digital News Report 2024, 60 per cent of Gen Z use social media as their main news source despite Facebook and Instagram de-prioritised news content on their platforms.
In Ms Gillard's age group, many young people are fed content about dieting, health and fitness from social media, and some begin following advice that has no scientific backing, she says.
"They've started all the work outs, they've been starving themselves, it gets pretty bad.
"I don't think people talk about it enough, it does turn into a bit of a shame thing.
"Seeing the bikini ads and the models, it does make me think, 'I could look like that, why don't I?' - It becomes an obsession, trying to look like girls you see in the reels and ads."
According to the most recent census data, by the time they reach the age of 17, nearly 18,000 women and girls had experienced long-term mental health problems, 1.7 times the rate for boys and men.
That data also shows two thirds of parents speak to their children about their online behaviour at least once a week, according to the Sydney University study, but only a fifth are sitting with children while they use the internet, prompting concerns that parents aren't aware enough of what their children are exposed to.
Work against the algorithm
Newcastle 25-year-old Tim Hearne says he has a cultivated endless stream of sports videos keeping him hooked.
"Once you start watching a couple of things of one creator, you start to get all that content," he says.
In the past, he tried to manipulate his algorithm to feed him funny content.

But he now tries to work against the algorithm to make sure he's not receiving explicit content, particularly sexual content.
Tony Lovett from Ballarat, Victoria, says he found his social media feed suddenly filled with messages encouraging him to gamble after downloading betting apps to his phone.
"I found it was getting out of control. I just needed to take control of gambling."
After deleting the online gambling platforms from his phone, 55-year-old Mr Lovett says the gambling ads suddenly stopped appearing in his social media feeds.

A 2023 federal parliamentary inquiry recommended a comprehensive ban of online gambling advertisement within three years, but recent signals from the federal government suggest it may instead seek to limit the time and number of gambling ads that can be displayed on screens.
A future beyond people and pics
For 69-year-old Gloria Quick from Kootingal in NSW, Facebook is a place where she can connect with friends and family.
"I like posting photos when we're away on trips and things, so people can see what we're up to," she says.

"It's a great way to keep in touch with people and make contact with people you haven't seen in a while or are far away from you."
But as social media is changing, and Mrs Quick is concerned by the onset of AI.
AI has been added into Meta's platforms in early 2024 and allows users to chat with a bot and generate images. Other developers are also working to create AI recreations of dead people.
An earlier ACM investigation found Meta's AI chatbot making up facts and distorting the truth as it sought to answer questions from users.
"I don't like anything artificial intelligence," Mrs Quick says.



