It's the day we've all been waiting for, and by "we", I mean parents of children between the ages of 10 and 15.
Subscribe now for unlimited access.
or signup to continue reading
And, of course, the dozens of experts, pundits, commentators, parents, children, young adults, politicians and general naysayers who have lined up to tear down the ban well before the official implementation date.
That date, December 10, has been anticipated and picked apart for the past few months, as the country moves towards the first steps in a world-first social experiment that could reshape an entire generation.

I haven't been paying attention - what does it all mean?
From Wednesday, children under the age of 16 will no longer be able to create or have an account on a range of social media platforms. Existing accounts will be disabled - many have been already in the lead-up - and the platforms that are subject to the ban will have to take reasonable steps to ensure account holders are over 15.
Which platforms are covered by the ban?
So far, the list includes Facebook, Instagram, Kick, Reddit, Snapchat, Threads, TikTok, Twitch, X (formerly Twitter) and YouTube. Children will still be able to access publicly available content via these sites, but won't be able to set or maintain accounts.
The list will likely change, as new sites could be added as the effects of the ban take shape. And so far, Meta, TikTok, Snapchat, Reddit, Twitch and Kick have all said they will comply with the ban.
Why these platforms?
The eSafety Commission, the agency tasked with implementing the policy, is focusing on the services that have the greatest number of Australian users under the age of 16, and will be watching how they use features and functions associated with risks of harm to children.
Why the ban?
Well, a word about language; the eSafety Commission prefers to call it a "delay" for children accessing social media. The restrictions aim to protect under-16s from pressures and risks they can be exposed to while logged in to social media accounts.
Many of these platforms are designed to keep people onscreen for as long as possible, which for children is too long. Seemingly unending streams of notifications and alerts and disappearing content increases the likelihood of children being exposed to negative, upsetting or manipulative content.
These have all been linked to harms to health and wellbeing - including increased stress levels, and reduced sleep and concentration, as well as cyber bullying and exposure to porn or violent material.
The idea is to give younger children time to develop digital, social and emotional skills before they're old enough to manage these platforms.
This is ridiculous - won't kids just find ways to get around the ban?
Yes, almost definitely, and the government is putting on the onus on the very rich and powerful platforms listed to find ways to keep ahead of them. These include preventing under-16s from faking their age with false IDs. It's usually quite obvious when a user is under 16, because of who they interact with, the content they view and how they engage with it. This is why social media is so insidious - it just knows so much about us.
Won't they just go on other apps?
Probably, but the commission had to start somewhere. In the meantime, Australia has in place enforceable Online Safety Codes and Standards that require a large number of online services to implement safety protections to prevent and remove the most harmful illegal content and activity, including sexual abuse of children and encouragement of terrorism.
The codes and standards also require the online industry to prevent children from accessing or being exposed to age-inappropriate material such as pornography, and to ensure that all users have effective tools, information, and options to limit access to harmful content.
What about VPNs?
Has the eSafety Commissioner even thought of them?
It's been a full year since the legislation was passed, so you'd hope so. Again, this will be a problem for the platforms, who can use location-based signals that can help work out if an account holder usually lives in Australia and could be using a VPN to pretend they don't. These can include IP addresses, GPS or other location services, device language and time settings, and an Australian phone number.
How about parents?
Shouldn't they be part of the ban?
Despite the onus being placed fairly squarely on the social media platforms to keep under-16s off them, families have a huge role to play in this ban if it's to become anything other than a sweetly naive pipe dream we all once held back in the heady days of 2025.
For starters, Apple devices have family sharing facilities, which means parents can have full control of their children's phone use, including when they're on it and how much time they're accessing apps.
Children are crafty when it comes to safeguarding their time on the apps - there are reports of children simply changing the times on their devices, signing in as their parents, wearing masks when doing age verifications - it's all very clever and predictable. But actually, it will, as of Wednesday, be completely legitimate to tell children it is, in fact, illegal to be on the listed platforms, and so you'd better get off them or else.
So much of the opposition to the ban has been driven by our FOMO-driven society - that's Fear Of Missing Out for those truly out of the loop. Whether it's kids worried they're missing out on what all the others are seeing and doing, to parents concerned about their children being socially excluded, the fact remains that social media platforms have us all right where they have always wanted us.
But if as many parents as possible are on board with the ban, it might just get this cohort over the line for the next few years. And there'll be an entire new generation - the ones who aren't yet on the apps - to grow up without the same expectations.
The kids will probably be OK - right?
Some might not be. Those who are particularly addicted to their phones will find the ban challenging, as will their families. Many children use social media as their main way of connecting and getting support, including children who are neurodiverse, introverted, disabled, away from school for health reasons, have family and friends overseas or language barriers, or basically have trouble being around people.
Others will be fine - better than fine, even. Imagine the blissful silence once those endless notifications are switched off, for all of them, all at once. We know, having spoken to young people who will be directly affected by the ban, that at least some are open to the possibility of their lives being changed, if only for the summer.
We know, too, that many parents are on board with the ban, if only so that they can tell their kids, with full authority, that their being on social media is now against the law.
"I didn't make the rules," they can say, throwing up their hands.
Other parents are against the ban, in principle or in practice or both, and plan to help their children circumvent it if necessary. Still more simply don't care.
But it should be obvious by now that these platforms all contain material that is harmful to young people, and until recently have shown no inclination whatsoever to mitigate these harms. Unlike, say, driving or drinking, the advent of social media into our lives has been a real-time journey we've all been a part of, and it's now clear that kids are bearing the brunt of forces we didn't clearly understand even just a few years ago.

