Whether your news diet is national broadcasters, online or newspapers, politicians of seemingly all stripes and levels of government are in a fit about young people and social media.
Subscribe now for unlimited access.
or signup to continue reading
If focused correctly on the serious, market-based sources of harm, this energy has the potential to generate lasting reforms that safeguard the security and wellbeing of young people for generations to come.
Executed poorly, it carries the risk of embedding a flash in the pan moral panic into measures that are unreasonably punitive on young people and continue to let the industrial-grade corporate harm producers (big tech, adtech, and everyone in between) off the hook.
In other words, attempted interventions into individuals' use of devices without contemplating the underlying personal information that feeds the beasts of algorithms and advertising systems conjures images of twigs plugging dams.
This is not to undermine the credible harms from extended screen-time and technical features like endless scroll and dark patterns. But we do need to take a minute to consider who and what we're actually trying to regulate here.
Is the problem the devices themselves and the people who use them, or everything that's on them, and the features that grant them the dual roles of both addictive candies and extractive data-hungry strangers?
What's being held back from the public narrative is the existence of a powerful and genuinely impactful package of policy reforms that could go to the heart of the social media business model and guarantee regulatory oversight into children and young people's data protection.
Like any complex issue of tech and social policy, these are not either/or solutions, but contemplating blunt instrument end-user bans without progressing advanced upstream market regulation proposals is foolish.

One of the legacy pieces the Albanese government was handed from its predecessor was an advanced set of work on Australia's nearly 40-year-old Privacy Act. The Albanese government provided a response to the Review Report last year.
Regulatory impact assessments kicked off this year. Against the grain of industry lobbying and costings scaremongering, a proposal for a regulator-drafted Children's Privacy Code survived stage after stage. But in these final sitting weeks of policy horse-trading, an uncontroversial yet vital piece of reform may be once again cast aside in favour of quick fixes.
A child who was 13 at the beginning of the consultations on the Privacy Act and the Children's Privacy Code is now an adult. The longer we wait for privacy laws the deeper go the harms.
Some years ago, it was routinely said that a child would have an estimated 72 million data points on them by the time they turned 13. Realistically, what this means is: a child's face can appear in AI-generated images (including sexualised images), their daily movement patterns available for purchase by any buyer, and their attention manipulated to sell ads targeting known vulnerabilities.
Children's privacy is far from a loveless cause. It attracts the interest and strong support across the political spectrum, and routinely polls well with the public. But it appears captured by the present legislation logjam which threatens to sacrifice reams of key reforms on the altar of stasis and indifference.
Children and young people themselves urgently want legislative change.
We recently spoke to a group of teenagers who talked about all sorts of strategies they had to try to keep themselves private, from using VPNs ("most of my class use them") to keeping the few posts they make under lock and key (in "double private accounts").
READ MORE:
And while older people cling onto the myth that because young people use social media they must not care about privacy anymore, their reality is more complex.
As one young person put it "sometimes there are choices you can make to protect your privacy - but it often feels like even if you take all these steps it doesn't really help. All of these websites can get data about you even if you're making conscious decisions to try not to. Sometimes it's hard to motivate yourself to make good decisions, if you know that you still aren't private".
This may be a parable of the power of big tech, adtech and vested commercial interests of Australia's vast data markets. Or it could be a simple political calculation that the vision of young people's right to freedom from digital tracking is too utopian, too esoteric, too hard. We can only wait and see.
- Alice Dawkins is executive director and Dr Rys Farthing is policy director at Reset.Tech Australia.
