Preparing for the Social Media Delay
- Kayelene Kerr Child Safety Educator & Advocate
- Sep 24
- 20 min read
Australia has introduced world-first legislation that will delay children’s access to social media until age 16. Platforms will be required to put age-assurance systems in place by December 2025, preventing children and young people from creating or maintaining accounts.
After the law takes effect on 10 December 2025, Australians can expect to see age-restricted social media platforms taking steps to stop children and young people under 16 years of age setting up or continuing to use accounts.
This is a significant change and will affect many families. This is where active parenting makes all the difference. The law can set the rules, but it can’t replace the guidance, support, conversations and boundaries you provide.
At its core the social media delay is about protection, not punishment.
Understanding the Why: Overview of Online Harms
The internet and technology have transformed the ways we learn, work, create, play, connect and are entertained. It’s given our children access to the world, but it has also given the world access to our children. Children gain immense benefits from being online, but there are also risks. The internet and digital environments, including emerging environments, were not and are not designed for children or with children’s safety in mind, yet they’re an integral part of their lives.
Globally the increasing number of children online has seen corresponding upward trends of online grooming, online child sexual abuse and exploitation, sextortion, youth produced sexual content, image-based abuse, cyberbullying, exposure to pornography and other illegal, hurtful, harmful and age-inappropriate content to name but a few. Much of what children are exposed to and are navigating, is too much too soon. Digital harm is occurring on apps, platforms and online services at unprecedented levels.
Safety has, and in many cases continues to be, an afterthought by technology companies. Sadly, and to the detriment of the health, wellbeing and safety of children, technology companies clearly demonstrate profits over people, profits over harm, profits over child safety. .
Technology companies’ priority is revenue generating activity, not children’s safety. These services have been developed in such a way that they create a supply chain of commercial activity, data processing, advertising and marketing, persuasive technology and design features that anticipate and often guide a child towards more extreme and harmful content. While I acknowledge the above-mentioned features may not have intended to cause harm to children, experience has shown us they have ultimately facilitated and perpetuated it.
For years technology companies have engaged in wilful blindness, prioritising commercial gain ahead of children’s safety. Technology company revenue is generated from data collection and user engagement. This commercial priority can profoundly shape the design of online products and services resulting in sensationalised content, the spread of fake news, misinformation, disinformation, malinformation and harmful and illegal content.
The power of algorithms and recommender systems to curate feeds and influence the content a user sees and consumes ought to be approached with caution; transparency and independent oversight is imperative. Algorithms can have a significant impact on what content a user sees and interacts with. This in turn has the power to shape attitudes, expectations, behaviours, beliefs, perceptions and practices. Left to their own devices, technology companies do not effectively self-regulate.
Children and young people’s exposure to harmful content online is not marginal, it’s mainstream. An Australian study of 14 - 17 year olds found 62% reported exposure to harmful content online. Harmful content includes:
Self-harm
Suicide – Ways to take own life
Unhealthy eating – Ways to be ‘very thin’
Hate speech
Gory or violent material
Drug Taking
Violent sexual images or videos
It’s important to note the internet is not segregated, what young people see is also what children see. In my experience the above-mentioned harms are regularly managed by primary schools.
For too long parents, carers, educators and other professionals have carried the responsibility of protecting children from online harms and for managing and mitigating the serious real-world consequences impacting children and young people’s health, wellbeing and personal safety.
The Australian government recognises it is manifestly unfair and unreasonable for children to be responsible for avoiding illegal and harmful content. It is also manifestly unfair and unreasonable for schools, parents/caregivers, educators, other professionals and community-based organisations to address these issues alone. The Australian government has intervened because technology companies have demonstrated time and time again that they will not adhere to their civic responsibility to ensure their networks, platforms and services are not used in ways that cause or contribute to violating children’s rights to be protected from harm.
Read more here.
What Apps, Platforms and Services does the legislation apply to?
At this stage the age restrictions will apply to Facebook, Instagram, Snapchat, TikTok, X (formerly Twitter) and YouTube.
It's important to note that users will still be able to watch content on YouTube and TikTok as these platforms don't require users to have an account to view content. Additionally, children and young people will be able to share content to other platforms not covered by the delay, for example messaging apps. This loophole in the legislation may prove to be a challenge.
The expectation is these platforms will not undermine the intent of this law or expose Australian children and young people to illegal, hurtful, harmful or age inappropriate content, whether this happens remains to be seen, and 'Big Tech' does not have a good track record when it comes to prioritising children's health, wellbeing and safety.
It's also important to note, illegal, hurtful and harmful content is not confined to social media platforms, it’s readily available via a browser search and is frequently shared with children via messaging apps that exist outside of social media. Using age verification for social media won’t remove harmful content or keep children safe from unsafe content, contact and conduct.
Many mainstream social media platforms in the public discourse are amongst the
safest. All be it inadequate, they do offer family safety centres, reporting options and a
degree of moderation. Many of the other platforms used by children and young people
do not offer any of these features and these platforms may be where children and young people move to following the delay. If that does occur there is a real possibility they'll be in more unsafe online places and spaces.
The social media delay may be somewhat effective at managing some online harms, but they will be ineffective in addressing others so we can't rely entirely on this delay.
Plan and Prepare
If your child/young person has existing social media accounts don’t wait until December 2025, plan and prepare now. Start age and developmentally appropriate conversations about these changes well before they take effect. Children under 16 don’t just need limits, they need guidance to prepare for what’s ahead.
Audit devices and accounts
Make a list of the apps on your child’s device. Identify the apps that will be effected.
If your child already has an account, back up content they want to keep eg. photos, videos, chats etc. Platforms usually have data-download tools.
Check login email addresses and dates of birth on accounts
Family Technology Plan
Revisit your Family Technology Agreement and make changes as necessary. Involve children and young people in this process. Let them help design what comes next. It doesn’t mean stepping back from your role as the adult, it means building their sense of agency. Keep your child informed about what’s coming and your expectations.
Conversation Starters
"What do you think will be the hardest part when things change? Is anything worrying you about it?
“Which part of social media feels most important to you, chatting, posting or seeing what others are doing?”
“Because of the new age delay, we need to start winding back your social media. How would you like to do it?”
“How would you like to save your favourite online memories? If you already have accounts, let’s back up photos you want to save."
“Let's set up a plan so you can keep in touch with your friends.”
Be Curious: Talk With Them, Not At Them
The goal of conversations about the social media delay isn’t to demonise technology or create fear, it’s to help children become socially, emotionally and relationally prepared to navigate online spaces safely and confidently. Simply telling children what to do rarely works, taking the time to ask, listen and explore their experiences is more likely to build trust and understanding.
It’s equally important to recognise the real benefits social media offers. For young people, these platforms provide access to educational content, creative spaces to express themselves and opportunities to explore interests alongside like-minded peers.
Social media can also serve as a vital space for connection, belonging and identity development, especially for marginalised and neurodivergent children and young people.
Acknowledging these benefits doesn’t mean ignoring risks. It means approaching online activity with curiosity and respect. By asking questions, listening carefully and exploring their experiences together, parents can help children understand how to make safe, positive and balanced choices online.
Ultimately, the aim is not to restrict or shame, but to guide. When children feel heard and understood, they are far more likely to internalise safe digital practices, build meaningful relationships and develop the skills to thrive both online and offline.
Conversation Starters
“Which apps or platforms make you feel closer to your friends, and which ones just take up time?”
“Which apps or platforms make you feel good, and which ones feel draining?”
“Which online spaces let you show your creativity or share your work?”
“Who online do you feel really understands your interests?”
“If you could change one thing about your online experience, what would it be?”
“How could we make your online time more positive while still keeping the things you enjoy?”
Parent Reflections
“How do I understand my child’s online identity compared to who they are offline?”
“What positive things has social media added to my child’s life - learning, creativity, friendships?”
“What values do I want to guide our family’s use of technology?”
“Am I prepared to make this a collaborative process rather than setting only top-down rules?”
Recognise & Validate Feelings
Taking away something a child already depends on is far harder than never introducing it in the first place. Social media isn’t just another app on their phone, it’s where they socialise, express themselves creatively and feel connected to the world. For many young people, their online profiles are deeply tied to who they are.
Make space for the feelings that will likely come with this delay. Their sadness, frustration, anger, disappointment etc. is real. Acknowledge and validate the feelings and difficulty this change will have in their life.
“I understand you are upset. This is a big change.”
“Let's find other ways to stay connected with your friends.”
“This is temporary. It’s a no for now, not forever” "The law applies to everyone under 16. This is not our rule, but we support it."
Conversation Starters
“Do you feel like your online world is part of who you are? How so?”
“If you couldn’t use your favourite app for a while, what would you miss the most?”
“What do you think will be the hardest part of taking a break from social media you've already been using?”
“Have you ever felt worried about losing touch with friends if you weren’t online?”
“What helps you feel truly connected, likes, comments, private chats or something else?”
Parent Reflections
“If I suddenly removed my favourite daily tool or hobby, how would I feel?”
“Am I underestimating how important online identity and friendships are to my child?”
“Do I assume my child can ‘just stop’ using social media without recognising what they’d lose?”
“Do I view social media only as a risk, or can I also acknowledge its benefits?”
“Am I modelling healthy digital habits in my own life?”
Go Slow
Aim for a gradual transition rather than suddenly removing platforms and apps. Sudden removal often sparks pushback. Support your child to slowly reduce their time on social media.
Reduce in manageable steps, one app or half an hour at a time. Help them notice which apps consume most of their attention, and which ones they genuinely use to connect with friends.
Conversation Starters
“How’s it feeling to spend less time on [app]?”
“What’s easier than you expected? What’s harder?”
“What would help make this change more manageable?”
Parent Reflection
“Do I know which platforms my child uses the most, and why they prefer them?”
“When my child is using social media, am I paying attention to how it affects their mood, behaviours and choices."
Alternatives & Opportunities
We can’t simply take social media away without providing children with meaningful alternatives. Social media is often more than just an app, it’s where they connect, create and feel part of a community.
Removing social media abruptly can leave a gap in their social, emotional, relational and creative lives, which can lead to boredom, frustration or anxiety. Instead, we need to offer replacement opportunities that fulfil the same needs. It’s important to understand why your child turns to social media and what they get from it.
Connection Encourage and create more face-to-face interactions, group activities and alternate messaging platforms to maintain friendships.
Creativity Provide outlets such as drawing, music, coding, video projects or DIY activities.
Belonging and Identity Support hobbies, sports, clubs or volunteer opportunities where they can feel valued and included.
Seeking Approval and Validation
Offer extra encouragement and positive feedback at home.
Experiencing anxiety when offline
This could signal unhealthy habits have developed. If concerned seek professional support.
By providing thoughtful alternatives, we help children transition more smoothly, maintain their sense of identity and learn to balance online and offline experiences in ways that are safe, positive and sustainable.
Conversation Starters
"Is there a new activity we can take up as a family eg) cooking, walking, board games ...“
"Which friends would you like to spend more time with in person?”
“Are there ways we can stay in touch with your friends that don’t rely on social media?”
“What’s one fun activity we could do together or with friends instead of scrolling?”
“How could we make sure you still feel part of your peer group without relying on apps?”
“What’s one thing you’d like to get better at if you had more time offline?”
Parent Reflection
"If there will be extra time in their day how will it be filled?
"What did your child love before social media?"
"When social media is removed, where might your child end up online?"
Responding to Resistance
Change is hard and resistance is normal. When children are asked to adjust habits around social media, it’s common for them to push back, argue or show frustration. How parents respond can make all the difference in whether the change is successful and whether it strengthens the parent-child relationship.
Stay calm and empathetic Resistance often reflects anxiety, fear of missing out or loss of control. Listen without judgment, acknowledge their feelings and validate that their frustration is real. Use simple phrases like, “I can see this feels unfair” or “It makes sense you’re frustrated”.
Offer choices and agency Kids are far more likely to cooperate if they feel some control. Ask, “Would you like to reduce screen time by half an hour each day, or skip one app first?” Giving them options shifts the dynamic from restriction to collaboration.
Explain the ‘why’ clearly Rather than only setting rules, explain the reasons for the change. For example, “We’re helping your brain and body rest, and making space for other things you enjoy.” When children understand the purpose, they are more likely to accept it.
Reinforce positive steps Celebrate small wins and effort, not just compliance. Recognise their willingness to try.
It’s Too Hard
I know some of you might be thinking there is no way this will work, it’s impossible. I get it, I really do. The prospect of daily arguments and moody tweens and teens can feel overwhelming, especially when you’re already balancing work, family and everything else life throws at you.
This is going to be particularly challenging with teens who have been on social media for years. I know that some families will allow their children to maintain their accounts to avoid conflict. If complete removal feels too hard at least implement these changes;
Set firm time boundaries, even if you cannot remove access entirely.
Move devices out of bedrooms at night - Studies consistently link social media use to poor sleep quality in children.
Implement 'device free' hours each day - Some reduction is better than no reduction.
Have regular check-ins about what they are seeing and experiencing online.
Bypassing Restrictions
We know tech-savvy teens and tweens may attempt workarounds, using VPNs, fake accounts or other methods, to circumvent restrictions. This is why parental supervision, education, conversation and participation is essential. (Note: read more about this in the eSafeKids Members' Community.)
Legislation and regulation cannot replace the vital role of parents and caregivers. What makes the real difference is the presence of parents and caregivers who guide, supervise and stay connected through open conversations.
The delay should be seen as a support for families, not a complete solution. The responsibility of parents, caregivers, educators and other professionals remains the same, support children and young people to develop the social, emotional, relational, and technical skills they need for safe, positive, respectful and secure experiences in the online places and spaces they spend time.
Modelling
Your child is paying attention, closer than you might realise. They notice how often your phone is in your hand, how quickly you react to notifications and how frequently you scroll just to fill a quiet moment. They see the gap between what you say and what you do.
If you want them to change, start with yourself. Swap some of your scrolling for real presence, conversations, shared activities and moments together. You don’t need to be perfect but try and show them the behaviour you hope to see.
Where to from here
For too long technology companies have not carried the responsibility of protecting children from online harms and for managing and mitigating the serious real-world consequences that are impacting the health, wellbeing and personal safety of children and young people.
I still believe the focus should be on mandating designs that are age-appropriate and prioritise children's health, wellbeing and safety over commercial interests. Robust legislation, regulation and effective enforcement measures that prevent children from known and foreseeable emerging risks are needed. This will be difficult given the relentless drive for market dominance, profits and anticompetitive behaviour by technology companies, but it is necessary because the alternative is to leave the health, wellbeing and safety of children in the hands of technology companies and that will not end well for anyone.
Frequently Asked Questions - Source Australian eSafety Commissioner
Why are under-16s being ‘banned’ from social media?
It’s not a ban, it’s a delay to having accounts.
Age-restricted platforms won’t be allowed to let under-16s create or keep an account. That’s because being logged into an account increases the likelihood that they’ll be exposed to pressures and risks that can be hard to deal with. These come from social media platform design features that encourage them to spend more time on screens, while also serving up content that can harm their health and wellbeing.
For example, the pressure to view disappearing content and respond to a stream of notifications and alerts has been linked to harms to health - these include reduced sleep and attention and increased stress levels.
While most platforms currently have a minimum age of 13 for account holders, delaying account access until 16 will give young people more time to develop important skills and maturity. It’s breathing space to build digital literacy, critical reasoning, impulse control and greater resilience.
It also means there’s extra time to teach under-16s about online risks and the impacts of harms, as well as how to stay safer online and seek help when they need it. This will give young people a better chance to prevent and deal with issues once they turn 16 and can have full social media access.
Which platforms will be age-restricted?
While no formal assessments have been made, the age restrictions are likely to apply to Facebook, Instagram, Snapchat, TikTok, X (formerly Twitter) and YouTube, among other platforms.
eSafety will soon be releasing its industry engagement, compliance and enforcement priorities. We will focus on platforms where there is the highest risk, considering factors such as the level of use by Australians under 16 combined with the presence of features and functions associated with the harms that the restrictions aim to prevent.
More generally, age restrictions will apply to social media platforms that meet three specific conditions, unless they are excluded based on criteria set out in legislative rules made by the Minister for Communications in July 2025.
The conditions for age restriction are:
the sole purpose, or a significant purpose, of the service is to enable online social interaction between two or more end-users
the service allows end-users to link to, or interact with, some or all of the other end-users
the service allows end-users to post material on the service.
Online gaming and standalone messaging apps are among a number of types of services that have been excluded under the legislative rules. However, messaging services that have social-media style features which allow users to interact in other ways apart from messaging may be included in the age restrictions, as well as messaging features accessed through age-restricted social media accounts.
The way online services are used can change over time, many services have multiple purposes, and new services are constantly being developed. So the platforms which are age-restricted may change depending on whether they start to meet, continue to meet or no longer meet the legislative rules for exclusion. Our regulatory guidance provides more information about how eSafety will approach this.
Which platforms have been excluded from the age restrictions?
Legislative rules excluding certain types of online services were made by the Minister for Communications following advice from the eSafety Commissioner and consultation with youth groups, parents, carers, the digital industry and civil society groups, as well as experts in child development, mental health and law.
The exclusions apply to:
services that have the sole or primary purpose of messaging, email, voice calling or video calling
services that have the sole or primary purpose of enabling users to play online games with other users
services that have the sole or primary purpose of enabling users to share information about products or services
services that have the sole or primary purpose of enabling users to engage in professional networking or professional development
services that have the sole or primary purpose of supporting the education of users
services that have the sole or primary purpose of supporting the health of users
services that have the sole or significant purpose of facilitating communication between educational institutions and students or student families
services that have the significant purpose of facilitating communication between health care providers and people using those services.
The legislative rules are supported by an explanatory statement, which provides some details about how eSafety should assess a platform’s sole, primary or significant purpose. The factors eSafety is to consider include:
the features and functions of the platform
how they are deployed and influence user engagement and experiences
the actual use of the platform, in addition to what the platform may say its intended purpose is.
The explanatory statement acknowledges that a platform’s purpose(s) may change over time.
My child has never had a problem on social media, why should they miss out?
We know that young people are not all the same. They use a range of social media platforms in varying ways and with different exposure to risks of harm.
However, the Australian Parliament voted for the restrictions for the good of all Australians under 16. The delay is similar to other age-based laws, such as restrictions on the sale of alcohol and cigarettes.
As the law will apply to all of them, parents and carers will no longer need to choose between allowing their under-16s to set up accounts on platforms that may negatively affect their health, or making sure they are not socially excluded. No under-16s have to feel like they’re ‘missing out’.
Won’t under-16s still be able to see social media feeds without accounts?
Under-16s will still be able to see publicly available social media content that doesn’t require being logged into an account. As they won’t be logged in, they won’t be exposed to the more harmful design features of accounts.
For example, most content is currently available to view on YouTube without holding an account.
Another example is that anyone can see some of Facebook’s content, such as the landing pages of businesses or services that use social media as their host platform.
It’s the Australian Government’s intention that under-16s will also continue to have access to online services that provide crucial information and support if they are experiencing distress.
How will under-16s be stopped from finding a way around the age restrictions?
Most social media services currently have a minimum age requirement for account holders, but often they don’t enforce it. That won’t be acceptable anymore. The new law requires age-restricted social media platforms to take reasonable steps to make sure under-16s can’t create or keep accounts.
There are systems and technologies that make this possible while preserving the privacy of users. Some are already being used by social media platforms.
Of course, no solution is likely to be 100% effective 100% of the time. We know that some under-16s may find their way around the restrictions, or be missed when age checks are done. We also know that some users who are 16 or older may have their accounts suspended in error.
So under the new law, age-restricted platforms will be expected to take steps to:
find existing accounts held by under-16s, and deactivate those accounts
prevent under-16s from opening new accounts
prevent workarounds that may allow under-16s to bypass the restrictions
have processes to correct errors if someone is mistakenly missed by or included in the restrictions, so no one is removed unfairly.
Platforms should also provide clear ways for people to report underage accounts, or to request a review if they have been age-restricted by mistake.
Will there be penalties for under-16s if they get around the age restrictions?
There are no penalties for under-16s who access an age-restricted social media platform, or for their parents or carers.
This is about protecting young people, not punishing or isolating them. The goal is to help parents and carers support the health and wellbeing of under-16s.
On the other hand, age-restricted social media platforms may face penalties if they don’t take reasonable steps to prevent under-16s from having accounts on their platforms.
A court can order civil penalties for platforms that don’t take reasonable steps to prevent underage users from having accounts on their platforms. This includes court-imposed fines of up to 150,000 penalty units for corporations – currently equivalent to a total of $49.5 million AUD.
'Reasonable steps' means platforms have to act to enforce the restrictions in a way that is just and appropriate in the circumstances. They will be in breach of the law if they show an unreasonable failure to prevent underage access to accounts.
eSafety is already working with the key platforms where we know Australian children are present in large numbers, and where there are features associated with risks to children. By working with platforms now, eSafety is taking steps to ensure they are getting ready for the social media age restrictions.
eSafety will monitor compliance and enforce the law. This will be done through a range of regulatory powers provided in the Online Safety Act.
Won’t the age restrictions stop under-16s from accessing important benefits of being online?
Under-16s will still be able to use online services, sites and apps that are not covered by the social media age restrictions.
The Australian Government is mindful of the need to balance safety with a broader range of digital rights. Under-16s will still be able to explore and express themselves on platforms that are not age-restricted, allowing connection, creativity, learning, health advice and entertainment. These include online games and standalone messaging apps.
Under-16s will also continue to have access to online services that provide crucial information and support if they are experiencing distress.
The social media age restrictions are designed to make sure under-16s are not over-exposed to negative experiences that can cause immediate and long-term harms to their health and wellbeing.
Will under-16s who already have accounts be allowed to keep using them?
No. Age-restricted social media platforms will have to take reasonable steps to find and deactivate accounts held by under-16s.
'Reasonable steps' means platforms have to act to enforce the restrictions in a way that is just and appropriate in the circumstances. eSafety has developed regulatory guidelines to help platforms deactivate accounts using an approach that is as safe and supportive as reasonably possible. The guidelines are informed by a broad evidence base, including lessons learned through the Australian Government’s Age Assurance Technology TrialExternal link and the outcomes of stakeholder consultations. The Office of the Australian Information Commissioner will provide guidance on privacy.
What proof of age methods will be allowed?
There is a range of technologies available to check age, at the point of account sign up and later. It will be up to each platform to decide which methods it uses.
eSafety has published regulatory guidance to help platforms decide which methods are likely to be effective and comply with the Online Safety Act. The guidelines draw on the Australian Government’s Age Assurance Technology TrialExternal link as well as stakeholder consultations, including our ongoing engagement with social media platforms that are likely to be restricted. The regulatory guidance also draws on our existing knowledge base, and includes principles that are consistent with similar international frameworks. The Office of the Australian Information Commissioner will provide guidance on privacy.
No Australian will be forced to use government ID (including Digital ID accredited
under the Australian Government’s Digital ID System) to prove their age online – age-restricted social media platforms will have to offer reasonable alternatives to users.
Throughout 2025 the Australian eSafety will publish new and updated resources for parents and carers – and for educators and other youth serving professionals – to help them understand how the age restrictions will work. This can be accessed here.
To learn more about eSafeKids workshops and training visit our services page.
To view our wide range of child friendly resources visit our online shop.
Join the free eSafeKids online Members' Community. It has been created to support and inspire you in your home, school, organisation and/or community setting.
About The Author
Kayelene Kerr is recognised as one of Western Australia’s most experienced specialist providers of Protective Behaviours, Body Safety, Cyber Safety, Digital Wellness and Pornography education workshops. Kayelene is passionate about the prevention of child abuse and sexual exploitation, drawing on over 28 years’ experience of study and law enforcement, investigating sexual crimes, including technology facilitated crimes. Kayelene delivers engaging and sought after prevention education workshops to educate, equip and empower children and young people, and to help support parents, carers, educators and other professionals. Kayelene believes protecting children from harm is a shared responsibility and everyone can play a role in the care, safety and protection of children. Kayelene aims to inspire the trusted adults in children’s lives to tackle sometimes challenging topics.
About eSafeKids
eSafeKids strives to reduce and prevent harm through proactive prevention education, supporting and inspiring parents, carers, educators and other professionals to talk with children, young people and vulnerable adults about protective behaviours, body safety, cyber safety, digital wellness and pornography. eSafeKids is based in Perth, Western Australia.
eSafeKids provides books and resources to teach children about social and emotional intelligence, resilience, empathy, gender equality, consent, body safety, protective behaviours, cyber safety, digital wellness, media literacy, puberty and pornography.
eSafeKids books can support educators teaching protective behaviours and child abuse prevention education that aligns with the Western Australian Curriculum, Australian Curriculum, Early Years Learning Framework (EYLF) and National Quality Framework: National Quality Standards (NQS).
