Could a ‘digital duty of care’ approach that focuses on improving online safety features through collaboration with social media platforms be more effective than restricting access?
The Australian Government made international headlines when it delivered on its highly publicised commitment to support parents and protect young people by setting a minimum age of 16 years for social media, with legislation passing Parliament on 29 November 2024.
The Online Safety Amendment (Social Media Minimum Age) Bill 2024 has been lauded as a landmark measure that will deliver greater protections for young Australians during critical stages of their development.
The laws place the onus on social media platforms – not young people or their parents – to take reasonable steps to prevent Australians under 16 years of age from having accounts, and ensures systemic breaches will see platforms face fines of up to $50 million.
The minimum age will apply to ‘age-restricted social media platforms’ as defined in the bill, which includes Snapchat, TikTok, Facebook, Instagram, X and others.
Importantly, the bill ensures that the law is responsive to the ever-evolving nature of technology, while enabling continued access to messaging, online gaming, and services and apps that are primarily for the purposes of education and health support – like Headspace, Kids Helpline, Google Classroom and YouTube.
Dr Alexia Maddox has been following the passage of the legislation, from proposal to law, in her role in the School of Education at La Trobe University.
As a Senior Lecturer in Pedagogy and Education Futures, Dr Maddox is at the forefront of a rapidly evolving landscape. With a background as a sociologist of technology, she brings a unique perspective to the way technology is shaping the future of learning.
“My role is to look at the current tech trends and the technologies that are being used in our schools, both by our kids and by our teachers,” Dr Maddox explains.

This encompasses a wide range of digital tools, from social media and artificial intelligence to immersive environments like virtual and augmented reality. Her research delves into the social impacts and implications of these technologies, exploring how they are transforming the way we communicate, learn, and engage with the world around us.
“I look at the social impacts and implications of our encounters with tech, and in this instance, for communication and learning technologies,” she says.
While Dr Maddox’s expertise lies firmly in the digital realm, she also collaborates closely with colleagues who have direct experience in the teaching profession. By bridging the gap between technology and pedagogy, she aims to provide valuable insights to educators and policymakers navigating the rapidly evolving landscape of digital education.
Her research-driven approach offers a nuanced understanding of the challenges and opportunities presented by technology in the classroom and beyond.
“I get a lot of exposure to that practice,” Dr Maddox says, referring to her work with pre-service and in-service teachers. “I work with them to understand how they’re using these technologies and what the implications are.”
As the digital revolution continues to reshape the education landscape, Dr Maddox’s expertise and collaborative approach will be crucial in shaping the future of learning.
Two sides
At the heart of the debate around the social media age ban is a complex web of concerns and considerations, Dr Maddox says, including parental anxieties about the impact of social media on youth.
“The social media age ban is really a response to parents’ concerns about how social media is affecting their kids, and of course, teachers have a real exposure to that with the way that kids use social media in their personal lives and also how it affects their learning,” she says.
The ban, which aims to restrict access to social media platforms for children under the age of 16, is seen by some as a way to address issues like mental health, cyberbullying, and exposure to inappropriate content. However, Dr Maddox cautions that the effectiveness of such a ban has not been demonstrated in other jurisdictions.
She says the timing of the social media age limit legislation, ahead of the federal election, capitalised on parental anxieties rather than engaging with the nuanced evidence base.
“For parents who are anxious about how social media is affecting their kids, how much time their kids spend on social media, and managing the dysfunction and issues that can arise – like anxiety and bullying for example – an age ban feels safer. However, as it has played out in other countries, there has been no evidence of an age ban being effective in keeping kids safe.”
“In Australia, we’re seeing stakeholders like 36 Months advocating for the social media age ban whilst acknowledging the need for children’s access to educational content available through platforms such as YouTube. However looking at what has happened in the US where pornography age ban restrictions implemented in states such as Louisiana, there has been increased VPN usage. While we don’t know the ages of who is using VPNs to access pornography, this practice is very likely to translate to the Australian context where there will be social media age restrictions.”
Dr Maddox says social media also has a lot of benefits, particularly for kids in regional and remote areas, and kids from migrant families whose grandparents might be overseas, helping them to stay connected.
“Kids who are marginalised, for example, the LGBTQI+ community, often use social media to find acceptance and safe places to explore their identity and sexuality.
“Social media is fundamental to how kids hang out with their peer groups and get access to knowledge and information. It’s embedded in their lives.”
Dr Maddox’s nuanced understanding of the social and educational implications of the ban will be crucial in shaping a balanced and evidence-based approach to protecting young people in the digital age. She emphasises the importance of digital literacy and the need for a comprehensive strategy that addresses the root causes of online harms, rather than simply restricting access.
“Social media is a way of life for young people and beyond that, it’s a way of life for all of us; we use it in our workplaces, we use it for professional networking. We understand how to manage our privacy settings, and to be alert to scams and phishing. That kind of digital literacy is important for us as adults, because it’s in our work, our citizen life, and our social lives,” she says.

Privacy concerns
The Online Safety Amendment (Social Media Minimum Age) Bill 2024 contains strong privacy provisions, with platforms required to ring-fence and destroy any data collected once it has been used for age assurance purposes. Failure to destroy data would be a breach of the Privacy Act, with penalties of up to $50 million.
Minister for Communications Michelle Rowland said the government has listened to young people, parents and carers, experts, and industry in developing these landmark laws to ensure they are centred on protecting young people – not isolating them.
“Good government is about facing up to difficult reform – we know these laws are novel, but to do nothing is simply not an option.
“Over the next 12 months, we’ll work closely with industry and experts to ensure the minimum age is effectively implemented, informed by the findings of the Age Assurance Technology Trial currently underway,” she said.
According to Dr Maddox, the proposed age verification technology trials reveal significant technical challenges that weren’t apparent when the legislation was first proposed.
“We’re seeing concerning accuracy gaps in biometric age estimation, especially for young teenagers, and significant disparities across different demographic groups. These technical limitations could create unintended barriers for legitimate users while potentially failing to protect those the legislation aims to safeguard,” she says.
Garnering far less media attention than the social media age limit legislation is the Government’s Digital Duty of Care legislation, which will place the onus on digital platforms to proactively keep Australians safe and better prevent online harms.
Dr Maddox argues that the Digital Duty of Care legislation is a more evidence-based and collaborative approach, one that addresses the root causes of online harms rather than simply restricting access.
She says it has been developed through extensive consultation, as opposed to the age ban that was not, and does not stem from the recommendations for the Inquiry into Social Media and Online Safety, the final report for which was recently released.
“The Digital Duty of Care bill is evidence-based and it has been well considered. It’s taking a safety-by-design approach, working with platforms to improve their features to increase the safety for all of us online, including kids,” Dr Maddox says.
Aligned with United Kingdom and European Union approaches, digital platforms will be required to take reasonable steps to prevent foreseeable harms on their platforms and services, with the framework to be underpinned by risk assessment and risk mitigation, and informed by safety-by-design principles.
Legislating a duty of care will mean services can’t ‘set and forget’. Instead, their obligations will mean they need to continually identify and mitigate potential risks, as technology and service offerings change and evolve.

Implications for educators
The social media age ban has significant implications for schools and educators. As Dr Maddox explains, the ban could severely limit the ways in which teachers and schools can leverage social media for educational purposes.
“Educators rely on social media for teaching and learning, including YouTube for educational content,” she says.
The ban could affect how educators set homework and use online resources, as many of these rely on access to social media platforms.
“The complexity of modern platforms means we can’t simply categorise them as ‘social media’ or ‘educational tools’. Take YouTube – while viewing content might remain accessible, the interactive features that make it valuable for education could be restricted. We need clearer frameworks for handling these hybrid platforms that serve multiple purposes in young people’s lives,” Dr Maddox says.
Her expertise in the field of digital pedagogy provides valuable insights into the challenges and opportunities that this legislation could present for the education sector.
“We recently came through a pandemic where all study moved online and teachers relied on existing digital resources, some of which they would have directed kids to social media to access. Under a social media age ban, a large swathe of that will be not available. It will come down to the definition of social media and which platforms are exempt from the age ban,” she says.
Dr Maddox also highlights the potential difficulties schools may face in enforcing the ban. Without clear guidelines and support, schools may struggle to navigate the complexities of monitoring and addressing student use of social media during school hours.
“Will schools be responsible for policing the ban in the classroom? What is the rule in schools? The government has said it won’t be punishing parents or kids. It will be punishing platforms. That means it will be the platforms that have to enforce this age restriction and age restricted content,” she says.
“Rather than introducing an age ban that involves an age verification process and digital identification, isn’t it better to create safer environments for users in the first place? That’s where the Digital Duty of Care legislation is going to be focused.”
Dr Maddox says platform compliance in a global context has shifted significantly.
“With Meta’s pushback against EU regulations and move toward reduced platform-level moderation, we can’t assume platforms will simply extend their European compliance measures to Australia. As we saw with the news media code, platforms might choose to withdraw services rather than comply with national regulations they see as burdensome for smaller markets.”
She says the assumption that platforms will extend EU/UK safety standards to Australia may need reconsideration.
“Meta’s increasing pushback against EU restrictions, combined with a shifting US regulatory environment, suggests platforms may become more resistant to national-level regulations in smaller markets. While Australia’s Digital Duty of Care legislation aligns with European standards, platforms could choose to withdraw services rather than comply with additional age restrictions – especially if they’re simultaneously challenging similar requirements in larger markets.
“This context makes the age verification requirements particularly precarious, as they rely on platform cooperation at a time when major platforms are increasingly willing to contest or withdraw from regulatory requirements. Schools and educators, already facing unclear enforcement guidelines, may need to prepare for a scenario where platform accessibility becomes more uncertain or fragmented.”
“Schools face a complex challenge. While the legislation aims to protect young people, it could inadvertently disrupt established educational practices that rely on social media interaction. We need clear guidelines about how schools should handle these restrictions during school hours, especially for remote learning and homework that involves social media engagement. The pandemic showed us how integral these tools have become for teaching – we can’t simply remove them without considering the educational implications.”
The social media age limit is expected to come into effect within 12 months of the bill passing, meaning it should be enforced by the end of 2025.