Kids using phone
(Photo by Katerina Holmes/ Pexels)

In a bold move to safeguard children online, Australia is poised to introduce unprecedented legislation that will ban social media use for children under the age of 16. Platforms such as Meta (which owns Facebook and Instagram), TikTok, and X will be given a year to implement these restrictions, a measure aimed at reducing exposure to online harms for minors.

Australia's Landmark Legislation for Online Safety

Starting November 18, the Australian government plans to introduce this landmark legislation, barring all social media access for users under 16, even if parental consent is granted. According to The Daily Mail, Prime Minister Anthony Albanese has firmly endorsed the policy, stating that it empowers parents to safeguard their children against the potentially harmful effects of social media.

"We ban alcohol for under-18s for purchasing," Albanese explained. "This weekend, there'll be examples of someone under 18 getting access to alcohol. Doesn't mean we say, 'Oh well, it's too hard, let it rip.'" Albanese emphasised that while the law may not prevent every child from using social media, it aims to set a precedent for responsible online engagement.

Holding Tech Giants Accountable

Under this legislation, the responsibility of enforcing the age restriction will lie squarely with tech giants rather than parents or children. Meta, TikTok, and X will have one year from the law's enactment to develop mechanisms for verifying users' ages. Currently, the minimum age requirement for these platforms is 13, but the Australian government is pushing to increase this threshold.

Meta has voiced concerns over the technical challenges of enforcing such a ban. Antigone Davis, Meta's global head of safety, highlighted the limitations of current technology for age verification, noting that many age-assurance tools rely on collecting personally identifiable information, potentially through facial recognition or ID verification, which raises privacy concerns. According to Fortune, Davis stated, "The idea that the industry can simply implement these requirements is probably a misunderstanding of our current technological capabilities."

Social Media's Dark Side: A Rising Crisis

The call for tighter regulation follows a series of tragic incidents linked to social media. According to Kidspot, 12-year-old Ella Catley-Crawford, from Brisbane, recently took her own life after enduring relentless cyberbullying. Catfished and harassed by classmates who spread her private photos online, Ella's experience represents just one of many stories highlighting the dark side of social media.

In May, another incident involving a social media challenge took the life of 13-year-old Esra Haynes, who died after participating in the "chroming" trend, which involves inhaling chemicals. Esra's parents have since campaigned for stricter social media regulations, arguing that children often lack the maturity to handle online content responsibly. Her father stated, "Kids at 13 don't understand consequences fully. Social media exposes them to risks they're not equipped to navigate."

Australia Joins Global Push for Youth Online Safety

Australia's legislation is part of a broader global movement to regulate children's social media use. In China, the Regulations on the Protection of Minors in Cyberspace mandate strict controls on harmful content and cyberbullying while requiring tech companies to verify user ages rigorously. This legislation includes provisions to limit minors' online activity and prevent exposure to addictive content, similar in intent to Australia's planned restrictions.

Countries such as Japan have already introduced regulations aimed at safeguarding young users. In September, Instagram rolled out restrictions in Japan that limit messaging capabilities and app usage time for users aged 13 to 17. According to Fortune, these users receive reminders to log off after 60 minutes of use, and parental consent is required for account changes, with parents receiving insights into their children's online activities.

In the UK, Labour MP Josh MacAlister recently proposed legislation to raise the age of "internet adulthood" from 13 to 16, requiring parental consent for social media access for minors under 16. MacAlister has advocated for similar restrictions in schools to limit smartphone usage, arguing that reducing screen time can mitigate negative effects on mental health.

A Challenging Path Ahead for Tech Firms

Despite concerns over enforcing these measures, Meta has committed to complying with Australia's age restrictions, though it questions whether current technology can meet the requirements effectively. The company highlighted that facial recognition or ID-based age verification could be invasive and challenging to implement. As Davis noted, "Age assurance technology... often needs personally identifiable information, which raises questions about user privacy and data handling."

The legislation is expected to bring social media companies into alignment with Australia's goal of child safety. However, as seen in other parts of the world, companies may face substantial logistical and ethical hurdles in enforcing these rules.