Meta, the tech giant behind Facebook, Instagram, and Threads, is preparing for a significant change in Australia. Starting December 4, young Australians under 16 will lose access to these platforms as part of a new government regulation aimed at improving online safety for children. This law, announced two weeks ago, mandates that social media companies take steps to prevent users younger than 16 from maintaining accounts.
Meta is the first company to act, reaching out to affected teens with warnings via SMS and email. They’re offering a two-week window for young users to download their data and delete their accounts. In its announcement, Meta highlighted the chance for users to update their contact info, so they can regain access once they turn 16.
There are about 500,000 Aussies aged 13 to 15 on Instagram and Facebook, according to Meta’s estimates. To verify ages, individuals over 16 who mistakenly receive notices can provide ID or a “video selfie” to Yoti Age Verification, a company specializing in age checks. However, as Terry Flew from Sydney University’s Center for AI, Trust and Governance noted, such technology can have a failure rate of over 5%. This raises concerns about accuracy and fairness.
The Australian government is cautious about how platforms can verify users’ ages, stressing that a blanket requirement for all users to prove they’re over 16 would be unreasonable. They believe social media companies already have enough data to distinguish between adult and child accounts.
Fines for companies that fail to comply are steep, reaching 50 million Australian dollars (around $33 million). Amid these changes, Meta’s vice president, Antigone Davis, suggested that age verification should start at the app store level instead, supporting a more standardized method of confirming user ages.
There has been mixed feedback about the new rules. Dany Elachi, founder of Heaps Up Alliance, a parents’ group advocating for the age restriction, believes it’s a step in the right direction but urges parents to encourage kids to explore activities beyond social media. He expressed frustration over the timing of the announcement, mentioning that clarity should have come sooner.
Critics, however, argue that outright banning young users could lead to negative consequences. Mat Tinkler, CEO of Save the Children, has called for stricter regulations on social media companies instead of a complete ban. He argues for a proactive approach where platforms embed safety features rather than imposing blanket restrictions. Similarly, the Australian Human Rights Commission has raised concerns, suggesting that alternative methods could offer protection without infringing on rights.
As this law takes effect, it sets a historical precedent for social media regulations worldwide. It’s a significant move, and other countries may watch closely to see how it unfolds and its wider implications for online safety and user rights.
For those interested, more can be read about the law and its implications on the Australian Government’s site here.
Source link
Kids Online Safety Act, Social Media, Snapchat, Facebook, Australia, Instagram, Meta, TikTok, YouTube, Children

