Meta is making big changes in Australia. Starting December 4, the company will close Instagram and Facebook accounts for users under 16. This is part of a new law aimed at limiting social media access for young people. The goal is to protect kids from potential harm online.
Australia is not the only country taking action. The European Union is also considering age restrictions on social media, and the UK has introduced age checks on platforms with adult content.
In Australia, a significant 77% of people support the ban, according to a YouGov poll. This law comes as concern grows about the negative effects of excessive social media use on children’s mental and physical health. A recent report shows that 96% of kids aged 10 to 15 in Australia have used social media. Nearly 350,000 kids aged 13-15 are on Instagram alone.
Meta’s approach involves using “age assurance methods” to identify underage users. If teenagers are mistakenly banned, they can verify their age by uploading an ID or a video. However, experts warn this could lead to errors. Rahat Masood from the University of New South Wales pointed out that AI tools can misestimate age based on user behavior, which isn’t always reliable.
Faith Gordon, an associate professor at the Australian National University, expressed concerns that many teens might find ways around the ban, such as using VPNs or borrowing accounts. This could push them toward less regulated parts of the internet, which might expose them to harmful content.
Andrew Przybylski, a professor at the University of Oxford, described outright bans as problematic. He believes they can stifle young people’s self-expression and growth. Instead of a ban, he advocates for open discussions between parents and kids about social media.
While this legislation may come from a place of concern, it raises questions about how effective it will truly be in safeguarding young users. Balancing protection with access to online spaces remains a complex challenge.
Source link

