The Australian government has raised concerns over major social media platforms, stating that Meta, Snapchat, TikTok, and YouTube are not fully complying with a ban on accounts for children under a certain age. Authorities say these platforms have failed to implement adequate measures to prevent underage users from creating accounts, potentially exposing children to inappropriate content and online risks.
The Office of the eSafety Commissioner, responsible for overseeing online safety in Australia, indicated that while progress has been made, enforcement gaps remain. The commissioner emphasized that platforms must adopt stronger verification methods and stricter age restrictions to comply with national regulations designed to protect minors online.
Australia’s digital safety laws, which include provisions to restrict child accounts, are part of broader efforts to curb cyberbullying, harmful content, and privacy violations affecting young users. Officials highlighted that social media companies have a responsibility to ensure that these legal safeguards are fully operational and effective.
Also Read: Ofcom, ICO Urge Meta, TikTok, Snap, YouTube to Enforce Strict Age Blocks
Meta, which owns Facebook and Instagram, along with Snapchat, TikTok, and YouTube, were reportedly contacted by the government for updates on their compliance measures. While the companies claim to implement age verification systems, authorities argue that these measures are insufficient and can be easily bypassed, allowing underage users to gain access to their platforms.
Legal experts say Australia’s firm stance reflects growing global pressure on social media companies to prioritize child safety. They note that the challenge lies not only in enforcing age restrictions but also in educating parents and users about online risks. The government’s warning serves as both a regulatory reminder and a public alert about the potential dangers of unmonitored social media access for children.
The eSafety Commissioner has indicated that further action, including stricter penalties or audits, could follow if platforms fail to comply fully. Australian authorities continue to stress that the safety of children online is a shared responsibility between the government, tech companies, and families, urging immediate improvements in digital safeguards and monitoring practices.
Also Read: Indonesia Signs Regulation Prohibiting Children Under 16 From Using Social Media