×
 

Ofcom, ICO Urge Meta, TikTok, Snap, YouTube to Enforce Strict Age Blocks

UK watchdogs urge Meta, TikTok, Snap, YouTube to block children.

Britain's media regulator Ofcom and privacy watchdog the Information Commissioner's Office (ICO) on March 12, 2026, jointly pressed major social media platforms—including Meta's Facebook and Instagram, ByteDance's TikTok, Snap's Snapchat, and Alphabet's YouTube—to significantly strengthen measures that prevent children from accessing their services. The regulators accused the companies of failing to enforce their own minimum age rules effectively, allowing underage users to bypass weak age verification systems and exposing them to potentially harmful or addictive content through algorithmic feeds.

The demands form part of the ongoing rollout of the UK's Online Safety Act, which requires platforms to protect children from illegal and harmful online material. Ofcom issued specific directives to the listed platforms (also including Roblox) to demonstrate by April 30, 2026, how they plan to implement tighter age checks, limit stranger interactions with minors, enhance feed safety, and halt the practice of testing new features on children. Failure to comply could trigger enforcement actions, including fines or other penalties, as Ofcom warned that current practices are inadequate and must change urgently.

Separately, the ICO sent an open letter to the same companies, urging the adoption of "modern, viable" age-assurance technologies—such as those used for adult content sites—to block access for users under 13, the typical minimum age for many platforms. The ICO emphasized moving beyond self-declaration methods, which children can easily circumvent, and prioritizing robust verification to safeguard young users' data and well-being. This follows broader concerns about children's exposure to grooming risks, misinformation, and mental health impacts from unregulated online environments.

Also Read: India Repatriates 177 From Lebanon, 500+ Leave Qatar

The regulators' actions align with growing international momentum to curb children's social media use, including Australia's recent restrictions and ongoing UK government consultations exploring even stricter measures, such as potential bans for under-16s. Ofcom's chief executive highlighted that while these platforms are household names, they have not placed children's safety at the core of their product design, prompting the call for immediate improvements.

As the April deadline approaches, the platforms face mounting pressure to overhaul their age verification systems amid threats of regulatory intervention. The move underscores the UK's commitment to holding tech giants accountable under the Online Safety Act, with potential implications for user privacy, platform design, and global standards on child online protection. Industry responses are awaited, as companies balance compliance with user experience and operational challenges in implementing advanced age checks.

Also Read: Trump Declares Victory in Iran Conflict; US Releases 172 Million Barrels from SPR

 
 
 
Gallery Gallery Videos Videos Share on WhatsApp Share