TikTok Rolls Out New EU Age Checks as UK Weighs Under-16 Social Media Ban
TikTok's New EU Age Verification System Launches

TikTok is set to deploy a sophisticated new age-verification system across the European Union in the coming weeks. The move comes amid growing international pressure, including from the UK, to implement stricter protections for children online, with some advocating for an Australia-style ban on social media for under-16s.

How TikTok's New Detection System Works

The technology, developed specifically for the EU regulatory landscape, has been piloted quietly over the past year. It employs a multi-faceted approach, analysing a combination of profile information, posted video content, and user behavioural signals to predict whether an account likely belongs to a user under the age of 13.

Unlike a fully automated ban, accounts flagged by this artificial intelligence will be escalated to specialist human moderators for review. Only after this assessment will accounts be subject to removal. This process has already proven effective; a similar pilot in the United Kingdom led to the removal of thousands of underage accounts.

Global Pressure Mounts for Stricter Child Protection

The rollout coincides with intense scrutiny from European data protection authorities on how platforms verify user ages. It also follows a significant policy shift in Australia, which in December implemented a social media ban for people under 16. The country's eSafety commissioner recently revealed that more than 4.7 million accounts have been removed from ten major platforms, including TikTok, YouTube, and Instagram, since the ban took effect.

In the UK, Prime Minister Keir Starmer has signalled a change in stance, telling Labour MPs he is now "open" to a social media ban for young people. His concern was sparked by reports of very young children spending excessive time on screens and growing worries about the damage these platforms can inflict on under-16s. This marks a shift from his previous opposition, where he cited enforcement difficulties and the risk of driving teens to darker corners of the web.

A Regulatory and Personal Push for Safety

The drive for tougher measures is not just political. Ellen Roome, whose 14-year-old son Jools Sweeney died after an online challenge went wrong, has called for greater rights for parents to access their children's social media accounts in the event of their death. On a legislative level, the European Parliament is pushing for age limits, while Denmark is seeking to ban social media for those under 15.

TikTok stated that the new system was built in consultation with Ireland's Data Protection Commission, its lead EU privacy regulator, to ensure compliance. This development follows a 2023 Guardian investigation which found that TikTok moderators had previously been instructed to allow under-13s to remain on the platform if they claimed parental supervision.

As platforms like TikTok and Meta, which uses Yoti for age verification on Facebook, enhance their tools, the global debate over balancing child safety, digital access, and privacy continues to intensify.