Menu Menu
[gtranslate]

Discord rolls out age checks amid pressure to protect young users

All Discord users across the world will have to provide proof of age next month, or face being resigned to a ‘teen appropriate’ experience by default. Will this help Discord to finally address safety concerns regarding young users?

Discord has proven notoriously hard to police, but rising pressure in recent years has prompted the platform to roll out stronger safety measures for 2026.

When March rolls around, all Discord users across the globe will be met with the same age verification window. Anyone who cannot adequately provide proof of their adult status will be resigned to a ‘teen appropriate’ experience.

Those who aren’t age verified will not be able to access age restricted servers – even if they were members prior to the change – and will lose the ability to speak in livestream ‘stage’ channels.

Content filters will also prevent teenage users from partaking in communities or viewing media deemed potentially graphic or sensitive, and messages from unfamiliar accounts will be automatically filtered into a separate inbox from normal DMs.

Discord implemented age checks in the UK and Australia last year, but the latest expansion will apply to all users globally – of which roughly 60 million are teens. The first of two options uses AI to analyse a video selfie which will estimate the subject’s age. The second, more intuitive option, is to send a photo of your ID to be analysed (and ‘immediately’ deleted) by a third party vendor.

The platform’s head of product policy, Savannah Badalich, is confident that Discord’s embarrassing launch of safety measures in the UK, which saw hundreds easily bypass the checks using Death Stranding’s in-game photo mode, has served as a learning experience to make the global rollout a success.

Internally, there’s an acceptance that a fresh wave of age checks will be the final straw for some users, given hackers already acquired thousands of private Discord usernames, email addresses, and biometrics just last October through a third party. Discord claims to have cut ties with that vendor entirely.

Talking of a potential hit to engagement, Badalich explained, ‘We do expect that there will be some sort of hit there, and we are incorporating that into what our planning looks like. We’ll find other ways to bring users back.’

Between several major platforms appeasing tech regulators with safeguards overnight, and others intent on introducing premium ad-free subscriptions (cough, Instagram), it’s understandable that people are becoming irked at their app experiences being degraded and suspicious of where their data is going.

In the case of Discord, however, it’s hard to deny that the ends justify the means. With no prior age safety compliance, its servers have been rife with predatory behaviour and the spread of some truly heinous material in recent years.

In some instances, extremist Discord communities have been linked to multiple murders, cases of self-mutilation, and even forced suicide – with young people often embroiled in them as both offenders and victims.

It’s inevitable that the most prolific teenagers will find ways around the upcoming barriers, as they have with similar systems before. But this does at least signal that Discord is taking a more proactive (and hopefully less reactive) approach to moderation.

Ending the platform’s misuse problem completely is too tall an order, but it’s encouraging to see youth safety finally being treated as a core responsibility, and not an afterthought.

Enjoyed this? Click here to read more Gen Z focused tech stories.

Accessibility