Liv McMahonTechnology reporter

Getty Images
Discord will soon require all users globally to verify their age with a face scan or by uploading a form of ID if they want to access adult content.
The online chat service, which allows people to create and join groups based on their interests, says it has more than 200 million monthly users.
Its new safety measures are designed to protect people by placing everyone into a teen-appropriate experience "by default".
Discord already makes some users in the UK and Australia verify their age to comply with online safety laws - but it will roll out the age checks to all users worldwide from early March.
"Nowhere is our safety work more important than when it comes to teen users," said Discord policy head Savannah Badalich.
"Rolling out teen-by-default settings globally builds on Discord's existing safety architecture, giving teens strong protections while allowing verified adults flexibility."
The platform says the new default settings will restrict what people can see and how they can communicate, with only those who prove they are an adult able to access age-restricted communities and unblur material marked as sensitive.
Users will also not be able to see direct messages sent to them from someone they do not know unless they complete Discord's age checks.
Users can either upload a photo of their ID to confirm their age or take a video selfie, where AI will be used to estimate their facial age.
Discord said information used for age checks will not be stored by the platform or the verification company.
It said face scans would not be collected, and ID uploads would be deleted after the verification is complete.
Privacy campaigners have previously warned such methods could pose a risk to people's privacy.
Discord faced criticism in October after official ID photos of around 70,000 users were potentially leaked after a firm which helped it verify ages was hacked.
With its new measures - which include a teen safety council - Discord is also echoing Meta's Facebook and Instagram, TikTok and Roblox.
Social platforms have rolled out a slew of measures to protect teens and children on their sites in recent years after facing increased pressure from lawmakers.
Discord's boss Jason Citron was grilled about his company's child safety measures at a fiery US Senate hearing in 2024, alongside alongside Facebook founder Mark Zuckerberg, Snap boss Evan Spiegler and TikTok's chief Shou Chew.



2 hours ago
1









