Want to use Discord from next month? You’ll have to hand over a photo of your ID or a scan of your face to verify you’re of age. It’s part of a new process introduced by the chat app aimed at ensuring no one underage is using the platform. All new and existing users, the company says, will be given a “teen-appropriate experience” by default, including content filtering and limited access to spaces that host adult content.
To regain the experience they previously had, users will need to prove their age through one of several options, including video selfies or sharing a photo of an identity document. (Discord did not immediately respond to Fast Company’s request for comment.)
Users have reacted pretty unfavorably toward the proposal, with many saying they’re unhappy about sharing personal data with Discord, which faced a massive data breach reported just months ago. In that instance, ID photos of 70,000 users were potentially leaked after a cyberattack. (Discord said the incident involved a third-party customer support provider, not its own systems.)
What worries privacy groups most is not just Discord’s plan, but the precedent it sets for other platforms. “It’s a reflection of growing concerns over the erosion of privacy online, and the slippery slope of mandating identity and age verification across the internet, making these systems a prime tool for surveillance and tracking,” says Rin Alajaji, associate director of state affairs at the Electronic Frontier Foundation (EFF). “Mandating age verification on a platform like Discord directly undermines the platform and the internet’s long-standing culture of anonymity.”
There are also broader concerns about the growing requirements for users to prove who they are and how old they are to do things they previously did without scrutiny. U.K. polling suggests that while people may support age checks in principle, they are far more reluctant to hand over ID or facial footage in practice. Willingness to comply drops significantly when specifics are involved, and the public is split on the use of face video and photographic ID. Only 23% of Brits say they’d hand over ID to access discussion forums like Discord. That same tension appears in the U.S.: people want children protected online, but are less comfortable when those protections infringe on their own rights.
Elinor Carmi, a senior lecturer in data politics and data justice at City St George’s, University of London, argues the backlash isn’t just about biometrics or ID checks in the abstract, but about whether people believe this kind of gatekeeping will actually work. “People just don’t think that age verification actually works,” she says, adding that users see policymakers and platforms reaching for a patch rather than a fix. “The social media platforms and the regulators are basically saying, ‘We have an issue, but let’s not deal with it. And let’s try to solve it in the most technical and easy solution, which is obviously also not working, because you can obviously fake it.’”
There’s also fatigue with the concept, with users feeling the burden is being shifted onto them, including teenagers as well as adults, rather than platforms. And beyond that, there are worries about the consequences of a “papers, please” era of the web. “For many users—especially vulnerable groups like LGBTQ+ youth—having a space to connect without revealing their real identities is essential for safety and free expression,” says EFF’s Alajaji. “Age verification puts that at risk, forcing users to choose between privacy and participation.”
She calls the decision to ask people to hand over more personal data after some users already lost theirs in last year’s Discord-linked data breach “reckless.” People are wary because they’ve been burned before and know they’re being asked to trade their likeness and other sensitive information simply to participate online. “Many users are understandably alarmed about their data being exposed or misused,” Alajaji says.