There are those that will stay on Discord because the benefits of the first three outweigh the degradation of privacy. Then there are those that will leave because the first three aren't important enough to outweigh the privacy loss. There will be all sorts of people in between.
HN has a rather amplified showing of folks who won't trust anything unless it's completely decentralized using E2EE clients verifiably compiled from source that they'be personally audited running on hardware made from self-mined rare metals. The reality is that there is a spectrum of folks out there all with different preferences and while some folks will leave (in this case) Discord, others will remain because thats where the folks they want to chat/game/voice with.
Back when I played games one friend in our group was banned from LoL arbitrarily so the whole group switched to Dota 2.
Honestly, all of these are documented probabilities at this point. SNS owners can do very decent predictions on what will happen if they introduce certain kind of friction. Also, it’s not 2005 anymore, people are used to upload their IDs everywhere. I mentioned it before as well, if you’ve used any large app, the chances are you’ve uploaded your ID (AirBnB, Tinder, and etc.)
SimpleX seems trustworthy enough, with thoughtful design decisions, even if it fails my "forced tor" requirement. I haven't spent the time to dive into Session's architecture, but it's on my to-do list, currently the marketing copy makes it look like the best choice.
I’ll be building a new platform on these two technologies and using Zoom or something else like Jitsi on the side for video/audio sharing.
It’s time accept the loss of “features” and go back to something simpler but also something that can still be here in 38 years — like IRC has been.
I guess I have a hard time understanding these calls to switch to a platform that has even fewer features than the unverified Discord accounts. The blog post is incorrect in claiming that verification will be mandatory. It will only be necessary to access certain features and content. For simple IRC-style chats or even for voice chats with gaming friends, no verification is required.
The average Discord user, or even the 98th percentile user, isn’t going to be looking to switch to a platform that isn’t a replacement for the features they use. They’re just going to not verify their accounts and move on.
Communities aren’t about the “platform features” they’re about the environment. As for profit CEO after CEO fail to recognize time after time
Things like image embeds, "markdown lite" formatting, and cross-device synchronization are now considered table stakes. There are always going to be some EFnet-type grognards who resist progress because reasons, but they should be ignored.
IRCv3 and Ergo support some of what's needed already (and in a backwards-compatible way!) but client support just isn't there yet, particularly on mobile.
Coming from a former heavy IRC user who's not going back except for nostalgia trips.
This isn't really accurate. Age verification is not mandatory for all accounts. You will be able to join a Discord with your friends, chat, and do voice without age verification.
Here's the exact list of what's restricted if you don't verify:
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
Taken from the announcement https://discord.com/press-releases/discord-launches-teen-by-...
So the claim that Discord is making ID verification "mandatory" or that you need it for gaming chats is untrue.
For children - this mandate also still makes the decision on behalf of the parents that a child must submit a scan of their face to a third party. Moving to Persona for age verification involves verification data being sent outside of the user's phone - in direct contradiction to Discord's initial promise. Third parties that we've been given no reason to trust will delete the data without using it for an improper purpose such as creating derivative info from the ID or facial scan itself unrelated to the sole purpose of verifying that an individual is an adult.
While we're at it - is there any legitimate reason why Discord is associating a person's actual or estimated age with their account as opposed to storing a value that states if they are or are not an adult? That sort of granularity seems unrelated to the stated purpose.
Also - the outcry here isn't from people who think they will no longer be able to use Discord in any way, shape, or form without going through an age verification process. That's a bizarre strawman that doesn't represent the main grievances being aired.
"Additionally, Discord will implement its age inference model, a new system that runs in the background to help determine whether an account belongs to an adult, without always requiring users to verify their age"[0]
0: https://discord.com/press-releases/discord-launches-teen-by-...
> Age verification is only one part of Discord’s broader age assurance approach. For the majority of adult users, we can assign their age group using information we already have (and this does not use message content), using age inference to determine whether an account belongs to an adult and allowing access to age-restricted experiences without completing an explicit age assurance flow.
1. https://support-dev.discord.com/hc/en-us/articles/3833839897...
You are correct. For now. But why would they stop there?
Supposedly this is to protect teens. If that's true, why would they continue letting teens chat with anonymous users? What if they get tricked into sharing sensitive images or video of themselves? Surely we need to know everyone's ID to ensure teens aren't unwittingly chatting with a known predator. It's for their safety. But for now that's a bridge too far. For now.
And why should we believe this even has anything to do with protecting teens? That's valuable data. Discord says they're not holding onto it... for now. But Discord is offering quite a lot to users for free. Why let such an obvious revenue source go unmonetized? They're doing this now because they're going public soon. Investors want an ROI and this action is sure to invite some competition. The people leaving want an alternative, so a competitor could get a foothold. Discord needs to stay ahead. And the users Discord keeps after this stunt are going to be the most resilient to leaving - the most exploitable. Surely they wouldn't care if the policy changes in the future.
The sky isn't falling. But the frog is boiling.
All so that we can post online about how Google is invading our privacy?
1. A way for politicians and the state to track porn habits to US citizens and use that information against them in the future. Blackmail for the future politicians, business leaders, and wealthy to coerce them into doing what those in power want.
2. A way for conservatives to tighten the noose around non-chaste materials and begin to purge them from the internet. And if that works, that's hardly the last thing that will go. Next will be LGBT content, women's rights content, atheist content, pro-labor content, and more. (Or if you're on the other side of the political spectrum, consider that the powers could be used to remove Christian content, 2nd Amendment content, etc. It doesn't really matter what is being removed, just that the mechanisms are in place and that powers can put a lid on the populace.)
We aren't screaming loudly enough.
Do not try to sugar coat this with a pedantic mistake.
This is far worse.
It's a first step down a path the Big Brother state wants.
Yell.
Scream.
Protest.
This topic really brings out the crazy conspiracy theories.
No, politicians are not using Discord age verification to track constituents' porn habits and blackmail them with it later.
Image what will happen post-IPO.
Did they forget it's proprietary, and from the same person that made OpenFeint, which also had a privacy lawsuit?
And not just that event: Parents are roasting Roblox for kids getting groomed, but after the relationship is initiated, the groomers always immediately the convo to Discord.
What's really more distressing is that it got this far before people figured out the game--maybe we should be reflecting on that part, the gullibility and the enabling of those people by those who knew better.
> For the majority of adult users, we will be able to confirm your age group using information we already have. We use age prediction to determine, with high confidence, when a user is an adult. This allows many adults to access age-appropriate features without completing an explicit age check.
> Facial scans never leave your device. Discord and our vendor partners never receive it. IDs are used to get your age only and then deleted. Discord only receives your age — that’s it. Your identity is never associated with your account.
> We leverage an advanced machine learning model developed at Discord to predict whether a user falls into a particular age group based on patterns of user behavior and several other signals associated with their account on Discord. We only use these signals to assign users to an age group when our confidence level is high; when it isn't, users go through our standard age assurance flow to confirm their age. We do not use your message content in the age estimation model.
I work with corporate privacy all of the time, and there is actually something really interesting going on here. We're basically never allowed to claim legal compliance using heuristics or predictive models. Like, never ever. They demand a paper trail on everything, and telling our legal team that we are going to leave it to an algorithm on a user device would make them foam at the mouth.
They are basically trusting a piece of software to look at your face or ID in the same way that, like, a server at a restaurant would check before serving you alcohol.
I am curious to see if this kind of software compliance in the long run is even allowable by regulators.
I feel like it has always been on this path to capture more and more of your data and personally link it to who you are.
herpdyderp•47m ago
I thought age verification was only required to access "adult" content?
jsheard•45m ago
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
cced•39m ago
Does this mean that in panel-like settings where 100s of users are listening to a speaker, in order to ask or contribute in voice you need to be verified?
jsheard•26m ago
https://support.discord.com/hc/en-us/articles/1500005513722-...
Hell if I know why unverified users are allowed to speak in normal voice channels but not in stage channels.
girvo•45m ago
roxolotl•40m ago
https://soatok.blog/2025/07/24/against-the-censorship-of-adu...