Platforms almost always begin with "90% of users will never notice," then gradually expand scope as regulatory pressure, liability, and competition increase. We saw similar dynamics with real-name policies at Facebook: initially narrow safeguards become infrastructure.
Auditability, fraud prevention, enforcement reporting, etc, historically pushes platforms toward more persistent verification regimes.
"On-device only" promises often erode once regulatory audits demand server-side attestations.
This is the only thing these days that Apple gets right: they make it technically impossible to backdoor, not just promise that they won't.
The second-order effect is normalization. Once large platforms operationalize age assurance, regulators point to them as proof that stricter mandates are "feasible," accelerating a global compliance cascade. Smaller platforms then adopt similar systems to avoid liability.
In past cycles (COPPA, GDPR, cookie banners, payment KYC) the burden disproportionately favored incumbents who could afford compliance. The likely long-term equilibrium is a stratified internet where meaningful participation in adult spaces increasingly requires some portable proof-of-age token, whether nominally anonymous or not.
It's not that Discord is acting in bad faith, they are simply the first domino to fall in what will be yet another GDPR cookie banner, a further erosion of privacy, and another nail in the coffin of a free internet.
Discord should have just "poasted through it," Huberman-style, and the mob would move on to the next platform that will inevitably be forced to enact similar policies.
jameslars•1h ago