Youtube, twitter, bluesky, whatsapp? Every app with a social aspect will be used by teens. And no, tiktok is not "only for teens" or "specially targeted at teens", nowadays everyone uses it and creates content on it.
If you run (say) a restaurant, you get big spikes in business from TikTok videos in ways you don't get from Facebook or Instagram or others.
TikTok is the platform everyone is one right now.
It's a communication channel attached to the most popular social network for young people. Obviously they're going to use it a lot. They use it for the extreme convenience.
I'm mindful that it's less secure than other apps, but for a lot of chats it doesn't matter.
It's uncontroversial amongst people who value their privacy.
The tension between the two camps (there are obviously nuances and this is a false dichotomy) is at a current peak. It's an ongoing controversy. It's a matter of public debate.
You might have liked it better if the angle had been "...which the government, controversially, wants to clamp down on" or something.
>“Parents should also be aware that players may want to find out more about the game using other platforms such as YouTube, Twitch, Reddit and Discord, where other game fans can discuss strategies and experiences.
After you notice it, you'll notice it everywhere.
So yeah, age verification should be taken down, as well as the datamining these companies do and the opaque tunning of their algorithms. It baffles me: people are concerned about their children's DMs but are not concerned about what companies serves them and what they do with their data.
That said, these platforms are making it impossible for parents to monitor anything. They're literally designed to profit off addiction in children.
Absolutely. But what responsibilities do megacorps have? Right now, everyone seems to avoid this question, and make do with megacorps not being responsible. This means: "we'll allow megacorps to be as they are and not take any responsibilities for the effects they have to society". Instead of them taking responsibilities, we're collecting everyone's data, this is because there are many interests involved.
Instead children would own special devices that are locked down and tagged with a "underage" flag when interacting with online services, while adults could continue as normal. We already heavily restrict the freedom of children so there is plenty of precedent for this. Optionally we could provide service points to unlock devices when they turn 18 to avoid E-waste as well.
This way it's the point of sale where you provide your ID, instead of attaching it to the hardware itself and sending it out to every single SaaS on the planet to do what they wish with.
https://digitaldemocracynow.org/2025/03/22/the-troubling-imp...
And yet, it's even more complex than that, since it's now owned by cronies of the current US President. I've never had a TikTok account, but conceptually I was mostly pretty okay with being spied-upon by China. I'm never going to China.
Voluntarily.
Most large platforms rely heavily on server-side visibility for abuse detection, spam filtering, recommendation systems, and safety tooling. End-to-end encryption removes that visibility by design. Once a platform is built around centralized analysis of user content, adding strong E2EE later isn’t just a feature toggle — it conflicts with large parts of the existing architecture.
You can’t moderate an E2EE platform.
maest•1h ago
sethops1•1h ago
tartoran•1h ago
tamimio•1h ago
derwiki•1h ago
modernpacifist•59m ago
tamimio•25m ago
mullingitover•1h ago