On the other hand... There is "WikiHitler", a game where people click on a "random article" on wikipedia and try to reach the "Adolf Hitler" page in the least amount of clicks... so yeah, technically, on wikipedia, you're always a few clicks away from Hitler too, but not by accident.
https://globalwitness.org/en/campaigns/digital-threats/tikto...
I don't know why news sites don't link to the source, but that's another discussion.
If this was Instagram nobody would care.
> Global Witness, a climate organisation whose remit includes investigating big tech’s impact on human rights, said it conducted two batches of tests, with one set before the implementation of child protection rules under the UK’s Online Safety Act (OSA) on 25 July and another after.
Also why the hell is a human rights / climate org doing research on tiktok?
Because such places are significant spots for propaganda and misinformation relating to both topics?
https://fortune.com/2025/09/28/larry-ellison-ai-surveillance...
Here's a link to the wiki for actual reality television show that exists in real life, Hardcore Pawn (https://en.wikipedia.org/wiki/Hardcore_Pawn). That isn't a misspelling. Welcome to the phase of the TrumpTok Takeover where We Need To Do Something To Protect The Children. I wish you luck in the Telescreen portion. Remember, if you make woke facial expressions at the camera during any of the daily loyalty oaths you will be declared Antifa and reeducated.
>After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex.
Where? I'm a grown ass adult who likes sex and has had a tiktok account for years now and I can't find any of this. I can find people dancing, dressed in a way that would be perfectly acceptable in public, but where are the women flashing and penetrative sex? Can anyone confirm that they've seen any of these things at all on TikTok, not to mention after a "small number of clicks"?
Guess those dumb TikTok-wannabe Shorts/Stories didn't work out.
Up next: Terrorist attacks coordinated via TikTok?
Or maybe a school shooting, leading to a ban on TikTok instead of guns.
Oh Murica..
Also I denied all access but it still suggested all my sons friends? How? Oh, and it won't even start without access to cameras.
I was pretty shocked. Still, friend off mine, a teacher tells me: You can't let your kid not have SnapChat, it's very important to them.
The Chinese apparently say: Just regulate! TikTok in our country is fun, educational even with safeguards against addiction. Because they mandate it. Somehow we don't want that here? We see it as overreach? Well I'm ready for some overreach (not ChatControl overreach, but you get what I mean). We leave it all up to the parents here, and all parents say: "Well my kid can't be the only one to not have it."
Meanwhile the kids I speak to tell me they regularly have vapeshops popping up in SnapChat, some dudes sell vapes with candy flavors (outlawed here) until the cops show up.
Yeah, we also did stupid things, I know, we grew up, found pron books in the park (pretty gross in retrospect), drank alcohol as young as 15, etc. I still feel this is different. We're just handing it to them.
Edit: Idk if you ever tried SnapChat but it is TikTok, chat, weird AI filters and something called "stories" which for me features a barely dressed girl in a Sauna.
It works in China because they have chat control to the extreme.
Yeah, it's OK to say no.
If the kid wants a phone and snapchat, there's nothing wrong with saying you simply won't be supplying that and if they want it they'd best figure out how to mow lawns. If you're old enough to "need" a phone you're old enough to hustle some yardwork and walk to the T-Mobile store yourself.
Of course, news rag cant publish the pictures/video and the accounts as proof. But we're supposed to take their word for it? Hard pass on that.
Now, I have seen advertisements that used sexism of various sorts. And this is common wherever advertising and capitalism take hold - its a quick and dirty hack to help sell garbage. https://en.wikipedia.org/wiki/Sex_in_advertising
https://globalwitness.org/en/campaigns/digital-threats/tikto...
The world is hostile and full of exploitation. It is no different on the internet.
X, on the other hand, has literal advertisements for adult products on my feed and I get followed by "adult" bot accounts several times a week that when I click through to block them often shows me literal porn. Same with spam facebook friend requests.
I think it boils down to a simple fact that trying to police user-generated content is always going to be an up-hill battle and it doesn't necessarily reflect on the company itself.
> Global Witness claimed TikTok was in breach of the OSA, which requires tech companies to prevent children from encountering harmful content...
Ok, that is noble goal but I feel that the gap between "reasonable measures" and "prevent" is vast.
I think it boils down to the simple fact that policing user-generated content is completely possible, it just requires identity verification, which is a very unpopular but completely effective idea. Almost like we rediscovered, for the internet, the same problems that need identity in other areas of life.
I think you will also see a push for it in the years ahead. Not necessarily because of some crazy new secret scheme, but because robots will be smart enough to beat most CAPTCHAs or other techniques, and AI will be too convincing, causing websites to be overrun. Reddit is already estimated to be somewhere between 20% and 40% robots. Reddit was also caught with their pants down by a study recently, with an AI robot on r/changemymind racking up ridiculous amounts of karma undetected.
This is disingenuous; the supposed NGO behind this is funded by another arm of the British government, the DFID (Department for International Development). This is native propaganda—the UK government is laundering its own pro-OSA agenda through watchdog organizations that aren't independent at all.
https://web.archive.org/web/20100902185631/http://www.global... ("Our Funders")
https://en.wikipedia.org/wiki/Global_Witness#Income
EU's parallel censorship regime did something similarly inauthentic (unlawfully targeted pro-Chat Control ads to influence votes).
https://noyb.eu/en/noyb-files-complaint-against-eu-commissio... ("noyb files complaint against EU Commission over targeted chat control ad campaign")
dentemple•1h ago
I (40m) don't think I've ever seen literal flashing or literal porn on TikTok, and my algorithm does like to throw in thirst content between my usual hobby stuff.
Are they making the claim that showing porn is a normal behavior for TikTok's algorithm overall, or are they saying that this is something that specifically pervasive with child accounts?
mothballed•1h ago
netruk44•1h ago
Approximate location, age, mobile OS/browser, your contacts, which TikTok links you open, who generated the links you open, TikTok search history, how long it takes you to swipe to the next video on the for you page, etc.
I don’t think it's really possible to say what TikTok’s algorithm does “naturally”. There’s so many influencing factors to it. (Beyond the promoted posts and ads which people pay TikTok to put in your face)
If you sign up to TikTok with an Android and tell it you’re 16, you’re gonna get recommended what the other 16 year olds with Androids in your nearby area (based on IP address) are watching.
yapyap•1h ago
ChromaticPanic•23m ago
thehodge•1h ago
elevation•45m ago
I assume that the offending content was popular but hadn’t been flagged yet and that the algorithm was just measuring her interest in a trending theme; it seems like it would be bad for business to intentionally run off mainstream users like that.
IanCal•1h ago
InitialLastName•37m ago
gadders•1h ago
It might be because I always block anyone with an OF link in their bio, but then that policy doesn't work on Insta.
ivape•48m ago
We’re a derelict society that has become numb, “it’s just a thirst trap”.
We’re in the later innings of a hyper-sexualized society.
Why it’s bad:
1) You shift male puberty into overdrive
2) You continue warping young female concepts of lewdness and body image, effectively “undefining” it (lewdness? What is lewdness?).
3) You also continue warping male concepts of body image
umanwizard•44m ago
ivape•39m ago
“Just thirst trap” (And you see the word I read into).
Right. No, I get it. Listen, we collectively have the issue of not recognizing the significance of things. Nothing personal.
Hizonner•40m ago
Yes.
Promoting vaping, not so much.
> We’re in the later innings of a hyper-sexualized society.
O NOES!
I mean, that's a ridiculous thing to say, but if it were true, so what?
ivape•38m ago
Two wrongs don’t make a right. I regret this name honestly, as there are a lot of high school and college aged people here.
philipkglass•13m ago
hackinthebochs•40m ago
lupusreal•36m ago
gjsman-1000•23m ago
Over the last year, I used to care about what HN generally believed; but seeing obvious statements like this downvoted, makes me just want to support every tech regulation and age verification law we hate here.
dogleash•9m ago
gjsman-1000•9m ago
dogleash•14m ago
Are you saying that the intersection is uniquely bad? In either case limits to content made in an effort to minimize parasocial relationships cut across very different lines than if the goal is minimizing access to porn.
mothballed•14m ago
hackinthebochs•1m ago
mvieira38•1h ago
causal•1h ago
mvieira38•55m ago
saurik•48m ago
Hizonner•44m ago
They are in the business of whipping up outrage, and should not be given any oxygen.
lupusreal•42m ago
Clicking on thirst trap videos?
Hizonner•40m ago
lupusreal•37m ago
> Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.
Hizonner•21m ago
https://globalwitness.org/en/campaigns/digital-threats/tikto...
Their methodology involves searching for suggested terms. They find the most outrage-inducing or outrage-adjacent terms offered to them at each step, and then iterate. They thereby discover, and search for, obfuscated terms being used by "the community" to describe the content they are desperately seeking.
They also find a lot of bullshit like the names of non-porn TV shows that they're too out of touch to recognize and too lazy to look up, and use those names to gin up more outrage, but that's a different matter.
This is, of course, all in the service of whipping up a moral panic over something that doesn't fucking matter to begin with.
_benedict•5m ago
ndriscoll•29m ago
Hizonner•19m ago
You will of course have wasted your time on a non-problem, but at least maybe you'll have an appreciation for how hard a non-problem it is.