I question how much influence such groups actually have, given that payment processors already dislike dealing with adult oriented businesses.
The percentage of chargebacks and disputes made for those transactions is significantly higher than any other category. Companies hate having to investigate such claims and issue new cards, even when it appears fairly obvious the purchase was made by the cardholder. It’s also tricky from a customer service standpoint, because the cardholder may likely be lying in order to hide an embarrassing purchase from a spouse or other family member.
It seems like payment processors just want to get rid of a hassle for themselves.
https://www.collectiveshout.org/progress_in_global_campaign_...
the key bit is non-consensual, so it's unrelated to individual morality and they're providing a way to report a real crime
Please don't comment on whether someone read an article.
https://news.ycombinator.com/newsguidelines.html> Please don't comment on whether someone read an article. "Did you even read the article? It mentions that" can be shortened to "The article mentions that".
"You should really read the article" is semantically the same as "The article mentions that". It's not a question.
(I am aware this is not really related to the article. I think this is a cool discussion to be had)
I can see this is related to the sad and ongoing ‘purification’ of the internet, but still, not going to get upset over better UX for taking down deepfakes or non-consensual explicit images which do hurt people.
What could go wrong?
Hopefully Google didn’t just build the world’s best deepfake search…
guessmyname•2h ago
Microsoft, Google, Facebook, and other large tech companies have had image recognition models capable of detecting this kind of content at scale for years, long before large language models became popular. There’s really no excuse for hosting or indexing these images as publicly accessible assets when they clearly have the technical ability to identify and exclude explicit content automatically.
Instead of putting the burden on victims to report these images one by one, companies should be proactively preventing this material from appearing in search results at all. If the technology exists, and it clearly does, then the default approach should be prevention, not reactive cleanup.
whatevermom5•1h ago
FrankBooth•1h ago
ahofmann•1h ago
whatevermom5•1h ago
osmsucks•1h ago
Dylan16807•1h ago
If you're saying they shouldn't index any explicit images, you're talking about something very different from the article.
drdaeman•1h ago
But I fail to make sense of it either way. Either the nuance of lack of consent is missing, or Google is blamed for not doing what they just did from the very first version.