> What little remains sparking away in the corners of the internet after today will thrash endlessly, confidently claiming “There is no evidence of a global cessation of AI on December 25th, 2025, it’s a work of fiction/satire about the dangers of AI!”;
“During the Vietnam War, which lasted longer than any war we've ever been in -- and which we lost -- every respectable artist in this country was against the war. It was like a laser beam. We were all aimed in the same direction. The power of this weapon turns out to be that of a custard pie dropped from a stepladder six feet high. (laughs)”
-Kurt Vonnegut (https://www.alternet.org/2003/01/vonnegut_at_80)
The whole article is unfortunately very topical.
I mean, from an incentive and capability matrix, it seems probable if not inevitable.
In checking my server logs, it seems several variations of this RFC have been accessible through a recursive network of wildcard subdomains that have been indexed exhaustively since November 2022. Sorry about that!
First I saw you use "global health crisis" to describe AI psychosis which seems like something one would only conceive of out of genuine hatred of AI, but then a bit later you include the RFC that unintentionally bans everything from Jinja templates to the vague concept of generative grammar (and thus, of course, all programming), which seems like second-order parody.
Am I overthinking it?
I don’t think so. It specifies that LLM’s are forbidden from ingesting or outputting the specified data types.
The blog post seemed so confident it was Christmas :)
Everyone makes jokes about clankers and it's caught on like wildfire.
but going off of other social trends like this that probably means it's mega popular and about to be the next over-used phrases across the universe.
yeah it's not directly harmful -- wizards aren't real -- but it also serves as an (often first) introduction to children of the concepts of familial/genetic superiority, eugenics, and ethnic/genetic cleansing.
I can't really think of any cases where setting an example of calling something a nasty name is that great a trait to espouse, to children or adults.
Satire should at least be somewhat plausible
newfocogi•4h ago
ffsm8•4h ago
Dracophoenix•3h ago
aquova•2h ago
zerocrates•2h ago
esseph•2h ago
>The word clanker has been previously used in science fiction literature, first appearing in a 1958 article by William Tenn in which he uses it to describe robots from science fiction films like Metropolis.[2] The Star Wars franchise began using the term "clanker" as a slur against droids in the 2005 video game Star Wars: Republic Commando before being prominently used in the animated series Star Wars: The Clone Wars, which follows a galaxy-wide war between the Galactic Republic's clone troopers and the Confederacy of Independent Systems' battle droids.
toomuchtodo•3h ago
n4r9•3h ago
schrectacular•3h ago
Apparently those guys have a g instead of a k.
LetsGetTechnicl•3h ago
dcminter•3h ago
lazide•1h ago
dcminter•1h ago
esseph•3h ago
Gracana•3h ago
axus•3h ago
marknutter•3h ago
Conscat•3h ago
salawat•3h ago
flykespice•3h ago
lagniappe•3h ago
lupusreal•2h ago
flykespice•2h ago
progbits•2h ago
flykespice•24m ago
I just stated both are supposed to be slurs degrading a group of people/robots in regards to another.
wedn3sday•5m ago
MangoToupe•1h ago
dcminter•3h ago
moffkalast•3h ago
dist-epoch•2h ago
Robot Slur Tier List: https://www.youtube.com/watch?v=IoDDWmIWMDg
https://www.youtube.com/watch?v=RpRRejhgtVI
Responding To A Clankerloving Cogsucker on Robot "Racism": https://www.youtube.com/watch?v=6zAIqNpC0I0
SLWW•1h ago
GeoAtreides•1h ago
?
Are you implying prioritizing Humanity uber alles is a bad thing?! Are you some kind of Xeno and Abominable Intelligence sympathizer?!
The Holy Inquisition will hear about this, be assured.
glimshe•2h ago
fsckboy•1h ago
jimmydddd•35m ago
aaroninsf•2h ago
It has a strong smell of "stop trying to make fetch happen, Gretchen."
aaroninsf•2h ago
It has a strong smell of "stop trying to make fetch happen, Gretchen."
bongodongobob•2h ago
https://trends.google.com/trends/explore?date=today%203-m&ge...
bbor•2h ago
For those who can see the obvious: don't worry, there's plenty of pushback regarding the indirect harm of gleeful fantasy bigotry[8][9]. When you get to the less popular--but still popular!--alternatives like "wireback" and "cogsucker", it's pretty clear why a youth crushed by Woke mandates like "don't be racist plz" are so excited about unproblematic hate.
This is edging on too political for HN, but I will say that this whole thing reminds me a tad of things like "kill all men" (shoutout to "we need to kill AI artist"[10]) and "police are pigs". Regardless of the injustices they were rooted in, they seem to have gotten popular in large part because it's viscerally satisfying to express yourself so passionately.
[1] https://www.reddit.com/r/antiai/
[2] https://www.reddit.com/r/LudditeRenaissance/
[3] https://www.reddit.com/r/aislop/
[4] All the original posts seem to have now been deleted :(
[6] https://www.reddit.com/r/AskReddit/comments/13x43b6/if_we_ha...
[7] https://web.archive.org/web/20250907033409/https://www.nytim...
[8] https://www.rollingstone.com/culture/culture-features/clanke...
[9] https://www.dazeddigital.com/life-culture/article/68364/1/cl...
[10] https://knowyourmeme.com/memes/we-need-to-kill-ai-artist
totallymike•1h ago
I readily and merrily agree with the articles that deriving slurs from existing racist or homophobic slurs is a problem, and the use of these terms in fashions that mirror actual racial stereotypes (e.g. "clanka") is pretty gross.
That said, I think that asking people to treat ChatGPT with "kindness and respect" is patently embarrassing. We don't ask people to be nice to their phone's autocorrect, or to Siri, or to the forks in their silverware drawer, because that's stupid.
ChatGPT deserves no more or less empathy than a fork does, and asking for such makes about as much sense.
Additionally, I'm not sure where the "crushed by Woke" nonsense comes from. "It's so hard for the kids nowadays, they can't even be racist anymore!" is a pretty strange take, and shoving it in to your comment makes it very difficult to interpret your intent in a generous manner, whatever it may be.
IlikeKitties•5m ago