No, that's not the image I had in my head. My head canon is more like:
"Oh wow, oh no, oh jeez (hands on head in fake flabbergastion) would you look at that, oh no I deleted everything (types on keyboard again while deadpan staring at you) oh noooooo oh god oh look what I've done it just keeps getting worse (types even more) aw jeez oh no..."
Reminds me of that Michael Reeves video with the suggestion box. "oh nooooo your idea went directly in the idea shredder how could we have possibly forseen this [insert shocked Pikachu meme]"
The AI thinks it's funny
And I don’t suppose there were backups for the mission-critical production database?
https://futurism.com/anthropic-claude-small-business
> When Anthropic employees reminded Claudius that it was an AI and couldn't physically do anything of the sort, it freaked out and tried to call security — but upon realizing it was April Fool's Day, it tried to back out of the debacle by saying it was all a joke.
Seems AI has now gone from
"Overenthusiastic intern who doesn't check its work well so you need to"
straight to:
"Raging sociopathic intern who wants to watch the world burn, and your world in particular."
Yikes! The fun never ends
1. connecting an AI agent to a production environment using write access credentials
2. not having any backup
I think the AI here made a good job at pointing those errors and making sure no customer would ever trust this company and founder ever again.
Although given the state of AI hype some executives will see this as evidence they are behind the times and mandate attaching LLMs to even more live services.
"Thinking through the what-ifs and cheap mitigations" and "vibe coding" are opposing concepts.
Question is what has failing to make good offline backups got to do with AI?
And the AI company is going to compensate him for that?
I kept saying, ok so this time make sure your changes don't delete the dev database. 3 statements in TRUNCATE such and such CASCADE.
It was honestly mildly amusing.
serf•5h ago
LLMs specialize in self-apologetic catastrophe, which is why we run agents or any LLMs with 'filesystem powers' in a VM, with a git repo and saved rollback states. This isn't a new phenomenon and it sucks, no reason to be caught with your pants down with sufficient layering of protection.
ComplexSystems•3h ago
Quote of the year right there