Unless we need to go to languages even harder? Haskell?
It'll be so hard to find anything in the chaff you might get your old job as a dev back. :)
Stop blaming "AI" - whatever you mean by this. Whether it's an LLM, an LLM-based agent or something else - stop blaming AI and "AI" and LLMs and... you get the point.
It's not the AI that makes the decision to, sorry for being straightforward, write the worthless code which feels like a piece of useless bloated trash. It's not the AI who makes decisions to do something without even understanding the topic - no matter how exactly you define "understanding" in this context. It's not AI who is responsible for this. Because whatever AI truly is right now - an autocomplete tool, advanced chatbot or, maybe, agent - whatever it is, the decisions are made by humans. AI is not responsible for anything that is happening right now.
Humans and humans only are responsible for what's happening. It's their choice. It's their qualities that are clearly visible now. It's their behaviour.
Stop blaming kitchen knives for murders.
Well, yeah, stop blaming the knives. Blame the cooks ("vibecoders") who think they can manage a kitchen because the knife cuts everything in half automatically. But also don't forget to blame the knife manufacturer ("AI" companies) who markets automated knives to people who don't know you shouldn't cut toward yourself.
I kind of agree. Some people don't understand how to code because they're lazy or have other issues, while others are trying to make a profit from it. I suppose you can tell who's who. But AI is directed by humans anyway. Instead of copy-pasting, a human could choose to try and write the code themselves, and then ask AI to review it and highlight areas for improvement. A human could choose to ask AI how to do things and then try to do it themselves. But if a human chooses to do things the other way, that's their choice. AI is not to blame here. It's still a human choice, and the person making it is the one who is actually responsible.
Some people smoke. Smoking kills, and not only can smokers die from it, but other people can be harmed by passive smoking as well. It's very easy to start smoking. But blaming cigarettes themselves, as objects/entities/etc. isn't the answer, I guess. It was a certain person's choice to try smoking. It was also the choice of another person to advertise smoking in one way or another, however...
Or rather, blaming a car. Yes, a bad driver is way more dangerous than a good driver, but even the best driver can make a mistake. Like cars, it's an inherently flawed piece of technology, and like cars, its benefits are too high for most of us to ignore. Way better analogy than my auto turret one.
Well, if you put it this way... even the best programmer in the world, who doesn't use AI at all, can also make a mistake. Of course, their mistakes would probably be less frequent, but I guess they wouldn't blame IDE for poor syntax highlighting (if it's good enough, of course) or the compiler or interpreter for failing to spot the logical error unrelated to syntax rules. They would say "it was my mistake". The problem with AI-generated code, though, is that those who generate it almost never take responsibility for it. They'll say something like, "AI made a mistake here and there." I have never seen someone who has generated flawed code using AI to take responsibility for it. And that's the main problem.
It doesn't matter whether you're a bad driver or the best driver. If you cause an accident, you must be held responsible. As simple as that.
> Like cars, it's an inherently flawed piece of technology
Sorry, but what exactly do you mean? I'm just curious to know what you mean when you say that cars are "an inherently flawed piece of technology".
jmclnx•1h ago
https://old.reddit.com/r/Python/comments/1qpq3cc/rant_ai_is_...