So it's from a place of wisdom when I say it's been impossible for me to ignore how absolutely world-destroying AI/LLM technology is on so very many levels. Not just for coders. The whole of it. The environmental impact, the global job market, education; it's truly catastrophic, our industry is embracing it, and don't understand why. We of all people should know better.
That's not even taking into account their efficacy, or lack thereof. I do my best to keep up. I've listened with an open mind to what advocates have to say. I've held my nose and experimented with them in a variety of ways; vibe coding, small fns, proposals, research, etc. I've tried to engineer better prompts with rubrics and xml-based formatting. I remain unconvinced any of this is worth the global cost.
As a coding tool, it's often flat out wrong and produces unusable or destructive results. Search "cursor delete files" and read about vibe coders who have had swaths of code deleted because claude got silly and these dummies don't know about version control. Look up the AI 12x problem; how coding time has dropped significantly, but code review and maintenance time has skyrocketed: https://webmatrices.com/post/vibe-coding-has-a-12x-cost-problem-maintainers-are-done
It gets worse the further you zoom out.
How many hallucinations and "use krazy glue to stick cheese to your pizza"s until people realize LLMs _suck at knowing things_ and they're not improving. We've reached the theoretical limits of the tech. Beyond small tasks like grammar and natural-language search, they get brittle af.
So why is everyone hyping AI? Why is everyone gung-ho to put half of the world's knowledge-workers out of a job and push our power grid to its absolute limits for something that has the success rate of a freshman intern?
--
Economically it's no better.
The ghouls who run these AI companies and their infestors are betting on AGI. But as the engineers in the field know, AGI is not possible with the LLM modality. They hope building massive data centers to hold historic amounts of processing power and training data will somehow crack it. But "morer and betterer" is a hail mary — even without new limitations re: copyrighted material.
Instead, they'll continue accruing massive circular debts. Nvidia investing in (loaning money to) openAI so openAI can buy more nvidia chips so nvidia and invest more until it spins out of control. AI companies are losing _billions of dollars_ every year with no path to solvency.
It's no wonder Apple peaced the fuck out of their LLM efforts and just offloaded to Google. There's no win scenario in dumping billions into a plagiarism-and-lying machine.
--
In the end, I just don't understand how people can continue to advocate for these things. With all we know about how these work and what it all costs and will cost society on a global scale, you'd have to be deluded, ignorant, or callous to advocate or even casually use this shit. I understand some folks feel like they _have_ to learn these things to ensure they still have a role in our industry. But none of this is sustainable.
I don't get it.