But LLMs took it a notch even further, coders have started morphing into LLM prompters today, that is primarily how software is getting produced. They still must baby sit these LLMs presently, reviewing and testing the code thoroughly before pushing it to the repo for CI/CD. A few more years and even that may not be needed as the more enhanced LLM capabilities like "reasoning", "context determination", "illumination", etc. (maybe even "engineering"!) would have become part of gpt-9 or whatever hottest flavor of LLM be at that time.
The problem is that even though the end result would be a very robust running program that reeks of creativity, there won't be any human creativity in that. The phrase dismal science was first used in reference to economics by medieval scholars like Thomas Carlyle. We can only guess their motivations for using that term but maybe people of that time thought that economics was somehow taking away the life force from society of humans, much similar to the way many feel about AI/LLM today?
Now I understand the need for putting food on the table. To survive this cut throat IT job market, we must adapt to changing trends and technologies and that includes getting skilled with LLM. Nonetheless, I can't help but get a very dismal feeling about this new way of software development, don't you?
lordkrandel•2h ago