There's a reason we still (generally) teach people how to do arithmetic with pencil and paper instead of jumping straight to calculators. Learning basic algorithms for performing the computations helps solidify the concepts and the rules of the game.
We'll need to do the same thing eventually with respect to LLMs and software engineering. People who skip the foundations or let their comprehension atrophy will eventually end up in a spot in which they need those skills. I basically never do arithmetic using pen and paper now, but I could if I had to, and, more importantly, the process ingrained some basic comprehension of how the integers relate under the group operations.
I totally agree, re: SQL specifically, by the way. SQL is basically already natural language. It's probably the last thing that I'd need to offload to some natural language prompt. I think it's a bit of a vicious circle problem. There's a lot of people who only need to engage with SQL from time to time, so working with it is a bit awkward each time for lack of practice. This incentivizes them to offload it to the LLM just to get it out of the way, which in turn further atrophies their skills with SQL.
This was actually the whole point of SQL in the first place: to be a query language close enough to natural language that non-specialists could easily learn to use it.
I have seriously considered hanging out my shingle to do this freelance, I don't think the time is quite ripe yet but maybe in a few months.
People are programming out on a limb - and blame goes to the library maintainer if the user lacks the fundamental skills to do troubleshooting.
It seems like the future is converging on there will 5 Matrix savant architects who make $1B/y who keep things operating while everyone else lives in a shanty or a pod.
The OP is essentially a (white collar) labor version of this. What is evidently valued is an appearance of expertise, rather than expertise itself. Just like the capitalists who want to make money, and skipping production of actual goods in order to accomplish that, "professionals" are going to skip actual learning in order to appear knowledgeable.
For 200 years, people have hoped that the "free market" will sort out the problem that Marx saw. It didn't happen - we still get financial bubbles that cause trouble for many people. So, I suspect it's a mistake to assume the learning problem will fix itself either. I suspect people (society at large) will have to consciously value the hard work of learning for this to be fixed.
Seriously, you might want to actually do a sniff check before taking Marx's word for anything.
In my understanding, Marx was actually appreciative of free market building stuff. The question he asked was this: If the workers pay for the goods produced with increasingly bigger share of private property of capitalist class, what happens once there is no more property to give away to capitalists? (And he was actually the first to ask this question, and treat capitalism as a dynamical system, which makes him into one of the greatest economists of all time.)
Of course, it didn't fail completely, for two reasons. One is people actually instituted social democratic and keynesian reforms that prevented the collapse. (Marx was an optimist in that people will learn and eventually reject capital ownership completely, but they didn't.)
The other reason is that we continuously invent new forms capital (private property) to inject into flailing capitalism. In other words, we find more and more public goods that can be privatized - things such as mortgages and other consumer debt, intellectual property, new financial instruments, health care insurance, private and behavioral personal information ("big data"), new forms of money such as Bitcoin, openly available information sucked up by generative AI etc.
But that doesn't mean capitalism (especially its financial arm) isn't parasitic - it just didn't manage to kill the host yet.
The first point is nonsensical ("sure, the free market system keeps making new stuff, but what happens when we run out of stuff to pay for it with?"); it's like a bad Yogi Berra pastiche, expecting that we as a society will become so wealthy that we can't afford to do it anymore. He tries to paper over this with a lot of hand-waving about classes, but fundamentally he's trying to use a zero-sum model to explain wealth creation.
If it weren't for this "parasitic" system you wouldn't even have been able to post this drivel, for several dozen reasons, including the fact that you likely would never have been born or would have died in your youth.
The same is true about every other single instruction produced.
We need to stay vigilant,otherwise we will pay the cost by fixing LLM bugs later.
I saw a post on twitter about how game devs were using ChatGPT for localization and when you translated the text to English it said something like “as a chat assistant I’m unable to translate this concept” or an explanation instead of the translation.
This is exactly the sort of future I imagine with AI - not that the grunts on the ground will be sold on it but management will be convinced they can fire the people who know what they’re doing and replace them with interns armed with a ChatGPT subscription
That's because if you know a little bit of SQL and know how to validate the answer LLMs give you this becomes a non-issue.
A better example would be an ambiguous prompt where the LLM can either use an array or a map to solve your problem so it chooses an array. But down the road you need a feature where direct access is what you need, but your code is already using arrays. In this situation what tends to happen is the LLMs ends up making some hack on top of your arrays to create this new feature and the code gets real bad.
By not understanding the difference between these 2 data structures you aren't able to specify what needs to be done and the LLM ends up implementing your feature in an additive way. And when you add enough features in this way things get messy.
What is still not clear to me is what is the proper "abstraction layer" we need to use to learn things in this new world where LLMs are available.
owenthejumper•6mo ago