I'm not sure if it's just me getting older or what, but something strikes me as odd about the future of programming and software engineering: LLMs are impressive, but you have to pay to use them. I can't recall another core tool or technology in the software industry—something central not just to the field, but to the world—that isn't free or open source. Think TCP/IP, the Linux kernel, Postgres, Git, ffmpeg, qemu, Latex, Kubernetes, and so on. Sure, there's plenty of proprietary software out there, but it's not the backbone of the internet or the computing industry.
Now, LLMs have the potential to become part of that backbone, yet nobody seems particularly concerned that they’re not open source (I'm talking about GPT, Claude, Copilot, Gemini). I know there are open source alternatives, but they’re not nearly as capable—and it seems most people here are perfectly fine using and paying for the proprietary ones.
I don’t like a future where I have to pay for every token just to write a program. And don’t tell me, "Well, just don’t use LLMs"; they’re going to become what Linux is today: ubiquitous.
devops000•4h ago
baobun•4h ago
minimaxir•4h ago
dakiol•4h ago
bigyabai•4h ago
Strictly speaking - we don't really know if this is true. There is no study proving AI gets smarter up to a certain point. It might keep scaling forever, or one day we might unknowingly reach the soft limit of LLM intelligence. I think assertions like the one you're making require specific evidence.
For comparison's sake, proprietary models like GPT-3 now pale in comparison to the results you get from a 7b Open Source LLM. The Open Source stuff really does move along, if not at the pace everyone would prefer.