>If an idiot like me can clone a product that costs $30k per month — in two hours — what even is software development? (...) The new software developer is no longer a craftsman. It’s the average operator, empowered
If entire industries employee counts are decimated and/or commodified, this means the "new software developer" wont find people to pay for what they create (whether software or a software driven service). For the majority, it also means further degradation of the kind of jobs one will be able to find.
Right now most AI-related posts are themselves slop, but most other posts aren't, so I could get by with ignoring all the AI-related posts. Unfortunately I'm quite interested in LLMs, and what other people think about them and are doing with them.
"IMO it should be considered quite rude in most contexts to post or send someone a wall of 100% AI-generated text. “Here, read this thing I didn’t care enough about to express myself." - https://x.com/littmath/status/2010759165061579086?s=20
Rather than ignore it, I'd deem it rude that something as low-effort as an AI generated blog post was shared here. I may not be able to set rules, but I wish we could flag posts like these. Some faux-gineer told their agent of choice to write up another fearmongering post about software developers and AI; I feel like my time was stolen from me.
The value of a Bloomberg Terminal isn't the UI (which is famously terrible/efficient); it's the latency, the hardware, the proprietary data feeds, the chat network, and the reliability.
Building a React frontend that fetches some JSON from an API in 2 hours is impressive, sure, but it’s not the hard part of fintech. We need to stop conflating "I built a UI that looks like X" with "I rebuilt the business value of X."
Calling polymarket bot “bloomberg terminal” is like calling boiling kettle a rocket engine.
That bot can be written in a few hours without ai.
Beware the AI+climatechange Armageddon!
^^ Its been a rough couple of years with these kinds of upshots being constantly posted.
Yes, things have changed. Is the entire software development world about to collapse because of LLMs? Sorry, no. I'm impressed by the capabilities we now have at our disposal, but the LLMs still do a bunch of dumb stuff. Some of the worst bugs and edge cases I've dealt with this year were almost unnoticeable things an agent added. With that said, there is still a ton of juice to squeeze from this paradigm. I really just wish everyone would stop pretending the sky is falling.
What's dying is the programmer-first job. That guy whose main use is that he knows how computers work, and secondly that he is a human who can understand how some business works, and can do the translation.
The other type of programmer is the business programmer. I started on this end before an incredibly long rabbit hole swallowed up my life. This is the person who thinks he's a finance guy, or an academic, or an accountant, or any number of things, who realizes that he can get a computer to help him.
This type of person is grounded in the specific business they come from, and has business-level abstractions for what he wants the computer to do.
AI is still imperfect, so it is still in your interest to know how the computer works, especially as you dive into things where your model of the machine actually matters. But it allows the person with the business view to generate code that would previously be their second job. He can QA the code on a business level. This used to just be called Excel, which would generate horrors for anyone who could actually program, but it is still the glue behind a huge number of business systems, and it still works because ugly often works.
I liken this to previous revolutions in IT. At one point schools had begun churning out literate people, and they started spilling out into the business world as clerks. You could learn how to read and write, and that would get you a job sending correspondence to India, that sort of thing. And that would be your way into the organization, and maybe you'd eventually learn the business itself.
People who typed stuff had a similar fate. There used to be rooms of people who would type letters and send them. Now the executive just types the letters and sends them off by email.
If you're a translator first, AI is not great for you. If you managed to turn your translation skills into executive skills, then you are happy to pull the ladder up.
I work in ERP. It is full of people like this. Accountants who learned SQL, some VB and you can get incredibly far.
They're also smart enough to know when they need an actual programmer, like I am smart enough to call them when it's time to do year end close / financial reporting
This is the part that terrifies me. Generating code has never been the bottleneck; understanding it has. If you aren't reviewing the code, you are effectively introducing a black box into your stack that you are responsible for but do not understand.
While on point, I like the patterns Effect is introducing, but I have already been using them pre- and post-AI for a long time now (especially in TypeScript). It's not an innovation. We shouldn't be surprised when LLMs are better at working on robust code bases with modular design.
Yes and no.
Obviously I have nowhere near the wealth to afford a Bloomberg Terminal, but an ex of mine had a father who has one. He told me that the terminal itself is kinda bullshit but the real value from it came from the fact that it essentially provides a way for you to network with other wealthy investors. The terminal itself, according to him, is pretty ornamental these days.
I won't actively defend his opinion if someone here knows better, but what he said made sense based on what I saw of it.
I have some experience with the Bloomberg Terminal and this was laugh out loud funny to me. This is like someone saying that they vibe-coded a rudimentary text editor with a spell checker, because that's all they really use, and that they would be therefore be able to pop out Microsoft Office 365 (or whatever its called now) in a couple more days.
Um, so in other words, there’s no evidence to support this?
This article though, is so disappointing. It's pure LLM-lingo, which makes it awful to read.
jleyank•4h ago