Maybe the time value of time is only increasing as we go.
Knowing that GTK $n-1 will soon be obsolete is enough reason to not put effort into learning it.
Incidentally, this is how you can distinguish between a good CS curriculum from a bad one. A good one focuses heavily on principles; the particular technical trappings are mostly just a medium, like Latin used to be in academia, now replaced by English. You pick up what you need to do to the job.
The answer to should we just sit around and wait for better technology is obviously no. We gain a lot of knowledge by building with what we have; builders now inform where technology improves. (The front page has an article about Voyager being a light day away...)
I think the more interesting question is what would happen if we induced some kind of 2% "technological inflation" - every year it gets harder to make anything. Would that push more orgs to build more things? Everyone pours everything they have into making products now because their resources will go less far next year.
Government bonds already do this for absolutely everything. If I can put my money in a guaranteed bond at X%/year then your startup that's a risky investment has to make much better returns to make it worth my while. That's why the stock market is always chasing growth.
Yeah, and will be done by somebody else. I think this is the main problem, and if you get rid of it, you'll have a completely sensible strategy. I mean there are many government contractors who, through corrupt connections, can guarantee that work will be awarded to them, and very often doing just that.
> Used to be, you had to find a customer in SO much pain that they'd settle for a point solution to their most painful problem, while you slowly built the rest of the stuff. Now, you can still do that one thing really well, but you can also quickly build a bunch of the table stakes features really fast, making it more of a no-brainer to adopt your product.
There was an article[1] going around about that recently, and I'm sure there are more, but it's also a trend I've seen first-hand. (I don't particularly care for the article's framing, I'm just linking to it to illustrate the underlying data.)
[1]: https://www.cnbc.com/2025/11/23/how-device-hoarding-by-ameri...
Don't confuse technical deflation with the Osborne effect:
But at least with iPhones, there is a deflationary affect because Apple has since the 3GS in 2009, kept the old phone around and reduced the price. For instance my son wanted an iPhone 16 Plus. I told him to wait until the 17 was announced and he bought one cheaper from T-Mobile
RTX doesn't count to me either, because that's some bullcrap pushed by gpu manufacturers that requires the aformentioned upscaling and frame generation techniques to fake actually being anywhere close to what gpu manufacturers want gamers to believe.
The generations gains haven’t been as great as past generations, but it’s getting silly to claim that GPUs aren’t getting faster for gaming.
Intentionally ignoring frame generation and DLSS up scaling also feels petty. Using those features to get 150-200fps at 4K is actually a very amazing experience, even if the purists turn their noses up at it.
The used GPU market is relatively good at calibrating for relative gaming performance. If new GPUs weren’t actually faster then old GPUs wouldn’t be depreciating much. Yet you can pick up 3000 series GPUs very cheaply right now (except maybe the 3090 which is prized for its large VRAM, though still cheap). Even 4000 series are getting cheap.
Doing it for a whole screenful of pixels, for the majority of frames (with multi-frame generation) is even less of it.
It does help that he has a small screen and regular DPI. Seems like everyone wants to run with 4x the pixels in the same space, which needs about 4x the GPU.
Why not? Sounds like a pretty reasonable strategy.
> Nobody seems to be putting off buying GPUs
Many people doing exactly that.
Now a new computer barely does anything faster for me.
I have the legal structure, i know my collegues, i have potentially employees and more capacity.
The problem is not that a startup is starting after you but you do not give yourself time to keep an eye on AI and not leveraging it when its helpful.
We leverage AI and ML progress constantly and keep an eye on advances. Segment Anything? Yepp we use it. Claude? Yes Sir!
That is why we are all waiting to buy our first personal computers and our first cell phones.
Economists have managed to be ludicrous for a very long time and yet we still trust them.
Let’s say you have the a fusion rocket and can hit 5% the speed of light. You want to migrate to the stars for some reason.
So do you build a generational ship now, which is possible, or… do you wait?
Because if you build it now someone with a much better drive may just fly right past you at 20% the speed of light.
In this one the answer is to plot it out under the assumption there is no totally undiscovered major physics that would allow, say, FTL, and plot the curves for advancement against that.
So can we do this with software? We have the progress of hardware, which is somewhat deterministic, and we know something about the progress of software from stats we can make via GitHub.
The software equivalent of someone discovering some “fantasy” physics and building a warp drive would be runaway self-improving AGI/ASI. I’d argue this is impossible for information theoretical reasons, but what if I’m wrong?
Say I have a startup that vibe codes “AI for real estate”. What about customer acquisition?
On the other hand, if I’m Zillow, why can’t I just throw a developer on the same feature and automatically have a customer base for it?
If you look at most of the YC funded startups these days, they are just prompt engineers with no go to market strategy and some don’t even have any technical people and are looking for “technical cofounders” that they can underpay with a promise of equity that will statistically be meaningless.
At the end of the day. Open AI is losing billions of dollars and Google caught up while still seeing record revenues and profits using its own infrastructure and TPUs
Even the laggard Apple is reportedly just going to throw a billion (chump change) at Google for its model and keep selling phones and other hardware while OpenAI is reportedly working on a “smart egg”
Don't conflate easy with simple. I'd argue they are actually easier and far more complex.
> writing functioning application code has grown easier thanks to AI.
> It's getting easier and easier for startups to do stuff.
> Another answer might be to use the fact that software is becoming free and disposable to your advantage.
For me, the logical conclusion here is: don't build a software startup!
I left an AI startup to do tech consulting. What do I do? Build custom AI systems for clients. (Specifically clients that decided against going with startups' solutions.) Sometimes I build it for them, but I prefer to work with their own devs to teach them how to build it.
Fast forward 3+ years and we're going to see more everyday SMBs hiring a dev to just build them the stuff in-house that they were stuck paying vendors for. It won't happen everywhere. Painful enough problems and worthwhile enough solutions probably won't see much of a shift.
But startups that think the market will lap up whatever they have to offer as long as it looks and sounds slick may be in for a rude surprise.
You aren’t doing it to get customers, it’s for investors and maybe a decent acquisition
I don't see this happening. Businesses generally want familiar tools that work reliably with predictable support patterns.
The charged cost of a frontier model is ~200x lower than 2 years ago, and the ones we are using now are much better - although measuring that and how much is challenging. Building a "better than GPT-4" model is also vastly cheaper than building GPT-4 was... perhaps 1/100th?
Ugh. I don't like that kind of 'desktop' apps. Huge bloat with a blip of actual app.
I disagree with this statement. It has become simpler, provided you don't care about it actually being correct, and you don't care about whether you really have tests that test what you think you asked for, you don't care about security, and other things.
Building the same thing involves doing the things that LLMs have proved time and again that they cannot do. But instead of writing it properly in the first place, you now need to look for the needle in the haystack that is the subtle bug that invariable get inserted by llms every single time I tried to use them. Which requires you to deeply understand the code anyway. Which you would have gotten automatically (and easier) if you were the one writing the code in the first place. developing the same thing at the same level of quality is harder with an LLM.
And the "table stakes" stuff is exactly the thing I would not trust an LLM with for sure, because the risk of getting it wrong could potentially be fatal (to the company, not the dev. Depends on his boss' temperament) with those.
Let's say there are 10 subtasks that need to be done.
Let's say a human has 99% chance of getting each one of them right, by doing the proper testing etc. And let's say that the AI has a 95% chance of getting it right (being very generous here).
0.99^10 = a 90% chance of the human getting it to work properly. 0.95^10 = only a 60% chance. Almost a coin toss.
Even with 98% success rate, the compounding success rate still goes down to about 81%.
The thing is that LLM's aren't just "a little bit" worse than humans. In comparison they're cavemen.
Have you been following the homepage of this site lately?
I've been able to use LLMs to build things in a weekend that I would not have been able to do in the past, without putting in months of serious effort.
I recently rewrote from scratch in a weekend a project that i had made a couple years ago. In a single weekend i now have a better product than I did at the time, when I spent maybe 20x the amount of time on it.
I'm not so sure that's the reason. I mean, to believe LLMs replace engineers you first need to believe engineers spend the bulk of their time typing frantically churning out code in greenfield projects. That's not compatible with reality. Although LLMs excel at generating new code from scratch, that scenario is the exception. Introducing significant changes to existing projects still requires long iterations which ultimately end up consuming more development time than actually rolling out the changes yourself.
The truth if the matter is that we are not observing an economic boom. The US is either stagnant or in a recession, and LLMs are not their cause. In an economic downturn you don't see spikes in demand for skilled workers.
On recession: cost of living is becoming crisis-level. I read recently that 67% of Americans are paycheck-to-paycheck. 150k/yr is 12k/month. If groceries go from 500 to 1000/month, a 150k wage-earner save less for retirement. For someone making 30-40k (basically minimum wage), it's a huge hit. Then consider it's the same story for cars, housing, medical care...it goes on and on. It doesn't look "recessionary" because GDP keeps going up. But we're getting so much less for it with every passing year.
I also agree that we need to consider what brownfield dev looks like. It's where the vast majority of my time has gone over 15+ years in software and I'm not convinced all the coordination / sequencing / thinking will be assisted with LLMs. Particularly because they aren't trained on large proprietary codebases.
What we might both be missing, is that for most people, writing the actual code is hard. LLMs help with that a lot. That's what a lot of junior/entry-level work, actually is (not as much planning/thinking as seniors do).
They are definitely not needed anymore.
The market is flooded with seniors.
So no problem there either
Tech is dividing the society and driving a wedge deeper. There is a huge population that are being thrown wayside by the high-speed tech highway. Which means the tech is getting more and more unreachable.
AI assistants are only going to make this worse, by removing the direct touch between users and the tech. Tech becomes just unmanageable for average person.
Just like how you were able to do all repairs for your bike, as a kid. But you can't do the same for your car now. Tech never gets easier or reachable.
Not going to lie, it looks like a bleak future.
If you want to build a successful AI company, assume the product part is easy. Build the support network: guarantee uptime, fast responses, direct human support. These are the shovels desperately needed during the AI gold rush.
Comparing with reading books you might have or need to order, read, or get from the library, you bet.
There still are some interesting problems to tackle. Maybe more than before. So who knows.
The previous made it easier to learn things, while LLMs do things for you.
As a result of Google and SO I was really bad in what they replaced.
As a student I memorized all the utilities I had to use frequently, but since the search revolution I keep googlung simplest stuff. "how to check if character is in string" every day because I don't have to remember it.
I might be an exception but I clearly got much worse in knowing things by heart because they are one search away.
Amd I think it's plausible that a person who will only ever work with LLM will never learn to actually code. And if they don't then what will they be able to reliably deliver.
I think analysis around inflation, deflation, and consumer prices are valid but they are part of an understanding from economies of 100 years ago. Money loses value when you can't do anything with it. Tech and AI runs on debt, and an extraordinary amount of it. Is that really money? I don't think so.
Deflation may suffer from Goodhart's law. Because we've repurposed all of available human resources for mitigating against it, the variables we used to measure it cease to become useful. Our central measure for the economy are things like the stock market and the unemployment rate which have prevalent and valid criticisms that policy makers ignore. They truly don't indicate what's occurring on main street and I'm afraid that we will be in a deflationary spiral without knowing it.
Otherwise I think you make good and interesting points. Genuine economics has a host of measuring sticks, but we, the non-economists, really only talk about, hear about, or even basically understand a handful. I had an economist effectively point this out to me when asking a question about GDP versus the broken window fallacy.
> Giga AI, a company building AI customer support agents, claims to have sworn off the "forward deployed engineer" model of custom software favored by many other successful startups, in favor of software that customizes itself—only possible because of coding agents.
Giga AI is not a publicly traded company and they have zero legal liability or possible downside for lying, and massive upside for lying. They also don't have real customers and are not in positive revenue. The trend is that everyone who has said this was lying.
When there's tangible evidence of this, I think it will be an important part of the discussion. Until then, saying "claims" and "but I don't really know" but then paraphrasing their press release without analysis is about as sophisticated and as honest as tweeting "people are saying."
The author should take their own advice and wait six months when these claims will be easier to substantiate and support the analysis far more strongly.
Models keep getting cheaper and more capable every few months. However, the underlying compute economics do not deflate at the same rate. GPU provisioning, inference orchestration, bandwidth constraints, latency guarantees, regulatory requirements, and failure handling do not become magically simple because a new model improved its reasoning. In reality, each improvement on the model side increases pressure on the infrastructure side. Bigger context windows, heavier memory footprints, more parallel requests, and more complex agentic workflows all increase the operational burden.
For infrastructure teams, waiting does not help. The surface area of what needs to be built only grows. You cannot delay autoscaling, observability, scheduling, routing, or privacy guarantees. Applications will always demand more from the infrastructure, and they will expect it to feel like a commodity.
My view is that technical deflation applies much more to application startups than to infrastructure startups. App founders can benefit from waiting. Infra founders have to build now because every model improvement instantly becomes a new expectation that the infra must support. The baseline keeps rising.
The real moat in the next era is not the speed of feature development. It is the ability of your infrastructure to absorb the increasing chaos of more capable models while keeping the experience simple and predictable for the user
Joker_vD•2mo ago
Or, you know, technological improvements that increase efficiency of production, or bountiful harvests, or generally anything else that suddenly expands the supply at the current price level across the economy. Thankfully, we have mechanisms in place that keep the prices inflating even when those unlikely events happen.
marcosdumay•2mo ago
Anyway, WTF, economics communication has a huge problem. I've seen the article's explanation repeated in plenty of places, it's completely wrong and borderline nonsense.
The reason deflation is bad is not because it makes people postpone buying things. It's because some prices, like salaries or rent just refuse to go down. That causes rationing of those things.
jdasdf•2mo ago
a common argument, but one that doesn't bear out in the absence of regulation enforcing that.
gus_massa•2mo ago
HPsquared•2mo ago
marcosdumay•2mo ago
HPsquared•2mo ago
marcosdumay•2mo ago
anonymouskimmer•2mo ago
This is basically what a bunch of people did during and following the Great Depression. Deflation was continuing and the money they had lent to the banks was being written off in bank defaults. And so an entire generation learned to just stick it under the mattress (or stick it in T bills, which reliably didn't default).
Also, it's not just a literal increase of money that causes inflation, an increase in money velocity also increases inflation. Debt write-offs decrease velocity, while debt issuance increases velocity. IANAE.
pjc50•2mo ago
The reverse of this is that high inflation tends to cause a lot of strikes, because salaries refuse to go up and very high levels of inflation need salary repricing every month or even week.
igleria•2mo ago
It got old really quick having to negotiate with the boss every 6 months.