frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Open in hackernews

The world could run on older hardware if software optimization was a priority

https://twitter.com/ID_AA_Carmack/status/1922100771392520710
126•turrini•2h ago

Comments

AndrewDucker•1h ago
Well, yes. It's an economic problem (which is to say, it's a resource allocation problem). Do you have someone spend extra time optimising your software or do you have them produce more functionality. If the latter generates more cash then that's what you'll get them to do. If the former becomes important to your cashflow then you'll get them to do that.
tgv•1h ago
It's the kind of economics that shifts the financial debt to accumulating waste, and technical debt, which is paid for by someone else. It's basically stealing. There are --of course-- many cases in which thorough optimizing doesn't make much sense, but the idea of just adding servers instead of rewriting is a sad state of affairs.
xyzzy123•1h ago
It doesn't seem like stealing to me? Highly optimised software generally takes more effort to create and maintain.

The tradeoff is that we get more software in general, and more features in that software, i.e. software developers are more productive.

I guess on some level we can feel that it's morally bad that adding more servers or using more memory on the client is cheaper than spending developer time but I'm not sure how you could shift that equilibrium without taking away people's freedom to choose how to build software?

esperent•57m ago
> It's basically stealing

This feels like hyperbole to me. Who is being stolen from here? Not the end user, they're getting the tradeoff of more features for a low price in exchange for less optimized software.

skydhash•10m ago
From what I’m seeing people do on their computers, it barely changed from what they’ve been doing on their pentium 4 one. But now, with Electron-based software and the generals state of Windows, you can’t recommend something older than 4 years. It’s hard to not see it as stealing when you have to buy a 1000+ laptop, when a 400 one could easily do the job if the software were a bit better.
victorbjorklund•51m ago
Not really stealing. You could off course build software that is more optimized and with the same features but at a higher cost. Would most buyers pay twice the price for a webb app that loads in 1 sec instead of 2? Probably not.
skydhash•7m ago
Try loading slack and youtube on a 4 year old laptop. It’s more in the 10s, and good luck if you only have 8GB of ram.
inetknght•15m ago
> It's basically stealing.

This is exactly right. Why should the company pay an extra $250k in salary to "optimize" when they can just offload that salary to their customers' devices instead? The extra couple of seconds, extra megabytes of bandwidth, and shittery of the whole ecosystem has been externalized to customers in search of ill-gotten profits.

nottorp•1h ago
Unfortunately, bloated software passes the costs to the customer and it's hard to evaluate the loss.

Except your browser taking 180% of available ram maybe.

By the way, the world could also have some bug free software, if anyone could afford to pay for it.

jillesvangurp•56m ago
What cost? The hardware is dirt cheap. Programmers aren't cheap. The value of being able to use cheap software on cheap hardware is basically not having to spend a lot of time optimizing things. Time is the one thing that isn't cheap here. So there's a value in shipping something slightly sub optimal sooner rather than something better later.

> Except your browser taking 180% of available ram maybe.

For most business users, running the browser is pretty much the only job of the laptop. And using virtual memory for open tabs that aren't currently open is actually not that bad. There's no need to fit all your gazillion tabs into memory; only the ones you are looking at. Browsers are pretty good at that these days. The problem isn't that browsers aren't efficient but that we simply push them to the breaking content with content. Content creators simply expand their resource usage whenever browsers get optimized. The point of optimization is not saving cost on hardware but getting more out of the hardware.

The optimization topic triggers the OCD of a lot of people and sometimes those people do nice things. John Carmack built his career when Moore's law was still on display. Everything he did to get the most out of CPUs was super relevant and cool but it also dated in a matter of a few years. One moment we were running doom on simple 386 computers and the next we were running Quake and Unreal with shiny new Voodoo GPUs on a Pentium II pro. I actually had the Riva 128 as my first GPU, which was one of the first products that Nvidia shipped running Unreal and other cool stuff. And while CPUs have increased enormously in performance, GPUs have increased even more by some ridiculous factor. Nvidia has come a long way since then.

I'm not saying optimization is not important but I'm just saying that compute is a cheap commodity. I actually spend quite a bit of time optimizing stuff so I can appreciate what that feels like and how nice it is when you make something faster. And sometimes that can really make a big difference. But sometimes my time is better spent elsewhere as well.

wtetzner•45m ago
> Time is the one thing that isn't cheap here.

Right, and that's true of end users as well. It's just not taken into account by most businesses.

I think your take is pretty reasonable, but I think most software is too far towards slow and bloated these days.

Browsers are pretty good, but developers create horribly slow and wasteful web apps. That's where the optimization should be done. And I don't mean they should make things as fast as possible, just test on an older machine that a big chunk of the population might still be using, and make it feel somewhat snappy.

The frustrating part is that most web apps aren't really doing anything that complicated, they're just built on layers of libraries that the developers don't understand very well. I don't really have a solution to any of this, I just wish developers cared a little bit more than they do.

nottorp•44m ago
> The hardware is dirt cheap.

It's not, because you multiply that 100% extra CPU time by all of an application's users and only then you come to the real extra cost.

And if you want to pick on "application", think of the widely used libraries and how much any non optimization costs when they get into everything...

kreco•41m ago
Your whole reply is focused at business level but not everybody can afford 32GB of RAM just to have a smooth experience on a web browser.
vermilingua•1h ago
Related: https://duskos.org/
freddie_mercury•1h ago
The world DOES run on older hardware.

How new do you think the CPU in your bank ATM or car's ECU is?

kunley•1h ago
Related: I wonder what cpu Artemis/Orion is using
jsheard•1h ago
IBM PowerPC 750X apparently, which was the CPU the Power Mac G3 used back in the day. Since it's going into space it'll be one of the fancy radiation-hardened versions which probably still costs more than your car though, and they run four of them in lockstep to guard against errors.

https://www.eetimes.com/comparing-tech-used-for-apollo-artem...

ngangaga•1h ago
I'm not sure what artemis or orion are, but you can blame defense contractors for this. Nobody ever got fired for hiring IBM or Lockheed, even if they deliver unimpressive results at massive cost.
rescbr•33m ago
Put a 4 nm CPU into something that goes to space and see how long it would take to fail.

One of the tradeoffs of radiation hardening is increased transistor size.

Cost-wise it also makes sense - it’s a specialized, certified and low-volume part.

ngangaga•26m ago
I don't disagree that the engineering can be justified. But you don't need custom hardware to achieve radiation hardening, much less hiring fucking IBM.

And to be clear, I love power chips. I remain very bullish about the architecture. But as a taxpayer reading this shit just pisses me off. Pork-fat designed to look pro-humanity.

dreamcompiler•6m ago
> fancy radiation-hardened versions

Ha! What's special about rad-hard chips is that they're old designs. You need big geometries to survive cosmic rays, and new chips all have tiny geometries.

So there are two solutions:

1. Find a warehouse full of 20-year old chips.

2. Build a fab to produce 20-year old designs.

Both approaches are used, and both approaches are expensive. (Approach 1 is expensive because as you eventually run out of chips they become very, very valuable and you end up having to build a fab anyway.)

scrapheap•1h ago
Well I know the CPU in my laptop is already over 10 years old and still works good enough for everything I do.
forinti•33m ago
My daily drivers at home are an i3-540 and and Athlon II X4. Every time something breaks down, I find it much cheaper to just buy a new part than to buy a whole new kit with motherboard/CPU/RAM.

I'm a sysadmin, so I only really need to log into other computers, but I can watch videos, browse the web, and do some programming on them just fine. Best ROI ever.

hiq•23m ago
> I can watch videos

Can you watch H.265 videos? That's the one limitation I regularly hit on my computer (that I got for free from some company, is pretty old, but is otherwise good enough that I don't think I'll replace it until it breaks). I don't think I can play videos recorded on modern iPhones.

forinti•15m ago
Yes, they play just fine with Gnome Videos or VLC.
PUSH_AX•1h ago
Some of it does.

The chips in everyones pockets do a lot of compute and are relatively new though.

ngangaga•1h ago
Sure, if you think the world consists of cash transactions and whatever a car needs to think about.
leonheld•1h ago
If we're talking numbers, there are many, many more embedded systems than general purpose computers. And these are mostly built on ancient process nodes compared to the cutting edge we have today; the shiny octa-cores on our phones are supported by a myriad of ancilliary chips that are definitely not cutting edge.
ngangaga•1h ago
We aren't talking numbers, though. Who cares about embedded? I mean that literally. This is computation invisible by design. If that were sufficient we wouldn't have smartphones.
dsego•1h ago
Doom can run on Apple's Lightning to HDMI adapter.
KolmogorovComp•52m ago
Powerplants and planes still run on 80s hardware.
yonisto•1h ago
I'm not much into retro computing. But it amazes me what people are pulling out of a dated hardware.

Doom on the Amiga for example (many consider it the main factor for the Amiga demise). Optimization and 30 years and it finally arrived

busterarm•1h ago
Let's keep the CPU efficiency golf to Zachtronics games, please.

I/O is almost always the main bottleneck. I swear to god 99% of developers out there only know how to measure cpu cycles of their code so that's the only thing they optimize for. Call me after you've seen your jobs on your k8s clusters get slow because all of your jobs are inefficiently using local disk and wasting cycles waiting in queue for reads/writes. Or your DB replication slows down to the point that you have to choose between breaking the mirror and stop making money.

And older hardware consumes more power. That's the main driving factor between server hardware upgrades because you can fit more compute into your datacenter.

I agree with Carmack's assessment here, but most people reading are taking the wrong message away with them.

wtetzner•1h ago
> I/O is almost always the main bottleneck.

People say this all the time, and usually it's just an excuse not to optimize anything.

First, I/O can be optimized. It's very likely that most servers are either wasteful in the number of requests they make, or are shuffling more data around than necessary.

Beyond that though, adding slow logic on top of I/O latency only makes things worse.

Also, what does I/O being a bottleneck have to do with my browser consuming all of my RAM and using 120% of my CPU? Most people who say "I/O is the bottleneck" as a reason to not optimize only care about servers, and ignore the end users.

busterarm•55m ago
I/O _can_ be optimized. I know someone who had this as their fulltime job at Meta. Outside of that nobody is investing in it though.

I'm a platform engineer for a company with thousands of microservices. I'm not thinking on your desktop scale. Our jobs are all memory hogs and I/O bound messes. Across all of the hardware we're buying we're using maybe 10% CPU. Peers I talk to at other companies are almost universally in the same situation.

I'm not saying don't care about CPU efficiency, but I encounter dumb shit all the time like engineers asking us to run exotic new databases with bad licensing and no enterprise features just because it's 10% faster when we're nowhere near experiencing those kinds of efficiency problems. I almost never encounter engineers who truly understand or care about things like resource contention/utilization. Everything is still treated like an infinite pool with perfect 100% uptime, despite (at least) 20 years of the industry knowing better.

_aavaa_•1h ago
There's servers and there's all of the rest of consumer hardware.

I need to buy a new phone every few years simply because the manufacturer refuses to update it. Or they add progressively more computationally expensive effects that makes my old hardware crawl. Or the software I use only supports 2 old version of macOS. Or Microsoft decides that your brand new cpu is no good for win 11 because it's lacking a TPM. Or god help you if you try to open our poorly optimized electron app on your 5 year old computer.

busterarm•53m ago
But Carmack is clearly talking about servers here. That is my problem -- the main audience is going to read this and think about personal compute.

All those situations you describe are also a choice made so that companies can make sales.

_aavaa_•42m ago
It shows up in different ways, and I agree that some of my examples are planned obsolescence.

I'm not so sure they're that different though. I do think that in the end most boil down to the same problem: no emphasis or care about performance.

Picking a programming paradigm that all but incentivizes N+1 selects is stupid. An N+1 select is not an I/O problem, it's a design problem.

pdhborges•44m ago
I'm looking at our Datadog stats right now. It is 64% cpu 36% IO.
titzer•1h ago
I like to point out that since ~1980, computing power has increased about 1000X.

If dynamic array bounds checking cost 5% (narrator: it is far less than that), and we turned it on everywhere, we could have computers that are just a mere 950X faster.

If you went back in time to 1980 and offered the following choice:

I'll give you a computer that runs 950X faster and doesn't have a huge class of memory safety vulnerabilities, and you can debug your programs orders of magnitude more easily, or you can have a computer that runs 1000X faster and software will be just as buggy, or worse, and debugging will be even more of a nightmare.

People would have their minds blown at 950X. You wouldn't even have to offer 1000X. But guess what we chose...

Personally I think the 1000Xers kinda ruined things for the rest of us.

ngangaga•1h ago
I don't think it's that deep. We are just stuck with browsers now, for better and worse. Everything else trails.
slowmovintarget•54m ago
We're stuck with browsers now until the primary touch with the internet is assistants / agent UIs / chat consoles.

That could end up being Electron (VS Code), though that would be a bit sad.

scotty79•49m ago
I don't think we are gonna go there. Talking is cumbersome. There's a reason, besides social anxiety that people prefer to use self-checkout and electronically order fastfood. There are easier ways to do a lot of things than with words.

I'd bet on maybe ad hoc ai designed ui-s you click but have a voice search when you are confused about something.

bluGill•6m ago
If you know what you want then not talking to a human is faster. However if you are not sure a human can figure out. I'm not sure I'd trust a voice assistant - the value in the human is an informed opinion which is hard to program, but it is easy to program a recommendation for whatever makes the most profit. Of course humans often don't have an informed opinion either, but at least sometimes they do, and they will also sometimes admit it when they don't.
ngangaga•39m ago
I think it'd be pretty funny if to book travel in 2035 you need to use a travel agent that's objectively dumber than a human. We'd be stuck in the eighties again, but this time without each other to rely on.

Of course, that would be suicide for the industry. But I'm not sure investors see that.

vrighter•1h ago
Don't forget the law of large numters. 5% performance hit on one system is one thing, 5% across almost all of the current computing landscape is still a pretty huge value.
titzer•1h ago
It's about 5%.

Cost of cyberattacks globally[1]: O($trillions)

Cost of average data breach[2][3]: ~$4 million

Cost of lost developer productivity: unknown

We're really bad at measuring the secondary effects of our short-sightedness.

[1] https://iotsecurityfoundation.org/time-to-fix-our-digital-fo...

[2] https://www.internetsociety.org/resources/doc/2023/how-to-ta...

[3] https://www.ibm.com/reports/data-breach

pron•1h ago
But it's not free for the taking. The point is that we'd get more than that 5%'s worth in exchange. So sure, we'll get significant value "if software optimization was truly a priority", but we get even more value by making other things a priority.

Saying "if we did X we'd get a lot in return" is similar to the fallacy of inverting logical implication. The question isn't, will doing something have significant value, but rather, to get the most value, what is the thing we should do? The answer may well be not to make optimisation a priority even if optimisation has a lot of value.

vrighter•52m ago
depends on whether the fact that software can be finished will ever be accepted. If you're constantly redeveloping the same thing to "optimize and streamline my experience" (please don't) then yes, the advantage is dubious. But if not, then the saved value in operating costs keeps increasing as time goes on. It won't make much difference in my homelab, but at datacenter scale it does
pron•34m ago
Even the fact that value keeps increasing doesn't mean it's a good idea. It's a good idea if it keeps increasing more than other value. If a piece of software is more robust against attacks then the value in that also keeps increasing over time, possibly more than the cost in hardware. If a piece of software is easier to add features to, then that value also keeps increasing over time.

If what we're asking is whether value => X, i.e. to get the most value we should do X, you cannot answer that in the positive by proving X => value. If optimising something is worth a gazillion dollars, you still should not do it if doing something else is worth two gazillion dollars.

_aavaa_•1h ago
Except we've squandered that 1000x not on bounds checking but on countless layers of abstractions and inefficiency.
pydry•58m ago
Most of it was exchanged for abstractions which traded runtime speed for the ability to create apps quickly and cheaply.

The market mostly didn't want 50% faster code as much as it wanted an app that didn't exist before.

If I look at the apps I use on a day to day basis that are dog slow and should have been optimized (e.g. slack, jira), it's not really a lack of the industry's engineering capability to speed things up that was the core problem, it is just an instance the principal-agent problem - i.e. I'm not the one buying, I don't get to choose not to use it and dog-slow is just one of many the dimensions in which they're terrible.

fsloth•55m ago
I don’t think abundance vs speed is the right lens.

No user actually wants abundance. They use few programs and would benwfit if those programs were optimized.

Established apps could be optimized to the hilt.

But they seldom are.

pydry•48m ago
>No user actually wants abundance.

No, all users just want the few programs which they themselves need. The market is not one user, though. It's all of them.

skydhash•26m ago
But each vendor only develop a few software and generally supports only three platforms -/+ one. It’s so damning when I see projects reaching out for electron, when they only support macOS and Windows. And software like Slack has no excuse for being this slow on anything other than latest gen cpu and 1gb internet connection.
bluGill•16m ago
Users only want 5% of the features of the few programs they use. However everyone has a different list of features and a different list of programs. And so to get a market you need all the features on all the programs.
infogulch•25m ago
> They use few programs

Yes but it's a different 'few programs' than 99% of all other users, so we're back to square one.

ffsm8•40m ago
> Most of it was exchanged for abstractions which traded runtime speed for the ability to create apps quickly and cheaply.

Really? Because while abstractions like that exist (i.e. a webserver frameworks, reactivity, SQL and ORMs etc), I would argue that these aren't the abstractions that cause the most maintenance and performance issues. These are usually in the domain/business application and often not something that made anything quicker to develop or anything, but instead created by a developer that just couldn't help themselves

tonyarkles•33m ago
I think they’re referring to Electron.

Edit: and probably writing backends in Python or Ruby or JavaScript.

Zak•21m ago
The backend programming language usually isn't a significant bottleneck; running dozens of database queries in sequence is the usual bottleneck, often compounded by inefficient queries, inappropriate indexing, and the like.
grumpymuppet•55m ago
This is something I've wished to eliminate too. Maybe we just cast the past 20 years as the "prototyping phase" of modern infrastructure.

It would be interesting to collect a roadmap for optimizing software at scale -- where is there low hanging fruit? What are the prime "offenders"?

Call it a power saving initiative and get environmentally-minded folks involved.

Gigachad•46m ago
Am I taking crazy pills or are programs not nearly as slow as HN comments make them out to be? Almost everything loads instantly on my 2021 MacBook and 2020 iPhone. Every program is incredibly responsive. 5 year old mobile CPUs load modern SPA web apps with no problems.

The only thing I can think of that’s slow is Autodesk Fusion starting up. Not really sure how they made that so bad but everything else seems super snappy.

mjburgess•43m ago
People conflat the insanity of running a network cable through every application with the poor performance of their computers.
sorcerer-mar•42m ago
I think it's a very theoretical argument: we could of course theoretically make everything even faster. It's nowhere near the most optimal use of the available hardware. All we'd have to give up is squishy hard-to-measure things like "feature sets" and "engineering velocity."
high_na_euv•41m ago
Yup, people run software on shitty computers and blame all the software.

The only slow (local) software I know is llvm and cpp compilers

Other are pretty fast

flohofwoe•40m ago
I guess you don't need to wrestle with Xcode?

Somehow the Xcode team managed to make startup and some features in newer Xcode versions slower than older Xcode versions running on old Intel Macs.

E.g. the ARM Macs are a perfect illustration that software gets slower faster than hardware gets faster.

After a very short 'free lunch' right after the Intel => ARM transition we're now back to the same old software performance regression spiral (e.g. new software will only be optimized until it feels 'fast enough', and that 'fast enough' duration is the same no matter how fast the hardware is).

Another excellent example is the recent release of the Oblivion Remaster on Steam (which uses the brand new UE5 engine):

On my somewhat medium-level PC I have to reduce the graphics quality in the Oblivion Remaster so much that the result looks worse than 14-year old Skyrim (especially outdoor environments), and that doesn't even result in a stable 60Hz frame rate, while Skyrim runs at a rock-solid 60Hz and looks objectively better in the outdoors.

E.g. even though the old Skyrim engine isn't by far as technologically advanced as UE5 and had plenty of performance issues at launch on a ca. 2010 PC, the Oblivion Remaster (which uses a "state of the art" engine) looks and performs worse than its own 14 years old predecessor.

I'm sure the UE5-based Oblivion remaster can be properly optimized to beat Skyrim both in looks and performance, but apparently nobody cared about that during development.

mschild•39m ago
A mix of both. There are large number of websites that are inefficiently written using up unnecessary amounts of resources. Semi-modern devices make up for that by just having a massive amount of computing power.

However, you also need to consider 2 additional factors. Macbooks and iPhones, even 4 year old ones, have usually been at the upper end of the scale for processing power. (When compared to the general mass-market of private end-consumer devices)

Try doing the same on a 4 year old 400 Euro laptop and it might look a bit different. Also consider your connection speed and latency. I usually have no loading issue either. But I have a 1G fiber connection. My parents don't.

_aavaa_•37m ago
I'd wager that a 2021 MacBook, like the one I have, is stronger than the laptop used by majority of people IN the world.

Life on an entry or even mid level windows laptop is a very different world.

josephg•26m ago
Yep. Developers make programs run well enough on the hardware sitting on our desks. So long as we’re well paid (and have decent computers ourselves), we have no idea what the average computing experience is for people still running 10yo computers which were slow even for the day. And that keeps the treadmill going. We make everyone need to upgrade every few years.

A few years ago I accidentally left my laptop at work on a Friday afternoon. Instead of going into the office, I pulled out a first generation raspberry pi and got everything set up on that. Needless to say, our nodejs app started pretty slowly. Not for any good reason - there were a couple modules which pulled in huge amounts of code which we didn’t use anyway. A couple hours work made the whole app start 5x faster and use half the ram. I would never have noticed that was a problem with my snappy desktop.

thfuran•13m ago
I've found so many performance issues at work by booting up a really old laptop or working remotely from another continent. It's pretty straightforward to simulate either poor network conditions or generally low performance hardware, but we just don't generally bother to chase down those issues.
tjader•37m ago
I just clicked on the network icon next to the clock on a Windows 11 laptop. A gray box appeared immediately, about one second later all the buttons for wifi, bluetooth, etc appeared. Windows is full of situations like this, that require no network calls, but still take over one second to render.
subjectsigma•32m ago
I have a 2019 Intel MacBook and Outlook takes about five seconds to load and constantly sputters
CelestialMystic•31m ago
You are using a relatively high end computer and mobile device. Go and find a cheap laptop x86 and try doing the same. It will be extremely painful. Most of this is due to a combination of Windows 11 being absolute trash and JavaScript being used extensively in applications/websites. JavaScript is memory hog and can be extremely slow depending on how it is written (how you deal with loops massively affects the performance).

What is frustrating though that until relatively recently these devices would work fine with JS heavy apps and work really well with anything that is using a native toolkit.

g-mork•22m ago
Lightroom non-user detected
alnwlsn•20m ago
It depends. Can Windows 3.11 be faster than Windows 11? Sure, maybe even in most cases: https://jmmv.dev/2023/06/fast-machines-slow-machines.html
xnorswap•13m ago
It vastly depends on what software you're forced to use.

Here's some software I use all the time, which feels horribly slow, even on a new laptop:

Slack.

Switching channels on slack, even when you've just switched so it's all cached, is painfully slow. I don't know if they build in a 200ms or so delay deliberately to mask when it's not cached, or whether it's some background rendering, or what it is, but it just feels sluggish.

Outlook

Opening an email gives a spinner before it's opened. Emails are about as lightweight as it gets, yet you get a spinner. It's "only" about 200ms, but that's still 200ms of waiting for an email to open. Plain text emails were faster 25 years ago. Adding a subset of HTML shouldn't have caused such a massive regression.

Teams

Switching tabs on teams has the same delayed feeling as Slack. Every iteraction feels like it's waiting 50-100ms before actioning. Clicking an empty calendar slot to book a new event gives 30-50ms of what I've mentally internalised as "Electron blank-screen" but there's probably a real name out there for basically waiting for a new dialog/screen to even have a chrome, let alone content. Creating a new calendar event should be instant, it should not take 300-500ms or so of waiting for the options to render.

These are basic "productivity" tools in which every single interaction feels like it's gated behind at least a 50ms debounce waiting period, with often extra waiting for content on top.

Is the root cause network hops or telemetry? Is it some corporate antivirus stealing the computer's soul?

Ultimately the root cause doesn't actually matter, because no matter the cause, it still feels like I'm wading through treacle trying to interact with my computer.

maccard•3m ago
I’d take 50ms but in my experience it’s more like 250.
makeitdouble•11m ago
To note, people will have wildly different tolerance to delays and lag.

On the extreme, my retired parents don't feel the difference between 5s or 1s when loading a window or. licking somewhere. I offered a switch to a new laptop, cloning their data, and they didn't give a damn and just opened the laptop the closest to them.

Most people aren't that desensitized, but for some a 600ms delay is instantaneous when for other it's 500ms too slow.

maccard•4m ago
Slack, teams, vs code, miro, excel, rider/intellij, outlook, photoshop/affinity are all applications I use every day that take 20+ seconds to launch. My corporate VPN app takes 30 seconds to go from a blank screen to deciding if it’s going to prompt me for credentials or remember my login, every morning. This is on an i9 with 64GB ram, and 1GN fiber.

On the website front - Facebook, twitter, Airbnb, Reddit, most news sites, all take 10+ seconds to load or be functional, and their core functionality has regressed significantly in the last decade. I’m not talking about features that I prefer, but as an example if you load two links in Reddit in two different tabs my experience has been that it’s 50/50 if they’ll actually both load or if one gets stuck either way skeletons.

fsloth•57m ago
The problem is 1000xers are a rarity.

The software desktop users have to put up with is slow.

HappMacDonald•4m ago
You can always install DOS as your daily driver and run 1980's software on any hardware from the past decade, and then tell me how that's slow.

1000x referred to the hardware capability, and that's not a rarity that is here.

The trouble is how software has since wasted a majority of that performance improvement.

Some of it has been quality of life improvements, leading nobody to want to use 1980s software or OS when newer versions are available.

But the lion's share of the performance benefit got chucked into the bin with poor design decisions, layers of abstractions, too many resources managed by too many different teams that never communicate making any software task have to knit together a zillion incompatible APIs, etc.

justincormack•54m ago
Most programming languages have array bounds checking now.
scotty79•52m ago
Since 1980 maybe. But since 2005 it increased maybe 5x and even that's generous. And that's half of the time that passed and two decades.

https://youtu.be/m7PVZixO35c?si=px2QKP9-80hDV8Ui

ngneer•43m ago
I agree with the sentiment and analysis that most humans prefer short term gains over long term ones. One correction to your example, though. Dynamic bounds checking does not solve security. And we do not know of a way to solve security. So, the gains are not as crisp as you are making them seem.
HappMacDonald•12m ago
You don't have to "solve" security in order to improve security hygiene by a factor of X, and thus risk of negative consequences by that same factor of X.
bluGill•10m ago
Bounds checking solves one tiny subset of security. There are hundreds of other subsets that we know how to solve. However these days the majority of the bad attacks are social and no technology is likely to solve them - as more than 10,000 years of history of the same attack has shown. Technology makes the attacks worse because they now scale, but social attacks have been happening for longer than recorded history (well there is every reason to believe that - there is unlikely to evidence going back that far).
dist-epoch•19m ago
It's more like 100,000X.

Just the clockspeed increased 1000X, from 4 MHz to 4 GHz.

But then you have 10x more cores, 10x more powerful instructions (AVX), 10x more execution units per core.

dardeaup•1h ago
It could also run on much less current hardware if efficiency was a priority. Then comes the AI bandwagon and everyone is buying loads of new equipment to keep up with the Jones.
caseyy•44m ago
Suddenly, American towns are running out of water, and we're building out nuclear power for text auto-complete. It's one for the history books, for sure.
SilverSlash•1h ago
The title made me think Carmack was criticizing poorly optimized software and advocating for improving performance on old hardware.

When in fact, the tweet is absolutely not about either of the two. He's talking about a thought experiment where hardware stopped advancing and concludes with "Innovative new products would get much rarer without super cheap and scalable compute, of course".

Cordiali•1h ago
It's related to a thread from yesterday, I'm guessing you haven't seen it:

https://news.ycombinator.com/item?id=43967208 https://threadreaderapp.com/thread/1922015999118680495.html

MrBuddyCasino•41m ago
This is exactly the point. People ignore that "bloat" is not (just) "waste", it is developer productivity increase motivated by economics.

The ability to hire and have people be productive in a less complicated language expands the market for workers and lowers cost.

ngangaga•36m ago
> "Innovative new products would get much rarer without super cheap and scalable compute, of course".

Interesting conclusion—I'd argue we haven't seen much innovation since the smartphone (18 years ago now), and it's entirely because capital is relying on the advances of hardware to sell what is to consumers essentially the same product that they already have.

Of course, I can't read anything past the first tweet.

wiz21c•1h ago
I'd much prefer Carmack to think about optimizing for energy consumption.
threetonesun•1h ago
Obviously, the world ran before computers. The more interesting part of this is what would we lose if we knew there were no new computers, and while I'd like to believe the world would put its resources towards critical infrastructure and global logistics, we'd probably see the financial sector trying to buy out whatever they could, followed by any data center / cloud computing company trying to lock all of the best compute power in their own buildings.
Mindwipe•1h ago
Probably, but we'd be in a pretty terrible security place without modern hardware based cryptographic operations.
margorczynski•58m ago
The priority should be safety, not speed. I prefer an e.g. slower browser or OS that isn't ridden with exploits and attack vectors.

Of course that doesn't mean everything should be done in JS and Electron as there's a lot of drawbacks to that. There exists a reasonable middle ground where you get e.g. memory safety but don't operate on layers upon layers of heavy abstraction and overhead.

wtetzner•54m ago
Unfortunately currently the priority is neither.
bob1029•55m ago
We've been able to run order matching engines for entire exchanges on a single thread for over a decade by this point.

I think this specific class of computational power - strictly serialized transaction processing - has not grown at the same rate as other metrics would suggest. Adding 31 additional cores doesn't make the order matching engine go any faster (it could only go slower).

If your product is handling fewer than several million transactions per second and you are finding yourself reaching for a cluster of machines, you need to back up like 15 steps and start over.

HolyLampshade•49m ago
> We've been able to run order matching engines for entire exchanges on a single thread for over a decade by this point.

This is the bit that really gets me fired up. People (read: system “architects”) were so desperate to “prove their worth” and leave a mark that many of these systems have been over complicated, unleashing a litany of new issues. The original design would still satisfy 99% of use cases and these days, given local compute capacity, you could run an entire market on a single device.

VagabundoP•53m ago
I've installed OSX Sequoia on 2015 iMacs with 8 gigs of ram and it runs great. More than great actually.

Linux on 10-15 year old laptops and it runs good. if you beef up RAM and SSD then actually really good.

So for everyday stuff we can and do run on older hardware.

voidUpdate•47m ago
I mean, if you put win 95 on a period appropriate machine, you can do office work easily. All that is really driving computing power is the web and gaming. If we weren't doing either of those things as much, I bet we could all quite happily use machines from the 2000s era
therealmarv•34m ago
Tell me about it. Web development has only become fun again at my place since upgrading from Intel Mac to M4 Mac.

Just throw in Slack chat, vscode editor in Electron, Next.js stack, 1-2 docker containers, one browser and you need top notch hardware to run it fluid (Apple Silicon is amazing though). I'm doing no fancy stuff.

Chat, editor in a browser and docker don't seem the most efficient thing if put all together.

caseyy•24m ago
There is an argument to be made that the market buys bug-filled, inefficient software about as well as it buys pristine software. And one of them is the cheapest software you could make.

It's similar to the "Market for Lemons" story. In short, the market sells as if all goods were high-quality but underhandedly reduces the quality to reduce marginal costs. The buyer cannot differentiate between high and low-quality goods before buying, so the demand for high and low-quality goods is artificially even. The cause is asymmetric information.

This is already true and will become increasingly more true for AI. The user cannot differentiate between sophisticated machine learning applications and a washing machine spin cycle calling itself AI. The AI label itself commands a price premium. The user overpays significantly for a washing machine[0].

It's fundamentally the same thing when a buyer overpays for crap software, thinking it's designed and written by technologists and experts. But IC1-3s write 99% of software, and the 1 QA guy in 99% of tech companies is the sole measure to improve quality beyond "meets acceptance criteria". Occasionally, a flock of interns will perform an "LGTM" incantation in hopes of improving the software, but even that is rarely done.

[0] https://www.lg.com/uk/lg-experience/inspiration/lg-ai-wash-e...

eth0up•24m ago
Yeah, having browsers the size and complexities of OSs is just one of many symptoms. I intimate at this concept in a grumbling, helpless manner somewhat chronically.

There's a lot today that wasn't possible yesterday, but it also sucks in ways that weren't possible then.

I foresee hostility for saying the following, but it really seems most people are unwilling to admit that most software (and even hardware) isn't necessarily made for the user or its express purpose anymore. To be perhaps a bit silly, I get the impression of many services as bait for telemetry and background fun.

While not an overly earnest example, looking at Android's Settings/System/Developer Options is pretty quick evidence that the user is involved but clearly not the main component in any respect. Even an objective look at Linux finds manifold layers of hacks and compensation for a world of hostile hardware and soft conflict. It often works exceedingly well, though as impractical as it may be to fantasize, imagine how badass it would be if everything was clean, open and honest. There's immense power, with lots of infirmities.

I've said that today is the golden age of the LLM in all its puerility. It'll get way better, yeah, but it'll get way worse too, in the ways that matter.[1]

Edit: 1. Assuming open source doesn't persevere

nashashmi•21m ago
The world could run on older hardware if rapid development did not also make money.

Rapid development is creating a race towards faster hardware.

1970-01-01•19m ago
The idea of a hand me down computer made of brass and mahogany still sounds ridiculous because it is, but we're nearly there in terms of Moore's law. We have true 2nm within reach and then the 1nm process is basically the end of the journey. I expect 'audiophile grade' PCs in the 2030s and then PCs become works of art, furniture, investments, etc. because they have nowhere to go.

https://en.wikipedia.org/wiki/2_nm_process

https://en.wikipedia.org/wiki/International_Roadmap_for_Devi...

ubermonkey•11m ago
The increasing longevity of computers has been impressing me for about 10 years.

My current machine is 4 years old. It's absolutely fine for what I do. I only ever catch it "working" when I futz with 4k 360 degree video (about which: fine). It's a M1 Macbook Pro.

I traded its predecessor in to buy it, so I don't have that one anymore; it was a 2019 model. But the one before that, a 2015 13" Intel Macbook Pro, is still in use in the house as my wife's computer. Keyboard is mushy now, but it's fine. It'd probably run faster if my wife didn't keep fifty billion tabs open in Chrome, but that's none of my business. ;)

The one behind that one, purchased in 2012, is also still in use as a "media server" / ersatz SAN. It's a little creaky and is I'm sure technically a security risk given its age and lack of updates, but it RUNS just fine.

WJW•19m ago
Well obviously. And there would be no wars if everybody made peace a priority.

It's obvious for both cases where the real priorities of humanity lie.

don_searchcraft•16m ago
100% agree with Carmack. There was a craft in writing software that I feel has been lost with access to inexpensive memory and compute. Programmers can be inefficient because they have all that extra headroom to do so which just contributes to the cycle of needing better hardware.
myth_drannon•7m ago
Software development has been commoditized and is directed by MBA's and others who don't see it as a craft. The need for fast project execution is above the craft of programming, hence, the code is bug-riddled and slow. There are some niche areas (vintage,pico-8, arduino...) where people can still practise the craft, but that's just a hobby now. When this topic comes up I always think about Tarkovsky's Andrei Rublev movie, the artist's struggle.
ricardo81•13m ago
Minimalism is excellent. As others have mentioned, using languages that are more memory safe (by assumption the language is wrote in such a way) may be worth the additional complexity cost.

But surely with burgeoning AI use efficiency savings are being gobbled up by the brute force nature of it.

Maybe model training and the likes of hugging face can avoid different groups trying to reinvent the same AI wheel using more resources than a cursory search of a resource.

armchairhacker•13m ago
Is there or could we make an iPhone-like that runs 100x slower than conventional phones but uses much less energy, so it powers itself on solar? It would be good for the environment and useful in survival situations.

Or could we make a phone that runs 100x slower but is much cheaper? If it also runs on solar it would be useful in third-world countries.

Processors are more than fast enough for most tasks nowadays; more speed is still useful, but I think improving price and power consumption is more important. Also cheaper E-ink displays, which are much better for your eyes, more visible outside, and use less power than LEDs.

noobermin•6m ago
I'm already moving in this direction in my personal life. It's partly nostalgia but it's partly practical. It's just that work requires working with people who only use what hr and it hoists on them, then I need a separate machine for that.

Recursive Becoming: Theory of Everything

https://twitter.com/dvcoolster/status/1922266236173365313
1•dvcoolster•1m ago•1 comments

Footage Captures Ground Shifting Live During Myanmar Earthquake

https://twitter.com/Geo_Risk/status/1921735829199679868
1•Ridius•2m ago•0 comments

Show HN: Stickr.Shop – Buy Stickers via SSH

https://stickr.shop
2•f4n4tiX•2m ago•0 comments

Match to lay off 13% of staff as number of paid users fall

https://techcrunch.com/2025/05/08/match-to-lay-off-13-of-staff/
1•Funes-•5m ago•0 comments

The End of Sierra as We Knew It, Part 3: The Dog Days of Oakhurst

https://www.filfre.net/2025/05/the-end-of-sierra-as-we-knew-it-part-3-the-dog-days-of-oakhurst/
1•doppp•6m ago•0 comments

Why Apple can’t just quit China

https://restofworld.org/2025/apple-china-dependence-tariffs-india-shift/
1•vinnyglennon•7m ago•0 comments

Argonne team uses Aurora to investigate potential dark energy breakthrough

https://www.anl.gov/article/argonne-team-uses-aurora-supercomputer-to-investigate-potential-dark-energy-breakthrough
1•rbanffy•9m ago•0 comments

China Makes High-Speed Laser Links in Orbit

https://spectrum.ieee.org/satellite-internet-china-crosslink
1•pseudolus•9m ago•0 comments

Beyond Incentives: How to Build Durable DeFi

https://www.coindesk.com/opinion/2025/04/22/beyond-incentives-how-to-build-durable-defi
1•PaulHoule•10m ago•0 comments

The Scouring of the Shire: a letter from ex-Palantir staff to tech workers [pdf]

https://s3.documentcloud.org/documents/25930212/the-scouring-of-the-shire.pdf
2•DonnyV•11m ago•0 comments

A calculator app? Anyone could make that

https://twitter.com/ChadNauseam/status/1890889465322786878
1•FjordWarden•11m ago•0 comments

In a high-stress work environment, prioritize relationships

https://wqtz.bearblog.dev/high-stress-job-relationships/
2•wqtz•12m ago•0 comments

Polkadot Naming Service

https://pns-swart.vercel.app/
2•andre15silva•13m ago•0 comments

Is Free-Threading Our Only Option?

https://discuss.python.org/t/is-free-threading-our-only-option/91775
1•Yiling-J•14m ago•0 comments

Chinese Weapons Gain Credibility After Pakistan-India Conflict

https://www.bloomberg.com/news/articles/2025-05-13/success-of-chinese-jets-against-india-raises-alarm-in-asia
1•belter•14m ago•0 comments

How the United States Gave Up Being a Science Superpower

https://steveblank.com/2025/05/13/how-the-united-states-became-a-science-superpower-and-how-quickly-it-could-crumble/
27•enescakir•16m ago•2 comments

Find Your Early Adopters

https://www.lennysnewsletter.com/p/consumer-business-find-first-users
1•harperlee•17m ago•0 comments

Lua for Elixir

https://davelucia.com/blog/lua-elixir
2•davydog187•17m ago•0 comments

The FreeBSD-native-ish home lab and network

https://antranigv.am/posts/2024/06/freebsd-server-network-homelab/
2•bediger4000•18m ago•0 comments

Google Search Engineer Rants on DOJ's Anti-Trust Case

https://www.seroundtable.com/google-search-engineer-x-doj-rant-39397.html
2•rexbee•20m ago•1 comments

Why the Poor Vote for the Right (and Stop Demanding More Equality)

https://www.unibocconi.it/en/news/why-poor-vote-right-and-stop-demanding-more-equality
23•rbanffy•21m ago•8 comments

Tower Defense: Cache Control

https://www.jasonthorsness.com/26
1•jasonthorsness•22m ago•0 comments

Claude Code as one-shot MCP server

https://github.com/steipete/claude-code-mcp
1•tosh•22m ago•0 comments

As US vuln-tracking falters, EU enters with its own security bug database

https://www.theregister.com/2025/05/13/eu_security_bug_database/
1•voxadam•22m ago•0 comments

Launch HN: Miyagi (YC W25) turns YouTube videos into online, interactive courses

2•bestwillcui•24m ago•0 comments

Eldercare robot helps people sit and stand, and catches them if they fall

https://news.mit.edu/2025/eldercare-robot-helps-people-sit-stand-catches-them-fall-0513
1•LorenDB•27m ago•1 comments

NASA Live on Twitch 3pm EDT Today: 'Design Artemis II Moon Mascot'

https://www.twitch.tv/nasa
1•bookofjoe•27m ago•1 comments

NASA turns the screams of a dying star into music

https://www.space.com/astronomy/nasa-turns-the-screams-of-a-dying-star-into-music
1•dylan604•28m ago•0 comments

Tannapfel, a German Success Story

https://forbetterscience.com/2025/05/13/tannapfel-a-german-success-story/
1•Tomte•30m ago•0 comments

We can no longer run Microsoft Store on 1809/LTSC 2019

https://github.com/fernvenue/microsoft-store
3•fernvenue•31m ago•0 comments