this way all the RAM that AI data centers scoop up will be used to lessen demand for RAM that those same datacenters created
net-zero RAM!
Nodejs and Python were used in 2012, why is now any different?
I do enjoy golang, but Rust gives me nightmares. I make my living in higher level languages.
When I started learning to program JavaScript was just starting to gain popularity outside of the browser. It was the first language I could actually grasp, and I largely think it for giving me a career.
No more evictions for me!
The only real downside to JavaScript, you know being used as a tool for native apps with stuff like electron is it eats ram. Everything needs to a ship a full chrome binary.
But if we go back to native applications, we don't get things like quality Linux ports. If you would have told me 15 years ago that Microsoft would create the most popular IDE on Linux I'd assume that you had misspoke.
Yes, even though the Javascript was written using Doubles and the WASM was written using 64 bit ints. It just means that it's possible to write optimized Javascript (mainly by reducing object allocations, reuse objects instead)
Of course it is always possible to write highly optimised code. But that’s not what people actually do, because of time, skill and maintenance constraints. Here’s a case study: in 2018 Mozilla ported some code from JS to Rust + WASM and got a 6x speed up [1]. An expert in V8 responded to this with highly optimised JavaScript, saying Maybe you don't need Rust and WASM to speed up your JS [2]. Both articles are worth reading! But it is worth remembering that it’s a lot quicker and easier to write the code in #1 than #2 and it is easier to maintain as well.
[1] - https://hacks.mozilla.org/2018/01/oxidizing-source-maps-with...
[2] - https://mrale.ph/blog/2018/02/03/maybe-you-dont-need-rust-to...
Performance sucked when I used native Javacsript BigInts. When I made my own BigInt by using an array of doubles, and pretended that the doubles were 48-bit integers, performance was much better. Using the arrays meant that all allocation of temporary values completely stopped. I had to write my own multiply-and-add function that would do bigint = bigint * 48-bit number + other bigint + other 48-bit number.
Idle power usage is what matters.
The 15 seconds it takes to launch Discord and install updates isn't going to be driving the overall efficiency of your PC.
Not when Windows gets its grubby mitts on them. I will frequently hear the fans spin up on my Win10 laptop when it should be doing nothing, only to find the Windows Telemetry process or the Search Indexer using an entire fucking CPU core.
Incidentally, cars are also a lot more fuel efficient these days than they used to be.
Make the cutoff 2026. If you really need more cycles than we have today to solve a problem you're doing something wrong! Stop creating waste and forcing us to buy new trash all of the time.
I have an old laptop with 8GB of RAM and an ancient CPU that I haul around with I when something small for basic work. I can run Discord, Visual Studio Code, and Chrome just fine.
Something else was going on with that PC.
Or the kid did an excellent job of socially engineering his parents into an upgrade.
Biggest ones are Book Off (books, comics...), Hard Off (electronics, computers, musical instruments...) and Hobby Off (toys, collectibles, video games...).
They even have a Liquor Off ! (not second hand, just discount/overstock)
Also: mode off (fashion).
As far as gaming is concerned, the "gaming" parts of Akihabara mainly concerned locally produced pastel toned 2D slideshow pornographic games("visual novels"), the genre that lead to gacha games like FGO. Local populace is horrible at handling 3D first-person content in general and that never helped.
Non-console gaming in Japan is growing somewhat but a lot is also going into phones, namely Genshin. So where the trend is headed is still pastel toned soft-porn games without much PvP.
I also wish I built a new gaming rig during the summer last year.
That's a perfectly fine modern PC. Similar to or faster than what some of my casual gaming friends use.
The RAM alone would probably sell for $350 or more.
Wow RAM prices have gotten absolutely insane.
The specs are still more than enough for any of the development I do. My main issue is just the form factor - a laptop is so much better for my current situation. The power consumption also kind of bugs me - things have improved a LOT in that regard since 2019 - although it's kind of nice for heating my office in winter. It also doesn't help that I built it with a full ATX motherboard and a giant case (Fractal Design Define R6), which is kind of ugly and takes up a ton of space.
wrxd•19h ago
kasane_teto•19h ago
nehal3m•19h ago
embedding-shape•18h ago
georgefrowny•14h ago
lysace•18h ago
Software takes a lot of time to build. Codebases live for decades. There's often an impossibly large cost in starting over with a less wasteful architecture/language/etc. Think going from an Electron/Chromium app to something built using some compiled language and native OS GUI constructs that uses 10x less resources.
Workaccount2•18h ago
Hardware by nature forces redesigns whereas software it's always possible to just keep building on top of old bad designs, and so much cheaper and faster to do so. That's why hardware is 10,000x faster than 30 years ago, and even simple word processors are debatabely faster than 30 years ago. Maybe even slower.
immibis•18h ago
Krutonium•17h ago
The ARM Core starts up, does crypto, Loads the SecureOS and the BIOS, then it starts the x86 CPU - In 16 bit mode! Which then boostraps itself through 32 then 64 bit mode.
So in the first couple sends of power on, your CPU is at various points ARM, i386, x86, and x86_64.
lysace•17h ago
immibis•15h ago
At best, they might have been able to confine the needed logic patches to the instruction-decoding front end.
palmotea•17h ago
Well, what if I want to run a 16-bit OS?
Also, I wonder if the transistor count of a literal entire 8086 processor is so small relative to the total that they just do that.
According to https://en.wikipedia.org/wiki/Transistor_count#Microprocesso...:
So you could fit 200,000+ 8086s on that not-so-cutting-edge silicon.immibis•15h ago
Compatibility mode doesn't work by having a separate 16-bit core. It's random bits of spaghetti logic to make the core work like a 16-bit core when the 32-bit flag isn't set.
latentsea•17h ago
First I'm learning about this and I'm curious why this needs to be the case? Seems so wild that it works this way, but I'm sure there's a logic to it.
immibis•17h ago
> it's always possible to just keep building on top of old bad designs, and so much cheaper and faster to do so
ChoGGi•16h ago
pwg•15h ago
It began life as an "out of band" way to administer servers so that an ops. team could do everything (other than actual hardware changes) remotely that would otherwise need a person to be standing in front of the server in the datacenter poking commands into a keyboard.
It then grew in responsibilities to also support the "secure boot" aspect of system startup, and beyond some Intel CPU version point (I do not remember which point), it exists in every Intel CPU produced.
UltraSane•15h ago
serpent•16h ago
graemep•15h ago
What does the ARM CPU do?
jerrysievert•9h ago
2OEH8eoCRo0•18h ago
Hamuko•18h ago
Game developers might have to do something though if high-end GPUs are going to end up being $5000.
wincy•18h ago
Hamuko•17h ago
https://www.techpowerup.com/344578/leaks-predict-usd-5000-rt...
isk517•16h ago
mananaysiempre•16h ago
tverbeure•15h ago
Which makes sense: the latency is determined by the underlying storage technology and the way to access that storage, which the same for both.
kube-system•14h ago
RAM manufacturers are switching lines over from DDR to make HBM.
wolvoleo•11h ago
duffyjp•14h ago
I've declined the refresh I've overdue for. My 2021 model MBP has 32gb and a 1TB SSD. They're currently giving out the base model Air: 16gb and 256gb. No thanks.
We used to get whatever was most appropriate for our role, now we get whatever is cheapest.
UltraSane•15h ago