This is a lot better than my memories of forcing a Pentium MMX 200 MHz PC with 32 MB SDRAM and an ATI All-in-Wonder Pro of running games from the early 2000s.
Was a bit faster than software (but hey I suppose if you weren't doing any transparency that makes it easier lol).
I remember what a huge difference it was having a dedicated 3D card capable of fast 2D and 3D vs the software rasterizer. Yes, NovaLogic games ran better. Yes, you can play Doom at decent FPS. Yes, SpecOps ran at full monitor resolution. They had a LOT to brag about.
As a developer, I'm sure Glide was great.
But as a kid that really wanted a 3dfx Voodoo card for Christmas so I could play all the sweet 3D games that only supported Glide, I was upset when my dad got me a Rendition Verite 2200. But I didn't want to seem ungrateful, so my frustration was pointed to 3dfx for releasing a proprietary API.
I was glad that Direct3D and OpenGL quickly surpassed Glide's popularity.
But yeah, then 3dfx failed to innovate. IIRC, they lagged behind in 32-bit color rendering support as well as letting themselves get caught with their pants down when NVIDIA released the GeForce and introduced hardware transform which allowed the GPU to be more than just a texturing engine. I think that was the nail in 3dfx's coffin.
Thanks for the laugh about your disappointment with your dad. I had a similar thing happen with mine when I asked for Doom and him being a Mac guy, he came back with Bungie’s Marathon. I was upset until I played Marathon… I then realized how wise my father was.
Single-digit FPS can _absolutely_ be playable if you're a desperate enough ten-year-old...
This would have been on some kind of Pentium 4 with integrated graphics. Not my earliest PC, but the first one I played any games on more advanced than the Microsoft Entertainment Packs.
I had to look at the ground and get the camera as close as possible to cross between the AH and the bank in IF. Otherwise I’d get about 0.1 fps and had to close the game, which meant waiting in line to get back. Those were the days.
> So with the right UI layout made from addons I could still be a pretty effective healer.
I got pretty good with the timings and could almost play without looking at the screen. But I was DD and it was vanilla so nobody cared if I sucked as long as I got far away with the bombs.
> I don't even remember what the dungeons looked like, just a giant grid of health bars, buttons and threat-meter graphs.
I was talking a couple of weeks ago with a mate who was MT at the time and told me he knew the feet and legs of all the bosses but never saw the animations or the faces before coming back with an alt a couple of years later. I was happy as a warlock, enjoying the scenery. With a refresh rate that gave me ample time to admire it before the next frame :D
Absolutely, sweet memories playing at less than 10fps using zsnes on a 486 dx2 by 1999...
Countless kids played Morrowind below par spec on family computers all across America.
I have fond memories of playing Diablo II at 16 fps on an old (even at the time) PowerMac. I am not sure I could do it now.
And somehow, more mesmerizing than games feels like playing now. To be a kid again.
Wasn't AoE1 released for PPC Mac natively? AoE2 was probably the best Mac game ever.
The DGX Spark and Mac Studio are currently the two best Arm-based platforms for running that game, it seems to like a lot of CPU to feed a decent GPU.
to your point about 'meaningful' though, indeed the ole College Try to run Crysis on a Samsung NC-10 would be far more glorious! But I assure you this was very fun for me.
Nothing. It’s just fun.
> t would have been more meaningful if the author tried the GPU card with an old machine, rather than a Raspberry Pi
But then it would have been lame. Who cares? If your old machine is a x86 less than 10 years old it’s most likely faster than the Pi. But that’s not the point. The point is to pair a cheap fun computer with a humongous and expensive card and see if it works. Because it’s fun.
Also, it doesn't seem like it would be all that much more expensive for these high end GPUs to start getting x86/64 SoCs with midrange specs baked in, and these AIO GPUs could be tailor made for standalone AI and gaming applications. If it's the equivalent of a $10 bit of gear in terms of cost, they could charge an additional $100 for the feature, with a SoC optimized for the specs of the GPU - get rid of the need for an eGPU altogether and stream from the onboard host?
That and the prices never really came back down to earth after the chip shortage hikes.
No. There are a bunch of alternatives with some to full pin compatibility. Some being many times faster [1]. No new projects should use a new Raspberry Pi.
- high current 5V USB power supply you probably don't have
- HDMI micro port you have like 1 cable for
- PCIe through very fragile ribbon cable + hodgepodge of adapters
- more adapters needed for SSD
- no case, but needs ample airflow
- power input is on the side and sticks out
GPIO is the killer feature, but I'll be honest, 99% of the hardware hacking I do is with microcontrollers much cheaper than a Pi that provide a serial port over USB anyways (and the commonly-confused-for-a-full-pi Pi Pico is pretty great for this)
We had a problem trying to bring up a couple of Pi 5, hoping they'd represent something reproducable we could deploy on multiple sites as an isolation stage for remote firmware programming. Everything looked great, until we brought one somewhere and untethered it from ethernet, and we started getting bizarre hangs. Turned out the wifi was close enough to the PCIe ribbon cable that bursts of wifi broadcasts were enough to disrupt the signal to the SSD, and essentially unmount it (taking root with it). Luckily we were able to find better shielded cables, but it's not something we were expecting to have to deal with.
The only case I can think of is very heavy compute that relies on low latency GPIO related to that compute?
This blog post shows a $2000 GPU attached to a slow SBC that costs less than 1/10th of the GPU.
It’s interesting. It’s entertaining. It’s a fun read. But it’s not a serious setup that anyone considers optimal.
I tried a lot of things, inclusing full windows reinstall, driver rollback, cleaning from dust etc etc. Crash reason is listed as "other" Nvidia driver error code.
Bazzite using Proton it works flawlessly. God of war,KCD2 and others. I guess, it will be Linux gaming for me from now on.
I am still puzzled why this situation even can be. If you have ideas, be my guest.
Doom The Dark Ages is a single player game, so I’m not sure who you’d be cheating against, aside from maybe some real Buzz Killington’s saying you’re “cheating Microsoft by pirating it”.
I know that sounds a little pedantic; but typically DRM involves an identity layer (who is allowed to access what?). Denuvo doesn’t care about that; it’s even theoretically possible to make a Denuvo protected binary anyone could use.
So plugging a RasPi into a 5090 is "just" swapping the horse for one 10,000x bigger (someone correct my ratio of the RasPi5 GPU to the RTX5090)
It's a quirk of the broadcom chips that the rpi family uses; the GPU is the first bit of silicon to power up and do things. The GPU specifically is a bit unusual, but the general idea of "smaller thing does initial bring up, then powers up $main_cpu" is not unusual once $main_cpu is ~ powerful enough to run linux.
Interesting
Pi4: 20 FPS same when using ffmpeg to stream to twitch. 5W
Pi5: 40 FPS idem as above. 10W
3588: 300+ FPS and rock solid 60 FPS streaming to twitch. 15W
So 5090 is not even interesting for gameplay. More polygons and larger textures do not make games more fun to play.
AAA has peaked and C++ does not even deliver interesting games any more. C#/Java are way better alternatives for modding.
I think the sweet spot for the Pi 5 is 4GB (cost vs functionality you can use it for). But if you're like me, you don't care about value quite as much as fun/exploration. And for that, the more RAM, the merrier...
Managed to complete the games with decent graphics and framerate at the time. It wasn't an ideal setup, but I didn't care. In fact, I thought it was a cool hack to play games at the time without forking out a lot of money to build a gaming PC.
Maybe there are probably better options now to game than attaching a dedicated GPU with whatever hardware you already have, but I can verify that external GPUs are really cool and useful (though a 5090 is definitely not needed). You also don't have to care about cooling the GPU, since it's "atmosphere" cooled (though headphones and/or ANC are a must).
rcarmo•10h ago
0x1ch•10h ago
juris•9h ago