But, a plain answer: Via Eden boards. still use north/southbridge architecture, and are from the mid 2000's.
It's just modern Windows/Linux that have discontinued the ability. Or, perhaps you have 16/32 and 32/64 and are unable to do 16bit on 64bit machines- which still boils down to "operating system."
By far the biggest issue though is that even the Via Eden processor is significantly faster than a 486- and lots of software (especially games) from that era used no-op instruction loops for timing and timers. This results in games like The Incredible Machine's level timer running out in half a second or less.
Neither are a fit, SDRAM was a Pentium/K-6 standard (PC66); the DIMMs ran faster than a non-OC'ed 486 bus, which ran at half the clock of the CPU. 486 "natural fit" would be FPM or EDO, if you wanted to be era-correct.
There were probably some off the wall 486 motherboards back then that supported SDR (post-1993...), but those would have been towards the very end of the 486 consumer life cycle. These did exist in the 486 era, where they had the option to run (or had an embedded) 386 using FPM while there was an open 486 socket and the option, but not requirement to run EDO.
Anyway, this is someone's project, so they can do whatever the heck they want.
TehCorwiz•1h ago
robinsonb5•26m ago
Quite apart from the increased complexity, the most important difference is that there's a minimum speed as well as a maximum speed for modern DDR RAM, which means there's usually quite a narrow window of achievable clock rates when getting an FPGA to talk to DDR3.
I suspect that's why the author chose to use the DDR for video: It's usually easy to keep plain old SDRAM in lockstep with a soft-CPU, since you can run it at anything between 133MHz (sometimes even more) and walking pace, so there's no need to deal with messy-and-latency-inducing clock domain crossing.
Streaming video data in bursts into a dual-clock FIFO and consuming it on the pixel clock is a much more natural fit.