frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Open in hackernews

Why is PS3 emulation so fast: RPCS3 optimizations explained [video]

https://www.youtube.com/watch?v=19ae5Mq2lJE
106•alexjplant•1d ago

Comments

seam_carver•1d ago
Happy that RPCS3 has added native apple silicon support
snvzz•1d ago
Hopeful for RISC-V.
leshokunin•1d ago
What’s particularly interesting here is that Sony and IBM spent a billion dollars to make the Cell. It was designed to be completely different from previous console CPUs. Even more so than the PS2’s “emotion engine” combo. So the fact that it’s so well emulated and also performant is remarkable!
deaddodo•1d ago
> It was designed to be completely different from previous console CPUs.

Sure, but they used a PowerPC core with specialized SIMD processors that they then extensively documented:

https://arcb.csc.ncsu.edu/~mueller/cluster/ps3/SDK3.0/docs/a...

If they hadn't done the latter, it would probably have taken a lot longer to reverse engineer. It also would have made it near impossible for developers to code effectively for it, however.

phoe-krk•1d ago
It's unbelievable that over the whole course of PS3's lifespan we've gone from "we will never be able to emulate this at full speed, the hardware is too slow and the Cell architecture too alien" to "why is PS3 emulation so fast, optimizations explained". I've been loosely tracking various emulators' progress and it's hats off with regards to the ingenuity behind all of the mechanisms that make it possible to emulate things fast enough.
FirmwareBurner•1d ago
To be fair PC CPUs and GPUs have evolved leaps and bounds form the beginning of PS3 emulation till today.
deaddodo•1d ago
I don't think anyone with knowledge of emulation (from a research and development side) would say it's impossible. The Cell is a standard PPC core with some well-documented[1] coprocessors.

A more realistic response would be: "computing hardware would not be powerful enough to emulate the PS3 in it's lifetime". We're now two decades out from it's release, and a decade out from it's final phase-out, so it seems that was a fair assessment.

1 - https://arcb.csc.ncsu.edu/~mueller/cluster/ps3/SDK3.0/docs/a...

magic_hamster•1d ago
Sony learned their lesson from what happened with the PS1, where commercial emulators like Bleem became available during the product's lifetime. It was probably not a huge deal in terms of lost sales, but Sony really didn't like this, as evident by their lawsuit (which also failed).

The PS2 with its Emotion engine was a huge leap which was pretty hard to emulate for a while. And the PS3 was even harder. Yes the developers hated the Cell architecture, but overall, Sony managed to create a pretty good system which spawned incredible games, while also being so hard to emulate that it took over a decade to reach a point where it's done properly, and almost 20 years to reach a point where it's considered really fast.

Compare this to the Switch, which was being emulated pretty well from the get go. This allowed some people to just make do with an emulator instead of buying the console (and the games). Actually this goes for pretty much all Nintendo devices.

glimshe•1d ago
Sony didn't create the cell architecture to prevent efficient emulation. At the time, manufacturers tried to get as much performance as possible from the manufacturing dollar under the assumption that developers would optimize their games for the machine. It was actually a partial failure, as few third party titles made full use of the architecture.
whizzter•1d ago
Kinda, in so many respects the PS3 SPU's that many hated was just taking the PS2 VU's to the next level as the programming model was very similar(shuffle blocks of data via DMA to fast vector units).

As a former PS2 developer I mostly thought "cool, VU's with more memory".

Few games really utilized the PS2 to it's fullest either (there's an port of GTA3 and GTA:VC to the older Dreamcast that's coming along very well).

The thing that really bit Sony here for the PS3 was that many PS2 titles (The PS2 GTA games being the prime example!) used the Renderware engine (a few others were available but it was the most popular afaik), so the complexity of the PS2 never really hit developers who were making games just below the absolute top tier.

When EA bought up Renderware slightly before the PS3 release, they closed off sales while honoring existing sales only so the most used cross platform engine was suddenly off limits to most third parties for the PS3 (Iirc is why Rockstar released that ping-pong game as an engine test before GTA4 and 5).

And the perceptions about third party engines also took a hit so not only was the most popular engine closed off, bigger developers were also became wary of relying on third party engines at all (during the PS3 period) until Unity later took off from indie usage.

pipes•1d ago
That is really interesting thanks. I always wondered what happened to renderware or why I stopped seeing it after the PS2.
rounce•1d ago
> Actually this goes for pretty much all Nintendo devices.

Roughly 30 years later and N64 emulation is not fully solved.

mrguyorama•1d ago
Fully solved how? It's in a great state.

Angrylion brought a new paradigm to n64 emulation, which is "fuck it, Low Level Emulation is fully doable now", and then that incredibly successful work was ported to run as a GPU shader, where it works a million times better! Now even medium powered devices, like the Steam Deck, can run low level emulation of n64 games at upscaled resolution and never run into graphics bugs, have fewer other bugs, have great performance, etc.

Classic bugs like the perfect dark remote camera that always had trouble on High Level Emulator plugins are just gone, no tweaks required. Games that wrote their own microcode run with no trouble. The crazy shit Rare and Factor 5 did at the end of the console's lifecycle just works in emulators.

https://www.libretro.com/index.php/category/parallel-n64/

Notably, Modern Vintage Gamer released a video titled "Why is Nintendo 64 emulation still a broken mess in 2025" and to make that video he had to contrive dumb scenarios: Running n64 emulation on the Playstation Vita and a Raspberry Pi.

Efficient and accurate high level emulation of the n64 is just not possible. You can't properly emulate the complex interactions going on in the n64 without huge amounts of overhead, it's too interconnected. Angrylion and Parallel-n64 proved, with that amount of overhead, you might as well do pixel accurate low level emulation and just eliminate an entire class of emulation bugs. When angrylion came out, around 2017, even a shitty laptop with a couple cores could run n64 games pixel accurate at native resolution and full speed.

In fact, on the Raspberry Pi that MVG is complaining about in the above mentioned video, he is complaining that "n64 emulation is a broken mess" because he is getting 50fps in conkers bad fur day. Because he is running it upscaled. He's complaining "n64 emulation is a broken mess" because the Raspberry Pi has a garbage GPU. Laptop integrated GPUs, even budget laptop integrated GPUs have no problems with parallel-n64

High level emulation was always a crutch, and never a good approach for n64 emulation. Even in it's heyday, it relied on per-game patches.

Notably, the Dolphin team ended up finding the same reality. What finally solved some serious emulator problems dealing with the gamecube having a mostly Fixed Function Pipeline graphics system that could be updated whenever, a situation that does not translate at all to computer graphics systems that expect you have individual shader programs to call with certain materials, was to write a giant shader that literally emulated the entire gamecube graphics hardware and use that while you wait for the emulated shader to compile. Ubershaders they call it.

pipes•1d ago
What evidence is there that Sony designed their hardware to be hard to emulated? As an aside: The n64 is hard to emulated and yet ultraHLE appeared right in the middle of its commercial life.
whizzter•1d ago
Back in those days we didn't have that many cores,etc so the raw computation power of the PS3 was an issue in itself and the SPU was a kind of middle-ground between shaders and CPU's so you probably had to use a regular CPU to emulate it.

We have multicore machines with far more cores today so we can match the computation unit count and performance.

The other part was that older consoles (8 and 16 bit era) really needed a lot of cycle-exact emulation to not fail and that requires an order of magnitude faster CPU's to manage emulating it properly and with CPU's hitting the Ghz limits around the same time we thought it'd be impossible to do that level of cycle-exactness needed.

Luckily though, because the PS3 needed optimal multi-core programming and the way to achieve the maximum throughput for that was to use DMA channels to shuffle data between CPU/SPU parts, emulator authors can probably use them as choke-points to handle emulation on a slightly more high level and avoid trying to manage cycle-exact timings.

ahartmetz•1d ago
The nice thing about more recent architectures is that no one (including the original authors) can rely on cycle-exactness because of the basically unpredictable effects of caches and speculative execution and bus contention and (...).

Most of these, as a big exception, do not apply to the Cell running code on data in its local memory, but fortunately, it's different as seen from other system components, as you say.

A short history of Greenland, in six maps

https://www.economist.com/graphic-detail/2025/06/04/a-short-history-of-greenland-in-six-maps
1•bookofjoe•2m ago•1 comments

Don't Settle for Mediocre Front End Testing

https://blog.thinkst.com/2025/06/dont-settle-for-mediocre-frontend-testing-build-stable-reliable-systems-instead.html
1•mslaviero•3m ago•0 comments

CEO Sundar Pichai says Google to keep hiring engineers

https://timesofindia.indiatimes.com/technology/tech-news/ceo-sundar-pichai-says-google-to-keep-hiring-engineers-because-/articleshow/121647784.cms
2•msolujic•3m ago•0 comments

Facet: Reflection for Rust

https://www.youtube.com/watch?v=0mqFCqw_XvI
1•todsacerdoti•6m ago•0 comments

Ask HN: Is GPU nondeterminism bad for AI?

1•ramity•7m ago•0 comments

Discord's CTO Is Just as Worried About Enshittification as You Are

https://www.engadget.com/gaming/discords-cto-is-just-as-worried-about-enshittification-as-you-are-173049834.html
2•m463•8m ago•0 comments

What LLMss Don't Talk About: Empirical Study of Moderation & Censorship Practice

https://arxiv.org/abs/2504.03803
2•superpupervlad•13m ago•0 comments

Ask HN: Should movie theaters allow you to watch movies in 30 minute chunks?

1•amichail•14m ago•3 comments

Soviet Radio Manufacturer Logos

http://oldradio.ru/logos/index.shtml
4•NaOH•21m ago•0 comments

Vapor: Swift, but on a Server

https://vapor.codes
2•nateb2022•23m ago•0 comments

$300 Ukrainian drones vs. $100M Russian bombers

https://www.gzeromedia.com/300-ukrainian-drones-vs-100-million-russian-bombers
6•WillDaSilva•23m ago•0 comments

Show HN: YOYO – AI Version Control for Vibe Coding

https://www.runyoyo.com/
1•itgelganbold•24m ago•0 comments

Trump and Musk enter bitter feud – and Washington buckles up

https://www.bbc.co.uk/news/articles/c3wd2215q08o
5•mellosouls•26m ago•1 comments

Musk: SpaceX will ground Dragon spacecraft used to shuttle astronauts to ISS

https://thehill.com/business/5335638-musk-spacex-will-ground-spacecraft-used-to-shuttle-astronauts-cargo-to-iss/
6•ilamont•27m ago•0 comments

Tokasaurus: An LLM Inference Engine for High-Throughput Workloads

https://scalingintelligence.stanford.edu/blogs/tokasaurus/
18•rsehrlich•27m ago•0 comments

Technical Interviews in the Age of LLMs

https://www.fractional.ai/blog/technical-interviews-in-the-age-of-llms
3•StriverGuy•29m ago•0 comments

APL Interpreter – An implementation of APL, written in Haskell (2024)

https://scharenbroch.dev/projects/apl-interpreter/
12•ofalkaed•32m ago•0 comments

Ask HN: Validating a Tool to Help Founders Stay Focused and Build What Matters

1•mmarvramm•33m ago•1 comments

U.S. Research Stock Returns Data

https://mba.tuck.dartmouth.edu/pages/faculty/ken.french/data_library.html
1•Bluestein•34m ago•0 comments

Meta Advertising Manual

https://proxima-wiki.notion.site/meta-advertising-manual-q2-2025
1•handfuloflight•34m ago•0 comments

Remote Development with X2Go

https://reemus.dev/article/jetbrains-remote-development-with-x2go
1•indigodaddy•35m ago•0 comments

Intel: New products must deliver 50% gross profit to get the green light

https://www.tomshardware.com/tech-industry/semiconductors/intel-draws-a-line-in-the-sand-to-boost-gross-margins-new-products-must-deliver-50-percent-to-get-the-green-light
3•Scramblejams•37m ago•3 comments

I made a list of free stuff for college hackers

https://www.buildincollege.com
2•createdbymason•37m ago•0 comments

Why Texas Won't Force Companies to Use E-Verify for Employment Authorization

https://www.texastribune.org/2025/06/05/texas-e-verify-requirements-immigration/
5•hn_acker•44m ago•3 comments

The Rarest Signature [video]

https://www.youtube.com/watch?v=aFHxsS6uv5g
1•Bluestein•46m ago•0 comments

600 years before Europeans arrived, Great Lakes farmers transformed the land

https://www.science.org/content/article/600-years-europeans-arrived-great-lakes-farmers-transformed-land
2•rbanffy•47m ago•0 comments

Measuring the elastic properties of the Gibeon meteorite using laser ultrasound

https://www.sciencedirect.com/science/article/pii/S1359646225001290
1•PaulHoule•47m ago•0 comments

A curated list of available fantasy consoles/computers

https://github.com/paladin-t/fantasy
2•90s_dev•47m ago•2 comments

AWS Plunks Down $10B for Datacenters in North Carolina

https://www.nextplatform.com/2025/06/05/aws-plunks-down-10-billion-for-datacenters-in-north-carolina/
3•rbanffy•48m ago•0 comments

A private company wants to build a city on the moon

https://abcnews.go.com/US/private-company-build-city-moon-land-probe/story?id=122515680
2•domofutu•48m ago•1 comments