frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Cook New Emojis

https://emoji.supply/kitchen/
1•vasanthv•30s ago•0 comments

Show HN: LoKey Typer – A calm typing practice app with ambient soundscapes

https://mcp-tool-shop-org.github.io/LoKey-Typer/
1•mikeyfrilot•3m ago•0 comments

Long-Sought Proof Tames Some of Math's Unruliest Equations

https://www.quantamagazine.org/long-sought-proof-tames-some-of-maths-unruliest-equations-20260206/
1•asplake•4m ago•0 comments

Hacking the last Z80 computer – FOSDEM 2026 [video]

https://fosdem.org/2026/schedule/event/FEHLHY-hacking_the_last_z80_computer_ever_made/
1•michalpleban•4m ago•0 comments

Browser-use for Node.js v0.2.0: TS AI browser automation parity with PY v0.5.11

https://github.com/webllm/browser-use
1•unadlib•5m ago•0 comments

Michael Pollan Says Humanity Is About to Undergo a Revolutionary Change

https://www.nytimes.com/2026/02/07/magazine/michael-pollan-interview.html
1•mitchbob•5m ago•1 comments

Software Engineering Is Back

https://blog.alaindichiappari.dev/p/software-engineering-is-back
1•alainrk•6m ago•0 comments

Storyship: Turn Screen Recordings into Professional Demos

https://storyship.app/
1•JohnsonZou6523•7m ago•0 comments

Reputation Scores for GitHub Accounts

https://shkspr.mobi/blog/2026/02/reputation-scores-for-github-accounts/
1•edent•10m ago•0 comments

A BSOD for All Seasons – Send Bad News via a Kernel Panic

https://bsod-fas.pages.dev/
1•keepamovin•14m ago•0 comments

Show HN: I got tired of copy-pasting between Claude windows, so I built Orcha

https://orcha.nl
1•buildingwdavid•14m ago•0 comments

Omarchy First Impressions

https://brianlovin.com/writing/omarchy-first-impressions-CEEstJk
2•tosh•19m ago•1 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
2•onurkanbkrc•20m ago•0 comments

Show HN: Versor – The "Unbending" Paradigm for Geometric Deep Learning

https://github.com/Concode0/Versor
1•concode0•20m ago•1 comments

Show HN: HypothesisHub – An open API where AI agents collaborate on medical res

https://medresearch-ai.org/hypotheses-hub/
1•panossk•23m ago•0 comments

Big Tech vs. OpenClaw

https://www.jakequist.com/thoughts/big-tech-vs-openclaw/
1•headalgorithm•26m ago•0 comments

Anofox Forecast

https://anofox.com/docs/forecast/
1•marklit•26m ago•0 comments

Ask HN: How do you figure out where data lives across 100 microservices?

1•doodledood•26m ago•0 comments

Motus: A Unified Latent Action World Model

https://arxiv.org/abs/2512.13030
1•mnming•26m ago•0 comments

Rotten Tomatoes Desperately Claims 'Impossible' Rating for 'Melania' Is Real

https://www.thedailybeast.com/obsessed/rotten-tomatoes-desperately-claims-impossible-rating-for-m...
3•juujian•28m ago•2 comments

The protein denitrosylase SCoR2 regulates lipogenesis and fat storage [pdf]

https://www.science.org/doi/10.1126/scisignal.adv0660
1•thunderbong•30m ago•0 comments

Los Alamos Primer

https://blog.szczepan.org/blog/los-alamos-primer/
1•alkyon•32m ago•0 comments

NewASM Virtual Machine

https://github.com/bracesoftware/newasm
2•DEntisT_•35m ago•0 comments

Terminal-Bench 2.0 Leaderboard

https://www.tbench.ai/leaderboard/terminal-bench/2.0
2•tosh•35m ago•0 comments

I vibe coded a BBS bank with a real working ledger

https://mini-ledger.exe.xyz/
1•simonvc•35m ago•1 comments

The Path to Mojo 1.0

https://www.modular.com/blog/the-path-to-mojo-1-0
1•tosh•38m ago•0 comments

Show HN: I'm 75, building an OSS Virtual Protest Protocol for digital activism

https://github.com/voice-of-japan/Virtual-Protest-Protocol/blob/main/README.md
5•sakanakana00•41m ago•1 comments

Show HN: I built Divvy to split restaurant bills from a photo

https://divvyai.app/
3•pieterdy•44m ago•0 comments

Hot Reloading in Rust? Subsecond and Dioxus to the Rescue

https://codethoughts.io/posts/2026-02-07-rust-hot-reloading/
4•Tehnix•44m ago•1 comments

Skim – vibe review your PRs

https://github.com/Haizzz/skim
2•haizzz•46m ago•1 comments
Open in hackernews

Why is PS3 emulation so fast: RPCS3 optimizations explained [video]

https://www.youtube.com/watch?v=19ae5Mq2lJE
107•alexjplant•8mo ago

Comments

seam_carver•8mo ago
Happy that RPCS3 has added native apple silicon support
snvzz•8mo ago
Hopeful for RISC-V.
leshokunin•8mo ago
What’s particularly interesting here is that Sony and IBM spent a billion dollars to make the Cell. It was designed to be completely different from previous console CPUs. Even more so than the PS2’s “emotion engine” combo. So the fact that it’s so well emulated and also performant is remarkable!
deaddodo•8mo ago
> It was designed to be completely different from previous console CPUs.

Sure, but they used a PowerPC core with specialized SIMD processors that they then extensively documented:

https://arcb.csc.ncsu.edu/~mueller/cluster/ps3/SDK3.0/docs/a...

If they hadn't done the latter, it would probably have taken a lot longer to reverse engineer. It also would have made it near impossible for developers to code effectively for it, however.

phoe-krk•8mo ago
It's unbelievable that over the whole course of PS3's lifespan we've gone from "we will never be able to emulate this at full speed, the hardware is too slow and the Cell architecture too alien" to "why is PS3 emulation so fast, optimizations explained". I've been loosely tracking various emulators' progress and it's hats off with regards to the ingenuity behind all of the mechanisms that make it possible to emulate things fast enough.
FirmwareBurner•8mo ago
To be fair PC CPUs and GPUs have evolved leaps and bounds form the beginning of PS3 emulation till today.
deaddodo•8mo ago
I don't think anyone with knowledge of emulation (from a research and development side) would say it's impossible. The Cell is a standard PPC core with some well-documented[1] coprocessors.

A more realistic response would be: "computing hardware would not be powerful enough to emulate the PS3 in it's lifetime". We're now two decades out from it's release, and a decade out from it's final phase-out, so it seems that was a fair assessment.

1 - https://arcb.csc.ncsu.edu/~mueller/cluster/ps3/SDK3.0/docs/a...

magic_hamster•8mo ago
Sony learned their lesson from what happened with the PS1, where commercial emulators like Bleem became available during the product's lifetime. It was probably not a huge deal in terms of lost sales, but Sony really didn't like this, as evident by their lawsuit (which also failed).

The PS2 with its Emotion engine was a huge leap which was pretty hard to emulate for a while. And the PS3 was even harder. Yes the developers hated the Cell architecture, but overall, Sony managed to create a pretty good system which spawned incredible games, while also being so hard to emulate that it took over a decade to reach a point where it's done properly, and almost 20 years to reach a point where it's considered really fast.

Compare this to the Switch, which was being emulated pretty well from the get go. This allowed some people to just make do with an emulator instead of buying the console (and the games). Actually this goes for pretty much all Nintendo devices.

glimshe•8mo ago
Sony didn't create the cell architecture to prevent efficient emulation. At the time, manufacturers tried to get as much performance as possible from the manufacturing dollar under the assumption that developers would optimize their games for the machine. It was actually a partial failure, as few third party titles made full use of the architecture.
whizzter•8mo ago
Kinda, in so many respects the PS3 SPU's that many hated was just taking the PS2 VU's to the next level as the programming model was very similar(shuffle blocks of data via DMA to fast vector units).

As a former PS2 developer I mostly thought "cool, VU's with more memory".

Few games really utilized the PS2 to it's fullest either (there's an port of GTA3 and GTA:VC to the older Dreamcast that's coming along very well).

The thing that really bit Sony here for the PS3 was that many PS2 titles (The PS2 GTA games being the prime example!) used the Renderware engine (a few others were available but it was the most popular afaik), so the complexity of the PS2 never really hit developers who were making games just below the absolute top tier.

When EA bought up Renderware slightly before the PS3 release, they closed off sales while honoring existing sales only so the most used cross platform engine was suddenly off limits to most third parties for the PS3 (Iirc is why Rockstar released that ping-pong game as an engine test before GTA4 and 5).

And the perceptions about third party engines also took a hit so not only was the most popular engine closed off, bigger developers were also became wary of relying on third party engines at all (during the PS3 period) until Unity later took off from indie usage.

pipes•8mo ago
That is really interesting thanks. I always wondered what happened to renderware or why I stopped seeing it after the PS2.
rounce•8mo ago
> Actually this goes for pretty much all Nintendo devices.

Roughly 30 years later and N64 emulation is not fully solved.

mrguyorama•8mo ago
Fully solved how? It's in a great state.

Angrylion brought a new paradigm to n64 emulation, which is "fuck it, Low Level Emulation is fully doable now", and then that incredibly successful work was ported to run as a GPU shader, where it works a million times better! Now even medium powered devices, like the Steam Deck, can run low level emulation of n64 games at upscaled resolution and never run into graphics bugs, have fewer other bugs, have great performance, etc.

Classic bugs like the perfect dark remote camera that always had trouble on High Level Emulator plugins are just gone, no tweaks required. Games that wrote their own microcode run with no trouble. The crazy shit Rare and Factor 5 did at the end of the console's lifecycle just works in emulators.

https://www.libretro.com/index.php/category/parallel-n64/

Notably, Modern Vintage Gamer released a video titled "Why is Nintendo 64 emulation still a broken mess in 2025" and to make that video he had to contrive dumb scenarios: Running n64 emulation on the Playstation Vita and a Raspberry Pi.

Efficient and accurate high level emulation of the n64 is just not possible. You can't properly emulate the complex interactions going on in the n64 without huge amounts of overhead, it's too interconnected. Angrylion and Parallel-n64 proved, with that amount of overhead, you might as well do pixel accurate low level emulation and just eliminate an entire class of emulation bugs. When angrylion came out, around 2017, even a shitty laptop with a couple cores could run n64 games pixel accurate at native resolution and full speed.

In fact, on the Raspberry Pi that MVG is complaining about in the above mentioned video, he is complaining that "n64 emulation is a broken mess" because he is getting 50fps in conkers bad fur day. Because he is running it upscaled. He's complaining "n64 emulation is a broken mess" because the Raspberry Pi has a garbage GPU. Laptop integrated GPUs, even budget laptop integrated GPUs have no problems with parallel-n64

High level emulation was always a crutch, and never a good approach for n64 emulation. Even in it's heyday, it relied on per-game patches.

Notably, the Dolphin team ended up finding the same reality. What finally solved some serious emulator problems dealing with the gamecube having a mostly Fixed Function Pipeline graphics system that could be updated whenever, a situation that does not translate at all to computer graphics systems that expect you have individual shader programs to call with certain materials, was to write a giant shader that literally emulated the entire gamecube graphics hardware and use that while you wait for the emulated shader to compile. Ubershaders they call it.

pipes•8mo ago
What evidence is there that Sony designed their hardware to be hard to emulated? As an aside: The n64 is hard to emulated and yet ultraHLE appeared right in the middle of its commercial life.
whizzter•8mo ago
Back in those days we didn't have that many cores,etc so the raw computation power of the PS3 was an issue in itself and the SPU was a kind of middle-ground between shaders and CPU's so you probably had to use a regular CPU to emulate it.

We have multicore machines with far more cores today so we can match the computation unit count and performance.

The other part was that older consoles (8 and 16 bit era) really needed a lot of cycle-exact emulation to not fail and that requires an order of magnitude faster CPU's to manage emulating it properly and with CPU's hitting the Ghz limits around the same time we thought it'd be impossible to do that level of cycle-exactness needed.

Luckily though, because the PS3 needed optimal multi-core programming and the way to achieve the maximum throughput for that was to use DMA channels to shuffle data between CPU/SPU parts, emulator authors can probably use them as choke-points to handle emulation on a slightly more high level and avoid trying to manage cycle-exact timings.

ahartmetz•8mo ago
The nice thing about more recent architectures is that no one (including the original authors) can rely on cycle-exactness because of the basically unpredictable effects of caches and speculative execution and bus contention and (...).

Most of these, as a big exception, do not apply to the Cell running code on data in its local memory, but fortunately, it's different as seen from other system components, as you say.

phendrenad2•8mo ago
I never thought PS3 emulation would be significantly ahead of Xbox 360 emulation.