frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Nestlé couldn't crack Japan's coffee market.Then they hired a child psychologist

https://twitter.com/BigBrainMkting/status/2019792335509541220
1•rmason•21s ago•0 comments

Notes for February 2-7

https://taoofmac.com/space/notes/2026/02/07/2000
2•rcarmo•1m ago•0 comments

Study confirms experience beats youthful enthusiasm

https://www.theregister.com/2026/02/07/boomers_vs_zoomers_workplace/
1•Willingham•8m ago•0 comments

The Big Hunger by Walter J Miller, Jr. (1952)

https://lauriepenny.substack.com/p/the-big-hunger
1•shervinafshar•10m ago•0 comments

The Genus Amanita

https://www.mushroomexpert.com/amanita.html
1•rolph•14m ago•0 comments

We have broken SHA-1 in practice

https://shattered.io/
1•mooreds•15m ago•1 comments

Ask HN: Was my first management job bad, or is this what management is like?

1•Buttons840•16m ago•0 comments

Ask HN: How to Reduce Time Spent Crimping?

1•pinkmuffinere•17m ago•0 comments

KV Cache Transform Coding for Compact Storage in LLM Inference

https://arxiv.org/abs/2511.01815
1•walterbell•22m ago•0 comments

A quantitative, multimodal wearable bioelectronic device for stress assessment

https://www.nature.com/articles/s41467-025-67747-9
1•PaulHoule•24m ago•0 comments

Why Big Tech Is Throwing Cash into India in Quest for AI Supremacy

https://www.wsj.com/world/india/why-big-tech-is-throwing-cash-into-india-in-quest-for-ai-supremac...
1•saikatsg•24m ago•0 comments

How to shoot yourself in the foot – 2026 edition

https://github.com/aweussom/HowToShootYourselfInTheFoot
1•aweussom•24m ago•0 comments

Eight More Months of Agents

https://crawshaw.io/blog/eight-more-months-of-agents
3•archb•26m ago•0 comments

From Human Thought to Machine Coordination

https://www.psychologytoday.com/us/blog/the-digital-self/202602/from-human-thought-to-machine-coo...
1•walterbell•27m ago•0 comments

The new X API pricing must be a joke

https://developer.x.com/
1•danver0•28m ago•0 comments

Show HN: RMA Dashboard fast SAST results for monorepos (SARIF and triage)

https://rma-dashboard.bukhari-kibuka7.workers.dev/
1•bumahkib7•28m ago•0 comments

Show HN: Source code graphRAG for Java/Kotlin development based on jQAssistant

https://github.com/2015xli/jqassistant-graph-rag
1•artigent•33m ago•0 comments

Python Only Has One Real Competitor

https://mccue.dev/pages/2-6-26-python-competitor
4•dragandj•34m ago•0 comments

Tmux to Zellij (and Back)

https://www.mauriciopoppe.com/notes/tmux-to-zellij/
1•maurizzzio•35m ago•1 comments

Ask HN: How are you using specialized agents to accelerate your work?

1•otterley•37m ago•0 comments

Passing user_id through 6 services? OTel Baggage fixes this

https://signoz.io/blog/otel-baggage/
1•pranay01•37m ago•0 comments

DavMail Pop/IMAP/SMTP/Caldav/Carddav/LDAP Exchange Gateway

https://davmail.sourceforge.net/
1•todsacerdoti•38m ago•0 comments

Visual data modelling in the browser (open source)

https://github.com/sqlmodel/sqlmodel
1•Sean766•40m ago•0 comments

Show HN: Tharos – CLI to find and autofix security bugs using local LLMs

https://github.com/chinonsochikelue/tharos
1•fluantix•41m ago•0 comments

Oddly Simple GUI Programs

https://simonsafar.com/2024/win32_lights/
1•MaximilianEmel•41m ago•0 comments

The New Playbook for Leaders [pdf]

https://www.ibli.com/IBLI%20OnePagers%20The%20Plays%20Summarized.pdf
1•mooreds•41m ago•1 comments

Interactive Unboxing of J Dilla's Donuts

https://donuts20.vercel.app
1•sngahane•43m ago•0 comments

OneCourt helps blind and low-vision fans to track Super Bowl live

https://www.dezeen.com/2026/02/06/onecourt-tactile-device-super-bowl-blind-low-vision-fans/
1•gaws•44m ago•0 comments

Rudolf Vrba

https://en.wikipedia.org/wiki/Rudolf_Vrba
1•mooreds•45m ago•0 comments

Autism Incidence in Girls and Boys May Be Nearly Equal, Study Suggests

https://www.medpagetoday.com/neurology/autism/119747
1•paulpauper•46m ago•0 comments
Open in hackernews

Why is PS3 emulation so fast: RPCS3 optimizations explained [video]

https://www.youtube.com/watch?v=19ae5Mq2lJE
107•alexjplant•8mo ago

Comments

seam_carver•8mo ago
Happy that RPCS3 has added native apple silicon support
snvzz•8mo ago
Hopeful for RISC-V.
leshokunin•8mo ago
What’s particularly interesting here is that Sony and IBM spent a billion dollars to make the Cell. It was designed to be completely different from previous console CPUs. Even more so than the PS2’s “emotion engine” combo. So the fact that it’s so well emulated and also performant is remarkable!
deaddodo•8mo ago
> It was designed to be completely different from previous console CPUs.

Sure, but they used a PowerPC core with specialized SIMD processors that they then extensively documented:

https://arcb.csc.ncsu.edu/~mueller/cluster/ps3/SDK3.0/docs/a...

If they hadn't done the latter, it would probably have taken a lot longer to reverse engineer. It also would have made it near impossible for developers to code effectively for it, however.

phoe-krk•8mo ago
It's unbelievable that over the whole course of PS3's lifespan we've gone from "we will never be able to emulate this at full speed, the hardware is too slow and the Cell architecture too alien" to "why is PS3 emulation so fast, optimizations explained". I've been loosely tracking various emulators' progress and it's hats off with regards to the ingenuity behind all of the mechanisms that make it possible to emulate things fast enough.
FirmwareBurner•8mo ago
To be fair PC CPUs and GPUs have evolved leaps and bounds form the beginning of PS3 emulation till today.
deaddodo•8mo ago
I don't think anyone with knowledge of emulation (from a research and development side) would say it's impossible. The Cell is a standard PPC core with some well-documented[1] coprocessors.

A more realistic response would be: "computing hardware would not be powerful enough to emulate the PS3 in it's lifetime". We're now two decades out from it's release, and a decade out from it's final phase-out, so it seems that was a fair assessment.

1 - https://arcb.csc.ncsu.edu/~mueller/cluster/ps3/SDK3.0/docs/a...

magic_hamster•8mo ago
Sony learned their lesson from what happened with the PS1, where commercial emulators like Bleem became available during the product's lifetime. It was probably not a huge deal in terms of lost sales, but Sony really didn't like this, as evident by their lawsuit (which also failed).

The PS2 with its Emotion engine was a huge leap which was pretty hard to emulate for a while. And the PS3 was even harder. Yes the developers hated the Cell architecture, but overall, Sony managed to create a pretty good system which spawned incredible games, while also being so hard to emulate that it took over a decade to reach a point where it's done properly, and almost 20 years to reach a point where it's considered really fast.

Compare this to the Switch, which was being emulated pretty well from the get go. This allowed some people to just make do with an emulator instead of buying the console (and the games). Actually this goes for pretty much all Nintendo devices.

glimshe•8mo ago
Sony didn't create the cell architecture to prevent efficient emulation. At the time, manufacturers tried to get as much performance as possible from the manufacturing dollar under the assumption that developers would optimize their games for the machine. It was actually a partial failure, as few third party titles made full use of the architecture.
whizzter•8mo ago
Kinda, in so many respects the PS3 SPU's that many hated was just taking the PS2 VU's to the next level as the programming model was very similar(shuffle blocks of data via DMA to fast vector units).

As a former PS2 developer I mostly thought "cool, VU's with more memory".

Few games really utilized the PS2 to it's fullest either (there's an port of GTA3 and GTA:VC to the older Dreamcast that's coming along very well).

The thing that really bit Sony here for the PS3 was that many PS2 titles (The PS2 GTA games being the prime example!) used the Renderware engine (a few others were available but it was the most popular afaik), so the complexity of the PS2 never really hit developers who were making games just below the absolute top tier.

When EA bought up Renderware slightly before the PS3 release, they closed off sales while honoring existing sales only so the most used cross platform engine was suddenly off limits to most third parties for the PS3 (Iirc is why Rockstar released that ping-pong game as an engine test before GTA4 and 5).

And the perceptions about third party engines also took a hit so not only was the most popular engine closed off, bigger developers were also became wary of relying on third party engines at all (during the PS3 period) until Unity later took off from indie usage.

pipes•8mo ago
That is really interesting thanks. I always wondered what happened to renderware or why I stopped seeing it after the PS2.
rounce•8mo ago
> Actually this goes for pretty much all Nintendo devices.

Roughly 30 years later and N64 emulation is not fully solved.

mrguyorama•8mo ago
Fully solved how? It's in a great state.

Angrylion brought a new paradigm to n64 emulation, which is "fuck it, Low Level Emulation is fully doable now", and then that incredibly successful work was ported to run as a GPU shader, where it works a million times better! Now even medium powered devices, like the Steam Deck, can run low level emulation of n64 games at upscaled resolution and never run into graphics bugs, have fewer other bugs, have great performance, etc.

Classic bugs like the perfect dark remote camera that always had trouble on High Level Emulator plugins are just gone, no tweaks required. Games that wrote their own microcode run with no trouble. The crazy shit Rare and Factor 5 did at the end of the console's lifecycle just works in emulators.

https://www.libretro.com/index.php/category/parallel-n64/

Notably, Modern Vintage Gamer released a video titled "Why is Nintendo 64 emulation still a broken mess in 2025" and to make that video he had to contrive dumb scenarios: Running n64 emulation on the Playstation Vita and a Raspberry Pi.

Efficient and accurate high level emulation of the n64 is just not possible. You can't properly emulate the complex interactions going on in the n64 without huge amounts of overhead, it's too interconnected. Angrylion and Parallel-n64 proved, with that amount of overhead, you might as well do pixel accurate low level emulation and just eliminate an entire class of emulation bugs. When angrylion came out, around 2017, even a shitty laptop with a couple cores could run n64 games pixel accurate at native resolution and full speed.

In fact, on the Raspberry Pi that MVG is complaining about in the above mentioned video, he is complaining that "n64 emulation is a broken mess" because he is getting 50fps in conkers bad fur day. Because he is running it upscaled. He's complaining "n64 emulation is a broken mess" because the Raspberry Pi has a garbage GPU. Laptop integrated GPUs, even budget laptop integrated GPUs have no problems with parallel-n64

High level emulation was always a crutch, and never a good approach for n64 emulation. Even in it's heyday, it relied on per-game patches.

Notably, the Dolphin team ended up finding the same reality. What finally solved some serious emulator problems dealing with the gamecube having a mostly Fixed Function Pipeline graphics system that could be updated whenever, a situation that does not translate at all to computer graphics systems that expect you have individual shader programs to call with certain materials, was to write a giant shader that literally emulated the entire gamecube graphics hardware and use that while you wait for the emulated shader to compile. Ubershaders they call it.

pipes•8mo ago
What evidence is there that Sony designed their hardware to be hard to emulated? As an aside: The n64 is hard to emulated and yet ultraHLE appeared right in the middle of its commercial life.
whizzter•8mo ago
Back in those days we didn't have that many cores,etc so the raw computation power of the PS3 was an issue in itself and the SPU was a kind of middle-ground between shaders and CPU's so you probably had to use a regular CPU to emulate it.

We have multicore machines with far more cores today so we can match the computation unit count and performance.

The other part was that older consoles (8 and 16 bit era) really needed a lot of cycle-exact emulation to not fail and that requires an order of magnitude faster CPU's to manage emulating it properly and with CPU's hitting the Ghz limits around the same time we thought it'd be impossible to do that level of cycle-exactness needed.

Luckily though, because the PS3 needed optimal multi-core programming and the way to achieve the maximum throughput for that was to use DMA channels to shuffle data between CPU/SPU parts, emulator authors can probably use them as choke-points to handle emulation on a slightly more high level and avoid trying to manage cycle-exact timings.

ahartmetz•8mo ago
The nice thing about more recent architectures is that no one (including the original authors) can rely on cycle-exactness because of the basically unpredictable effects of caches and speculative execution and bus contention and (...).

Most of these, as a big exception, do not apply to the Cell running code on data in its local memory, but fortunately, it's different as seen from other system components, as you say.

phendrenad2•8mo ago
I never thought PS3 emulation would be significantly ahead of Xbox 360 emulation.