frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Rob Pike's 5 Rules of Programming

https://www.cs.unc.edu/~stotts/COMP590-059-f24/robsrules.html
98•vismit2000•1h ago•45 comments

OpenAI Has New Focus (On the IPO)

https://om.co/2026/03/17/openai-has-new-focus-on-the-ipo/
25•aamederen•50m ago•20 comments

JPEG Compression

https://www.sophielwang.com/blog/jpeg
235•vinhnx•4d ago•47 comments

Write up of my homebrew CPU build

https://willwarren.com/2026/03/12/building-my-own-cpu-part-3-from-simulation-to-hardware/
89•wwarren•2d ago•13 comments

Mistral AI Releases Forge

https://mistral.ai/news/forge
509•pember•14h ago•114 comments

How the Eon Team Produced a Virtual Embodied Fly

https://eon.systems/updates/embodied-brain-emulation
25•LopRabbit•2d ago•6 comments

A Decade of Slug

https://terathon.com/blog/decade-slug.html
632•mwkaufma•16h ago•62 comments

Nightingale – open-source karaoke app that works with any song on your computer

https://nightingale.cafe/
62•rzzzzru•3h ago•11 comments

Microsoft's 'unhackable' Xbox One has been hacked by 'Bliss'

https://www.tomshardware.com/video-games/console-gaming/microsofts-unhackable-xbox-one-has-been-h...
702•crtasm•20h ago•255 comments

Celebrating Tony Hoare's mark on computer science

https://bertrandmeyer.com/2026/03/16/celebrating-tony-hoares-mark-on-computer-science/
41•benhoyt•5h ago•3 comments

Python 3.15's JIT is now back on track

https://fidget-spinner.github.io/posts/jit-on-track.html
391•guidoiaquinti•17h ago•217 comments

Show HN: Pgit – A Git-like CLI backed by PostgreSQL

https://oseifert.ch/blog/building-pgit
54•ImGajeed76•1d ago•19 comments

More than 135 open hardware devices flashable with your own firmware

https://openhardware.directory
244•iosifnicolae2•4d ago•29 comments

(Media over QUIC) on a Boat

https://moq.dev/blog/on-a-boat/
23•mmcclure•4d ago•2 comments

Ndea (YC W26) is hiring a symbolic RL search guidance lead

https://ndea.com/jobs/search-guidance
1•mikeknoop•4h ago

The pleasures of poor product design

https://www.inconspicuous.info/p/the-pleasures-of-poor-product-design
138•NaOH•10h ago•47 comments

Ask HN: What breaks first when your team grows from 10 to 50 people?

21•hariprasadr•3d ago•18 comments

Get Shit Done: A meta-prompting, context engineering and spec-driven dev system

https://github.com/gsd-build/get-shit-done
352•stefankuehnel•15h ago•177 comments

Judge orders restoration of Voice of America

https://apnews.com/article/voice-of-america-kari-lake-trump-cd6d1ef05272f842705da0ed38d3de24
13•geox•42m ago•2 comments

Show HN: Sub-millisecond VM sandboxes using CoW memory forking

https://github.com/adammiribyan/zeroboot
179•adammiribyan•21h ago•46 comments

Have a fucking website

https://www.otherstrangeness.com/2026/03/14/have-a-fucking-website/
474•asukachikaru•7h ago•258 comments

Unsloth Studio

https://unsloth.ai/docs/new/studio
298•brainless•20h ago•58 comments

Animation 10k Starlink Satellites

https://spaceweather.com/archive.php?view=1&day=18&month=03&year=2026
13•MeteorMarc•4h ago•18 comments

Honda is killing its EVs

https://techcrunch.com/2026/03/14/honda-is-killing-its-evs-and-any-chance-of-competing-in-the-fut...
340•sylvainkalache•2d ago•752 comments

A tale about fixing eBPF spinlock issues in the Linux kernel

https://rovarma.com/articles/a-tale-about-fixing-ebpf-spinlock-issues-in-the-linux-kernel/
109•y1n0•10h ago•8 comments

Leviathan (1651)

https://www.gutenberg.org/files/3207/3207-h/3207-h.htm
87•mrwh•3d ago•26 comments

Why AI systems don't learn – On autonomous learning from cognitive science

https://arxiv.org/abs/2603.15381
127•aanet•13h ago•66 comments

Forget Flags and Scripts: Just Rename the File

https://robertsdotpm.github.io/software_engineering/program_names_as_input.html
41•Uptrenda•7h ago•39 comments

It Took Me 30 Years to Solve This VFX Problem – Green Screen Problem [video]

https://www.youtube.com/watch?v=3Ploi723hg4
252•yincrash•4d ago•99 comments

Electron microscopy shows ‘mouse bite’ defects in semiconductors

https://news.cornell.edu/stories/2026/03/electron-microscopy-shows-mouse-bite-defects-semiconductors
80•hhs•4d ago•20 comments
Open in hackernews

Doom GPU Flame Graphs

https://www.brendangregg.com/blog/2025-05-01/doom-gpu-flame-graphs.html
107•zdw•10mo ago

Comments

forrestthewoods•10mo ago
Neat.

I’ll be honest, I kinda don’t get flame graphs. I mean I understand what they are. I have just always strictly preferred a proper timeline view ala Superluminal or Tracy.

Using 20ms chunks for a game is fine but also super weird. Ain’t no game using 20ms frames! So if you were using this for real you’d get all kinds of oddities. Just give me a timeline and call it a day plz.

tibbar•10mo ago
Flame graphs are definitely less sophisticated than Superluminal/Tracy/etc, but that's a part of the attraction - you can visualize the output of many profiling tools as a flamegraph without prior setup. I also think it's a pretty good UX for the "which function is the performance bottleneck" game.
Veserv•10mo ago
The difference between a flame graph and a trace visualization is that a flame graph is a aggregate/summary visualization. It helps visualize total runtime attributed to functions.

It is like the difference between seeing the mean of a distribution and seeing a plot of every datapoint in the distribution. They are useful for different purposes.

An example of how you might use it in conjunction with a trace visualizer is that you would select a time span in a trace and generate a flame graph for the selection. This would show you which functions and call stacks were responsible for most of the execution time in the selection. You would then use that to find one of those call stacks in the trace to examine how they execute to see if it makes sense.

gerdesj•10mo ago
The game model might involve 20ms time slices. The frame rate is simply the best available visualisation of the "action" that the machine can manage.

So, you have your game model, input and output. Output needs to be good enough to convince you that you are in control and immersive enough to keep you engaged and input needs to be responsive enough to feel that you are in control. The model needs to keep track of and co-ordinate everything.

I'm old enough to still own a Commodore 64 and before that I played games and wrote some shit ones on ZX 80, 81 and Speccies. I typed in a lot of DATA statements back in the day (40 odd years ago)!

When you pare back a game to the bare basics - run it on a box with KB to deal with instead of GB - you quite quickly get to understand constraints.

Things are now way more complicated. You have to decide whether to use the CPU or the GPU for each task.

fennecbutt•10mo ago
I think flame graphs are perfect for what they do, compressing multi dimensional data down into fewer dimensions.

It makes it a lot easier to visualise at a glance, and sometimes an issue is obvious from the flame graph.

But you're right, for complex issues I find I need to dig deeper than that and view everything linearly.

They're just nice for glaring issues, it's like a mini dashboard almost.

bobmcnamara•10mo ago
Loads of older console games used 20ms fields in Europe.

Edit: also my laptop can, but I'm not into that sort of thing.

hyperman1•10mo ago
Makes sence. 20ms is 50hz, the European net frequency. All TVs sync with it, so old game consoles had to.
forrestthewoods•10mo ago
If if the game runs at 20ms frames you don’t want to sample an arbitrary sequence of 20ms slices.
brendangregg•10mo ago
The origin problem for flame graphs was MySQL server performance involving dozens of threads: as a timeline view you need dozens of timelines, one for each thread, since if you render it on one (I know this is probably obvious) then you have samples from different threads from one moment to the next turning the visualization into hair. Flame graphs scale forever and always show the aggregate: any number of threads, servers, microservices, etc.

I think great UI should do both: have a toggle for switching between flame graphs (the summary) and timelines (aka "flame charts") for analyzing time-based patterns. I've encouraged this before and now some do provide that toggle, like Firefox's profiler (Flame Graphs and Stack Charts for timeline view).

As for 20ms, yes, we do want to take it down. A previous HN comment from years ago, when I first published FlameScope, was to put a game frame on the y-axis instead of 1 second, so now each column shows the rendering of a game frame, and you can see time-offset patterns across the frames (better than a time-series timeline). We started work on it and I was hoping to include it in this post. Maybe next one.

forrestthewoods•10mo ago
I’ve never actually seen a profiler that shows quite what I want. I have lots of subsystems running at different rates. Gameplay at 30Hz, visual render at 90Hz, physics at 200Hz, audio at some rate, network, some device, etc.

So what I want is the ability to view each subsystem in a manner that lets me see when it didn’t hit its update rate. I have many many different frame rates I care about hitting.

Of course things even get more complex when you have all the work broadly distributed with a job system…

foota•10mo ago
Timelines are good when things happen once, but when you have repeated calls to functions from different places etc., a flame graph helps a lot.

Sandwich views supporting collapsing recursion are the secret sauce for flame graphs imo. See e.g,. https://pyroscope.io/blog/introducing-sandwich-view/

coherentpony•10mo ago
> Ain’t no game using 20ms frames!

A frame every 20ms equates to 50 frames per second. Doesn't seem too unreasonable for a modern game.

60 frames per second would be one frame every ~16 ms.

forrestthewoods•10mo ago
Correct. Which means that every 20ms pixel slices two or three frames. Which is a really really bad way to profile!
brendangregg•10mo ago
I could just regenerate these heat maps with 60 rows instead of 50. I'm limited by the sampling rate that was captured in the profile data file. To provide even more resolution (so you had many samples within a game frame) I'd need to re-profile the target with a higher frequency.

When Martin, my colleague at Netflix at the time, built a d3 version of FlameScope, he put a row selector in the UI: https://github.com/Netflix/flamescope

wtallis•10mo ago
It sounds like your problem might be not with the visualization itself, but with the underlying idea of a sampling profiler as opposed to tracing every single call from every single frame.
forrestthewoods•10mo ago
No. Sampling profilers are great. Most powerful is of course a mix of sampling and instrumentation. But nothing beats the feeling of a sampling profiler fixing big issues in under 5 minutes.

Flamegraphs are a nice tool to have in the bag I suppose. But they’re more tertiary than primary or even secondary to me.

coherentpony•10mo ago
> Correct. Which means that every 20ms pixel slices two or three frames. Which is a really really bad way to profile!

If 20 ms is a reasonable frame time for a modern game, why is it an unreasonable thing to profile?

I understand other, shorter, frame times may be interesting to profile too. My point is that if you want to understand a reasonable or realistic workload, then it should also be reasonable to profile that workload.

forrestthewoods•10mo ago
The issue isn’t that 20ms is an unreasonable slice size. The issue is you can’t perform an arbitrary slice.

Imagine a game that runs at 50Hz/20ms frame. Unusual but let’s go with it because the exact value doesn’t matter. Ideally this update takes AT MOST 20ms. Otherwise we miss a frame. Which means most frames actually take maybe 15ms. And some may take only 5ms. If you drew this on a timeline there would be obvious sleeps waiting for the next frame to kick off.

If you take an arbitrary sequence of 20ms slices you’re not going to capture individual frames. You’re going to straddle frames. Which is really bad and means each pixel is measuring a totally different body of work.

Does that make sense?

coherentpony•10mo ago
Ah yes. Ok.

Yes, that makes perfect sense. Thanks.

rawling•10mo ago
A few comments, 2 days ago

https://news.ycombinator.com/item?id=43846283

gitroom•10mo ago
Pretty cool seeing people actually care so much about profiling tools. You think we ever get one tool that really covers enough to keep everyone happy?
saagarjha•10mo ago
Considering that I hate flamegraphs probably not
benpoulson•10mo ago
What’s your preferred way of visualising performance?
saagarjha•10mo ago
Depends but if it's samples I'd usually reach for a hierarchical outline view. If it's time series data then probably a bunch of tracks.
badsectoracula•10mo ago
Flamegraphs are neat but the call graph in Luke Stackwalker[0] was more immediately obvious to me (especially since it draws a thick red line for the hottest path) than them.

Another approach is one i used for a profiler i wrote some time ago (and want to port to Linux at some point)[1] which displays the hottest "traces" (i.e. callstacks). One neat aspect of this is that you can merge multiple traces using a single function as the "head" (so, e.g., if there are two separate traces that contain a function "Foo" somewhere, you can select to use "Foo" as the starting point and they'll be treated as the same trace with their hitcounts combined).

[0] https://lukestackwalker.sourceforge.net/

[1] http://runtimeterror.com/tools/fpwprof/index.html

MathMonkeyMan•10mo ago
Love the screenshot with the million shotgunners and cyberdemons, and a chainsaw.
victor_xuan•10mo ago
May be I am too young but can someone explain this obsession with Doom on Hacker News?
prox•10mo ago
It’s not an obsession, it’s probably because Doom is easy to understand from a code perspective, and also addresses a lot of graphical/game code techniques, aka its a perfect hobby from a coding perspective to learn, adapt and tweak. It’s actually the perfect example of Hacking.

Also this hacking started early on, so there is probably tons and tons of documentation and data, again making it a great candidate to work with it from a hacking perspective.

hyperman1•10mo ago
I've lived trough the doom craze. My perspective:

It was a game that clearly advanced the state of the art. Even in the month before it came out, people were claiming it simply could not be done technically on the underpowered PC technology of that time. Even after it came out, it took others a year to catch up.

Expectations at release were sky high, and Doom still overdelivered. The tech, the gameplay, the esport aspect, the gritty graphics theming. It was all new and daring.

The BSP was an obscure thing nobody understood. It took the DEU people months to wrap their heads around the NODES datastructure.

When the modding tools were finally available, the whole world became 1 big modding community for the same game. That amount of focus on 1 game has never happened before or since. Modding continues to this day, 30 years later.

Then the source code was delivered, for free, to everyone. It was easily readable and told every secret. We all went crazy. It was studied by every programming student. Even today, no other codebase has the same amount of scrutinity, with every bug and detail documented.

Myhouse.wad was a great example of how far a doom mod can be pushed. But it is also a testament to the collective midlife crisis of the doomers from that age, all of us yearning for the good old days.

YuxiLiuWired•10mo ago
Is it possible to make Doom in the GPU flame graphs?