frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Ask HN: How do you deal with people who trust LLMs?

95•basilikum•3h ago•104 comments

Ask HN: What breaks first when your team grows from 10 to 50 people?

89•hariprasadr•3d ago•69 comments

Spotify playing ads for paid subscribers

101•IncandescentGas•12h ago•82 comments

Ask HN: How is AI-assisted coding going for you professionally?

427•svara•3d ago•611 comments

Ask HN: Which router makers do you trust most?

7•general_reveal•6h ago•5 comments

Ask HN: How are you protecting yourself from skill atrophy?

4•xpnsec•9h ago•5 comments

Ask HN: What is it like being in a CS major program these days?

204•tathagatadg•2d ago•195 comments

Ask HN: Is vibe coding a new mandatory job requirement?

30•newswangerd•1d ago•65 comments

Ask HN: Why is everyone on HN obsessed with Rust?

9•goldkey•6h ago•8 comments

Ask HN: Is Claude down Again?

6•rreyes1979•17h ago•5 comments

Open AI is actively censoring information about voting today in the US

10•resters•1d ago•14 comments

Tell HN: AI tools are making me lose interest in CS fundamentals

97•Tim25659•3d ago•91 comments

Skills Manager – manage AI agent skills across Claude, Cursor, Copilot

3•evergreenxx•18h ago•5 comments

Claude Code 500s

16•bavarianbob•1d ago•5 comments

Knowledge workers managing AI show collapsed productivity, not just a plateau

6•dfordp11•1d ago•2 comments

Ask HN: How do you handle payments for AI agents?

2•bahaghazghazi•1d ago•3 comments

Ask HN: Have you successfully treated forward head posture ("nerd neck")?

57•trashymctrash•4d ago•45 comments

Apple Screen Sharing High Performance

7•chapoly1305•1d ago•0 comments

Ask HN: We need to learn algorithm when there are Claude Code etc.

11•JasonHEIN•1d ago•2 comments

Claude Is Having an Outage

49•theahura•1d ago•17 comments

Ask HN: How are you doing technical interviews in the age of Claude/ChatGPT?

6•jonjou•1d ago•6 comments

I'm 60 years old. Claude Code killed a passion

242•fred1268•3d ago•188 comments

Ask HN: Did GitHub remove Opus and Sonnet from their Copilot Pro subscription?

13•lgl•2d ago•7 comments

Ask HN: How to Learn C++ in 2026?

14•creatorcoder•3d ago•14 comments

It feels like Claude goes down almost daily now

27•mrprincerawat•1d ago•7 comments

Ask HN: The trickiest bug you've encountered?

4•chistev•8h ago•6 comments

Do you really need an agent?

9•g_br_l•3d ago•7 comments

Tell HN: Godaddy DNS resolution down for 2+ hours

9•codegeek•2d ago•0 comments

You've reached the end!

Open in hackernews

Ask HN: The trickiest bug you've encountered?

4•chistev•8h ago

Comments

latexr•8h ago
Selection of stories.

https://500mile.email

chistev•8h ago
Cool!
ksherlock•8h ago
A buffer underflow that overwrote a pointer that overwrote 1 byte of code in the multiplication library (no hardware multiplication or memory protection, by the way) that caused unsigned multiplication to be handled as signed multiplication (or perhaps vice versa). This didn't manifest until much later, of course.
AnimalMuppet•7h ago
We had a function that looked like

  void f()
  {
    bool run = true;
    while (run)
    {
      g();
    }
  }
This function was exiting, and not because something threw. So the loop was terminating. And when it did, run was false.

The obvious answer is that g() was smashing the stack. But I ruled that out, because g() was returning to f(), and if g() was smashing the stack, I would expect it to destroy the return address before it destroyed a variable in f().

I tried to solve it for a month, off and on. Every time I tried to get more information, the problem disappeared.

Finally I got desperate enough to look at the assembly output of the compiler, and light dawned. (This was g++ on an ARM, by the way.)

run, the bool in f(), had no address. It lived in register R12. When f() called g(), it pushed the return address. In the implementation of g(), the first thing it did is push R12 so that it could place its own variable in the scratchpad register. So f()'s local variable wound up in g()'s stack frame...

And g() was smashing the stack. Duh.

In particular, it called msgrcv(int msqid, const void msgp, size_t msgsz, int msgflg), which has a highly misleading API. (For those not in the know, this is a POSIX message queue implementation.) It expects msgp to point to a structure like

  struct msgbuf {
      long mtype;       /* message type, must be > 0 */
      char mtext[1];    /* message data */
  };
and msgsize is the size of msgbuf.mtext array, not* the size of msgbuf.

The contractors who wrote this code used the size of msgbuf, which is 4 bytes too high, so they were writing four bytes too many, which happened to overwrite the pushed value of R12, which was f()'s run variable. (The queue did not get out of sync, because they made the same mistake on the other side, and wrote four bytes too many as well.)

One more twist: The message queue was actually wrapping communication with another CPU. So whenever an unrelated four bytes on a different CPU were zero, then the loop would exit and f() would terminate.

TacticalCoder•7h ago
Already posted it here in the past, answering the same question (and some people seem to like it so here we go).

Around 1991 I was writing a DOS game... In a very rare circumstance the game would crash but it could happen after playing for 15 minutes or more. Sometimes not at all. I couldn't make sense of it.

At some point I decided to rewrite my entire game loop to make the game engine fully deterministic: input, time (frame) at which input happened. So that I could then record myself playing the game and replay it fully deterministically.

Except this was in 1991 and deterministic game engines did not exist back then. The first time I read about one was on a postmortem about Age of Empire on Gamasutra (IIRC). I even wrote to the article's author telling him: "Oh wow, it's the first time I read about a deterministic game engine. I made one in 1991 but since then had never heard about anybody using one." and he answered, as much excited as I was, saying he didn't know about any game doing that in 1991 either and he liked why I came up with it.

Since then it became extremely common: a game like Warcraft III for example, where there can be hundreds of units, has tiny save game files for it only records inputs and the time at which they happened (and btw it of course requires to have a same version of the game engine, or a backward compatible one, to be able to replay the save files).

But Age of Empire (1997) is the first one that I remember describing using such an engine.

Back to my 1991 DOS game... I rewrote the game engine, wrote a simple recorder recording the inputs, and played and played and played until it crashed. I then replayed the game (seen that now I could): and, sure enough, the game crashed. Huge relief. At that point I knew the bug was dead: I could reproduce it, I knew I'd smash it.

Turns out: when the hero had taken an extra allowing it to fire two shot at once and would fire two shots, and the first shot would kill the last thing on the level, then the second shot would keep living it's life during the next level (my logic would keep updating that shot and overwriting memory it wasn't supposed to access), happily corrupting memory until something would make the game crash.

It was tricky because it require a special condition.

And the only way I found to be able to reproduce the bug was to basically invent the concept of a deterministic game engine. Or at the very least independently discover it.

The game was never published but it's how my career started (very long story, for a blog or something).

P.S: if anyone know of a game using a deterministic engine from before 1991, I'm all ears (especially if it's an arcade one: that'd really make my day).

chistev•6h ago
Cool story