frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

JP Morgan staff told they must share biometric data to access headquarters

https://www.theguardian.com/business/2025/oct/10/jp-morgan-staff-told-they-must-share-biometric-d...
1•Maf1•36s ago•0 comments

Programming in the Sun: A Year with the Daylight Computer

https://wickstrom.tech/2025-10-10-programming-in-the-sun-a-year-with-the-daylight-computer.html
1•owickstrom•2m ago•0 comments

Governments are spending billions on their own 'sovereign' AI technologies

https://www.theguardian.com/technology/2025/oct/09/governments-spending-billions-sovereign-ai-tec...
1•JeanKage•4m ago•0 comments

A major evolution of Apple Security Bounty

https://security.apple.com/blog/apple-security-bounty-evolved/
1•jacopoj•5m ago•0 comments

An ESP32 3D Printed Robotic Grabber with a POV Cam [video]

https://www.youtube.com/watch?v=8faEnWKrBrA
1•meilily•6m ago•0 comments

Non-stop snow, instant noodles and yaks: tales of being trapped on Everest

https://www.theguardian.com/world/2025/oct/10/hikers-trapped-rescued-everest-survival-stories
1•tosh•7m ago•0 comments

A Deep Dive into MCP and the Future of AI Tooling

https://a16z.com/a-deep-dive-into-mcp-and-the-future-of-ai-tooling/
1•emreb•8m ago•0 comments

Vibe coding ChatGPT apps [video]

https://www.youtube.com/watch?v=Zt-XNN1mxDA
1•ainiro•12m ago•0 comments

The Embarrassing Ruby/Rails Subreddit Chronicles 2025-10-09

https://andymaleh.blogspot.com/2025/10/the-embarrassing-rubyrails-subreddit.html
1•unripe_syntax•17m ago•0 comments

Database Client for Convex

https://pluk.sh
1•m2fauzaan•18m ago•0 comments

Show HN: Lo fi beats to vibe code to – infinite diffs and lo fi

https://vibecafe.briansunter.com/
1•bribri•20m ago•0 comments

Gemstone Software Design [video]

https://www.youtube.com/watch?v=oYxWHgO_Ogo
1•msuniverse2026•21m ago•0 comments

Easy Cloud Storage Solution for Individuals? – Try MeshDrive

1•hardikprl94•21m ago•0 comments

FramePack Studio

https://framepack.studio/
1•yuyu74189w•22m ago•0 comments

Vard – Zod-inspired prompt injection detection for TypeScript

https://github.com/andersmyrmel/vard
1•andersmyrmel•24m ago•0 comments

Parallelizing Cellular Automata with WebGPU Compute Shaders

https://vectrx.substack.com/p/webgpu-cellular-automata
5•ibobev•24m ago•0 comments

Show HN: Quick Share App:I built a app can share files via local Wi-Fi or LAN

https://quick-share.app/
1•jumpdong•25m ago•0 comments

More on Carmichael

https://www.johndcook.com/blog/2025/10/09/more-on-carmichael/
1•ibobev•26m ago•0 comments

Fermi Paradox Weakens

2•fym•28m ago•0 comments

Show HN: Pilot Kit – An all-in-one toolkit I built for private pilot training

https://air.club/
1•Michael9876•29m ago•0 comments

Instarid: Free and Add-Free Tool to Plan Your Instagram Feed

https://instagrid.siquemlabs.com/
1•theolouvel•30m ago•0 comments

H1: Bootstrapping LLMs to Reason over Longer Horizons via Reinforcement Learning

https://arxiv.org/abs/2510.07312
1•saynotocoffee•32m ago•0 comments

Truth-Aware Decoding: Program Logic for Factual LMs

https://arxiv.org/abs/2510.07331
2•HenryAI•35m ago•1 comments

US anti-fascism expert blocked from flying to Spain at airport

https://www.theguardian.com/us-news/2025/oct/09/anti-fascism-mark-bray-rutgers-university
5•saubeidl•38m ago•0 comments

Nobel Peace Prize 2025: Venezuelan Politician Maria Corina Machado

https://www.bbc.com/news/live/c1l80g1qe4gt
6•DDerTyp•40m ago•0 comments

Nobel Peace Prize 2025

https://www.nobelprize.org/prizes/peace/2025/machado/facts/
28•mitchbob•41m ago•0 comments

Microsoft hypes PCs with NPUs, still can't offer a good reason to buy one

https://www.theregister.com/2025/10/10/microsoft_npu_windows_opinion/
1•YeGoblynQueenne•41m ago•0 comments

Nobel Peace Prize 2025: María Corina Machado

https://www.nobelprize.org/prizes/peace/2025/summary/
82•pykello•41m ago•48 comments

Show HN: I invented a new generative model and got accepted to ICLR

https://discrete-distribution-networks.github.io/
2•diyer22•43m ago•0 comments

Nobel Peace Prize – María Corina Machado

https://www.nobelprize.org/prizes/peace/2025/press-release/
4•lode•43m ago•1 comments
Open in hackernews

Love C, Hate C: Web Framework Memory Problems

https://alew.is/lava.html
50•OneLessThing•6h ago

Comments

jacquesm•4h ago
There are many, many more such issues with that code. The person that posted it is new to C and had an AI help them to write the code. That's a recipe for disaster, it means the OP does not actually understand what they wrote. It looks nice but it is full of footguns and even though it is a useful learning exercise it also is a great example of why it is better run battle tested frame works than to inexpertly roll your own.

As a learning exercise it is useful, but it should never see production use. What is interesting is that the apparent cleanliness of the code (it reads very well) is obscuring the fact that the quality is actually quite low.

If anything I think the conclusion should be that AI+novice does not create anything that is useable without expert review and that that probably adds up to a net negative other than that the novice will (hopefully) learn something. It would be great if someone could put in the time to do a full review of the code, I have just read through it casually and already picked up a couple of problems, I'm pretty sure that if you did a thorough job of it there would be many more.

drnick1•3h ago
> What is interesting is that the apparent cleanliness of the code (it reads very well) is obscuring the fact that the quality is actually quite low.

I think this is a general feature and one of the greatest advantages of C. It's simple, and it reads well. Modern C++ and Rust are just horrible to look at.

messe•3h ago
I slightly unironically believe that one of the biggest hindrances to rust's growth is that it adopted the :: syntax from C++ rather than just using a single . for namespacing.
jacquesm•3h ago
I believe that the fanatics in the rust community were the biggest factor. They turned me off what eventually became a decent language. There are some language particulars that were strange choices, but I get that if you want to start over you will try to get it all right this time around. But where the Go authors tried to make the step easy and kept their ego out of it, it feels as if the rust people aimed at creating a new temple rather than to just make a new tool. This created a massive chicken-and-the-egg problem that did not help adoption at all. Oh, and toolchain speed. For non-trivial projects for the longest time the rust toolchain was terribly slow.

I don't remember any other language's proponents actively attacking the users of other programming language.

imtringued•2h ago
Software vulnerabilities are an implicit form of harassment.
messe•2h ago
I'm hoping that's meant to satirise the rust community, because it's horseshit like this that makes a sizeable subset of rust evangelists unbearable.
01HNNWZ0MV43FF•2h ago
> I don't remember any other language's proponents actively attacking the users of other programming language.

I just saw someone on Hacker News saying that Rust was a bad language because of its users

jacquesm•1h ago
Yawn. Really, if you have nothing to say don't do it here.
citbl•3h ago
The safer the C code, the more horrible it starts looking though... e.g.

    my_func(char msg[static 1])
uecker•3h ago
Compared to other languages, this is still nice.
jacquesm•1h ago
It is - like everything else - nice because you, me and lots of others are used to it. But I remember starting out with C and thinking 'holy crap, this is ugly'. After 40+ years looking at a particular language it no longer looks ugly simply because of familiarity. But to a newcomer C would still look quite strange and intimidating.

And this goes for almost all programming languages. Each and every one of them has warts and issues with syntax and expressiveness. That holds true even for the most advanced languages in the field, Haskell, Erlang, Lisp and more so for languages that were originally designed for 'readability'. Programming is by its very nature more akin to solving a puzzle than to describing something. The puzzle is to how to get the machine to do something, to do it correctly, to do it safely and to do it efficiently, and all of those while satisfying the constraint of how much time you are prepared (or allowed) to spend on it. Picking the 'right' language will always be a compromise on some of these, there is no programming language that is perfect (or even just 'the best' or 'suitable') for all tasks, and there are no programming languages that are better than any other for any subset of all tasks until 'tasks' is a very low number.

OneLessThing•3h ago
I agree that it reads really well which is why I was also surprised the quality is not high when I looked deeper. The author claims to have only used AI for the json code, so your conclusion may be off, it could just be a novice doing novice things.

I suppose I was just surprised to find this code promoted in my feed when it's not up to snuff. And I'm not hating, I do in fact love the project idea.

lifthrasiir•3h ago
Yeah, I recently wrote a moderate amount of C code [1] entirely with Gemini and while it was much better than what I initially expected I needed a constant steering to avoid inefficient or less safe code. It needed an extensive fuzzing to get the minimal amount of confidence, which caught at least two serious problems---seriously, it's much better than most C programmers, but still.

[1] https://github.com/lifthrasiir/wah/blob/main/wah.h

jacquesm•3h ago
I've been doing this the better part of a lifetime and I still need to be careful so don't feel bad about it. Just like rust has an 'unsafe' keyword I realize all of my code is potentially unsafe. Guarding against UB, use-after-free, array overruns and so on is a lot of extra work and you only need to slip up once to have a bug, and if you're unlucky something exploitable. You get better at this over the years. But if I know something needs to be bullet proof the C compiler would not be my first tool of choice.

One good defense is to reduce your scope continuously. The smaller you make your scope the smaller the chances of something escaping your attention. Stay away from globals and global data structures. Make it impossible to inspect the contents of a box without going through a well defined interface. Use assertions liberally. Avoid fault propagation, abort immediately when something is out of the expected range.

uecker•3h ago
I strategy that helps me is just not use open-coded pointer arithmetic or string manipulation but encapsulate those behind safe bounds-checked interfaces. Then essentially only life-time issues remain and for those I usually do have a simple policy and clearly document any exception. I also use signed integers and the sanitizer in trapping mode, which turns any such issue I may have missed into a run-time trap.
OneLessThing•3h ago
This is why I love C. You can build these guard rails at exactly the right level for you. You can build them all the way up to CPython and do garbage collection and constant bounds checking. Or keep them at just raw pointer math. And everywhere in between. I like your approach. The downside being that there are probably 100,000+ bespoke implementations of similar guard rails where python users for example all get them for free.
jacquesm•36m ago
It definitely is a lot of freedom.

But the lack of a good string library is by itself responsible for a very large number of production issues, as is the lack of foresight regarding de-referencing pointers that are no longer valid. Lack of guardrails seems to translate in 'do what you want' not necessarily 'build guard rails at the right level for you', most projects simply don't bother with guardrails at all.

Rust tries to address a lot of these issues, but it does so by tossing out a lot of the good stuff as well and introducing a whole pile of new issues and concepts that I'm not sure are an improvement over what was there before. This creates a take-it-or-leave it situation, and a barrier to entry. I would have loved to see that guard rails concept extended to the tooling in the form of compile time flags resulting in either compile time flagging of risky practices (there is some of this now, but I still think it is too little) and runtime errors for clear violations.

The temptation to 'start over' is always there, I think C with all of its warts and shortcomings is not the best language for a new programmer to start with if they want to do low level work. At the same time, I would - still, maybe that will change - hesitate to advocate for rust, it is a massive learning curve compared to the kind of appeal that C has for a novice. I'd probably recommend Go or Java over both C and rust if you're into imperative code and want to do low level work. For functional programming I'd recommend Erlang (if only because of the very long term view of the people that build it) or Clojure, though the latter seems to be on its retour.

OneLessThing•2h ago
This is exactly my problem with LLM C code, lack of confidence. On the other hand, when my projects get big enough to the point where I cannot keep the code base generally loaded into my brains cache they eventually get to the point where my confidence comes from extensive testing regardless. So maybe it's not such a bad approach.

I do think that LLM C code if made with great testing tooling in concert has great promise.

jacquesm•10m ago
That generalizes to anything LLM related.
citbl•3h ago
The irony is also that AI could have been used to audit the code and find these issues. All the author had to do was to question.
nurettin•2h ago
> should never see production use.

I have an issue with high strung opinions like this. I wrote plenty of crappy delphi code while learning the language that saw production use and made a living from it.

Sure, it wasn't the best experience for users, it took years to iron out all the bugs and there was plenty of frustration during the support phase (mostly null pointer exceptions and db locks in gui).

But nobody would be better off now if that code never saw production use. A lot of business was built around it.

zdragnar•2h ago
Buggy code that just crashes or produces incorrect results are a whole different category. In C a bug can compromise a server and your users. See the openssl heart bleed vulnerability as a prime example.

Once upon a time, you could put up a relatively vulnerable server, and unless you got a ton of traffic, there weren't too many things that would attack it. Nowadays, pretty much anything Internet facing will get a constant stream of probes. Putting up a server requires a stricter mindset than it used to.

jacquesm•1h ago
There are minimum standards for deployment to the open web. I think - and you're of course entirely free to have a different opinion - that those are not met with this code.
nurettin•21m ago
Yes, I have lots of opinions!

I guess the question at spotlight is: At what point would your custom server's buffer overflow when reading a header matter and would that bug even exist at that point?

Could a determined hacker get to your server without even knowing what weird software you cooked up and how to exploit your binary?

We have a lot of success stories born from bad code. I mean look at Micro$oft.

Look at all the big players like discord leaking user credentials. Why would you still call out the little fish?

Maybe I should create a form for all these ahah.

lelanthran•3h ago
I can't completely blame the language here: anyone "coding" in a language new to them using an LLM is going to have real problems.
OneLessThing•3h ago
It's funny the author says this was 90% written without AI, and that AI was mostly used for the json code. I think they're just new to C.

Trust me I love C. Probably over 90% of my lifetime code has been written in C. But python newbies don't get their web frameworks stack smashed. That's kind of nice.

lelanthran•45m ago
> But python newbies don't get their web frameworks stack smashed. That's kind of nice.

Hah! True :-)

The thing is, smashed stacks are difficult to exploit deterministically or automatically. Even heartbleed, as widespread as it was, was not a guaranteed RCE.

OTOH, an exploit in a language like Python is almost certainly going to be easier to exploit deterministically. Log4j, for example, was a guaranteed exploit and the skill level required was basically "Create a Java object".

This is because of the ease with which even very junior programmers can create something that appears to run and work and not crash.

messe•3h ago
> Another interesting choice in this project is to make lengths signed:

There are good reasons for this choice in C (and C++) due to broken integer promotion and casting rules.

See: "Subscripts and sizes should be signed" (Bjarne Stroustrup) https://open-std.org/jtc1/sc22/wg21/docs/papers/2019/p1428r0...

As a nice bonus, it means that ubsan traps on overflow (unsigned overflows just wrap).

uecker•3h ago
I do not agree that the integer promotion or casting (?) rules are broken in C. That some people make mistakes because they do not know them is a different problem.

The reason you should make length signed is that you can use the sanitizer to find or mitigate overflow as you correctly observe, while unsigned wraparound leads to bugs which are basically impossible to find. But this has nothing to do with integer promotion and wraparound bugs can also create bugs in - say - Rust.

OneLessThing•3h ago
It's interesting to hear these takes. I've never had problems catching unsigned wrap bugs with plain old memory sanitizers, though I must admit to not having a lot of experience with ubsan in particular. Maybe I should use it more.
jacquesm•3h ago
I've had some fun reviewing some very old code I wrote (1980's) to see what it looked like to me after such a long time of gaining experience. It's not unlike what the OP did here, it reads cleanly but I can see many issues that escaped my attention at the time. I always compared C with a very fast car: you can take some corners on two wheels but if you make a habit of that you're going to end up in a wall somewhere. That opinion has not changed.
uecker•2h ago
I think the correct comparison is a sharp knife. It is extremely useful and while there is a risk it is fully acceptable. The idea that we should all use plastic knifes because there are often accidents with knifes is wrong and so is the idea that we use should abandon C because of memory safety. I follow computer security issues for several decades, and while I think we should have memory safety IMHO the push and arguments are completely overblown - and they are especially not worth the complexity and issues of Rust. I never was personally impacted by a security exploit caused by memory safety or know anybody in my personal vicinity who was. I know many cases where people where affected by other kinds of security issues. So I think those are what we should focus on first. And having timely security updates is a hell lot more important than memory safety, so I am not happy that Rust now makes this harder.
jacquesm•1h ago
That's an interesting point you are making there. The most common exploits are of the human variety. Even so it is probably a good idea to minimize the chances of all kinds of exploits. One other problem - pet peeve of mine - is that instead of giving people just security updates manufacturers will happily include a whole bunch of new and 'exciting' stuff in their updates that in turn will (1) introduce new security issues and (2) will inevitably try to extract more money from the updaters. This is extremely counterproductive.
simonask•1h ago
I’m sorry, but there is an incredible amount of hard data on this, including the number of CVEs directly attributable to memory safety bugs. This is publicly available information, and we as an industry should take it seriously.

I don’t mean to be disrespectful, but this cavalier attitude towards it reads like vaccine skepticism to me. It is not serious.

Programming can be inconsequential, but it can also be national security. I know which engineers I would trust with the latter, and they aren’t the kind who believe that discipline is “enough”.

jacquesm•1h ago
So what do you propose to do?
uecker•2h ago
GCC's sanitizer does not catch unsigned wraparound. But the bigger problem is that a lot of code is written where it assumes that unsigned wraps around and this is ok. So you you would use a sanitizer you get a lot of false positives. For signed overflow, one can always consider this a bug in portable C.

Of course, if you consistently treat unsigned wraparound as a bug in your code, you can also use a sanitizer to screen for it. But in general I find it more practical to use signed integers for everything except for modular arithmetic where I use unsigned (and where wraparound is then expected and not a bug)

messe•3h ago
I meant implicit casting, but I guess that really falls under promotion in most cases where it's relevant here (I'm on a train from Aarhus to Copenhagen right now to catch a flight, and I've slept considerably less than usual, so apologies if I'm making some slight mistakes).

The issues really arise when you mix signed/unsigned arithmetic and end up promoting everything to signed unexpectedly. That's usually "okay", as long as you're not doing arithmetic on anything smaller than an int.

As an aside, if you like C enough to have opinions on promotion rules then you might enjoy the programming language Zig. It's around the same level as C, but with much nicer ergonomics, and overflow traps by default in Debug/ReleaseSafe optimization modes. If you want explicit two's complement overflow it has +%, *% and -% variants of the usual arithmetic operations, as well as saturating +|, *|, -| variants that clamp to [minInt(T), maxInt(T)].

EDIT to the aside: it's also true if you hate C enough to have opinions on promotion rules.

jacquesm•3h ago
Yes, this is one of the more subtle pitfalls of C. What helps is that in most contexts the value of 2 billion is large enough that a wraparound would be noticed almost immediately. But when it isn't then it can lead to very subtle errors that can propagate for a long time before anything goes off the rails that is noticed.
uecker•3h ago
I prefer C to Zig. IMHO all the successor languages throw out the baby with the bathwater and add unnecessary complexity. But Zig is much better than Rust, but, still, I would never use it for a serious project.

The "promoting unexpectedly" is something I do not think happens if you know C well. At least, I can't remember ever having a bug because of this. In most cases the promotion prevents you from having a bug, because you do not get unexpected overflow or wraparound because your type is too small.

Mixing signed and unsigned is problematic, but I see issues mostly in code from people who think they need to use unsigned when they shouldn't because they heard signed integers are dangerous. Recently I saw somebody "upgrading" a C code basis to C++ and also changing all loop variables to size_t. This caused a bug which he blamed on working on the "legacy C code" he is working on, although the original code was just fine. In general, there are compiler warnings that should catch issues with sign for conversions.

Sukera•2h ago
Could you expand on how these wraparound bugs happen in Rust? As far as I know, integer overflow panics (i.e. aborts) your code when compiled in debug mode, which I think is often used for testing.
01HNNWZ0MV43FF•2h ago
> That some people make mistakes because they do not know them is a different problem.

We can argue til we're blue in the face that people should just not make any mistakes, but history is against us - People will always make mistakes.

That's why surgeons are supposed to follow checklists and count their sponges in and out

bringbart•49m ago
>while unsigned wraparound leads to bugs which are basically impossible to find.

What?

unsigned sizes are way easier to check, you just need one invariant:

if(x < capacity) // good to go

Always works, regardless how x is calculated and you never have to worry about undefined behavior when computing x. And the same invariant is used for forward and backward loops - some people bring up i >= 0 as a problem with unsigned, but that's because you should use i < n for backward loops as well, The One True Invariant.

user____name•2h ago
I just put assertions to check the ranges of all sizes and indices upon function entry, doubles as documentation, and I mostly don't have to worry about signedness as a result.
kstenerud•1h ago
Yup, unsigned math is just nasty.

Actually, unchecked math on an integer is going to be bad regardless of whether it's signed or unsigned. The difference is that with signed integers, your sanity check is simple and always the same and requires no thought for edge cases: `if(index < 0 || index > max)`. Plus ubsan, as mentioned above.

My policy is: Always use signed, unless you have a specific reason to use unsigned (such as memory addresses).

bringbart•18m ago
unsigned is easier: 'if(index >= max)' and has fewer edge cases because you don't need to worry about undefined behavior when computing index.
bluetomcat•3h ago
Good C code will try to avoid allocations as much as possible in the first place. You absolutely don’t need to copy strings around when handling a request. You can read data from the socket in a fixed-size buffer, do all the processing in-place, and then process the next chunk in-place too. You get predictable performance and the thing will work like precise clockwork. Reading the entire thing just to copy the body of the request in another location makes no sense. Most of the “nice” javaesque XXXParser, XXXBuilder, XXXManager abstractions seen in “easier” languages make little sense in C. They obfuscate what really needs to happen in memory to solve a problem efficiently.
01HNNWZ0MV43FF•2h ago
Can you do parsing of JSON and XML without allocating?
bluetomcat•2h ago
Yes, you can do it with minimal allocations - provided that the source buffer is read-only or is mutable but is unused later directly by the caller. If the buffer is mutable, any un-escaping can be done in-place because the un-escaped string will always be shorter. All the substrings you want are already in the source buffer. You just need a growable array of pointer/length pairs to know where tokens start.
gritzko•2h ago
Yep, no problem. In place parsing only requires a stack. Stack length is the maximum JSON nesting allowed. I have a C dialect exactly like that.
veqq•2h ago
Of course. You can do it in a single pass/just parse the token stream. There are various implementations like: https://zserge.com/jsmn/
Ygg2•2h ago
Theoretically yes. Practically there is character escaping.

That kills any non-allocation dreams. Moment you have "Hi \uxxxx isn't the UTF nice?" you will probably have to allocate. If source is read-only you have to allocate. If source is mutable you have to waste CPU to rewrite the string.

lelanthran•17m ago
> Moment you have "Hi \uxxxx isn't the UTF nice?" you will probably have to allocate.

Depends on what you are doing with it. If you aren't displaying it (and typically you are not in a server application), you don't need to unescape it.

lelanthran•19m ago
> Can you do parsing of JSON and XML without allocating?

If the source JSON/XML is in a writeable buffer, with some helper functions you can do it. I've done it for a few small-memory systems.

lock1•1h ago
Why does "good" C have to be zero alloc? Why should "nice" javaesque make little sense in C? Why do you implicitly assume performance is "efficient problem solving"?

Not sure why many people seem fixated on the idea that using a programming language must follow a particular approach. You can do minimal alloc Java, you can simulate OOP-like in C, etc.

Unconventional, but why do we need to restrict certain optimizations (space/time perf, "readability", conciseness, etc) to only a particular language?

bluetomcat•46m ago
Because in C, every allocation incurs a responsibility to track its lifetime and to know who will eventually free it. Copying and moving buffers is also prone to overflows, off-by-one errors, etc. The generic memory allocator is a smart but unpredictable complex beast that lives in your address space and can mess your CPU cache, can introduce undesired memory fragmentation, etc.

In Java, you don't care because the GC cleans after you and you don't usually care about millisecond-grade performance.

lelanthran•24m ago
> Why does "good" C have to be zero alloc?

GP didn't say "zero-alloc", but "minimal alloc"

> Why should "nice" javaesque make little sense in C?

There's little to no indirection in idiomatic C compared with idiomatic Java.

Of course, in both languages you can write unidiomatically, but that is a great way to ensure that bugs get in and never get out.

lelanthran•26m ago
> Good C code will try to avoid allocations as much as possible in the first place.

I've upvoted you, but I'm not so sure I agree though.

Sure, each allocation imposes a new obligation to track that allocation, but on the downside, passing around already-allocated blocks imposes a new burden for each call to ensure that the callees have the correct permissions (modify it, reallocate it, free it, etc).

If you're doing any sort of concurrency this can be hard to track - sometimes it's easier to simply allocate a new block and give it to the callee, and then the caller can forget all about it (callee then has the obligation to free it).

jqpabc123•2h ago
Reads like an indictment of vibe coding.

LLMs are fundamentally probabilistic --- not deterministic.

This basically means that anything produced this way is highly suspect. And this framework is an example.