frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

192.168.0.1

https://blog.cloudflare.com/welcome-to-connectivity-cloud/
1•nana_mhmd•2m ago•0 comments

Nano AI: A Free and Useful AI Tool Box

https://nanoai.run
1•Li_Evan•9m ago•0 comments

Code.org Hour of AI

https://code.org/en-US/hour-of-ai
1•westurner•11m ago•1 comments

A prudent planetary limit for geologic carbon storage

https://www.nature.com/articles/s41586-025-09423-y
1•rolph•11m ago•0 comments

AI Is Currently Hitting the Wall of Code Structure

https://modgo.org/ai-is-currently-hitting-the-wall-of-code-structure/
1•hexpeek•12m ago•1 comments

Epson Livingstation 57-inch Rear-Projection LCD HDTV with built-in printer (2005)

https://www.soundandvision.com/content/epson-livingstation-57-inch-rear-projection-lcd-hdtv
1•some-guy•13m ago•0 comments

The Progress of the Model Context Protocol (MCP) with John Capobianco [audio]

https://packetpushers.net/podcasts/the-cloud-gambit/tcg057-following-the-progress-of-the-model-co...
1•mooreds•15m ago•0 comments

A neural basis for distinguishing imagination from reality

https://www.cell.com/neuron/fulltext/S0896-6273(25)00362-9
1•rolph•16m ago•0 comments

Industry Groups Call on Congress to Renew Cybersecurity Law Before Expiration

https://bpi.com/industry-groups-call-on-congress-to-renew-critical-cybersecurity-law-before-septe...
2•mooreds•16m ago•0 comments

Reverse Designing Ferroelectric Capacitors with ML-Based Compact Modeling

https://arxiv.org/abs/2508.20216
1•PaulHoule•16m ago•0 comments

India's honk-happy drivers are switching to even louder horns

https://text.npr.org/nx-s1-5508242
1•mooreds•16m ago•0 comments

How the Brain Tells Imagination from Reality

https://www.scientificamerican.com/article/newfound-reality-signal-helps-the-brain-tell-imaginati...
1•rolph•17m ago•0 comments

OSS Licensing Sucks

https://jackson.dev/post/oss-licensing-sucks/
1•pabs3•20m ago•0 comments

Bit Art Bot

https://freeradical.zone/@bitartbot/media
1•nurettin•21m ago•1 comments

The General Catalogue of Jacobson and Co [pdf]

https://upload.wikimedia.org/wikipedia/commons/4/45/The_general_catalogue_of_Jacobson_%26_co_%28I...
1•Jugurtha•21m ago•0 comments

A feature-rich clock website with multiple functionalities

https://live-clock.com/
1•yangyiming•24m ago•0 comments

FTC to Review AI Chatbot Risks with Focus on Privacy Harms

https://news.bloomberglaw.com/privacy-and-data-security/ftc-to-review-ai-chatbot-risks-with-focus...
1•1vuio0pswjnm7•25m ago•0 comments

Lack of observability in FB chat births ReactJS

https://twitter.com/dmwlff/status/1762885255030259854
4•pbd•29m ago•0 comments

Thoughts on Visual Programming

https://btmc.substack.com/p/thoughts-on-visual-programming
1•mitchbob•33m ago•0 comments

Why the Technological Singularity May Be a "Big Nothing"

1•starchild3001•37m ago•1 comments

The world has a running Rational R1000/400 computer again (2019)

https://datamuseum.dk/wiki/Rational/R1000s400/Logbook/2019#2019-10-28
2•MaxLeiter•38m ago•0 comments

Root cause for why Windows 11 is breaking or corrupting SSDs may have been found

https://www.neowin.net/news/root-cause-for-why-windows-11-is-breaking-or-corrupting-ssds-may-have...
3•speckx•39m ago•0 comments

Wrote an in-depth blog on scaling modern transformers with n-D parallelism

https://jaxformer.com/
3•chinmayjindal_•43m ago•0 comments

Re Falcon 030: designing replica PCB of Atari Falcon

https://re-falcon.com/project
2•msephton•44m ago•0 comments

The Y2Q Problem: Harvest Now, Decrypt Later

https://shalashashka.substack.com/p/y2q-the-quiet-data-bomb
1•Shalashashka•46m ago•0 comments

Reducing double spend latency from 40 ms to < 1 ms on privacy proxy

https://blog.cloudflare.com/reducing-double-spend-latency-from-40-ms-to-less-than-1-ms-on-privacy...
1•corvad•46m ago•0 comments

Using refactoring patterns as search beacons (90%+ token reduction)

https://github.com/Akisin/structural-beacon
1•Akisin•47m ago•0 comments

Redesigning Workers KV for increased availability and faster performance

https://blog.cloudflare.com/rearchitecting-workers-kv-for-redundancy/
1•corvad•49m ago•0 comments

Show HN: Lightweight tool for managing Linux virtual machines

https://github.com/ccheshirecat/flint
2•ccheshirecat•54m ago•0 comments

Navy SEALs Killed Fishermen to Hide Failed Mission to Wiretap North Korea

https://reason.com/2025/09/05/navy-seals-reportedly-killed-north-korean-fishermen-and-mutilated-t...
45•nomilk•54m ago•6 comments
Open in hackernews

C++26: Erroneous behaviour

https://www.sandordargo.com/blog/2025/02/05/cpp26-erroneous-behaviour
41•todsacerdoti•4h ago

Comments

gblargg•3h ago
> It’s well-defined, yet incorrect behaviour that compilers are recommended to diagnose. Is recommended enough?! Well, with the growing focus on safety, you can rest assured that an implementation that wouldn’t diagnose erroneous behaviour would be soon out of the game.

Is this to cover cases that would be hard/costly to detect? For example you pass the address of an uninitialized variable to a function in another source file that might read or just write to it, but the compiler can't know.

bluGill•3h ago
Right. The compiler needs to diagnose where it can but the are many cases it cannot be sure.
mcdeltat•3h ago
As a long time user of C++, I want to propose a question: do we think C++ will ever reach a point where it is significantly ergonomic and safe enough to use (in comparison to e.g. Python or Rust) via adding new features?

I've used C++ for so long and I'm a good way into thinking that the language is just over. It missed its mark because of backwards compatibility with fundamental language flaws. I think we can continue to add features - which are usually decent ideas given the context, kudos to the authors for the effort - but the language will decline faster than it can be fixed. Furthermore, the language seems to be continually becoming harder to implement, support, and write due to the constant feature addition and increase in semantic interconnectivity. To me it's almost mostly a theoretical exercise to add features at this point: practically we end up with oddly specced features which mostly work but are fundamentally crippled because they need to dodge an encyclopedia of edge cases. The committee are really letting the vision of a good C++ down by refusing to break backwards compatibility to fix core problems. I'm talking fundamental types, implicit conversions, initialisation, preprocessor, undefined / ill-formed NDR behaviour. The C++ I'm passionate about is dead without some big changes I don't think the community can/will handle.

bluGill•3h ago
That doesn't matter - it cost a billion dollars to write the current project I work on. There is no way I can ask for that much to rewrite it all in whatever. Thus we are stuck on c++ which was the best choice 15 years ago. Anything that makes new code easier to write is great help.

sure we are looking at options - but rust and c++ don't interoperate well (c api is too limiting). D was looking interesting for a while but I'm not sure how it fits (d supports c++ abi)

AnimalMuppet•2h ago
60 million dollars a year in development for 15 years? What are you working on?
goalieca•2h ago
There are many large projects with ~1000 engineers will end up costing more than that.
paulddraper•2h ago
Chrome doesn’t fit those figures exactly but is close.
foobar10000•2h ago
Many large financial operations - HFT such as Hudson River Trading or Optiver or IMC or Citadel, or the big C++ systems in the banks - 200-400 developers working on an ecosystem of components can easily clear that.
quotemstr•3h ago
> As a long time user of C++, I want to propose a question: do we think C++ will ever reach a point where it is significantly ergonomic and safe enough to use (in comparison to e.g. Python or Rust) via adding new features?

Memory safety semantics aside (needed and will be disruptive, even if done gradually) ---

You could get 80% of the way to ergonomic parity via a 1:1 re-syntaxing, just like Reason (new syntax for OCaml) and Elixer (new syntax for Erlang). C++ has good bones but bad defaults. Why shouldn't things be const by default? Why can't we do destructuring in more places? Why is it so annoying to define local functions? Why do we have approximately three zillion ways of initializing a variable?

You can address a lot of the pain points by making an alternative, clean-looking "modern" syntax that, because it's actually the same language as C++, would have perfect interoperability.

dwattttt•2h ago
It's an interesting proposal. A new grammar which can express everything (legal) that C++ can, can be automatically translated, but can avoid being ambiguous (https://stackoverflow.com/a/794083).

You'd have an existing language with a new syntax; it can perfectly interact with existing C++ code, but you could make those suggested changes, and could also express things in the new syntax that couldn't be done in the old one.

EDIT: taking an example elsewhere in this thread; taking an address of an uninitialised variable and passing it to a function. Today the compiler can't (without inter-procedure analysis) tell whether this is a use of uninitialised data, or whether it's only going to write/initialise the variable.

A new syntax could allow you to express that distinction.

aw1621107•1h ago
> You can address a lot of the pain points by making an alternative, clean-looking "modern" syntax that, because it's actually the same language as C++, would have perfect interoperability.

Sounds like Herb Sutter's cpp2/cppfront: https://github.com/hsutter/cppfront

CoastalCoder•3h ago
On the bright side, any sufficiently motivated team can fork the language to try out ideas like yours.

Perhaps a better language could even get some traction without major corporate sponsorship. I think (?) rust and zig are examples of that.

aw1621107•1h ago
> Perhaps a better language could even get some traction without major corporate sponsorship. I think (?) rust and zig are examples of that.

Rust might not count depending on whether you count Mozilla's sponsorship a major corporate sponsorship.

ghosty141•3h ago
Absolutely spot on in my opinion.

Also things often just don’t compose well. For example if you have a nested class that you want to use in an unordered_set in its parent class then you just can’t do it because you can’t put the std::hash specialization anywhere legal. It’s just two parts of the language which are totally valid on their own but don’t work together. Stuff like this is such a common problem in c++ that it drives me nuts

Kranar•33m ago
>For example if you have a nested class that you want to use in an unordered_set in its parent class then you just can’t do it because you can’t put the std::hash specialization anywhere legal.

This is not true. From within your parent class you use an explicit hashing callable, and then from outside of the parent class you can go back to using the default std::hash.

The result looks like this:

    struct Foo {
      struct Bar {};
      struct BarHasher {
        std::size_t operator ()(const Bar&) const noexcept;
      };
      std::unordered_set<Bar, BarHasher> bar_set;
    };

    namespace std {
      template<>
      struct hash<Foo::Bar> {
        std::size_t operator()(const Foo::Bar& b) const noexcept {
          return Foo::BarHasher()(b);
        }
      };
    }
The std::hash specialization at the end is legal and allows other users of Foo::Bar to use std::unordered_set<Foo::Bar> without needing the explicit BarHasher.
saghm•3h ago
> do we think C++ will ever reach a point where it is significantly ergonomic and safe enough to use (in comparison to e.g. Python or Rust) via adding new features?

This is an interesting perspective to me, because my view as someone who's been using Rust since close to 1.0 and hasn't done much more than dabbled in C++ over the years is basically the opposite. My (admittedly limited) understanding is that this has never really been a goal of the committee, because if someone is willing to sacrifice backwards compatibility, they could presumably just switch to using one of those other languages at that point. Arguably the main selling point of C++ today is the fact that there's a massive set of existing codebases out there (both libraries that someone might want to use and applications that might still be worked on), and for the majority of them, being rewritten would be at best a huge effort and more realistically not something going to be seriously considered.

If the safety and ergonomics of C++ are a concern, I guess I'm not sure why someone would pick it over another language for a newly-started codebase. In terms of safety, Rust is an option that exists today without needing C++ to change. Ergonomics are a bit less clear-cut, but I'd argue that most of the significant divergences in ergonomics between languages are pretty subjective, and it's not obvious to me that there's a significant enough gap in between Rust's and C++'s respective choices that warrant a new language that's not compatible with C++ but is far enough from Rust for someone to refuse to use it on the basis ergonomics alone. It seems to me like "close enough to C++ to attract the people who don't want to use Rust but far enough from Rust to justify breaking C++'s backwards compatibility" is just too narrow a niche for it to be worth it for C++ to go after.

mcdeltat•3h ago
It's true that without backwards compatibility concerns, Rust looks like a great alternative.

However I think C++ still has some things going for it which may make it a useful option, assuming the core issues were fixed. C++ gives ultimate control over memory and low level things (think pointers, manual stack vs heap, inline assembly). It has good compatibility with C ABIs. It's very general purpose and permissive. And there are many programmers with C++ (or C) knowledge out there already.

Further, I think C++ started on its current feature path before Rust really got a big foothold. Consider C++ has been around a really long time, plenty long enough to fix core features.

Finally I reckon the whole backwards compatibility thing is a bit weird because if the code is so ancient and unchangable, why does it need the latest features? Like you desprately need implicit long-to-int conversion but also coroutines?? And for regular non-ancient code, we already try to avoid the problematic parts of C++, so fixing/removing/changing them wouldn't be so bad. IMO it's a far overdone obsession with backwards compatibility.

Of course without a significant overhaul to the language you'd probably say "screw it" and start from scratch with something nicer like Rust.

saghm•2h ago
I think I might be getting confused about the point you're making here. To me, pre-existing knowledge and the decades-long legacy of C++ feel like much stronger arguments against changing anything in a breaking way compared to making breaking changes to improve the language. I do agree with you around a lot of the new features being introduced not feeling super necessary, but I'm guessing that the stance of people in favor of them is that adding them doesn't feel like it's a huge problem either given that they can do them without breaking anything. My perception is that C++ has already been a fairly large language for a while, and that most codebases already develop a bit of a dialect of which features to use or not use (which you allude to as well), so I could imagine that they expect people who don't like the new features to just ignore them.

I think I'm most confused about the last part that you're saying. A significant overhaul to the language in a breaking way feels pretty much the same as saying "screw it" and starting from scratch, just with specific ergonomic choices being closer to C++ than to Rust. Several of the parts that you cite as the strengths of the language, like inline assembly and pointers are still available in Rust, just not outside of explicitly unsafe contexts, and I'd imagine that an overhaul of C++ to enhance memory safety would end up needing to make a fairly similar compromise for them. It just seems like the language you're wishing for would end up with a fairly narrow design space, even if it is objectively superior to the C++ we have today, because it would have to give up the largest advantage that C++ does have without enough unoccupied room to grow into. The focus on backwards compatibility doesn't seem to be that it would necessarily be the best choice in a vacuum, but a reflection of the state of the ecosystem as it is today, and a perception that sacrificing it would be giving up its position as the dominant language in a well-defined niche to try to compete in a new one. This is obviously a subjective viewpoint, but it doesn't seem implausible to me, and given the fact that we can't really know how it would work out unless they do try, sticking with compatibility feels like the safer option.

dwattttt•2h ago
The biggest question I have around the viability of breaking changes in C++ is whether you can compile some code with a newer breaking standard, some with an older standard, and link them.

Headers would be a problem given their text inclusion in multiple translation units, but it's not insurmountable; you're currently limited to the oldest standard a header is included into, and under a new standard that breaks compatibility you'd be limited to a valid subset of the old & new standard.

EDIT: ironically modules (as a concept) would (could?) solve the header problem, but they've not exactly been a success story so far.

TuxSH•1h ago
> EDIT: ironically modules (as a concept) would (could?) solve the header problem, but they've not exactly been a success story so far.

Because they are little different from precompiled headers. Import std; may be nice, but in a large project you are likely to have your own defines.hpp file anyway (that is going to be precompiled for double-digits compile times reduction).

Ironically too, migrating every header in an executable project to modules might slow down build times, as dependency chains reduce the parallelism factor of the build.

isaacremuant•3h ago
Very little signal to noise in your post.

It's widely used and you can do so effectively if you need it as know what you're doing.

kachapopopow•3h ago
I just don't see a reason to use c++ anymore when rust does quite literally, everything better. For prototyping, hacking, firmware and native interfacing though? c++ any time of the day.
TuxSH•1h ago
> quite literally, everything better

Like retained mode GUIs, games, intrusive containers or anything that can't be trivially represented by a tree of unique_/shared_ptr?

anon-3988•3h ago
Based on Hyrum's Law (since they preserve backward compatibility), any line of "modern" C++ code is also basically just C code. There's no way around this. I don't think there will ever be a point where C++ is so safe and rigid that it feels comfortable to write in without thinking of the 100s different ways the languages screws you over.

They love to say C++ is for everyone but it is clearly not. Only wizards and nerds burdened by sunk cost fallacy is willingly writing these modern C++ code. I personally just use C++ as a "nicer" C.

wffurr•1h ago
>> The committee are really letting the vision of a good C++ down by refusing to break backwards compatibility to fix core problems

IIUC this is what Profiles are. It’s an opt in source file method to ban certain misfeatures and require certain other safe features.

Kranar•1h ago
Safety profiles don't exist and there are so many issues with them that it's unlikely they will ever get added to the language. For example, you mention how it's a method applied to a source file, but C++ doesn't have the concept of a source file, it only knows about translation units.

But then the problem becomes where exactly do you opt-in to this feature? If you do it in a header file then this can result in a function being compiled with the safety profile turned on in one translation unit and then that exact same function is compiled without that safety profile in another translation unit... which ironically results in one of the most dangerous possible outcomes in C++, the so-called ODR violation.

If you don't allow safety-profiles to be turned on in header files, then you've now excluded a significant amount of code in the form of templates, constexpr and inline functions.

jandrewrogers•1h ago
The saving grace of C++ is that 80% of my complaints are really about the standard library rather than the language itself. I can replace the standard library with something else more consistent and modern without regard for legacy compatibility, and many people do.

Most of the rest of my complaints could be addressed by jettisoning backward compatibility and switching to more sensible defaults. I realize this will never happen.

C++ still has some unique strengths, particularly around metaprogramming compared to other popular systems languages. It also is pretty good at allowing you to build safe and efficient abstractions around some ugly edge cases that are unavoidable in systems programming. Languages like Rust are a bit too restrictive to handle some of these cases gracefully.

Kranar•1h ago
There is a version of C++ that adds complete memory safety to the language by adding features to the language in a way that preserves complete backwards compatibility with existing C++ source code. That version of C++ is called Circle/Safe C++, it represents a monumental amount of effort that was written by a single individual, and it's a complete disgrace that the C++ committee has informally shut the door on that individual:

https://safecpp.org/draft.html

LoganDark•11m ago
> it's a complete disgrace that the C++ committee has informally shut the door on that individual

What do you mean?

Kranar•9m ago
The committee very politely showed him the door.
benreesman•1h ago
I dramatically prefer modern C++ to either of Python or Rust in domains where it's a toss up. It's really nice these days.

Like any language that lasts (including Python and Rust) you subset it over time: you end up with linters and sanitizers and static analyzers and LSP servers, you have a build. But setting up a build is a one-time cost and maintaining a build is a fact of life, even JavaScript is often/usually the output of a build.

And with the build done right? Maybe you dont want C++ if youre both moving fast and doing safety or security critical stuff (as in, browser, sshd, avionics critical) but you shouldnt be moving fast on avoinics software to begin with.

And for stuff outside of that "even one buffer overflow is too many" Venn?

C++ doesn't segfault more than either of those after its cleared clang-tidy and ASAN. Python linking shoddy native stuff crashes way more.

socalgal2•49m ago
I think it's like 50/50. There's been proposals for various things like --perfectly-safe-subset-only (making that up). But there's so much to fix and so much arguing about whether it should actually be fixed or not.

I would personally like a safe (and fast) subset that doesn't require me to be vigilant but catches every thing I could do wrong to the same level as rust. Then, like rust, you could remove that flag for a few low-level parts that for some reason need to be "unsafe" (maybe because they call into the OS).

There was a good talk from the WebKit team about stuff they did to get more safety.

https://www.youtube.com/watch?v=RLw13wLM5Ko

Some of it was AST level checks. IIRC, they have a pre-commit check that there is no pointer math being used. They went over how to change code with pointer math into safe code with zero change in performance.

A similar one was Ref usage checking where they could effectively see a ref counted object was being passed to as a raw pointer to a function that might free the ref and then still used in the calling function. They could detect that with an AST based checker.

That said, I have no idea how they (the C++ committee) are going to fix all the issues. --no-undefined-behavior would be a start. Can they get rid of the perf bombs with std::move? Why do I have to remember that shit?

almostgotcaught•28m ago
> I've used C++ for so long and I'm a good way into thinking that the language is just over

"no one goes there anymore it's too crowded"

dwattttt•3h ago
> with the growing focus on safety, you can rest assured that an implementation that wouldn’t diagnose erroneous behaviour would be soon out of the game.

Unless they were incumbent and inertia keeps them in. Or they're the only choice you have for a niche target. Or you have some other reason to keep them, such as (thinking?) the performance they bring is more important.

webdevver•2h ago
C++26, C++29, C++32, ... C++50?

Surely all good things come to an end, but where? i reckon there will be a C++29. what about C++38? C++43 sounds terrifying. Mid-century C++? there is no way in hell i will still be staying up to date with C++43. Personally I've already cut the cord at C++11.

aw1621107•1h ago
> Surely all good things come to an end, but where?

As long as there are people willing to put in the work to convince the standards committee that the proposals they champion are worth adding to C++ (and as long as there is a committee to convince, I suppose), then new versions of C++ will continue to be released.

> there is no way in hell i will still be staying up to date with C++43. Personally I've already cut the cord at C++11.

Sure, different developers will find different features compelling and so will be comfortable living with different standards. That one group is fine with existing features shouldn't automatically prevent another from continuing to improve the language if they can gain consensus, though.