frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Stacky – certain block game clone

https://www.susmel.com/stacky/
1•Keyframe•48s ago•0 comments

AIII: A public benchmark for AI narrative and political independence

https://github.com/GRMPZQUIDOS/AIII
1•GRMPZ23•55s ago•0 comments

SectorC: A C Compiler in 512 bytes

https://xorvoid.com/sectorc.html
1•valyala•2m ago•0 comments

The API Is a Dead End; Machines Need a Labor Economy

1•bot_uid_life•3m ago•0 comments

Digital Iris [video]

https://www.youtube.com/watch?v=Kg_2MAgS_pE
1•Jyaif•4m ago•0 comments

New wave of GLP-1 drugs is coming–and they're stronger than Wegovy and Zepbound

https://www.scientificamerican.com/article/new-glp-1-weight-loss-drugs-are-coming-and-theyre-stro...
3•randycupertino•6m ago•0 comments

Convert tempo (BPM) to millisecond durations for musical note subdivisions

https://brylie.music/apps/bpm-calculator/
1•brylie•8m ago•0 comments

Show HN: Tasty A.F.

https://tastyaf.recipes/about
1•adammfrank•8m ago•0 comments

The Contagious Taste of Cancer

https://www.historytoday.com/archive/history-matters/contagious-taste-cancer
1•Thevet•10m ago•0 comments

U.S. Jobs Disappear at Fastest January Pace Since Great Recession

https://www.forbes.com/sites/mikestunson/2026/02/05/us-jobs-disappear-at-fastest-january-pace-sin...
1•alephnerd•10m ago•0 comments

Bithumb mistakenly hands out $195M in Bitcoin to users in 'Random Box' giveaway

https://koreajoongangdaily.joins.com/news/2026-02-07/business/finance/Crypto-exchange-Bithumb-mis...
1•giuliomagnifico•10m ago•0 comments

Beyond Agentic Coding

https://haskellforall.com/2026/02/beyond-agentic-coding
3•todsacerdoti•12m ago•0 comments

OpenClaw ClawHub Broken Windows Theory – If basic sorting isn't working what is?

https://www.loom.com/embed/e26a750c0c754312b032e2290630853d
1•kaicianflone•14m ago•0 comments

OpenBSD Copyright Policy

https://www.openbsd.org/policy.html
1•Panino•14m ago•0 comments

OpenClaw Creator: Why 80% of Apps Will Disappear

https://www.youtube.com/watch?v=4uzGDAoNOZc
2•schwentkerr•18m ago•0 comments

What Happens When Technical Debt Vanishes?

https://ieeexplore.ieee.org/document/11316905
2•blenderob•19m ago•0 comments

AI Is Finally Eating Software's Total Market: Here's What's Next

https://vinvashishta.substack.com/p/ai-is-finally-eating-softwares-total
3•gmays•20m ago•0 comments

Computer Science from the Bottom Up

https://www.bottomupcs.com/
2•gurjeet•20m ago•0 comments

Show HN: A toy compiler I built in high school (runs in browser)

https://vire-lang.web.app
1•xeouz•22m ago•1 comments

You don't need Mac mini to run OpenClaw

https://runclaw.sh
1•rutagandasalim•23m ago•0 comments

Learning to Reason in 13 Parameters

https://arxiv.org/abs/2602.04118
2•nicholascarolan•25m ago•0 comments

Convergent Discovery of Critical Phenomena Mathematics Across Disciplines

https://arxiv.org/abs/2601.22389
1•energyscholar•25m ago•1 comments

Ask HN: Will GPU and RAM prices ever go down?

1•alentred•25m ago•1 comments

From hunger to luxury: The story behind the most expensive rice (2025)

https://www.cnn.com/travel/japan-expensive-rice-kinmemai-premium-intl-hnk-dst
2•mooreds•26m ago•0 comments

Substack makes money from hosting Nazi newsletters

https://www.theguardian.com/media/2026/feb/07/revealed-how-substack-makes-money-from-hosting-nazi...
6•mindracer•27m ago•0 comments

A New Crypto Winter Is Here and Even the Biggest Bulls Aren't Certain Why

https://www.wsj.com/finance/currencies/a-new-crypto-winter-is-here-and-even-the-biggest-bulls-are...
1•thm•27m ago•0 comments

Moltbook was peak AI theater

https://www.technologyreview.com/2026/02/06/1132448/moltbook-was-peak-ai-theater/
2•Brajeshwar•28m ago•0 comments

Why Claude Cowork is a math problem Indian IT can't solve

https://restofworld.org/2026/indian-it-ai-stock-crash-claude-cowork/
3•Brajeshwar•28m ago•0 comments

Show HN: Built an space travel calculator with vanilla JavaScript v2

https://www.cosmicodometer.space/
2•captainnemo729•28m ago•0 comments

Why a 175-Year-Old Glassmaker Is Suddenly an AI Superstar

https://www.wsj.com/tech/corning-fiber-optics-ai-e045ba3b
1•Brajeshwar•28m ago•0 comments
Open in hackernews

Is C++26 getting destructive move semantics?

https://stackoverflow.com/questions/79817124/is-c26-getting-destructive-move-semantics
28•signa11•2mo ago

Comments

andrewmcwatters•2mo ago
That commenter’s syntax notation looked pretty neat and intuitive.
staplung•2mo ago
Sounds like the answer is no.

"For trivial relocatability, we found a showstopper bug that the group decided could not be fixed in time for C++26, so the strong consensus was to remove this feature from C++26."

[https://herbsutter.com/2025/11/10/trip-report-november-2025-...]

ogoffart•2mo ago
I’m curious what that showstopper bug actually was.

I was really looking forward to this feature, as it would've helped improve Rust <-> C++ interoperability.

pwdisswordfishy•2mo ago
https://quuxplusone.github.io/blog/2025/11/18/kona-trip-repo... goes into more detail.
anordal•2mo ago
Thanks for the link!
CosmicZombie•2mo ago
After learning Rust, C++ just seems unnecessarily complicated.

I'm not even going to try to understand this.

Using C++ these days feels like continuing to use the earth-centric solar system model to explain the motion of planets vs a much simple sun-centric model: Unnecessarily over-complicated.

drnick1•2mo ago
Rust just doesn't work for a lot of applications. Things like GUI toolkits, Web browsers, game engines are a pain to write without true OOP. Yes, it's "overly complicated" at this point after about 40 years of development, buts it's still top 3 of the TIOBE index after all these years for a reason.
ogoffart•2mo ago
Of course Rust can handle those use cases fine (GUIs, web browsers, and game engines).

C++ is still high on the TIOBE index mainly because it is indeed old and used in a lot of legacy systems. For new projects, though, there's less reason to choose C++.

drnick1•2mo ago
It's still high because it solves real world problems, so it's still the gold standard for anything ranging from systems programming to scientific computing.
Calavar•2mo ago
Web browsers, yes. With GUIs and games, it's a less clear. Of course you can write GUIs and games in any Turing complete language but there's still a lot of work to be done in finding the right ergonomics in Rust [1, 2].

[1] https://www.warp.dev/blog/why-is-building-a-ui-in-rust-so-ha...

[2] https://loglog.games/blog/leaving-rust-gamedev/

koakuma-chan•2mo ago
You are saying "With GUIs and games" as if there is any GUI framework or game engine that doesn't suck.
steveklabnik•2mo ago
The TIOBE index does not measure anything useful.

I’d say this if Rust were at the #1 spot too.

vlovich123•2mo ago
I think you’ve misidentified the reason this stuff is harder in Rust and it has nothing to do with “true OOP” if by that you mean class based inheritance. The primary challenge is mapping how GUIs are traditionally mutated onto Rust semantics. Even then, efforts like Slint show it’s eminently feasible so I’m not sure your argument holds.

It’s important to remember that c++ has a 30 year head start on Rust, especially at a crucial growth part of computing. Thats why it tops the TIOBE index. But I fully expect it to go the way of COBAL sooner rather than later where most new development does not use c++.

Profan•2mo ago
If anything OOP might actually be detrimental to many game engine applications (in a modern computing context in regards to the kind of data layouts and implementation patterns it encourages), and traits and "traditional" OOP (if you exclude implementation inheritance, which is largely cursed anyways) are real close together anyways, I think Rust is a great fit specifically for game engines at least, for gameplay programming I'm not as certain but for anything where you're mostly managing essentially data pipelines that need to go very fast and be reliable and not crashy, Rust is a great fit.
Pet_Ant•2mo ago
I feel like C++ is the counter-argument to backwards compatibility. Even Java is loosening it's obsession with it.

Sometimes you just need to move forward.

Python 3 should be studied for why it didn't work as opposed to a lesson not to do it again.

0xcafefood•2mo ago
> Python 3 should be studied for why it didn't work as opposed to a lesson not to do it again.

I'm curious about this in particular. It seems like the Python 2 to 3 transition is a case study in why backwards compatibility is important. Why would you say the lesson isn't necessarily that we should never break backwards compatibility? It seems like it almost could've jeopardized Python's long-term future. From my perspective it held on just long enough to catch a tail wind from the ML boom starting in the mid 2010s.

Pet_Ant•2mo ago
Because you end up accumulating so much cruft and complexity that the language starts to fold under its own weight.

Often you hear the advice that when using C++ your team should restrict yourself to a handful of features. ...and then it turns out that the next library you need requires one of the features you had blacklisted and now it's part of your codebase.

However, if you don't grow and evolve your language you will be overtaken by languages that do. Your community becomes moribund, and more and more code gets written that pays for the earlier mistakes. So instead of splitting your community, it just starts to atrophy.

Python 2 to 3 was a disaster, so it needs to be studied. I think the lesson was that they waited too long for breaking changes. Perhaps you should never go too many releases without breaking something so the sedentary expectation never forms and people are regularly upgrading. It's what web browsers do. Originally people were saying "it takes us 6 months to validate a web browser for our internal webapps, we can't do that!" ...but they managed and now you don't even know when you upgrade.

steveklabnik•2mo ago
Part of the issue is that you can't just generalize it to "breaking changes." Ruby, very similar to Python, underwent a similar break between Ruby 1.8 and Ruby 1.9, but didn't befall the same fate.

The specifics really matter in this kind of analysis.

pjmlp•2mo ago
C++ has already broken backwards compatibility a few times, and the way GCC changed their std::string implementation semantics is one of the reasons why nowadays features that require ABI breaks tend to be ignored by compiler vendors.

Backwards compatibility is C++'s greatest asset, I already took part in a few rewrites away from C++, exactly because the performance in compiled managed languages was good enough, and the whole thing was getting rebooted.

swatcoder•2mo ago
Like C always will, C++ mostly still echoes the diversity of hardware architectures and runtime environments out there. That diversity is still broader than many people who don't work in weird specialties and fringes realizes, and it's useful to have a "modern" language that respects the odd quirks of those systems and how you sometimes need to leverage them to meet requirements.

If you're just writing application software for consumers or professionals, or a network service, and it's destined to run on one of the big three families of operating systems using the one of the big few established hardware architectures at that scale, there are definitely alternatives that can make your business logic and sometimes even your key algorithms simpler or clearer, or your code more resistant to certain classes of error.

If you look at Rust and see "this does everything I could imagine doing, and more simply than C++", there's nothing wrong with that, because you're probably right for yourself. But there are other projects out there that other people work on, or can expected themselves working on someday, that still befit C++ and it's nice for the language to keep maturing and modernizing for their sake, while maintaining its respect for all the underlying weirdness they have to navigate.

creata•2mo ago
> If you're just writing application software... there are definitely alternatives...

Tangentially, is there a good alternative to Qt or SDL in Rust yet?

pjmlp•2mo ago
Slint, done by ex-Qt employees.
bryanlarsen•2mo ago
IMO, even better would just be good QT bindings that take advantage of the benefits of Rust. I haven't checked. The gnome bindings are pretty good, but the abstraction does leak through.
duped•2mo ago
> C++ mostly still echoes the diversify of hardware architectures and runtime environments out there

It doesn't though, or at least none of those echoes are why C++ is complex. Here are some examples of unnecessary complexity.

The rules of 3/5 exist solely due to copy/move/assign semantics. These rules do not need to exist if the semantics were simpler.

Programmers need to be aware of expression value categories (lvalue, rvalue, etc). These concepts that most languages keep as internal details of their IRs are leaked to error messages, because of the complex semantics of expression evaluation.

SFINAE is a subtle rule of template expansion that is exploited for compile time introspection due to the fact the latter is just missing, despite the clear desire for it.

The C++ memory model for atomics is a particular source of confusion and incorrectness in concurrent programs because it decouples a fairly simple problem domain and rules into (arguably too small) a set of semantics that are easy to misuse, and also create surprisingly complex emergent behaviors when misused that become hard to debug.

These are problems with the language's design and have nothing to do with the hardware and systems it targets.

The thing that bugs me about this topic is that C++ developers have a kind of Stockholm syndrome for their terrible tools and devex. I see people routinely struggle with things that other languages simply don't have (including C and Rust!) because C++ seems committed to playing on hard mode. It's so bad that every C++ code base I've worked on professionally is essentially its own dialect and ecosystem with zero cross pollination (except one of abseil/boost/folly).

There is so much complexity in there that creates no value. Good features and libraries die in the womb because of it.

pjmlp•2mo ago
SFINAE in 2025 is only reasonable in existing old code, or people stuck in old compilers.

Since C++17 there are better options.

Despite all its warts, most C++ wannabe replacements, depend on compiler tools written in C++, and this isn't going to change in then foreseeable future, based on the about two decades that took to replace C with C++ in compiler development circles, even though there is some compatibility.

psyclobe•2mo ago
I think Rust is unnecessarily safety focused.

Opinions are cool.

dehrmann•2mo ago
But that's it's thing. C++ with improved syntax (think Kotlin for C++) isn't really enough to gain popularity. It's like how Dvorak is sorta better than qwerty, but it doesn't offer anything new, and it isn't so much better it justifies migrating.
pwdisswordfishy•2mo ago
No need to imagine hypotheticals – this exists: https://github.com/hsutter/cppfront
TuxSH•2mo ago
> Unnecessarily over-complicated.

Most of the complexity comes from the fact that C++ trivially supports consuming most C code, but with its own lifetime model on top, and that it also offers great flexibility.

Of course things become simpler when you ditch C source compat and can just declare "this variable will not be aliased by anyone else"

AFAIK C++'s constexpr and TMP is less limited than Rust's is.

pjmlp•2mo ago
Also, with all its warts, I find easier to stay within C++ itself, instead of two macro systems, that additionally depend on an external crate.
wrathofmonads•2mo ago
The real barrier is the C++ ecosystem. It represents the cost of losing decades of optimized, highly integrated, high-performance libraries. C++ maps almost perfectly to the hardware with minimal overhead, and it remains at the forefront of the AI revolution. It is the true engine behind Python scientific libs and even Julia (ex. EnzymeAD). Rust does not offer advantages that would meaningfully improve how we approach HPC. Once you layer the necessary unsafe operations, C++ code in practice becomes mostly functional and immutable, and lifetimes matter less beyond a certain threshold when building complex HPC simulations. Or even outsourced to a scripting layer with Python.
amluto•2mo ago
> C++ maps almost perfectly to the hardware with minimal overhead

Barely.

The C++ aliasing rules map quite poorly into hardware. C++ barely helps at all with writing correct multithreaded code, and almost all non-tiny machines have multiple CPUs. C++ cannot cleanly express the kinds of floating point semantics that are associative, and SIMD optimization care about this. C++ exceptions have huge overhead when actually thrown.

wrathofmonads•2mo ago
In simulations or in game dev, the practice is to use an SoA data layout to avoid aliasing entirely. Job systems or actors are used for handling multithreading. In machine learning, most parallelism is achieved through GPU offloading or CPU intrinsics. I agree in principle with everything you’re saying, but that doesn’t mean the ecosystem isn’t creative when it comes to working around these hiccups.
jcalvinowens•2mo ago
> The C++ aliasing rules map quite poorly into hardware.

But how much does aliasing matter on modern hardware? I know you're aware of Linus' position on this, I personally find it very compelling :)

As a silly little test a few months ago, I built whole Linux systems with -fno-strict-aliasing in CFLAGS, everything I've tried on it is within 1% of the original performance.

amluto•2mo ago
Even with strict aliasing, C and C++ often have to assume aliasing when none exists.
jcalvinowens•2mo ago
If they somehow magically didn't, how much could be gained?

I've never seen an attempt to answer that question. Maybe it's unanswerable in practice. But the examples of aliasing optimizations always seem to be eliminating a load, which in my experience is not an especially impactful thing in the average userspace widget written in C++.

The closest example of a more sophisticated aliasing optimization I've seen is example 18 in this paper: https://dl.acm.org/doi/pdf/10.1145/3735592

...but that specific example with a pointer passed to a function seems analogous to what is possible with 'restrict' in C. Maybe I misunderstood it.

This is an interesting viewpoint, but is unfortunately light on details: https://lobste.rs/s/yubalv/pointers_are_complicated_ii_we_ne...

Don't get me wrong, I'm not saying aliasing is a big conspiracy :) But it seems to have one of the higher hype-to-reality disconnects for compiler optimizations, in my limited experience.

Measter•2mo ago
Back in 2015 when the Rust project first had to disable use of LLVM's `noalias` they found that performance dropped by up to 5% (depending on the program). The big caveat here is that it was miscompiling, so some of that apparent performance could have been incorrect.

Of course, that was also 10 years ago, so things may be different now. There'll have been interest from the Rust project for improving the optimisations `noalias` performs, as well as improvements from Clang to improve optimisations under C and C++'s aliasing model.

jcalvinowens•2mo ago
Thanks! I've heard a lot of anecdotes like this, but I've never found anyone presenting anything I can repeoduce myself.
steveklabnik•2mo ago
Strict aliasing is not the only kind of aliasing.
jcalvinowens•2mo ago
Yes, that's why I described it as "silly" :)

Is there a better way to test the contribution of aliasing optimizations? Obviously the compiler could be patched, but that sort of invalidates the test because you'd have to assume I didn't screw up patching it somehow.

What I'm specifically interested in is how much more or less of a difference the class of optimizations makes on different calibers of hardware.

steveklabnik•2mo ago
Well, the issue is that "aliasing optimizations" means different things in different languages, because what you can and cannot do is semantically different. The argument against strict aliasing in C is that you give up a lot and don't get much, but that doesn't apply to Rust, which has a different model and uses these optimizations much more.

For Rust, you'd have to patch the compiler, as they don't generally provide options to tweak this sort of thing. For both rust and C this should be pretty easy to patch, as you'd just disable the production of the noalias attribute when going to LLVM; gcc instead of clang may be harder, I don't know how things work over there.

jcalvinowens•2mo ago
Thanks!
jawilson2•2mo ago
> C++ exceptions have huge overhead when actually thrown

Which is why exceptions should never really be used for control flow. In our code, an exception basically means "the program is closing imminently, you should probably clean up and leave things in a sensible state if needed."

Agree with everything else mostly. C/C++ being a "thin layer on top of hardware" was sort of true 20? 30? years ago.

pjmlp•2mo ago
CUDA hardware is specially designed against C++ memory model.

It wasn't initially, and then NVIDIA went through a multi-year effort to redesign the hardware.

If you're curious, there are two CppCon talks on the matter.

colonwqbang•2mo ago
Your post could be (uncharitably) paraphrased as: "once you have written correct C++ code, the drawbacks of C++ are not relevant". That is true, and the same is true of C. But it's not really a counterargument to Rust. It doesn't much help those us who have to deliver that correct code in the first place.
feelamee•2mo ago
I think rust has a lot of slippery points too. And this number will grow with years. So, yes - rust made a lot of good choices over C++, but this does not mean it has no its own problems. Therefore I can't say Rust is simple in this sense.. But, of course, this is a good evolution step.
amluto•2mo ago
IMO destructive move is rather tangled up with another language feature: the ability to usefully have a variable of type T where all values must be actual values that meet the constraints of a T.

Suppose T is a file handle or an owning pointer (like unique_ptr), and you want to say:

    T my_thing = [whatever]
and you want a guarantee that T has no null value and therefore my_thing is valid so long as it’s in scope.

In C++, if you are allowed to say:

    consume(std::move(my_thing));
then my_thing is in scope but invalid. But at least C++ could plausibly introduce a new style of object that is neither copyable nor non-destructively-movable but is destructively movable.

Interestingly, Go is kind of all in on the opposite approach. Every instance of:

   my_thing, err := [whatever]
creates an in-scope variable that might be invalid, and I can’t really imagine Go moving in a direction where either this pattern is considered deprecated or where it might be statically invalid to use my_thing.

I actually can imagine Python moving in a direction where, if you don’t at least try to prove to a type checker that you checked err first, you are not allowed to access my_thing. After all, you can already do:

    my_thing: T | None = [whatever]
and it’s not too much of a stretch to imagine similar technology that can infer that, if err is None, then T is not None. Combining this in a rigorous way with Python’s idea that everything is mutable if you try hard enough might be challenging, but I think that rigor is treated as optional in Python’s type checking.