frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

France's homegrown open source online office suite

https://github.com/suitenumerique
469•nar001•4h ago•222 comments

British drivers over 70 to face eye tests every three years

https://www.bbc.com/news/articles/c205nxy0p31o
155•bookofjoe•2h ago•135 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
447•theblazehen•2d ago•161 comments

Leisure Suit Larry's Al Lowe on model trains, funny deaths and Disney

https://spillhistorie.no/2026/02/06/interview-with-sierra-veteran-al-lowe/
32•thelok•2h ago•2 comments

Software Factories and the Agentic Moment

https://factory.strongdm.ai/
33•mellosouls•2h ago•27 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
93•AlexeyBrin•5h ago•17 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
781•klaussilveira•20h ago•241 comments

First Proof

https://arxiv.org/abs/2602.05192
42•samasblack•2h ago•28 comments

StrongDM's AI team build serious software without even looking at the code

https://simonwillison.net/2026/Feb/7/software-factory/
26•simonw•2h ago•23 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
36•vinhnx•3h ago•4 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
59•onurkanbkrc•5h ago•3 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
1034•xnx•1d ago•583 comments

Coding agents have replaced every framework I used

https://blog.alaindichiappari.dev/p/software-engineering-is-back
180•alainrk•4h ago•255 comments

A Fresh Look at IBM 3270 Information Display System

https://www.rs-online.com/designspark/a-fresh-look-at-ibm-3270-information-display-system
27•rbanffy•4d ago•5 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
171•jesperordrup•10h ago•65 comments

Vinklu Turns Forgotten Plot in Bucharest into Tiny Coffee Shop

https://design-milk.com/vinklu-turns-forgotten-plot-in-bucharest-into-tiny-coffee-shop/
9•surprisetalk•5d ago•0 comments

72M Points of Interest

https://tech.marksblogg.com/overture-places-pois.html
16•marklit•5d ago•0 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
107•videotopia•4d ago•27 comments

What Is Stoicism?

https://stoacentral.com/guides/what-is-stoicism
7•0xmattf•1h ago•1 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
265•isitcontent•20h ago•33 comments

Making geo joins faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
152•matheusalmeida•2d ago•43 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
278•dmpetrov•20h ago•148 comments

Ga68, a GNU Algol 68 Compiler

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
36•matt_d•4d ago•11 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
546•todsacerdoti•1d ago•264 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
421•ostacke•1d ago•110 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
365•vecti•22h ago•166 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
65•helloplanets•4d ago•69 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
338•eljojo•23h ago•209 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
460•lstoll•1d ago•303 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
373•aktau•1d ago•194 comments
Open in hackernews

C++ Modules Are Here to Stay

https://faresbakhit.github.io/e/cpp-modules/
78•faresahmed•2w ago

Comments

whobre•1w ago
> auto main() -> int {

Dude…

on_the_train•1w ago
It's been the go-to syntax for 15 years now
Night_Thastus•1w ago
Go-to? I've never seen a project use it, I've only ever seen examples online.
whobre•1w ago
Same here
on_the_train•1w ago
It's still been the standard since c++11 and I've been using it every since in all teams I've worked in.
cpburns2009•1w ago
Now I haven't touched C++ in probably 15 years but the definition of main() looks confused:

> auto main() -> int

Isn't that declaring the return type twice, once as auto and the other as int?

yunnpp•1w ago
No. The auto there is doing some lifting so that you can declare the type afterwards. The return type is only defined once.

There is, however, a return type auto-deduction in recent standards iirc, which is especially useful for lambdas.

https://en.cppreference.com/w/cpp/language/auto.html

auto f() -> int; // OK: f returns int

auto g() { return 0.0; } // OK since C++14: g returns double

auto h(); // OK since C++14: h’s return type will be deduced when it is defined

maccard•1w ago
What about

auto g() -> auto { return 0.0; }

yunnpp•1w ago
0.0 is a double, so I would assume the return type of g is deduced to be double, if that is what you're asking.
maccard•1w ago
I was more pointing out that the syntax was dumb for that particular example!
maccard•1w ago
I really wish they had used func instead, it would have saved this confusion and allowed for “auto type deduction” to be a smaller more self contained feature
zabzonk•1w ago
the standard c++ committee is extremely resistant to introducing new keywords such as "func", so as not to break reams of existing code.
maccard•1w ago
Indeed. I am a frequent critic of the c++ committee’s direction and decisions. There’s no direction other than “new stuff” and that new stuff pretty much has to be in the library otherwise it will require changes that may break existing code. That’s fine.

But on the flip side, there’s a theme of ignoring the actual state of the world to achieve the theoretical goals of the proposal when it suits. Modules are a perfect example of this - when I started programming professionally modules were the solution to compile times and to symbol visibility. Now that they’re here they are neither. But we got modules on part. The version that was standardised refused to accept the existence of the toolchain and build tools that exist, and as such refused to place any constraints that may make implementation viable or easier.

St the same time we can’t standardise Pragma once because some compiler may treat network shares or symlinks differently.

There’s a clear indication that the committee don’t want to address this, epochs are a solution that has been rejected. It’s clear the only real plan is shove awkward functional features into libraries using operator overloads - just like we all gave out to QT for doing 30 years ago. But at least it’s standardised this time?

CamperBob2•1w ago
It's like calling a Ford Mustang Mach-E the "Model T++."
few•1w ago
And their code example doesn't actually return a value!
Davidbrcz•1w ago
For main it's explicitly allowed by the standard, and no return is equal to return 0
direwolf20•1w ago
which is super weird. If they can tell the compiler to allow no return, only for main, they can also tell it to pretend void return is int return of 0, only for main.
webdevver•1w ago
i was sincerely hoping i could get

    auto main(argc, argv) -> int
         int argc;
         char **argv;
to work, but alas it seems c++ threw pre-ansi argument type declarations out.
zabzonk•1w ago
> c++ threw pre-ansi argument type declarations out

they never were in C++.

sethops1•1w ago
As someone who quit c++ over 15 years ago it's been comical to watch what this language has become.
rovingeye•1w ago
This has been valid C++ since C++ 11
direwolf20•1w ago
It's unusual. Some, unusual, style guides require it. It's useful in some cases, even necessary in some which is why it was introduced, but not for simple "int"
rovingeye•1w ago
it's literally the exact same thing. We use trailing return types to be consistent across the language.
direwolf20•1w ago
We use trigraphs to be consistent across the language.
cocoto•1w ago
In my opinion this syntax is super good, it allows to have all functions/method names starting at the same level, it’s way easier to read the code that way, huge readability improvement imo. Sadly nobody uses this and you still have the classic way so multiple ways to do the same thing…
vitaut•1w ago
This style is used in {fmt} and is great for documentation, especially on smaller screens: https://fmt.dev/12.0/api/#format_to_n
cmovq•1w ago
Can someone using modules chime in on whether they’ve seen build times improve?
nickelpro•1w ago
import std; is an order of magnitude faster than using the STL individually, if that's evidence enough for you. It's faster than #include <iostream> alone.

Chuanqi says "The data I have obtained from practice ranges from 25% to 45%, excluding the build time of third-party libraries, including the standard library."[1]

[1]: https://chuanqixu9.github.io/c++/2025/08/14/C++20-Modules.en...

luke5441•1w ago
Yeah, but now compare this to pre-compiled headers. Maybe we should be happy with getting a standard way to have pre-compiled std headers, but now my build has a "scanning" phase which takes up some time.
direwolf20•1w ago
Modules are a lot like precompiled headers, but done properly and not as a hack.
nickelpro•1w ago
The OP does this and measures ~1.2x improvement over PCH.
vitaut•1w ago
We did see build time improvements from deploying modules at Meta.
feelamee•1w ago
why use modules if PCH on your diagram is not much worse in compile times?
nickelpro•1w ago
Macro hygiene, static initialization ordering, control over symbol export (no more detail namespaces), slightly higher ceiling for compile-time and optimization performance.

If these aren't compelling, there's no real reason.

feelamee•5h ago
We live with that for *decades*. For me this is not a daily problem. So yes, this is not compelling, unfortunately.
bluGill•1w ago
modules are the future and the rules for are well thought out. Ever compiler has their own version of PCH and they all work different in annoying ways.
Maxatar•1w ago
Modules are the future... and will always be the future.
feelamee•5h ago
> Ever compiler has their own version of PCH and they all work different in annoying ways.

I don't care because I use cmake

WalterBright•1w ago
Having implemented PCH for C and C++, it is an uuugly hack, which is why D has modules instead.
reactjs_•1w ago
Here’s the thing I don’t get about module partitions: They only seem to allow one level of encapsulation.

    Program
    - Module
      - Module Partition
whereas in module systems that support module visibility, like Rust’s, you can decompose your program at multiple abstraction levels:

    Program
    - Private Module
      - Private Module
        - Private Module
        - Public Module
      - Public Module
Maybe I am missing something. It seems like you will have to rely on discipline and documentation to enforce clean code layering in C++.
pdpi•1w ago
Rust's re-exports also allow you to design your public module structure separate from your internal structure.
groby_b•1w ago
I don't think you're missing something. The standards committee made a bad call with "no submodules", ran into insurmountable problems, and doubled down on the bad call via partitions.

"Just one more level bro, I swear. One more".

I fully expect to sooner or later see a retcon on why really, two is the right number.

Yeah, I'm salty about this. "Submodules encourage dependency messes" is just trying to fix substandard engineering across many teams via enforcement of somewhat arbitrary rules. That has never worked in the history of programming. "The determined Real Programmer can write FORTRAN programs in any language" is still true.

bluGill•1w ago
The C++ committee tries to do features with room for future extension. They believe that whatever you want from sub-modules is still possible in the future - but better to have a small (as if modules is small) thing now than try for perfects. We can argue about submodules once we have the easy cases working and hopefully better understand the actual limitations.
groby_b•1w ago
Not to put too fine a point on it: The world has 35 years of experience with submodules. It's not rocket science. The committee just did what committees do.

And sure, "future extension" is nice. But not if the future arrives at an absolutely glacial pace and is technically more like the past.

This may be inevitable given the wide spread of the language, but it's also what's dooming the language to be the next COBOL. (On the upside, that means C++ folks can write themselves a yacht in retirement ;)

bluGill•1w ago
That is 35 years of different things tried, some that work better than others, some that are not compatible with others. Trying to figure out what is the best compromise while also making something that doesn't break existing code is hard when there are a lot of people who care.
pornel•1w ago
Just getting to this barely-working state took C++ longer than it took to create all of Rust, including a redesign of Rust's own module system.
pklausler•1w ago
FWIW, Fortran does have submodules.
groby_b•1w ago
I suppose we shall amend to "The determined Real Programmer will fix FORTRAN" ;)

But, for the folks who didn't grow up with the Real Programmer jokes, this is rooted in a context of FORTRAN 77. Which was, uh, not famous for its readability or modularity. (But got stuff done, so there's that)

pklausler•1w ago
I'm so old, those jokes were about me.
groby_b•1w ago
Fogeys unite! ;) (They're about me, too)
zabzonk•1w ago
I wrote a lot of F77 code way back when, including an 8080 simulator similar to that written by Gates and Allen used to build their BASIC for Altair. I don't know what language they wrote theirs in, but mine was pretty readable, just a bit late. And it was very portable - Dec10, VAX, IBM VM/CMS with almost no changes.

I think F77 was a pretty well designed language, given the legacy stuff it had to support.

groby_b•1w ago
It was well designed. Hence the "it got stuff done".

But it was also behind the times. And, if we're fair, half of its reputation comes from the fact that half of the F77 code was written by PhDs, who usually have... let's call it a unique style of writing software.

zabzonk•1w ago
Indeed. Two PhD students came to see me when the polytechnic I worked for switched from a Dec10 to two IBM 4381s.

[them] How can we get our code to work on the IBM?

[me] (examines code) This only looks vaguely like Fortran.

[them] Yes, we used all these wonderful extensions that Digital provides!

[me] (collapse on the floor laughing) (recover) Hmm. Go see Mike (our VAX systems programmer). You may be able to run on our VAXen, but I can't imagine it running on the IBMs without a major rewrite. Had they stuck to F77 there would have been few problems, and I could have helped with them.

Portability is always worth aiming for, even if you don't get all the way there.

fl0ki•1w ago
Fascinatingly, I am not aware of any real issues with how Rust did nested modules. It even treated crates as top-level modules for most language-level purposes. I am sure there are nuanced reasons that C++ can't do quite the same, but the developer experience consequences can't be worth it.
pjmlp•1w ago
Like most languages with modules.

Rust, Modula-2 and Ada are probably the only ones with module nesting.

IsTom•1w ago
Notably many languages in ML family have first class modules.
pjmlp•1w ago
Only Standard ML and OCaml, as far as I am aware.

However this is a different kind of modules, with them being present on the type system, and manipulated via functors.

w4rh4wk5•1w ago
https://arewemodulesyet.org/ gives you an overview which libraries already provide a module version.
srcreigh•1w ago
Wow, the way this data is presented is hilarious.

Log scale: Less than 3% done, but it looks like over 50%.

Estimated completion date: 10 March 2195

It would be less funny if they used an exponential model for the completion date to match the log scale.

w4rh4wk5•1w ago
Yeah, my personal opinion is that modules are dead on arrival, but I won't waste my time arguing with C++ enthusiasts on that.
mcdeltat•1w ago
Nah I'm a C++ (ex?) enthusiast and modules are cool but there's only so many decades you can wait for a feature other languages have from day 1, and then another decade for compilers to actually implement it in a usable manner.
w4rh4wk5•1w ago
I am fine with waiting for a feature and using it when it's here. But at this point, I feel like C++ modules are a ton of complexity for users, tools, and compilers to wrangle... for what? Slightly faster compile times than PCH? Less preprocessor code in your C++.. maybe? Doesn't seem worth it to me in comparison.
ziml77•1w ago
I would think they don't want to hear that because of how badly they want modules to happen. Don't kill their hope!
Kelteseth•1w ago
Author here. Sadly, this had to be done, otherwise you would not see anything on the chart. I added an extra progress bar below, so that people would not get a wrong impression.
w4rh4wk5•1w ago
Hey, I really appreciate this site! Independent from my personal opinion on modules, I think it's extremely helpful to everyone to see the current state of development; and you do an excellent job reflecting that.
Kelteseth•1w ago
Thanks <3 Working on this project also made me realize that cpp needs something like crates.io. We are using vcpkg as a second-best guess for cpp library usages, because it has more packages than sites like conan. Also adding support of things like import statement list, shows that there needs to be a naming convention, because now we have this wild mix:

- import boost.type_index;

- import macro-defined;

- import BS.thread_pool;

srcreigh•1w ago
Hey, sorry about that. I find your site very charming. Yeah it takes a few seconds to understand, but that's completely fine imo.

You are excused if the site misleads anybody, just because you published "Estimated completion date: 2195". That's just so awesome. Kudos.

TimorousBestie•1w ago
I can’t deploy C++ modules to any of the hardware I use in the shop. Probably won’t change in the near-to-mid future.

It seems likely I’ll have to move away from C++, or perhaps more accurately it’s moving away from me.

bluGill•1w ago
If you tools are not updated that isn't the fault of C++. You will feel the same about Rust when forced to used a 15 year old version too (as I write this Rust 1.0 is only 10 years old). Don't whine to me about these problems, whine to your vendors until they give you the new stuff.
Joker_vD•1w ago
> whine to your vendors until they give you the new stuff.

How well does this usually work, by the way?

krior•1w ago
Nobody is "whining" to you. Nobody is mentioning rust. Your tone is way too sharp for this discussion.
TimorousBestie•1w ago
If C++ libraries eschew backward compatibility to chase after build time improvements, that’s their design decision. I’ll see an even greater build time improvement than they do (because I won’t be able to build their code at all).
juliangmp•1w ago
My experience with vendor toolchains is that they generally suck anyway. In a recent bare metal project I chose not to use the vendor's IDE and toolchain (which is just an old version of GCC with some questionable cmake scripts around it) and instead just cross compile with rust manually. And so far its been a really good decision.
TimorousBestie•1w ago
Yep, this aligns with my experience. I’ve yet to take the plunge into cross compiling with rust though, might have to try that.
juliangmp•1w ago
It's been a comfortable journey for me. There's a support library for the arm cortex I'm using so it was very easy to get some LEDs to blink. Obviously we had to implement some drivers manually (UART for example) and there's lots of unsafe code, but overall the language makes a lot of things very nice on bare metal.
jcranmer•1w ago
> If you tools are not updated that isn't the fault of C++.

It kinda is. The C++ committee has been getting into a bad habit of dumping lots of not-entirely-working features into the standard and ignoring implementer feedback along the way. See https://wg21.link/p3962r0 for the incipient implementer revolt going on.

20k•1w ago
Its happening again with contracts. Implementers are raising implementability objections that are being completely ignored. Senders and receivers are being claimed to work great on a GPU but without significant testing (there's only one super basic cuda implementation), and even a basic examination shows that they won't work well

So many features are starting to land which feel increasingly DoA, we seriously need a language fork

direwolf20•1w ago
Please make one.
amluto•1w ago
Even some much simpler things are extremely half baked. For example, here’s one I encountered recently:

    alignas(16) char buf[128];
What type is buf? What alignment does that type have? What alignment does buf have? Does the standard even say that alignof(buf) is a valid expression? The answers barely make sense.

Given that this is the recommended replacement for aligned_storage, it’s kind of embarrassing that it works so poorly. My solution is to wrap it in a struct so that at least one aligned type is involved and so that static_assert can query it.

bluGill•1w ago
The only people who write code like that have plenty of time to understand those questions - and why the correct answer is what it is is critically important to that line of code working correctly. The vast majority of us would never write a line like that - we let the compiler care about those details. the vast majority of the time 'just use vector' is the right answer that has zero real world exceptions.

but in the rare case you need code like that be glad C++ has you covered

amluto•1w ago
> and why the correct answer is what it is is critically important to that line of code working correctly.

> but in the rare case you need code like that be glad C++ has you covered

I strongly disagree. alignof(buf) works correctly but is a GCC extension. alignof(decltype(buf)) is 1, because alignas is a giant kludge instead of a reasonable feature. C++ only barely has me covered here.

crote•1w ago
When one of the main arguments people use to stick to C++ is that it "runs everywhere", it actually is. After all, what use is there for a C++ where the vast majority of the library ecosystem only works with the handful of major compilers? If compatibility with a broad legacy ecosystem isn't important, there are far more attractive languages these days!

Just like Python was to blame for the horrible 2-to-3 switch, C++ is to blame for the poor handling of modules. They shouldn't have pushed through a significant backwards-incompatible change if the wide variety of vendor toolchains wasn't willing to adopt it.

nxobject•1w ago
OP isn't whining to you; I'm not sure where you're getting this defensiveness from. They're simply observing the state of their toolchain and, likely, have better things to do with their 24 hours in their day.
maccard•1w ago
This is not an argument against modules. This is an argument against allowing areas that don’t upgrade hold modern c++ back.
direwolf20•1w ago
Nobody uses all features of C++.

But you might not be able to use libraries that insist upon modules. There won't be many until modules are widespread.

rienbdj•1w ago
From the outside looking in, this all feels like too little too late. Big tech has decided on Rust for future infrastructure projects. C++ will get QoL improvements… one day and the committees seem unable to keep everyone happy or disappoint one stake holder. C++ will be around forever, but will it be primarily legacy?
20k•1w ago
Yes. Unfortunately the committee has completely abandoned safety at this point. Even memory/thread safety profiles have been indefinitely postponed. The latest ghost safety lifetimes thing is completely unimplementable

There literally isn't a plan or direction in place to add any way to compete with Rust in the safety space currently. They've got maybe until c++29 to standardise lifetimes, and then C++ will transition to a legacy language

direwolf20•1w ago
Using containers and std::string for everything eliminates the majority of safety bugs.
pornel•1w ago
The safety bar is way way higher.

The C++ WG keeps looking down at C and the old C++ sins, sees their unsafety, and still thinks that's the problem to fix.

Rust looks the same way at modern C++. The std collections and smart pointers already existed before the Rust project has been started. Modern C++ is the safety failure that motivated creation of Rust.

AlotOfReading•1w ago
If only the standard differentiated between programs that are "mostly" free of UB and programs that aren't.
cataphract•1w ago
Not really. We keep getting pointer-like types like std::string_view and std::span that can outlive their referents.
Conscat•1d ago
You need `GSL` and `lifetimebound` to approach most modern safety bugs.
mathisfun123•1w ago
> Big tech has decided on Rust for future infrastructure projects. C++ will get QoL improvements…

when people say this do they have like any perspective? there are probably more cpp projects started in one week (in big tech) than rust projects in a whole year. case in point: at my FAANG we have probably like O(10) rust projects and hundreds of cpp projects.

ofrzeta•1w ago
> Big tech has decided on Rust for future infrastructure projects.

as they say "citation needed"

Night_Thastus•1w ago
The fact that precompiled headers are nearly as good for a much smaller investment tells you most of what you need to know, imo.
fooker•1w ago
C++ templates and metaprogramming is fundamentally incompatible with the idea of your code being treated in modules.

The current solution chosen by compilers is to basically have a copy of your code for every dependency that wants to specialize something.

For template heavy code, this is a combinatorial explosion.

pjmlp•1w ago
It has worked perfectly fine while using VC++, minus the usual ICE that still come up.
fooker•1w ago
It works perfectly when it comes to `import std` and making things a bit easier.

It does not work very well at all if your goal is to port your current large codebase to incrementally use modules to save on compile time and intermediate code size.

pjmlp•1w ago
Office has made a couple of talks about their modules migration, which is exactly that use case.
WalterBright•1w ago
D has best-in-class templates and metaprogramming, and modules. It works fine.
amluto•1w ago
I think that SFINAE and, to a lesser extent, concepts is fundamentally a bit odd when multiple translation units are involved, but otherwise I don’t see the problem.

It’s regrettable that the question of whether a type meets the requirements to call some overload or to branch in a particular if constexpr expression, etc, can depend on what else is in scope.

direwolf20•1w ago
This is one of those wicked language design problems that comes up again and again across languages, and they solve it in different ways.

In Haskell, you can't ever check that a type doesn't implement a type class.

In Golang, a type can only implement an interface if the implementation is defined in the same module as the type.

In C++, in typical C++ style, it's the wild west and the compiler doesn't put guard rails on, and does what you would expect it to do if you think about how the compiler works, which probably isn't what you want.

I don't know what Rust does.

pornel•1w ago
Rust's generics are entirely type-based, not syntax-based. They must declare all the traits (concepts) they need. The type system has restrictions that prevent violating ODR. It's very reliable, but some use-cases that would be basic in C++ (numeric code) can be tedious to define.

Generic code is stored in libraries as MIR, which is half way between AST and LLVM IR. It's still monomorphic and slow to optimize, but at least doesn't pay reparsing cost.

direwolf20•1w ago
How does it handle an implementation of a trait being in scope in one compilation unit and out of scope in another? That's the wicked problem.
amluto•1w ago
It’s impossible (?) due to the “coherence” rule. A type A can implement a trait B in two places: the crate where A is defined or the crate where B is defined. So if you can see A and B, you know definitely whether A implements B.

The actual rule is more complex due to generics:

https://github.com/rust-lang/rfcs/blob/master/text/2451-re-r...

and that document doesn’t actually seem to think that this particular property is critical.

fooker•1w ago
Rust gets around the shortcomings of its generics by providing an absurdly powerful macro engine.

It's a great idea when not abused too much for creating weird little DSLs that no one is able to read.

direwolf20•1w ago
The compiler is supposed to put the template IR into the compiled module file, isn't it?
fooker•1w ago
Exactly, that's no better than #including transitive dependencies to compile large translation units.
yunnpp•1w ago
I recently started a pet project using modules in MSVC, the compiler that at present has best support for modules, and ran into a compiler bug where it didn't know how to compile and asked me to "change the code around this line".

So no, modules aren't even here, let alone to stay.

Never mind using modules in an actual project when I could repro a bug so easily. The people preaching modules must not be using them seriously, or otherwise I simply do not understand what weed they are smoking. I would very much appreciate to stand corrected, however.

senfiaj•1w ago
I still hope that modules become mature and safe for production code. Initially I coded in C/C++ and this header #include/#ifndef approach seemed OK at that time. But after using other programming languages, this approach started to feel too boilerplate and archaic. No sane programming language should require a duplication in order to export something (for example, the full function and its prototype), you should write something once and easily export.
kccqzy•1w ago
> No sane programming language should require a duplication in order to export something (for example, the full function and its prototype)

You are spoiled by the explosive growth of open source and the ease of accessing source code. Lots of closed source commercial libraries provide some .h files and a .so file. And even when open source, when you install a library from a package from a distribution or just a tarball, it usually installs some .h files and a .so file.

The separation between interface and implementation into separate files was a good idea. The idea seemed to be going out of vogue but it’s still a good idea.

senfiaj•1w ago
> Lots of closed source commercial libraries provide some .h files and a .so file.

I'm mostly talking about modules for internal implementation, which is likely to be the bulk of the exports. Yes, it's understandable that for dll / so files exporting something for external executables is more complicated also because of ABI compatibility concerns (we use things like extern "C"). So, yes header approach might be justified in this case, but as I stated, such exports are probably a fraction of all exports (if they are needed at all). I'll still prefer modules when it's possible to avoid them.

AgentME•1w ago
In most situations, auto-generating the equivalent of .h files for a library based on export statements in the source code would be fine and a useful simplification.
johannes1234321•1w ago
> The separation between interface and implementation into separate files was a good idea. The idea seemed to be going out of vogue but it’s still a good idea.

However as soon as you do C++ that goes away. With C++ you need implementation of templates available to the consumer (except cases with limited set of types where you can extern them), wmin many cases you get many small functions (basic operator implementations, begin()/end() for iterators in all variations etc.) which benefit from inking, thus need to be in the header.

Oh and did I mention class declarations tonthe the class size ... or more generic and even with plain C: As soon as the client should know about the size of a type (for being able to allocate it, have an array of those etc) you can't provide the size by itself, but you have to provide the full type declaration with all types down the rabbit hole. Till you somewhere introduce a pointer to opaque type indirection.

And then there macros ...

Modules attempt to do that better, by providing just the interface in a file. But hey, C++ standard doesn't "know" about those, so module interface files aren't a portable thing ...

nextaccountic•1w ago
The .h could have been a compiler output, like the .so
kccqzy•1w ago
The C language is too complicated and too flexible to allow that. If you are starting from scratch and creating a new language, this could be a design goal from the beginning.
quuxplusone•1w ago
> The C language is too complicated and too flexible to allow that.

I disagree. In fact, I would expect the following could be a pretty reasonable exercise in a book like "Software Tools"[1]: "Write a program to extract all the function declarations from a C header file that does not contain any macro-preprocessor directives." This requires writing a full C lexer; a parser for function declarations (but for function and struct bodies you can do simple brace-matching); and nothing else. To make this tool useful in production, you must either write a full C preprocessor, or else use a pipeline to compose your tool with `cpp` or `gcc -E`. Which is the better choice?

However, I do see that the actual "Software Tools" book doesn't get as advanced as lexing/parsing; it goes only as far as the tools we today call `grep` and `sed`.

I certainly agree that doing the same for C++ would require a full-blown compiler, because of context-dependent constructs like `decltype(arbitrary-expression)::x < y > (z)`; but there's nothing like that in K&R-era C, or even in C89.

No, I think the only reason such a declaration-extracting tool wasn't disseminated widely at the time (say, the mid-to-late 1970s) is that the cost-benefit ratio wouldn't have been seen as very rewarding. It would automate only half the task of writing a header file: the other and more difficult half is writing the accompanying code comments, which cannot be automated. Also, programmers of that era might be more likely to start with the header file (the interface and documentation), and proceed to the implementation only afterward.

[1] - K&P's "Software Tools" was originally published in 1976, with exercises in Ratfor. "Software Tools in Pascal" (1981) is here: https://archive.org/details/softwaretoolsinp00kern/

kccqzy•1w ago
> Write a program to extract all the function declarations from a C header file that does not contain any macro-preprocessor directives

There you go. You just threw away the most difficult part of the problem: the macros. Even a medium-sized C library can have maybe 500 lines of dense macros with ifdef/endif/define which depends on the platform, the CPU architecture, as well as user-configurable options at ./configure time. Should you evaluate the macro ifdefs or preserve them when you extract the header? It depends on each macro!

And your tool would still be highly incomplete because it only handles function declarations not struct definitions, typedefs you expect the users to use.

> the other and more difficult half is writing the accompanying code comments, which cannot be automated

Again disagree. Newer languages have taught us that it is valuable to have two syntaxes for comments, one intended for implementation and one intended for the interface. It’s more popularly known as docstrings but you can just reuse the comment syntax and differentiate between // and /// comments for example. The hypothetical extractor tool will work no differently from a documentation extractor tool.

Maxatar•1w ago
I interpreted OP's post to say that you take a C file after the preprocessor has translated it. How you perform that preprocessing can simply be by passing the file to an existing C preprocessor, or you can implement it as well.

Implementing a C preprocessor is tedious work, but it's nothing remotely complex in terms of challenging data structures, algorithms, or requiring sophisticated architecture. It's basically just ensuring your preprocessor implements all of the rules, each of which is pretty simple.

kccqzy•1w ago
And you had the same misunderstanding as OP. Because you have eliminated all macros during the preprocessor step, you can no longer have macro-based APIs, including function-like macros you expect library users to use, #ifdef blocks where you expect user code to either #define or #undef, and a primitive form of maintaining API compatibility but not ABI compatibility for many things.

It’s a cute learning project for a student of computer science for sure. It’s not remotely a useful software engineering tool.

quuxplusone•1w ago
Our points of view are probably not too far off, really. Remember this whole thought-experiment is counterfactual: we're imagining what "automatic extraction of function declarations from a .c file" would have looked like in the K&R era, in response to claims (from 50 years later) that "No sane programming language should require a duplication in order to export something" and "The .h could have been a compiler output." So we're both having to imagine the motivations and design requirements of a hypothetical programmer from the 1970s or 1980s.

I agree that the tool I sketched wouldn't let your .h file contain macros, nor C99 inline functions, nor is it clear how it would distinguish between structs whose definition must be "exported" (like sockaddr_t) and structs where a declaration suffices (like FILE). But:

- Does our hypothetical programmer care about those limitations? Maybe he doesn't write libraries that depend on exporting macros. He (counterfactually) wants this tool; maybe that indicates that his preferences and priorities are different from his contemporaries'.

- C++20 Modules also do not let you export macros. The "toy" tool we can build with 1970s technology happens to be the same in this respect as the C++20 tool we're emulating! A modern programmer might indeed say "That's not a useful software engineering tool, because macros" — but I presume they'd say the exact same thing about C++20 Modules. (And I wouldn't even disagree! I'm just saying that that particular objection does not distinguish this hypothetical 1970s .h-file-generator from the modern C++20 Modules facility.)

[EDIT: Or to put it better, maybe: Someone-not-you might say, "I love Modules! Why couldn't we have had it in the 1970s, by auto-generating .h files?" And my answer is, we could have. (Yes it couldn't have handled macros, but then neither can C++20 Modules.) So why didn't we get it in the 1970s? Not because it would have been physically difficult at all, but rather — I speculate — because for cultural reasons it wasn't wanted.]

AnimalMuppet•1w ago
Not for things like public/private, static, virtual. Not for inheritance, either.
nextaccountic•1w ago
You add all those things in a single .c or .cpp source, without manual authoring of .h files (this may require language changes, maybe in C30 and C++30 or something)

Then whatever is relevant to the public interface gets generated by the compiler and put in a .h file. This file is not put in the same directory of the .c file to not encourage people to put the .h file in version control.

Maxatar•1w ago
I think everyone hopes/hoped for a sane and useful version of modules, one that would provide substantial improvements to compilation speed and make things like packaging libraries and dealing with dependencies a lot more sane.

The version of modules that got standardized is anything but that. It's an incredibly convoluted mess that requires an enormous amount of effort for little benefit.

senfiaj•1w ago
> It's an incredibly convoluted mess that requires an enormous amount of effort for little benefit.

I'd say C++ as a whole is a complete mess. While it's powerful (including OOP), it's complicated and inconsistent language with a lot of historical baggage (40+ years). That's why people and companies still search for (or even already use) viable replacements for C++, such as Rust, Zig, etc.

malfmalf•1w ago
They are using modules in the MS Office team:

https://devblogs.microsoft.com/cppblog/integrating-c-header-...

Maxatar•1w ago
This is untrue. The MS Office team is using a non-standard MSVC compiler flag that turns standard #include into header units, which treats those header files in a way similar to precompiled header files. This requires no changes to source code, except for some corner cases they mention in that very blog post to work around some compiler quirks.

That is not the same as using modules, which they have not done.

starfreakclone•1w ago
There's nothing non-standard happening there. The compiler is allowed to translate #include -> import. Here's the standardese expressing that: https://eel.is/c%2B%2Bdraft/cpp.include#10.

I do agree, it's not _exactly_ the same as using _named modules_, but header units share an almost identical piece of machinery in the compiler as named modules. This makes the (future planned) transition to named modules a lot easier since we know the underlying machinery works.

The actual blocker for named modules is not MSVC, it's other compilers catching up--which clang and gcc are doing quite quickly!

bluGill•1w ago
Modules are still in the early adoptor phase - despite 3 years. there are unfortunately bugs, and we still need people to write the "best practices for C++ modules" books. Everyone who has use them overall says they are good things and worth learning, but there is a lot about using them well that we haven't figured out.
alextingle•1w ago
Best practice for C++ modules: avoid.

(Buy my book)

vitaut•1w ago
Modules have been working reasonably well in clang for a while now but MSVC support is indeed buggy.
throw_sepples•1w ago
I'm afraid things will continue very much sucking for a long time and will still be less-than even when they become broadly supported since sepples programmers, being real programmers™, are not entitled to have nice things.
up2isomorphism•1w ago
“C includes show it age.” But C++ is stating not because of there is a “++” there but because of there is a “C”.
direwolf20•1w ago
Decades–old parts of the ++ also contribute.
fasterik•1w ago
I get by without modules or header files in my C++ projects by using the following guidelines:

- Single translation unit (main.cpp)

- Include all other cpp files in main

- Include files in dependency order (no forward declarations)

- No circular dependencies between files

- Each file has its own namespace (e.g. namespace draw in draw.cpp)

This works well for small to medium sized projects (on the order of 10k lines). I suspect it will scale to 100k-1M line projects as long as there is minimal use of features that kill compile times (e.g. templates).

indil•1w ago
I believe that's called a unity build. Really nice speedup.
zabzonk•1w ago
SQLite calls this an "amalgamation". It is easy and convenient for users (not developers) of SQLite code.

https://sqlite.org/amalgamation.html

zabzonk•1w ago
This might be OK for someone using your files (a bit like a header-only library) but not so great for team development.
w4rh4wk5•1w ago
You still organize the big file into sections to keep things together that are semantically related. For Git it mostly doesn't matter whether it's 100 small files or a single big one.
jokoon•1w ago
I am curious to know if that 8.6x speedup is consistent.

I don't see many "fair" benchmarks about this, but I guess it is probably difficult to properly benchmarks module compilation as it can depend on cases.

If modules can reach that sort of speedup consistently, it's obviously great news.