I wonder why nobody configures this, is this not something that they can configure themselves to a more relevant image, like the GCC logo or something?
How do you think Anubis should dress?
That said, more on topic, I am really glad that C++ actually considers the implications of switching default targets and only does this every 5 years. That's a decent amount of time and longer than most distros release cycles.
When a language changes significantly faster than release cycles (ie, rustc being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Apt now having rust code, and Debian's release cycle being 4 years for LTS, debian's shipped rustc won't be able to compile Apt.
It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.
Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.
I've worked on a number of pretty large projects. If the target for the source code changes it can be really hard to keep C++20 features from creeping in. It means that you either need to explicitly build targeting 11, or whoever does code reviews needs to have encyclopedic knowledge of whether or not a change leaked in a future feature.
It is "doable" but why would you do it when you can simply keep the compiler targeting 11 and let it do the code review for you.
It doesn't appear to me that the parent comment was implying otherwise.
The default is changing for any compilation that doesn't explicitly specify a standard version. I would have thought that the build process for a compiler is likely careful enough that it does explicitly specify a version.
A compiler is perfectly capable of compiling programs which use features that its own source does not.
So why has it been posted it as a reply, and why label it a counterpoint?
C++ standards support and why C++23 and C++26 are not the default: https://gcc.gnu.org/projects/cxx-status.html
They are discussing in this email thread whether it is already properly supported.
> It's one reason why people care so much about self-hosted compilers
For self-hosting and bootstrapping you want the compiler to be compilable with an old version as possible.
Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.
[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.
[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.
[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.
I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.
jjmarr•1h ago
1718627440•1h ago
albertzeyer•43m ago
whobre•39m ago
suby•14m ago