I wonder why nobody configures this, is this not something that they can configure themselves to a more relevant image, like the GCC logo or something?
but open source generally isnt treated as a product. its just a bunch of volunteers having fun writing code. its natural that they will include their other interests in it in some way because it makes working on a project more fun. first impressions matter a lot, but i dont think foss projects should optimize for that instead of having fun.
How do you think Anubis should dress?
I think the fact that people bring up things that the Anubis mascot isn't when talking about Anubis is more telling of their own harmful (and potentially racist) biases against Japanese-styled media than it is about the idea of having anime-styled mascots for free software projects.
That said, more on topic, I am really glad that C++ actually considers the implications of switching default targets and only does this every 5 years. That's a decent amount of time and longer than most distros release cycles.
When a language changes significantly faster than release cycles (ie, rustc being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Apt now having rust code, and Debian's release cycle being 4 years for LTS, debian's shipped rustc won't be able to compile Apt.
It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
A good example is the C++11 standard garbage collection! It was explicitly optional but afiak no one implemented it.
So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.
Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.
I've worked on a number of pretty large projects. If the target for the source code changes it can be really hard to keep C++20 features from creeping in. It means that you either need to explicitly build targeting 11, or whoever does code reviews needs to have encyclopedic knowledge of whether or not a change leaked in a future feature.
It is "doable" but why would you do it when you can simply keep the compiler targeting 11 and let it do the code review for you.
It doesn't appear to me that the parent comment was implying otherwise.
The default is changing for any compilation that doesn't explicitly specify a standard version. I would have thought that the build process for a compiler is likely careful enough that it does explicitly specify a version.
I could be misreading this, but unless they have a different understanding of what it means to dog fooding than I do then it seems like the proposal is to use C++20 features in the compiler bootstraping.
The email mentions that the last time they changed it was 5 years ago in GCC 11, and the link <https://gcc.gnu.org/projects/cxx-status.html#cxx17> indeed says
> C++17 mode is the default since GCC 11; it can be explicitly selected with the -std=c++17 command-line flag, or -std=gnu++17 to enable GNU extensions as well.
which does not imply a change in an obscure feature (bootstrapping) that would only affect a few users.
Even if you only target 11, there may be advantages to setting a newer version anyway. Sometimes the standard finally allows some optimization that would work, or disallows something that was always error prone anyway. I would recommend you set your standard to the latest the compiler supports and fix any bugs. Solve your we have to support older standards problem by having your CI system build with an older compiler (and also the newest one). C++ is very good at compatibility so this will rarely be a problem.
For example in CMake the natural variable is CMAKE_CXX_STANDARD, but it's implemented backwards: if you set it to 14 but your compiler supports only C++11, they'll add -std=gnu++11. You have to also set CMAKE_CXX_STANDARD_REQUIRED to ON, which not man projects do. I don't think there's an easy way to say "this project requires C++14 or higher".
A compiler is perfectly capable of compiling programs which use features that its own source does not.
So why has it been posted it as a reply, and why label it a counterpoint?
The prior post seemed to be claiming that this required any form of a bootstrapping process, when it does not.
Building your compiler in another language doesn't help at all. In fact, it just makes it worse. Dogfooding C++20 in your compiler that isn't even built in C++ is obviously impossible.
My original point is that you can write a compiler for any language in any language.
> This particular compiler does require bootstrapping, and that's obviously what "the compiler" is referring to in that comment.
You have to pick an option: either it requires bootstrapping, or it doesn’t.
As it’s possible to write the C++20 compiler features in C++11 (or whatever GCC or Clang are written in these days), it factually does not require bootstrapping.
> So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.
…as though building the new version of the compiler depended on the features it’s implementing already existing. This is clearly not the case.
The person you responded to answered the question posed by the person that they responded to. And they answered it correctly. Your "counterpoints" are counterpoints to an imaginary argument/claim that no one has actually made. The reason why it's not part of the quote that you pulled out of the other comment is that there's no way to quote the other person saying what you're trying to frame them as having said, because it's not what they were saying. This entire subthread is the result of an unnecessary attempt at a correction that doesn't manage to correct anyone about anything.
Which, as you say, is clearly not the case.
I have no idea how you managed to misread the comment so badly, but there we are.
A perfectly fine observation on its own—but it's not on its own. It's situated in a conversational context. And the observation is in no way a counterpoint to the person you posted your ostensible reply to.
Aside from that, you keep saying "bootstrapping" as in whether or not this or that compiler implementation strategy "requires bootstrapping". But writing a compiler in different source language than the target language it's intended to compile and using that to build the final compiler doesn't eliminate bootstrapping. The compiler in that other language is just part of the bootstrapping process.
C++ standards support and why C++23 and C++26 are not the default: https://gcc.gnu.org/projects/cxx-status.html
They are discussing in this email thread whether it is already properly supported.
> It's one reason why people care so much about self-hosted compilers
For self-hosting and bootstrapping you want the compiler to be compilable with an old version as possible.
Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.
[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.
[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.
[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.
Like, think about it: if you think the defaults should be good for greenfield projects, then greenfield projects won't be using the correct flags (because if they are, then the whole argument is specious anyway). And when C++34 shows up, they're going to be broken and we'll have this argument again.
Compatibility is hard. But IMHO C++ and gcc are doing this wrong and C is doing it much better.
Please don't spread misinformation. Breaking changes are actually almost inexistent with C++. The last one was with the COW std::string and std::list ~15 years ago with the big and major switch from C++03 to C++11. And heck, even then GCC wouldn't let your code break because it supported dual ABIs - you could mix C++03 and C++11 code and link them together.
So C++ actually tries really hard _not_ to break your code, and that is the philosophy behind a language adhering to something that is called backwards-compatibility, you know? Something many, such as Google, were opposing to and left the committee/language for that reason. I thank the C++ language for that.
Introducing new features or new keywords or making stricter implementation of existing ones, such as narrowing integral conversions, is not a breaking change.
This is some kind of semantic prestidigitation around a definition for "breaking" that I'm not following. Yes, obviously it is. New keywords were valid symbol names before they were keywords.
Makes me wonder if the "don't spread misinformation" quip was made in good faith.
That use case is probably less than 20% of all the C++ development cycles out there. This is a 40 year old language that the bulk of the industry has decided to abandon for new work. The large majority of people doing work on this code are doing minimal-change updates, and nonsense like this is how you end up with rules like "We have to deploy on Ubuntu 20.04 still because the AbandonWare 4.7 library doesn't work on later version".
And it's avoidable, but not if you run around lying to people and yourself about what a breaking change is. Again, look at how C does this. The C standard writers actually know that they're updating a legacy environment and care deeply about full backwards compatibility.
I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.
This is nonsense. Apt devs can target a rustc release and that release can be the same release that ships with Debian? Moreover, since those apt devs may have some say in the matter, they can choose to update the compiler in Debian!
> The entire language culture is built around this rapid improvement.
... Because this is a cultural argument about how some people really enjoy having their codebase be 6 years behind the latest language standard, not about any actual practical problem.
And I can understand how someone may not be eager to learn C++20's concepts or to add them immediately to a code base, but upgrades to your minimum Rust version don't really feel like that. It's much more like "Wow that's a nifty feature, I immediately understand and I'd like to use in the std lib. That's a great alternative to [much more complex thing...]" See, for example, OnceLock added at 1.70.0: https://doc.rust-lang.org/std/sync/struct.OnceLock.html
Warnings becoming errors would be scoped to gcc itself only, and they can fix them as part of the upgrade.
The issue with defaults is that people have projects that implicitly expect the default to be static.
So when the default changes, many projects break. This is maybe fine if it’s your own project but when it’s a few dependencies deep, it becomes more of an issue to fix.
It's not an end user problem, anyway. The issue is the language didn't change in a backwards compatible way and also didn't require setting a language version.
"Properly supported" is the key here. Does GCC currently properly support C++23, for example? When I checked a few months ago, it didn't.
We're starting to need caniuse.com for C++.
cursing because the old program does not compile anymore No.
Of course we were all ADHD pedantic nerds so take this with a grain of salt.
jjmarr•2mo ago
1718627440•2mo ago
albertzeyer•2mo ago
whobre•2mo ago
suby•2mo ago
Maxatar•2mo ago
And this is ignoring the fact that none of GCC, clang, or MSVC have a remotely good implementation of modules that would be worth using for anything outside of a hobby project.
I agree with the other commenter who said modules are a failure of a feature, the only question left is whether the standards committee will learn from this mistake and refrain from ever standardizing a feature without a solid proof of concept and tangible use cases.
pjmlp•2mo ago
Maxatar•2mo ago
Office does not use C++ modules, what Office did was make use of a non-standard MSVC feature [1] which reinterprets #include preprocessor directives as header units. Absolutely no changes to source code is needed to make use of this compiler feature.
This is not the same as using C++20 modules which would require an absolutely astronomical amount of effort to do.
In the future, read more than just the headline of a blog post if you wish to actually understand a topic well enough to converse in it.
[1] https://learn.microsoft.com/en-us/cpp/build/reference/transl...
pjmlp•2mo ago
Feel free to roam around on my Github account.
Also go read the C++ mailings regarding what is standard or not in modules.
CyberDildonics•2mo ago
Maxatar•2mo ago
The committee is full of very smart and talented people, no dispute about that, but it's also very silo'd where people just work on one particular niche or another based on their personal interests, and then they trade support with each other. In discussions it's almost never the case that features are added with any consideration towards the broader C++ audience.
CyberDildonics•2mo ago
forrestthewoods•2mo ago