This isn't meant to make myself seem smart or to try and make you seem dumb, I'm just curious what was confusing about this even from a high-level perspective. It felt like a clever but not too atypical metaprogramming thing.
Maybe I've just done too much Clojure.
Smart pointers being a great example. Shared ptr has its issues, it isn’t the most performant choice in most cases, but it by far reduces more footguns than it introduces.
Compared to something like std::variant in the C++17 standard that comes with poor enough performance issues that it’s rarely ever a good fit.
No way anything std::meta gets into serious production; too flexible in some ways, too inflexible in others, too much unpredictability, too high impact on compilation times - just like always with newer additions to the C++ standard. It takes one look at coding standards of real-world projects to see how irrelevant this stuff is.
And like always, the problem std::meta is purported to solve has been solved for years.
> ... just like always with newer additions to the C++ standard.
This is objectively laughable.
Build tools to generate C++ code from some other tool. Interface description languages, for example, or something like (going back decades here) lex and yacc even.
Which is far better than to rely on a party which, as I said, has precisely nothing to do with what anyone needs. Which will inevitably produce solutions that can only partially (I am being generous here) be used in any particular situation.
As for "possibly buggy" - look, I can whip up a solid *DL parser complete with a C++ code generator in what, a week? And then polish it from that.
The committee will work for several years, settle on a barely working design, then it will take some years to land in major compilers, then it will turn out it is unusable because someone forgot a key API or it was unfeasible on VAX or something like that.
And my build process is not complicated, and never will be. It can always accomodate another step. Mainly because I don't use CMake.
We are at C++20 and I wouldn't like to work for a company that uses an earlier standard.
IDL/DDL is the source of truth, moving the type definitions there is the whole point. There is only one definition for each type, which is in the *DL, corresponding C++ headers are generated and everything is statically known.
And even if it's true that some things can be done already with specific compilers and implementation-specific hacks, it would be really nice to be able to do those things more straightforwardly.
My experience with C++ changes has been that the recent additions to compile-time metaprogramming operations is that they improve compile times rather than make it worse, because you don't have to do things like std::enable_if<> hacks and recursive templates to do things that a simple generic lambda or constexpr conditional will do, which are more difficult for both you and the compiler.
1. So many necessary common practices of C++ are far too complicated!
2. Std committee adds features to make those practices simpler.
3. C++ keeps adding features. It’s too big. They should cut out the old stuff!
4. The std committee points at the decade-long Python 3 fiasco.
5. Repeat.
To me it feels like they have fleshed out key paradigms so that is not a mess anymore. They are not there yet with compile time evaluation (constexpr consteval,...), at least with C++20, not sure if it's mostly finished with C++23/26.
The language itself and std is quite bloated but writing modern C++ isn't that complicated anymore in my experience.
Which, precisely, additions do not fit my points?
Then again Scott Meyers said he's never written a C++ program professionally.
I think you're inadvertently misrepresenting Scott Meyers' claim.
Cited from somewhere else:
> I'll begin with what many of you will find an unredeemably damning confession: I have not written production software in over 20 years, and I have never written production software in C++. Nope, not ever.
He went on to clarify that he made a living out of consultancy, not writing software. He famously retired from C++ in 2015, too.
No one needs a billion dollars, it is practically irrelevant unless you are running on greed
So either I and others misread you or it is just a matter of different views on value.
Facebook famously felt compelled to hire eminent C++ experts to help them migrate away from their PHP backend. I still recall reading posts on the Instagram Engineering blog on how and where they used C++.
What point do you think you're making?
And no, reflection hasn’t “been solved for years” unless you have a very misleading definition of “solved”. A lot of the C++ code I work with is heavily codegen-ed via metaprogramming. Despite the relative expressiveness and flexibility of C++ metaprogramming, proper reflection will dramatically improve what is practical in a strict and type-safe way at compile-time.
Anecdata: A year or so ago I have been in discussion if beta features of C++20 on platforms are good to be used on large scale. It makes it not a sum but an intersection of partial implementations. Anyway it looked positive until we needed a pilot project to try. One of the projects came back with 'just flipping C++20 switch with no changes causes significant regression on build times'. After confirming it that it is indeed not an error on our side it was kinda obvious. Proportional increase of remote compilation cloud costs for few minor features is a 'no'. After a year the beta support is no longer beta but still partial on platforms and no improvements on build times in community. YMMV of course because gamedev mostly supports closed source platforms with closed set of build tools.
Given that C++20 introduced modules, which are intended to make builds faster, I think just flipping C++20 switch with no changes and checking build times should not be the end of checking whether C++20 is worth it for your setup.
Turning on modules effectively requires that all of your project dependencies themselves have turned on modules. Fail to do so, and a lot of the benefits start to become hindrances (Clang is currently debating going to 64-bit source locations because modularizing in this manner tends to exhaust the current 32-bit source locations).
I think this just proves that your team is highly inexperienced in C++ projects, which you implicitly attest by admitting this was your first C++ upgrade you had to go through.
Let me be very clear: there is never an upgrade of the C++ version targeted by a project that does not require full regression tests and a few bugs to squash. Why? Because even if the C++ side of things is perfectly fine, libraries often introduce all sorts of unexpected issues.
For example, once I had to migrate a legacy project to C++14 and flipping the compiler flag to c++14 caused a wall of compiler errors. It turned out the C++ was perfectly fine, but a single library behaved very poorly with a constexpr constructor they enabled conditionally with C++14.
You should understand that upgrades to the core language and standard libraries are exceptionally stable, and a clear focus of the standardization committee. But they only have a say in how the core language and standard libs should be. The bulk of the code any relatively complex project consumes is not core lang+ stdlib, but third-party libraries and frameworks. These often are riddled with flags to toggle whole components only in specific versions of the C++ language, mainly for backwards compatibility. Once you target a new version of C++, often that means you replace whole components of upstream dependencies. This often requires fixing your code. This happens very frequently, even with the likes of Boost.
So, what you're complaining about is not C++ but your inexperience in software engineering in general. I mean, what is the rule of thumb about major version upgrades?
How high are those compilation costs compared the developer time that might be saved with even minor features?
Good, but I think what happens is there are people on the bleeding edge of C++, usually writing libraries that ship with new code. Each new feature is a godsend for them -- it's the reason why the features are proposed in the first place. It allows you to write libraries more simply, more generally, more safely, and more efficiently.
The rest of us are dealing with old code that is a hodgepodge of older standards and toolchains, that has to run in multiple environments, mostly old ones. It's like yeah, this C++26 feature will come in handy for me someday, but if that day comes then it will be in 2036, and I might not be writing C++ by then.
Things seem to be catching up. I had the same view up until recently, but now I'm able to use most of the C++23 features in an embedded platform (granted, some are still missing (limited to GCC 11.2).
This line of thinking is not productive. It is a mistake to see yourself as what you do, because then you're cornering yourself into defending it, no matter what.
People just don't want to maintain two completely different stacks (one on the server, one on the client).
It's kind of neat that it works. It's also a bit fidgety: the cannibalized code can cause issues (which, e.g. prevented C++11 adoption for a while in some experiments), and now CERN depends on bits of an old C++ compiler to read their data. Some may question the wisdom of making a multi-billion dollar dataset without a spec and dependent on internals of C++ classes (indeed experiments are slowly moving to formats with a clear spec), but for sure having a standard for reflection is better than the home-grown solution they rely on now.
[1]: https://indico.cern.ch/event/408139/contributions/979831/att...
Since then, a lot has changed, and now it is all based on cling ( https://root.cern/cling/ ), that originates from clang and llvm. cling is responsible generates the serialization / reflection of the classes needed within the ROOT framework.
Hint: it's C++, and yes, it will eventually use stuff like std::meta heavily.
Rust proc macros get used in serious production, even though they're quite slow to compile. Sure, std::meta is probably a bit clunkier, but that's expected from new C++ features as you say.
Compile-time reflection, with good, built in API, akin to C# Roslyn would be a real boon.
I don't think the "legos" vs "shipping" debate here is really valid. One can write any type of code in any language. I'm a freak about C++, but if someone wants to ship in Python or JS, the more power to them - one can write code that's fast enough to not matter, but takes advantage of those languages' special features.
I really think reflection + annotations will give us the chance to have much better serialization and probably something more similar to Python decorators.
That will be plenty useful and it is going to transform a part of C++ ecosystem, for example I am thinking of editors that need to reflect on data structures or web frameworks such as Crow or Drogon, Database access libraries...
It is rare to read something more moronic than that
The Rust equivalent of std::meta (procedural macros) are heavily used everywhere including in serialization framework, debugging and tracers.
And that's not surprising at all: Compile time introspection is much more powerful and lightweight than codegen for exactly the same usage.
It's not actually wrong though is it - real codebases have been implementing reflection and introspection through macro magic etc. for decades at this point.
I guess it's cool they want to fix it in the language, but as always, the approach is to make the language even more complex than it already is - e.g. two new operators (!) in the linked article
Having a flaky pile of junk as an alternative is never been an excuse to not fix the problem properly.
Every proper modern language (Rust, Kotlin, Zig, Swift, even freaking Golang) has a form of runtime reflection or static introspection.
Only C++ does not. It was done historically with a mess of macros or a pre-compiler (qt-moc) that all have an entire pile of issue.
> the approach is to make the language even more complex than it already is - e.g. two new operators
The problem of rampant complexity in C++ is not so much about the new features when they bring something and make sense.
It is about its inability to remove the old stuff even if it is consensual that it is garbage (e.g iostreams).
Thank you. Some people use the phrases "real projects" and "production code" as if they imply some standard of high quality.
I think this is the most clueless comment I ever read in HN. I hope the site is not being hit with it's blend of September.
I was going to explain to you how fundamentally wrong your comment was, but it's better to just kindly ask you to post in Reddit instead.
What solution is that? A Python script that spits out C++ code?
But I agree that one doesn't have to learn everything, or nearly-everything, to write decent-to-good modern-C++ code.
The fact that C++ is a multi-layered language with assured backwards compatibility really helps in slowly migrating to newer design paradigms and performant techniques while being sure/stable every step of the way.
These are the kind of features many folks skip over, as they are niche and require a bit of boilerplate.
Old:
template<class...> struct list {};
using types = list<int, float, double>;
constexpr auto sizes = []<template<class...> class L, class... T>(L<T...>) {
return std::array<std::size_t, sizeof...(T)>{{ sizeof(T)... }};
}(types{});
New: constexpr std::array types = {^^int, ^^float, ^^double};
constexpr std::array sizes = []{
std::array<std::size_t, types.size()> r;
std::ranges::transform(types, r.begin(), std::meta::size_of);
return r;
}();
I'm so tired of parameter packs, as useful as they are. Just give me a regular range based for loop or something similar like this. Thank you, this can't come soon enough. constexpr std::array types = {^^int, ^^float, ^^double};
auto sizes = std::whatever::transform(types, std::meta::size_of);
which would have been even nicer.BTW, I continue to maintain some C++ software, and I like cryptopp [1]. I know people now use libsodium.
https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2025/p29...
The examples section was pretty helpful for me.
This has built up into a culture where people who have little to no experience with c++ but who have been told and seen only bad headlines about it join in with those who have legitimate concerns, those who are promoting their favorite language and those who are trolls leading to a general mood.
It is almost perfectly predictable that if you open the discussion on a link to a c++ article on this site there will be someone promoting either zip, rust or circle in that discussion. There will also be a comment on the bloat of the language and someone venting their trauma from some horrible code base.
tombert•6mo ago
I'm not 100% convinced that UML is actually useful at all. Obviously if you find value from it, don't let me take that from you, by all means keep doing it, but all it seemed to provide was boxes pointing to other boxes for stuff that really wasn't unclear from looking directly at the code anyway. It's really not that hard to look directly at the class and look directly at the "extends" keyword (or the equivalent for whatever language you're using) and then follow from there. Maybe if you had like ten layers of inheritance it could be valuable, but if you're doing ten layers of inheritance there's a good chance that your code will be incomprehensible regardless.
I'm not against visual diagrams for code, I draw logic out with Draw.io all the time and I've been hacking on the RoboTool [1] toolkit a bit in my free time, but what UML offers always felt more masturbatory than useful.
Maybe I'm wrong, it certainly wouldn't be the first time, but every time I've tried to convince myself to like it I've left a little disappointed. It always kind of felt like stuff the enterprise world does to look like they're working hard and creating value.
[1] https://robostar.cs.york.ac.uk/robotool/
ETA:
[2] By "class", I meant like an education class, not a Java class.*
devjab•6mo ago
Personally I view architecture in UML, ArchiMate or draw.io rather than being build with something similar icepanel.io to be a complete waste of my time. But that's just me.
burnt-resistor•6mo ago
And not to satisfy documentation requirements for critical safety systems.
secondcoming•6mo ago
rramadass•6mo ago
It is not just drawing boxes but a visual modeling language providing both static/structural and dynamic/behavioural views of a complete system. You will only understand its value when you actually deal with large systems consisting of many interconnected modules with dependencies. In such large codebases it is almost impossible to understand all structural/behavioural aspects by browsing code whereas a tool like Doxygen generating UML diagrams from code becomes a godsend. You can map from UML to Code or Code to UML. As with any language you don't have to know all of it but can focus only on what you need eg. Class diagram/Activity diagram/Statemachine diagram are the ones i have found most useful.
Finally, UML is now being used as a modeling/specification language frontend to Formal Methods which is the ultimate proof of its usefulness.
ok123456•6mo ago
It got pushed on everyone, so there could be a layer of "software architects" who didn't have to know how to code and could have endless meetings where the final product was a Bayeux Tapestry of UML.
UML captures inheritance and composition well, but a program is more than the sum of its schema. Also, real programming languages all have their idioms, and using UML as the design space creates a significant impedance mismatch.
rramadass•6mo ago
Automatic Formal Model Generation from UML Diagrams – An Implementation Experience - https://ieeexplore.ieee.org/document/9753518
UML-B: Formal modelling and design aided by UML - pdf at https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&d...