https://docs.carbon-lang.dev/docs/project/roadmap.html
What _is_ interesting is that I get the impression that Carbon is being workshopped with the C++ community, rather than the wider PLT community -- I worry that they won't benefit from the broader perspectives that'll help it avoid well-known warts elsewhere.
Compatibility with C++ is fine, but so far it seems carbon's safety story is entirely a wishlist rather than anything yet. Seems like Carbon might be a more of a place to demonstrate features for C++ committees than a real language?
Personally I have hand it up to here with lousy programmingn languages that make it easy for me to write bugs.
Honestly seems like a dubious idea. The C++ community that remains are even more "just get good" than before. They still think UB all over the place is fine.
The remaining people driving where the language goes have other priorities in mind like reflection.
The profiles that were supposed to be so much better than the Safe C++ proposal, none of them made it into C++26, and it remains to be seen if we ever will see a sensible preview implementation for C++29.
If WG21 were handling Rust instead f64 would implement Ord, and people would just write unsafe blocks with no explanation in the implementation of supposedly "safe" functions. Rust's technology doesn't care but their culture does.
Beyond that though, the profiles idea is dead in the water because it doesn't deliver composition. Rust's safety composes. Jim's safe Activity crate, Sarah's safe Animals crate and Dave's safe Networking crate compose to let me work with a safe IPv6-capable juggling donkey even though Jim, Sarah and Save have never met and had no idea I would try that.
A hypothetical C++ 29 type safe Activity module, combined with a thread safe Animals module, and a resource leak safe Networking module doesn't even get you something that will definitely work, let alone deliver any particular safety.
But Rust allows pattern matching on floats.
https://play.rust-lang.org/?version=stable&mode=debug&editio...
Rust Zulip is C++ WG21 confirmed?
https://github.com/rust-lang/rust/issues/41620#issuecomment-...
https://github.com/rust-lang/rust/pull/84045#issuecomment-82...
Some part of it want C++ to be Rust, with a focus on compile-time safety. Others take "C++" literally as "C with extra stuff" and value performance over safety.
Companies like Google are likely to be in the former camp, as for what they are doing, security is critical. Unsurprisingly, Carbon is a Google project.
Video game companies on the other hand are likely to be in the latter camp. Most of the times, security is not as critical, especially for offline games, and memory corruption usually don't go further than a game crash. Tight memory management however is critical, and it often involves raw pointers and custom allocation schemes.
Thus there is an opening for a faster language. And still for a safer one. And for an easier one to use. So all C++ has going for it is inertia. It's moribund unless the committee reconsider their stance on intentionally losing the performance competition.
A major role that C plays today is being the common protocol all languages speak[0]. C++ can't fill this role, and neither can Rust.
There is a huge opportunity for some language to become the next common protocol, the common ABI, that all languages share in common.
(Maybe Rust could do this, but they haven't stabilized their ABI yet, and I don't the details.)
It would be nice if there was a somewhat higher level ABI that languages could use though. The C ABI is very low level and tedious.
If it ever goes beyond that remains to be seen.
The Carbon team is the first to point out that anyone doing green field development should reach out to Rust or any managed language that fits the project scope.
FWIW, we're working hard whenever looking at an aspect of the language to look at other languages beyond C++ and learn any and everything we can from them. Lots of our design proposals cite Swift, Rust, Go, TypeScript, Python, Kotlin, C#, Java, and even Scala.
That all-consonant keyword always makes it seem like I'm reading Hungarian notation when reading Rust for instance. An other options I've seen for instance in Pony, "fun", is already an English word with a completely different meaning.
Even the "function" from Javascript seems fine to me.
const add = (a: i32, b: i32): i32 => a + b;
...or any variation of the arrow-function idea...A "function" keyword often exists just to help the parser. C3, for example, to simply the parser of its language that's a superset of C, adds a "fn" keyword for this very purpose of disambiguation.
function add(a: i32, b: i32): i32 {
return a + b;
}
Than the example you provided and it is approximately the same length. I used to arrow functions everywhere in TS/JS and it made it difficult to read IME, and there was zero benefit. They are find for things like event handlers, promises chains etc. But I'd rather just use function when I don't have to worry about the value of this. auto my_function(int, double) -> int;
They probably want to use the same arrow signature and need something in place of auto as omitting it completely would complicate parsing.In practice everybody just uses class, because who as the time to type the full keyword and signature declarations in C++ are already unwieldy as it is.
I agree, I hate fn. Also not a fan of func though.
Unfortunately we keep designing languages for people using notepad.
Nowadays my editor even writes full blocks at a time.
Mind you, I'm not saying that your solution doesn't work. Just that it doesn't work for the GP.
Such small things as using __ __ in python and small inconveniences (lua's 1 instead of 0) really has a lot of people, what do I say.. yea, polarized on this matter.
What is inconvenient about it?
* I might be slightly exaggerating.
I like the use of [] though, it reminds me of Scala, which I liked before they did the scala 3 fork.
But then defining a type constructor itself still uses `()`, like `class UnsafeAllowDelete(T:! Concrete) { ... }`. It does seem somewhat inconsistent.
https://en.wikipedia.org/wiki/Generic_programming - Worth studying up on if you're unfamiliar with it.
If they can't get safety right at the design stage, they'll never get it right. We already have D and Zig in this space.
As to "getting it right" - things are not so simple. The emphasis on memory-safety soundness is based on some empirical hypotheses, some better founded than others, and it's unclear what "getting it right" means.
From a software correctness perspective, the road to sound memory safety is as follows: 1. We want to reduce the amount of costly bugs in software as cheaply as possible, 2. Memory unsafe operations are a common cause of many costly bugs, 3. Some or all memory bugs can be eliminated cheaply with sound language guarantees.
The problem is that 1. memory safety refers to several properties that don't all contribute equally to correctness (e.g. out-of-bounds access causes more serious bugs than use-after-free [1]), and 2. soundly guaranteeing different memory safety properties has different costs. It gets more complicated than that (e.g. there are also unsound techniques that have proven very effective to consider), but that's the overview.
It is, therefore, as of yet unclear which memory safety properties are worth it to soundly guarantee in the language, and the answer may depend on the language's other goals (and there must be other goals that are at least as important, because the empty language guarantees not only all memory safety properties but all (safety [2]) correctness properties, yet nobody uses it as it's useless, while a language like ATS can be used to write many useful programs, but few use it because it's just too costly to use well). The goal is always to find the right balance.
For example, Java soundly guarantees lack of use-after-free at the cost of increased memory footprint; that may be "getting it right" for some programs but not all. Rust soundly guarantees lack of use-after-free at the cost of imposing strong and elaborate typesystem constraints (that, as is often the case, are more constraining than the property they guarantee); that, too, may be "getting it right" for some programs, though not all. Zig guarantees lack of out-of-bounds access in a simple language at the cost of not guaranteeing lack of use-after-free, and that may also be "getting it right" for some programs but not all.
So what "getting it right" means always depends on constraints other than safety (Rust and Zig want to consume less memory than Java; Java and Zig want to be simpler than Rust; Java and Rust want to guarantee more memory safety properties than Zig). If Carbon wants to be more interoperable with C++ than Java, Rust, or Zig, then it will have to figure out what "getting it right" means for Carbon.
[1]: https://cwe.mitre.org/top25/archive/2024/2024_cwe_top25.html
[2]: https://en.wikipedia.org/wiki/Safety_and_liveness_properties
It means eliminating undefined behavior, and unplanned interaction between distant parts of the program.
Don't get me wrong - less undefined behaviour is better, but drawing a binary line between some and none makes for a convenient talking point, but isn't necessarily the sweet spot for the complicated and context-dependent series of tradeoffs that is software correctness.
Second, I would be surprised if the static analyses in the tool are precise enough for real-world Zig programs. For example, it is undecidable to determine whether a function “takes ownership” of an argument pointer. In particular, if you want to avoid false negatives, the “free after transfer” case needs to be conservative, but then you almost certainly will flag false positives.
It keeps adding keywords and it has become way harder to keep it in your head. It’s over 220 at this point. Don’t take my word for it, Swift creator doesn’t agree with its current direction either.
I don't even watch fireship anymore. I actively resist the urge to. There are some other better channels like typecraft or primagen or dreams of code and so many other enthusiasts, there is this one bash guy that I watch whose having fun in life doing side quests like going to gym and gardening and I am all for that too.
Carbon exists so that it's possible to migrate a large C++ code base, like Chrome, from C++ to something saner, incrementally.
The most important attribute of Carbon is not the specifics of the syntax but the fact that it's designed to be used in a mixed C++ / Carbon code base and comes with tooling to convert as much of C++ as possible to Carbon.
That's what makes Carbon different from any other language: D, Zig, Nim, Rust etc.
It's not possible to port a millions line C++ code base, like Chrome, to another language so large C++ projects are stuck with objectively pretty bad language and are forced to continue to use C++ even though a better language might exist.
That's why Carbon is designed for incremental adoption in large C++ projects: you can add Carbon code to existing C++ code and incrementally port C++ over to Carbon until only Carbon code exists.
Still a very large investment but at least possible and not dissimilar to refactoring to adopt newer C++ features like e.g. replacing use of std::string with std::string_view.
That's why it's a rational project for Google. Even though it's a large investment, it might pay off if they can write new software in Carbon instead of C++ and refactor old code into Carbon.
Also, FWIW, it is very ergonomic for Nim to call C (though the reverse is made complex by GC'd types). { I believe similar can be said for other PLangs you mention, but I am not as sure. } It's barely an inconvenience. Parts of Nim's stdlib still use libc and many PLangs do that for at least system calls. You can also just convert C to Nim with the c2nim program, though usually that requires a lot of hand editing afterwards.
Maybe they should write a C++2carbon translator tool? That would speed things up for them. Maybe they already have and I just haven't heard of it? I mean the article does say "some level of source-to-source translation", but I couldn't find details/caveats poking around for a few minutes.
Hypothetically you could importcpp fns, classes, etc when compiling with nim cpp
Elaborating on this cross-talk, any academic taxonomy says reference counting is a kind of GC. { See, the subtitle or table of contents of Jones 1996 "Garbage Collection: Algorithms for Automatic Dynamic Memory Management", for example. } Maybe you & I (or Nim's --mm?) can personally get the abbreviation "AMM" to catch on? I doubt it, but we can hope!! :) Sometimes I think I should try more. Other times I give up.
Before the late 90s, people would say "tracing GC" or "reference counting GC" and just "GC" for the general idea, but somehow early JavaVM GC's (and their imitators) were so annoying to so many that "The GC" came to usually refer, not just to the abstract idea of AMM, but to the specific, concrete separate tracing GC thread(s). It's a bit like if "hash table" had come to mean only a "separately chained linked list" variant because that's what you need for delete-in-the-middle-of-iterating like C++ STL wants and then only even the specific STL realization to boot { only luckily that didn't happen }.
So that's maybe a bad example. In the same way I think it's fine that "Structured programming" is about the need to use structured control flow, not the much later idea of structured concurrency even though taken today you might say they both have equal claim to this word "structured".
In contrast it is weird that people decided somehow "Object oriented" means the features Java has, rather than most of what OO was actually about when it was invented. I instinctively want to blame Bjarne Stroustrup but can't think of any evidence.
Honestly, while I find the syntax terse, I welcome more low level languages able to push performance.
But we need to get the language and interop into good shape to be able to thoroughly test and evaluate the migration.
Printing as in the example from Carbon's Github repository, does not work. 'Print("Test");' gives a complaint about not finding 'Print'.
For all of C++'s faults, it is an extremely stable and vendor-independent language. The kind of organisation that's running on some C++ monolith from 1995 is not going to voluntarily let Apple become a massive business risk in return for marginally nicer DX.
(Yes, Swift is OSS now, but Apple pays the bills and sets the direction, and no one is seriously going to maintain a fork.)
I guess, some Mac apps? In that case I think most platform independent "guts" would be in C or C++, and the Obj-C++ part is tied to the frameworks, so the devs would have to rewrite it anyway.
It is perfectly feasable for companies that are full commited into Apple ecosystem.
I know you can compile C++ files to object files, pass them to the D compiler, and have them call eachothers' functions. I've never tried it though.
--------
g++ -c foo.cpp
dmd bar.d foo.o -L-lstdc++
--------
But D and C++ have just enough differences to make extern(C++) not be automatic. It can take some pretty arcane metaprogramming to get things to work, and some things are impossible.
It's also worth pointing out that D isn't trying to be fully compatible with C++.
_Incrementally_: a C++ project can be incrementally made more sane also using constructs to avoid and constructs to use once the problem domain is confined. In my past, I had successfully implemented this quest for 3 different fairly large C++ projects. This is not a strong selling point for carbon.
You could do this with Nim, Nim 2’s ARC model is compatible with c++’s RAII. Nim supports moves, destructors, copies, etc. see https://nim-lang.org/docs/destructors.html
You can import C++ classes, member functions, free functions, etc. easily with importcpp
importcpp for the code you are incrementally porting over. You could write a libclang script to do this for you. Exportcpp for what you any code that have been ported but have dependencies in C++ land.
My best guess is they want C++ compatibility and a new language due to preferences, more control over the compiler, etc. which are all valid reasons
Maybe the page was updated recently, but there is a "why" link near the top:
https://docs.carbon-lang.dev/#why-build-carbon
What I would like to see is more documentation on the "why not" that summarizes why other languages and proposals are not sufficient. For example, Safe C++ proposal[1] appears to satisfy all requirements, but I can't find any reference to it.
FWIW, the biggest challenge with Safe C++ is that WG21 rejected[1] that direction. And it was developed without building a governance model or way to evolve outside of WG21, and so doesn't seem to have a credible path forward.
[1]: FWIW, some members of WG21 don't agree with this characterizationp, but both the author's impression and the practical effect was to reject the direction.
I believe that getting WG21 to actually say "No" was very useful to have non-technical leadership people understand that C++ can't be the solution they need.
One other use case I could think of is gaming, where there is an incredible amount of load-bearing C++ code that's never realistically going to be rewritten, and strict memory safety is not necessarily a sine qua non in the way it is in other fields.
I find it hard to trust Google to maintain any software nor to write software that is maintainable by a community. They write software for themselves and themselves alone.
If it (purportedly?) exists so that Google can move multi-million line code bases from C++ to something better bit-by-bit, because it's otherwise infeasible to do so, why would Google drop it after they have ported the first million?
You can simply wait to see if Chrome adopts it.
One good aspect about C++ is its backwards compatibility or stability. Also a drawback, but companies not having to spend huge amounts of time, expertise and money rewriting their whole codebases all the time is something they appreciate.
Rust is often somewhat stable, but not always.
https://internals.rust-lang.org/t/type-inference-breakage-in...
https://github.com/rust-lang/rust/issues/127343
300 comments on Github.
https://github.com/NixOS/nixpkgs/pull/332176
Rust has editions, but it's a feature that it will probably take years to really be able to evaluate.
What kind of compatibility story will Carbon have? What features does it have to support compatibility?
Carbon is not a programming language (sort of) - https://news.ycombinator.com/item?id=42983733 - Feb 2025 (97 comments)
Ask HN: How is the Carbon language going? - https://news.ycombinator.com/item?id=40480446 - May 2024 (1 comment)
Will Carbon Replace C++? - https://news.ycombinator.com/item?id=34957215 - Feb 2023 (321 comments)
Carbon Programming Language from Google - https://news.ycombinator.com/item?id=32250267 - July 2022 (1 comment)
Google Launches Carbon, an Experimental Replacement for C++ - https://news.ycombinator.com/item?id=32223270 - July 2022 (232 comments)
Carbon Language: An experimental successor to C++ - https://news.ycombinator.com/item?id=32151609 - July 2022 (504 comments)
Carbon: high level programming language that compiles to plain C - https://news.ycombinator.com/item?id=4676789 - Oct 2012 (39 comments)
superficial details matter - people that stayed on C++ instead of transitioning to flashy new ones have type-before-name as part of programming identity
you can have all the features in the world (and be recognized by it), but if the code doesn't _look_ like C++, then it's of no interest
I don't think it will reach the same distribution as other languages, as the niche is "large C++ projects, which want to transition to something else without rewrite" for anybody else there are a huge number of alternatives.
Yeah, agree that it sounds slightly off initially.
That would either be a wholesale conversion or emitting a translation shim style thing at the boundary between legacy c++ and the new language.
I'm not sure Carbon is necessary to achieve such a conversion.
You wouldn't get idiomatic code out but with some effort you'd get rust/d/c/other which clang compiles to the same IR as the original.
How much refactoring is warranted afterwards would depend on how much effort you put in to recreating templates / header files / modules etc on the fly.
I'm not sure I'd choose to do this myself if I was in Google's position but it would be tempting.
the point of carbon is that you can incrementally migrate your c++ program to it in place, and the migrated code will end up easier to maintain than the original c++.
importcpp what you need. exportcpp for the other way around
As to things ABI prevents:
- scoped_lock was added to not break ABI by modifying lock_guard
- int128_t has never been standardized because modifying intmax_t is an ABI break. Although if you ask me, intmax_t should just be deprecated.
- unique_ptr could fit in register with language modifications, which would be needed to make it zero-overhead, compared to a pointer
- Many changes to error_code were rejected because they would break ABI
- status_code raised ABI concerns
- A proposal to add a filter to recursive_directory_iterator was rejected because it was an ABI break
- A proposal to make most of <cstring> constexpr (including strlen) will probably die because it would be an ABI break.
- Adding UTF-8 support to regex is an ABI break
- Adding support for realloc or returning the allocated size is an ABI break for polymorphic allocators
- Making destructors implicitly virtual in polymorphic classes
- Return type of push_back could be improved with an ABI break
- Improving shared_ptr would be an ABI break
- [[no_unique_address]] could be inferred by the compiler should we not care at all about ABI
Not sure what to think of this one. Either one introduces a new keyword to opt out (not great), or all public destructors of an abstract base class are implicitly marked virtual (not great and another "hidden" language feature like threadsafe-statics).
After all, an abstract base class does not need its destructor to be public.
Isn't it just a way of controlling the language vs using normative bodies?
I can imagine the thought process behind the designers of the language went as follows:
"It's not possible to improve C++ without breaking backwards compatibility"
"That's correct, but if we're going to break backwards compatibility anyways, why not use this as an opportunity to change a bunch of things?"
aka the python 3 mentality, where necessary changes were combined with unnecessary changes that caused pointless migration costs. The fallacy is derived from the fact that breaking backwards compatibility is considered a massive fixed cost due to the fact that libraries have to be updated, therefore adding small incremental costs will not meaningfully increase overall cost. In reality the fixed cost of breaking backwards compatibility can be reduced massively if the proper care is taken, which means all the "just because" changes that were thrown in as a bonus, end up representing a much larger share of the migration cost than initially anticipated.
bananapub•20h ago
Jtsummers•20h ago
bananapub•19h ago
pjmlp•18h ago
Basically there should be a 1.0 somehow towards the end of 2026.
https://github.com/carbon-language/carbon-lang/blob/trunk/do...
This is a talk from last year CppNorth, there should be one this year as well,
https://youtu.be/8SGMy9ENGz8?si=reukeBjxAOivX6qI
Jtsummers•18h ago
https://docs.carbon-lang.dev/docs/project/roadmap.html
Even on the submitted page, the oldest you could claim it represents is 2024. But I stand by my earlier remark. When linking to an active project's documentation or home page, unless it's to a specifically dated version of it, a date doesn't make sense. For instance, linking to something specific in Python 2.6 documentation, maybe add a date. But if it's just to python.org, it would be absurd to tag it with [1991].