Rust is the present and the future and it's quite logical that it becomes a key requirement in Linux distributions, but I'm really not convinced by the wording here… This last sentence feels needlessly antagonistic.
A nostalgia-fuelled Linux distro, maybe using a deliberately slimmed down or retro kernel, and chosen software could make a lot more sense than keep trying to squeeze Debian onto hardware that was already obsolete at the turn of the century while also promoting Debian as a viable choice for a brand new laptop.
Solved problem:
United States Patent Application 3127321 Date of Patent March 31, 1964 NUCLEAR REACTOR FOR A RAILWAY VEHICLE
alpha, hppa, m68k and sh4
To be fair, lots of people did use Motorola 68xxx CPUs when those were new, it's just that it was 40+ years ago in products like the Commodore Amiga. The SH4 is most popularly connected to the Dreamcast, Sega's video game console from back when Sega made video game consoles.
The Alpha and PA Risc were seen in relatively recent and more conventional hardware, but in much tinier numbers, and when I say relatively I mean early this century, these are not products anybody bought five years ago, and when they were on sale they were niche products for a niche which in practical terms was eaten by Microsoft.
> Rust is already a hard requirement on all Debian release architectures and ports except for alpha, hppa, m68k, and sh4 (which do not provide sqv).
Wonder what this means for those architectures then?
https://mastodon.social/@juliank
>Senior Engineer at Canonical.
sure some are also payed by a foundation. Which is also payed by companies but with a degree of decoupling of influence.
and some pay them self, e.g. fully voluntary work, but most dev can't afford to do so on a long term, high time commitment manner. So a lot of major changes and contributions end up coming from people directly or indirectly "payed" by some company.
and that's pretty common across most "older, larger, sustainable and still developed OSS"
They will be rebranded as "retro computing devices"
I’m not in the Debian world, but those do seem to me like the types of systems that could use their own specialized distros rather than being a burden to the mass market ones. It’s not as if you could run a stock configuration of any desktop environment on them anyway.
Does or should debian care? I don't know.
I don’t get the fuzz around the “retro computing” verbiage. I doubt anyone is actually running Debian on these devices out of necessity, someone who plays baroque music in reconstructed period instruments won’t balk at being called an “early music” enthusiast.
But I'm not sure. I think the new Rust dependencies are good. In an ideal world, the people who care about niche systems step up to help Rust target those systems.
I’m actually the person who added the m68k target to the Rust compiler and was also one of the driving forces of getting the backend into LLVM.
Generally speaking, getting a new backend into the Rust compiler is not trivial as it depends on LLVM support at the moment which is why asking someone to just do it is a bit arrogant.
Luckily, both rustc_codegen_gcc and gccrs are being worked on, so this problem will be resolved in the future.
I'll try to rephrase: if we never want to give up support for a platform we've supported in the past, then I think we only have two options: (1) never adopt new technology where support for said platforms doesn't come for free, or (2) leave it up to those who care about the niches to ensure support.
Neither is pain-free, but the first seems like a recipe for stagnation.
It's lovely to see the two alternative compiler paths for Rust moving forward though! Thank you!
(LLVM even used to have an in-tree DEC Alpha backend, though that was back in 2011 and not relevant to any version of Rust.)
[0] Looks like there is basic initial support but no 'core' or 'std' builds yet. https://doc.rust-lang.org/rustc/platform-support/m68k-unknow... This should potentially be fixable.
yes, from a pure code generation aspect
no, as all conditional-compiled platform specific code is missing.
So using it with #[no_core] should work (assuming the WIP part of the backend isn't a problem). But beyond that you have to first port libcore (should be doable) and then libstd (quite a bunch of work).
Who is actually _running_ Debian Trixie on these platforms now?
It is counter-intuitive to me that these platforms are still unofficially supported, but 32-bit x86 [edit: and all MIPS architectures!] are not!
I am emotionally sad to see them fall by the wayside (and weirdly motivated to dig out a 68k Amiga or ‘very old Macintosh’ and try running Trixie…) but, even from a community standpoint, I find it hard to understand where and how these ports are actually used.
It’s just a bit annoying that Rust proponents are being so pushy in some cases as if Rust was the solution to everything.
It looks like the last machines of each architecture were released:
Alpha in 2007
HP-PA in 2008
m68k in pre-2000 though derivatives are used in embedded systems
sh4 in 1998 (though possible usage via "J2 core" using expired patents)
This means that most are nearly 20 years old or older.
Rust target triples exist for:
m68k: https://doc.rust-lang.org/nightly/rustc/platform-support/m68... and https://doc.rust-lang.org/nightly/rustc/platform-support/m68... both at Tier 3.
(Did not find target triples for the others.)
If you are using these machines, what are you using them for? (Again, genuinely curious)
Everything else is at least i686 and Rust has perfectly adequate i686 support.
Is there any major distro left with pre i686 support?
Cars, airplanes, construction equipment, etc.
you mainly find that with systems needing certification
this are the kind of situations where having a C language spec isn't enough but you instead need a compiler version specific spec of the compiler
similar they tend to run the same checkout of the OS with project specific security updates back-ported to it, instead of doing generic system updates (because every single updates needs to be re-certified)
but that is such a huge effort that companies don't want to run a full OS at all. Just the kernel and the most minimal choice of packages you really need and not one more binary then that.
and they might have picked Debian as a initial source for their packages, kernel etc. but it isn't really Debian anymore
You could make this argument for so many usecases but apparently people just enjoy bashing retrocomputing here.
Well, there are so many things were you could argue about the relevance of a userbase.
If the size of a userbase would be the only argument, Valve could just drop support for the Linux userbase which is just 2-3% of their overall userbase.
Here is one famous example of a dude who’s managed to get PRs merged in dozens of packages, just to make them compatible with ancient versions of nodejs https://news.ycombinator.com/item?id=44831811
But yeah, those can figure out how to keep their own port
But the people who use the language have an amazing talent to make people on the fence hate them within half a dozen sentences.
They remind me of Christian missionaries trying to convert the savages from their barbarous religions with human sacrifice to the civilised religion with burning heretics.
Rust people for some reason are.
There is no guarantee that other bugs do not flurish in the rust echosystem. There are no publicly known quality code checks of rust programs except a big "trust us"(see firefox with all its CVEs, despite "rust"). And combined with the Cargo echosystem, where every malicious actor can inject malware is a big warning sign.
If I got that right, how is "it's still not perfect" an argument?
Agree with the Cargo objection.
Use Rust for evergreen projects by all means, just leave mature tested systems alone, please.
Or maybe Debian should never rely on any software written after 2015?
> There is no guarantee that other bugs do not flurish in the rust echosystem.
well, less likely than in C thanks to a advanced type system, e.g. allowing authors of abstractions make their API much more fool proof.
> where every malicious actor can inject malware is a big warning sign.
Very much doubt that is the case...
And just an anecdote, Asahi Linux devs said that Rust made it very easy (maybe relative to working with C) to write the drivers for the Apple M1 and M2 series, so it seems that the language has his merits, even without the cargo ecosystem.
Also Rust will only minimize certain kinds of bugs, others are impossible, a few years ago (I believe was Microsoft) that said that 70% of the bugs found were memory related [0], it means that Rust would have prevented most of those.
Maybe Rust is not the best answer, but as for now it the most proven answer for this particular problem, who know of Zig or other language will replace both C and Rust in the future.
[0] https://www.zdnet.com/article/i-ditched-linux-for-windows-11...
Firefox is not even close to 100% Rust.
This is a wildly misinformed comment.
https://www.cvedetails.com/vulnerabilities-by-types.php is a bit more clear. It's xss, SQL, then memory. The first two are not possible to enforce a fix on - you can always make a decision to do something bad with no visible annotation. Even then, rich types like in rust make safe interfaces easier to produce. But rust tackles the next class of issues - one that you can verify to be safe or require an explicit "unsafe" around it.
To add on to that, with declarations the programmer can tell the Lisp compiler that (for example) a variable can be stack allocated to help improve performance. The fact that Lisp code is just data is another benefit towards performance as it means macros are relatively easy to write so some computation can be done at compile time. There are also various useful utilities in the spec which can be used to help profile execution of a program to aid in optimization, such as time and trace.
Fast forward 5 centuries, it turns out they were in fact pretty successful as South America central Africa are the places where Catholicism is the most active today, far more than in Europe.
Not sure how that’s relevant when CL is basically dead and no one wants to work with it, while Rust is flourishing and delivering value
Citation needed.
Or, what can be asserted without evidence can be dismissed by pointing to ripgrep.
It's insane that x86 Debian is still compiling all software targeting Pentium Pro (from 1995!).
x64 Debian is a bit more modern, and you must splurge for a CPU from 2005 (Prescott) to get the plethora of features it requires
The cost of supporting this old hardware for businesses or hobbyists isn’t free. The parties that feel strongly that new software continue to be released supporting a particular platform have options here, ranging from getting support for those architectures in LLVM and Rust, pushing GCC frontends for rust forward, maintaining their own fork of apt, etc.
There's a non-negligble amount of "handed-down" refurbished hardware from developed to developing. PCs and servers that are already 5+yo and out of market at installation.
See (relatively recent) list of manfuacturers here:
https://en.wikipedia.org/wiki/List_of_x86_manufacturers
and scroll down for other categories of x86 chip manufacturers. These have plenty of uses. Maybe in another 30 years' time they will mostly be a hobby, but we are very far from that time.
(In my second-tier university at my developing country, the Sun workstation hadn’t been turned on in years by the late 2000s, and the the minicomputer they bought in the 1980s was furniture at the school)
Edit: As for big businesses, they have support plans from IBM or HP for their mainframes, nothing relevant to Debian.
But you are also completely ignoring limited-capabilities hardware, like embedded systems and micro-controllers. That includes newer offerings from ST Microelectronics, Espressif, Microchip Technology etc. (and even renewed 'oldies' like eZ80's which are compatible with Zilog's 8-bit Z80 from the 1970s - still used in products sold to consumers today). The larger ones are quite capable pieces of hardware, and I would not be surprised if some of them use Debian-based OS distributions.
Note that Debian no longer supports x86 as of Debian 13.
BTW, today is Pentium Pro's 30 years anniversary.
Debian 13 raised the x86 requirement to Pentium 4 because LLVM required SSE2 and Rust required LLVM.
The target before was not Pentium Pro in my understanding. It was Pentium Pro equivalent embedded CPUs. Servers and desktops since 2005 could use x86-64 Debian.
why not? I still want to run modern software on older machines for security and feature reasons
I would be worried if even C++ dependencies were added for basic system utilities, let alone something like Rust.
Now, granted, I'm not an expert on distro management, bootstrapping etc. so maybe I'm over-reacting, but I am definitely experiencing some fear, uncertainty and doubt here. :-(
This is the status quo and always has been. gcc has plenty of extensions that are not part of a language standard that are used in core tools. Perl has never had a standard and is used all over the place.
For example, IIUC, you can build a perl interpreter using a C compiler and GNU Make. And if you can't - GCC is quite bootstrappable; see here for the x86 / x86_64 procedure:
https://stackoverflow.com/a/65708958/1593077
and you can get into that on other platforms anywhere along the bootstrapping chain. And then you can again easily build perl; see:
https://codereflections.com/2023/12/24/bootstrapping-perl-wi...
apt is so late in the process that these bootstrapping discussions aren’t quite so relevant. My point was that at the same layer of the OS, there are many, many components that don't meet the same criteria posted, including perl.
Here's a thread of them insulting upstream developers & users of the Debian packages. https://github.com/keepassxreboot/keepassxc/issues/10725
It is our responsibility to our users to provide them the most secure option possible as the default.
Removing features is not the most secure option possible. Go all the way then and remove everything. Only when your computer cannot do anything it will be 100% secure.We should really hold more value to keeping existing user setups working. Breakages are incredibly damaging and might very well have a bigger impact than insecure defaults.
Security is there to keep the features usable without interruptions or risks.
E.g. plugging the computer off the network is not about security if the service needs to be accessible.
Very concrete example, the whole Log4j vulnerability issue was basically just a direct implication of a feature that allowed for arbitrary code execution. Nearly no user of Log4j intentionally used that feature, they were all vulnerable because Log4j had that feature.
The fix to the CVE was effectively to remove the feature. If someone had the foresight to try to reduce Log4j to only the features that ~everyone actually used, and publish a separate Log4j-maximal for the fringe users that intentionally use that feature, it would have prevented what was arguably the worst vulnerability that has ever happened.
In the case this thread is about, no one seems to be deny that there should be a 'minimal' and 'full' versions and that the 'minimal' version is going to be more secure. The entire flame war seems to be over whether its better to take a preexisting package name and have it be a minimal one or the full one.
That is simply a tradeoff between "make preexisting users who don't use ancillary features be as secure as possible by default going forward" or "make preexisting users who do use ancillary features not broken by upgrades".
In this case it is not clear at all whether the feature is obscure. For most people it could be actually essential and the primary requirement for the whole software.
This is literally the same as helping a relative to make their computer more secure by turning it off. Problem solved I guess?
If you made a mistake by shipping insecure defaults you could fix it e.g. by including a banner to use the minimal version to users that don't use the extra features. But simply rug-pulling everybody for "security" and doubling down by insulting the affected users? I really do not understand people that act like this.
If I have a program that encrypts and decrypts passwords, then the surface area is way smaller than if it also has browser integrations and a bunch of other features. Every feature has the potential to make this list longer: https://keepass.info/help/kb/sec_issues.html which applies to any other piece of software.
At the same time, people can make the argument that software that's secure but has no useful features also isn't very worthwhile. From that whole discussion, the idea of having a minimal package and a full package makes a lot of sense - I'd use the minimal version because I don't use that additional functionality, but someone else might benefit a bunch from the full version.
As teh64 helpfully pointed out in https://news.ycombinator.com/item?id=45784445 some hours ago, 4ish years ago my position on this was a total 360 and I'd have had the same reaction to now-me's proposal.
Unnecessary drama as usual...
The one demanding it is the maintainer of keepassxc it would’ve been better to just close the issue that this is a Debian only problem and he should install it like that and just close it.
now this is separate from being open for discussion if someone has some good arguments (which aren't "you break something which isn't supported and only nich used") and some claim he isn't open for arguments
and tbh. if someone exposes users to actual relevant security risk(1) because the change adds a bit of in depth security(2) and then implicitly denounces them for "wanting crap" this raises a lot of red flags IMHO.
(1): Copy pasting passwords is a very bad idea, the problem is phsishing attacks with "look alike" domains. You password manager won't fill them out, your copy past is prone to falling for it. In addition there are other smaller issues related to clip board safety and similar (hence why KC clears the clipboard after a short time).
(2): Removing unneeded functionality which could have vulnerabilities. Except we speak about code from the same source which if not enabled/setup does pretty much nothing (It might still pull in some dependencies, tho.)
but yes very unnecessary drama
In fact not having it encourages copy and paste which reduces security.
Whats next? Strip javascript support from browsers to reduce the attack surface?
I don't get how this is even a discussion. Either he is paid by canonical to be a corporate saboteur or he is completely insane.
Furthermore, if these architectures are removed from further debian updates now, is there any indication that, once there's a rust toolchain supporting them, getting them back into modern debian wouldn't be a bureaucratic nightmare?
The GCCRS project can't even build libcore right now, let alone libstd. In addition, it is currently targeting Rust 1.50's feature set, with some additions that the Linux kernel needs. I don't see it being a useful general purpose compiler for years.
What's more likely is that rustc_codegen_gcc, which I believe can currently build libcore and libstd, will be stabilised first.
These architectures aren't being removed from Debian proper now, they already were removed more than a decade ago. This does not change anything about their status nor their ability to get back into Debian proper, which had already practically vanished.
i.e. they are only still around because they haven't caused any major issues and someone bothered to fix them up from time to time on their own free time
so yes, you probably won't get them back in once they are out as long as a company doesn't shoulder the (work time) bill for it (and with it I mean long term maintenance more then the cost of getting them in)
but for the same reason they have little to no relevance when it comes to any future changes which might happen to get them kicked out (as long as no company steps up and shoulders the (work time) bill for keeping them maintained
Note that I'm not saying Debian should, I'm saying it is reasonable that they would. I am not a Debian maintainer and so I should not have an opinion on what tools they use, only that adding Rust isn't unreasonable. It may be reasonable to take away a different tool to get Rust in - again this is something I should not have an opinion on but Debian maintainers should.
https://github.com/rust-embedded/cortex-m
Even the embedded world is slowly changing.
People selling slop does not imply much about anything other than the people making the slop
I think the linked requirement, the hype you see, and rust's own material is misleading: It's not a memory-safety one-trick lang; it's a nice overall lang and tool set.
Lack of drivers is prohibitive if your are a small/medium team or are using a lot of complicated peripherals or SoC. Compare to C where any MCU or embedded SoC or moderately complex peripheral normally comes with C driver code.
In practice I end up rewriting drivers. Which sounds daunting but often times it's much easier than folks think and the resulting code is usually 1/4th or smaller the original C code. If only implement what you need sometimes drivers can be less than 100 lines of Rust.
One problem: It's tedious going from the pointer-level API bindgen gives you to a high-level rust API that has references, arrays etc. In that you have to do some boilerplate for each bit of functionality you want. Not a big deal for a specific application, but not ideal if making a general library. And C libs tend to be sloppy with integer types, which works, but is not really idiomatic for rust. Maybe that could be automated with codegen or proc macros?
I believe the ESP-IDF rust lib is mostly FFI (?); maybe that's a good example. We've been re-inventing the wheel re STM-32 and Nordic support.
Zig is an example of excelling at C interop--not Rust.
And Cargo is an impediment in the embedded ecosystem rather than a bonus.
Part of why we're getting Rewrite-it-in-Rust everywhere is precisely because the C interop is sufficiently weak that you can't do things easily in a piecemeal fashion.
And lets not talk about Rust compile times and looking at Rust code in a debugger and just how bad Rust code is in debug mode ...
This is entirely the wrong lens. This is someone who wants to use Rust for a particular purpose, not some sort of publicity stunt.
> I know nobody that programms or even thinks about rust. I’m from the embedded world a there c is still king.
Now’s a good time to look outside of your bubble instead of pretending that your bubble is the world.
> as long as the real money is made in c it is not ready
Arguably, the real money is made in JavaScript and Python for the last decade. Embedded roles generally have fewer postings with lower pay than webdev. Until C catches back up, is it also not ready?
Telling people they need to take their ball and go home if they're incapable or unable to maintain an entire compiler back-end seems like a, shall we say, 'interesting' lens for a major distro such as Debian.
Just to parse some files?
I think it isn’t reasonable to infer that nobody uses something because you don’t know anybody who uses it in your niche. I know lots of embedded programmers who use Rust.
That's you. At companies like Microsoft and Google, plenty of people think about and discuss Rust, with some products/features already using Rust.
Most people nowadays who criticize Rust do so on a cultural basis of "there are people who want this so and it changes things therefore it is bad". But never on the merits.
Rust is a good language that contains in its language design some of the lessons the best C programmers have internalized. If you are a stellar C programmer you will manually enforce a lot of the similar rules that Rust enforces automatically. That doesn't mean Rust is a cage. You can always opt for unsafe if ypu feel like it.
But I know if my life depended on it I would rather write that program in Rust than in C, especially if it involves concurrency or multiprocessing.
Practically on embedded the issue is that most existing libraries are written in C or C++. That can be a reason to not choose it in the daily life. But it is not a rational reason for which a programming language sucks. Every programming language had once only one user. Every programming language had once no dependencies written in it. Rust is excellent in letting you combine it with other languages. The tooling is good. The compiler error messages made other language realize how shitty their errors were.
Even if nobody programmed in Rust, the good bits of that language lift the quality in the other languages.
Additionally to that, a part of the team doesn't had fun on writing code with Rust.
We trashed the whole tool, which was a massive loss of time for the project.
Once you take out cargo, rusts development environment becomes quite poor.
I dislike the tone of the evangelism and the anti-C attitude but I'm not anti-rust. I purchased a computer with an oversized amount or RAM in part so I could experiment with rust. But determining how to write, edit and compile small programs, from the ground up, without cargo appears exceedingly difficult, and feels like going against the tide
It stands to reason that the embedded programmer commenting was unable to determine how to avoid using cargo and pulling in unnecessary dependencies. Otherwise he would not have encountered this problem
e.g. Chrome & Fuchsia both build included Rust bits using their existing build system.
Bazel and Buck2 both work well with it, relatively.
One can also just be really disciplined with Cargo and not add superfluous deps and be careful about the ones you do include to monitor their transitive dependencies.
IMHO this is more about crates.io than Cargo, and is the biggest weakness of the language community. A bulk of developers unfortunately I think come from an NPM-using background and so aren't philosophically ... attuned... to see the problem here.
(I similarly have yet to see a single convincing argument to try to fight past the awkward, verbose and frustrating language that is rust).
Secondly the argument that because you don't use it in your area no one should use it in OS development is nonsensical.
This is your bias alone. I know tons of people and companies that do. Rust most likely runs on your device.
EC2 (lots of embedded work on servers), IAM, DynamoDB, and parts of S3 all heavily use Rust for quite a few years now already.
We can move really fast with Rust as compared to C, while still saving loads of compute and memory compared to other languages. The biggest issue we've hit is the binary size which matters in embedded world.
Linux has added support for Rust now. I don't think Rust's future supremacy over C is doubtful at this point.
AWS might honestly be the biggest on Rust out of all the FAANGs based on what I've heard too. We employ loads of Rust core developers (incl Niko, who is a Sr PE here) and have great internal Rust support at this point :). People still use the JVM where performance doesn't matter, but anywhere where performance matters,I don't see anyone being okay-ed to use C over Rust internally at this point.
But for end users on Debian trying to compile rust stuff is a nightmare. They do breaking changes in the compiler (rustc) every 3 months. This is not a joke or exaggeration. It's entirely inappropriate to use such a rapidly changing language in anything that matters because users on a non-rolling distro, LIKE DEBIAN, will NOT be able to compile software written for it's constantly moving bleeding edge.
This is an anti-user move to ease developer experience. Very par for the course for modern software.
First, Debian is not a distro where users have to compile their software. The packages contain binaries, the compilation is already done. The instability of Rust would not affect users in any way.
And second, as a developer, I never had a more unpleasant language to work with than Rust. The borrow checker back then was abysmal. Rust is not about developer happiness - Ruby is - but its memory safety makes it a useful option in specific situation. But you can be sure that many developers will avoid it like a plague - and together with the breakage and long compile times that's probably why moves like the one dictated here are so controversial.
Sure it would. Suppose a rust-based package has a security bug. Upstream has fixed it, but that fix depends on some new rust language feature that the frozen version of rust in Debian doesn't have yet.
The rustc version will be fixed for compaibility at every release and all rust dependencies must be ported to apts.
In the debian context, the burden imposed by rust churn and "cargo hell" falls on debian package maintainers.
If 32-bit x86 support can be dropped for pragmatic reasons, so can these architectures. If people really, really want to preserve these architectures as ongoing platforms for the future, they need to step up and create a backend for the Rust toolchain that supports them.
All (current) languages eventually have a compiler/runtime that is memory unsafe. This is basically fine because it's a tiny amount of surface area (relative to the amount of code that uses it) and it exists in a way that the input to is relatively benign so there's enough eyes/time/... to find bugs.
There's also nothing stopping you from re-implementing python/ruby/... in a safer way once that becomes the low hanging fruit to improve computer reliability.
How many type confusion 0 days and memory safety issues have we had in dynamic language engines again? I've really lost count.
My impression is that for the trusted code untrusted input case it hasn't been that many, but I could be wrong.
A lot of the C code used in python is calling out to old, battle tested and niche libraries so it is unlikely that someone is going to replace those any time soon but Rust is definitely increasing as time goes on for greenfield work.
I have been seeing hatred on this forum towards Rust since long time. Initially it didn't make any kind of sense. Only after actually trying to learn it did I understand the backlash.
It actually is so difficult, that most people might never be able to be proficient in it. Even if they tried. Especially coming from the world of memory managed languages. This creates push back against any and every use, promotion of Rust. The unknown fear seem to be that they will be left behind if it takes off.
I completed my battles with Rust. I don't even use it anymore (because of lack of opportunities). But I love Rust. It is here to stay and expand. Thanks to the LLMs and the demand for verifiability.
For instance,
struct Feet(i32);
struct Meters(i32);
fn hover(altitude: Meters) {
println!("At {} meters", altitude.0);
}
fn main() {
let altitude1 = Meters(16);
hover(altitude1);
let altitude2 = Feet(16);
hover(altitude2);
}
This fails at build time with: 12 | hover(altitude2);
| ----- ^^^^^^^^^ expected `Meters`, found `Feet`
| |
| arguments to this function are incorrect
Guaranteeing that I’ve never mixed units means I don’t have to worry about parking my spacecraft at 1/3 the expected altitude. Now I can concentrate on the rest of the logic. The language has my back on the types so I never have to waste brain cycles on the bookkeeping parts.That’s one example. It’s not unique to Rust by a long shot. But it’s still a vast improvement over C, where that same signed 32 bit data type is the number of eggs in a basket, the offset of bytes into a struct, the index of an array, a UTF-8 code point, or whatever else.
This really shows up at refactoring time. Move some Rust code around and it’ll loudly let you know exactly what you need to fix before it’s ready. C? Not so much.
> An investigation attributed the failure to a measurement mismatch between two measurement systems: SI units (metric) by NASA and US customary units by spacecraft builder Lockheed Martin.[3]
But also think of how many libc functions take multiple ints or multiple chars in various orders. You can get carried away with typing, i.e. by having a separate type for everything*. Still, imagine you’re writing, say, a hypothetical IDE device driver and had separate types for BlockNumber and ByteInBlock so that it’s impossible to transpose read(byte_offset,block) instead of read(block,byte_offset), even if those are really the same kind of numbers.
That kind of thing makes a gazillion bugs just vanish into the ether.
There appears to be some tension between different conveniences you might afford yourself. If you have read(offset: offsetTypeForRead, address: addressTypeForRead), you can catch when someone accidentally passes an address where the offset should be and an offset where the address should be.
Or, you can say "hey, I'm always adding the offset to the address; it doesn't matter which one gets passed first" and relieve the programmer of needing to know the order in which two semantically distinct variables get passed to `read`.
But if you do provide that convenience -- and it's not unintuitive at all; there really is only one valid interpretation of a combination of an address and an offset, regardless of the order you mention them in -- you lose some of the safety that you wanted from the types. If your variables are declared correctly, everything is fine. If there's a mistake in declaring them, you'll wave through incorrect calls to `read` that would have been caught before.
fn sub(a:LeftOp, b:RightOp)
but that seems redundant. There are still plenty of other cases where I could your idea being useful. Like I always forget whether (in Python) it’s json.dump(file, data) or dump(data, file). Ultimately, should it matter? I’m passing a file handle and an object, and it’s unambiguous how those two args relate to the task at hand.> ... ground controllers ignored a string of indications that something was seriously wrong with the craft's trajectory, over a period of weeks if not months. But managers demanded that worriers and doubters "prove something was wrong," even though classic and fundamental principles of mission safety should have demanded that they themselves, in the presence of significant doubts, properly "prove all is right" with the flight
Dropping units on the NASA side also was problematic but really culture was the cause of the actual crash.
[0] https://spectrum.ieee.org/why-the-mars-probe-went-off-course
/s
In C++ you can even add overloaded operators to make using math on such structs ergonomical.
Compilers know of the idiom, and will optimize the struct away.
IIUC, rust would NOT let you do a type checked m/s * s => m, so using the type system for these kinds of games is silly and dangerous (I presume you would have to do the dumb thing and typeconvert to the same type -- e.g.
(m) (speed * ((m/s) seconds))
to do multiplication which means you're inserting unscientific and reader-confusing type conversions all over the place)``` #include <stdio.h>
typedef struct { int value; } Feet;
typedef struct { int value; } Meters;
void hover(Meters altitude) { printf("At %i meters\n", altitude.value); }
int main() { Meters altitude1 = {.value = 16}; hover(altitude1); Feet altitude2 = {.value = 16}; hover(altitude2); } ```
``` error: passing 'Feet' to parameter of incompatible type 'Meters' 20 | hover(altitude2); ```
Coming from a dynamically typed language (Python, etc), this might seem like a revelation, but its old news since the dawn of programming computers. A C language server will pick this up before compile time, just like `rust-analyzer` does: `argument of type "Feet" is incompatible with parameter of type "Meters"`.
Did you not know this? I feel like a lot of people on message boards criticizing C don't know that this would fail to compile and the IDE would tell you in advance...
Time and time again, theoretically worse solutions that are easily accessible win
I think this is more true of C than it is of Rust if the bar is "code of sufficient quality to be included in debian"
It might take some people months rather than days, but I think that is a desirable outcome.
Important low level software should be written by competent developers willing to invest the effort.
If people from that world complain about Rust, I surely wouldn't want them around a C codebase.
There's nothing wrong about memory-managed languages, if you don't need to care about memory. But being unfamiliar with memory and complaining about the thing that help you avoid shooting your foot isn't something that inspires trust.
The hardship associated with learning rust isn't going to go away if they do C instead. What's going to happen instead is that bugged code will be written, and they will learn to associate the hardship with the underlying problem: managing memory.
That could also be applied to C and C++ …
Separation of concerns solves this because the compiler has minimal impact on the trustedness of the code the Rust compiler generates. Indeed, one would expect that all the ways that the LLVM compiler fails are ways any Rust implementation would fail too - by generating the wrong code which is rarely if ever due to memory safety or thread safety issues. There may be other reasons to write the compiler backend in Rust but I wouldn’t put the trust of compiled Rust code as anywhere near the top of reasons to do that.
Can you provide some evidence to support this? There’s a large body of evidence to the contrary, e.g. from Chrome[1].
> But we have tools to prevent that. The new security issues are supply chain attacks.
Speaking as a “supply chain security” person, this doesn’t really hold water. Supply chain attacks include the risk of memory unsafety lurking in complex dependency trees; it’s not an either-or.
[1]: https://www.chromium.org/Home/chromium-security/memory-safet...
Does it audit third-party code for you?
The average C project has at most a handful of other C dependencies. The average Rust, Go or NodeJS project? A couple hundred.
Ironically, because dependency management is so easy in modern languages, people started adding a lot of dependencies everywhere. Need a leftpad? Just add one line in some yaml file or an "Alt-Enter" in an IDE. Done.
In C? That is a lot more work. If you do that, you do it for advanced for stuff you absolutely need for your project. Because it is not easy. In all likelihood you write that stuff yourself.
I think the problem started with the idea over language-level managers that are just github collections instead of curated distribution-level package managers. So my response "C has no good package manager" is: It should not have a packager manager and Cargo or npm or the countless Python managers should all not exist either.
I don't know about that. Look at the code for the COSMIC desktop environment's clock widget (the cosmic-applet-time directory under <https://github.com/pop-os/cosmic-applets>), for example. It's pretty much unreadable compared to a C code base of similar complexity (GNU coreutils, for example: <https://savannah.gnu.org/projects/coreutils/>).
If I wanted to tweak the Rust project, I’d feel pretty confident I was calling the right things with the right params.
Java can potentially have the same problem. But because everyone uses an IDE and because it's rarely really an issue, everyone will simply import `Baz` rather than worry about the Foo::Baz and Bat::Baz collision. It does happen in java code, but I can't stress how infrequently it's actually a problem.
In java, I rarely pay attention to the `import` section (and I know most devs at my company).
You can look up `using namespace std;` in google and you'll find a lot of articles saying it's a bad practice in C++. Everyone recommends writing the full `std::cout` rather than `cout`.
I do think it’s down to personal preference. With the fully qualified names, I can look at the screen and follow the flow without having to mouse over the various names in play. For that matter, I could print it out if I wanted to and still have all the information.
I don’t think you’re objectively wrong. It’s more that we have different approaches to managing the complexity when it gets hairy.
Most of the code in that module is dedicated to the gui maintenance. The parts that do deal with time are perfectly legible.
as in that "isn't the style of code you are used too"
I don't think "how well people not familiar with you language can read it" is a relevant metric for most languages.
Also IMHO while C feels readable it isn't when it matters. Because it very often just doesn't include information you need when reading. Like looking at function header doesn't tell you if a ptr is nullable, or if a mut ptr is a changeable input value or instead is a out ptr. which is supposed to point to unitialized memory and if there is an error how that affects the state of the validity of any mutable ptrs passed in. To just name some example (lets not even get started about pre processor macros pretending to be C functions). In conclusion while C seems nice to read it is IMHO often a painful experience to "properly" read it e.g. in context of a code review.
As a side note: The seemingly verbose syntax of e.g. `chrono::DateTime` comes from there being 2 DateTime-types in use in the module, one from the internationalization library (icu) and one from a generic time library (chronos). Same for Sender, etc. That isn't a supper common issue, but happens sometimes.
I disagree. Both seem perfectly readable, assuming you know their preferred coding styles. As a non-C programmer, I absolutely despise running into #ifndef SOME_OBSCURE_NAME and `while (n) { if (g) {` but C (and in the latter case Go) programmers seem to love that style.
Comparing a bunch of small, barely integrated command line programs to a UI + calendar widget doesn't seem "of similar complexity" to me. Looking at a C clock widget (https://gitlab.freedesktop.org/xorg/app/xclock/-/blob/master...) the difference seems pretty minimal to me. Of course, the XClock code doesn't deal with calendars, so you have to imagine the extra UI code for that too.
I beg to differ.
The easiest way to see this is in US locales, which use 12-hour clocks in GNU 'date' but not other implementations:
$ date -d '13:00'
Sat Nov 1 01:00:00 PM PDT 2025
$ uu_date -d '13:00'
Sat Nov 1 13:00:00 2025
I added a test case for that recently, since it is a nice usability feature [1].[1] https://github.com/coreutils/coreutils/commit/1066d442c2c023...
There's other languages that are considered acceptable, even desirable, languages to write applications in (e.g., Java, PHP, Go), but Rust is really the first language to compete sufficiently close to C's competence for people to contemplate adding it to the base-system-languages list. I'd say only Go has ever come close to approaching that threshold, but I've never seen it contemplated for something like systemd.
Interestingly, I wonder if the debates over the addition of C++, Python, and Perl to the base system language set were this acrimonious.
I think any projects that are run by people that see themselves as "X-people" (like Python-people, Perl-people) always have a bit "ick" reaction to new languages being added to projects they might see as part of a language's community.
So say you're a C++ developer, contributed to APT over the years, see all of it linked to the C++ community which you are part of too, and someone wants to start migrating parts of it to Rust/$NewLang. I think it might sometimes affect more for these people than just the code, might even be "attacking" (strong word perhaps) their sense of identity, for better or worse.
If APT were a hardcore C++ project surely we'd have like adopted namespaces everywhere by now.
> indeed, there's quite a few commenters here who I think would be surprised to learn that not only is C++ on this list, but that it's been on it for at least 25 years
... isn't so surprising.I’ve notice a lot of that in base OS systems
Its a curiosity more than anything though
Debian has ongoing efforts to make many shell scripts (like postinst Scripts in packages etc.) non-bash-specific.
A minimal Debian installation doesn't contain bash, but rather dash, which doesn't support bash extensions.
But even so - what price correct & secure software? We all lost a tonne of performance overnight when we applied the first Meltdown and Spectre workarounds. This doesn't seem much different.
- It's not an option for debian core infrastructure until it supports at least the same platforms debian does (arm, riscv, etc) and it currently only supports x86_64.
- It doesn't turn C into a modern language, since it looks like there's active development here getting the productivity benefits of moving away from C is likely still worth it.
> Critical infrastructure still written in C - particularly code that parses data from untrusted sources - is technical debt that is only going to get worse over time.
But hasn't all that foundational code been stable and wrung out already over the last 30+ years? The .tar and .ar file formats are both from the 70s; what new benefits will users or developers gain from that thoroughly battle-tested code being thrown out and rewritten in a new language with a whole new set of compatibility issues and bugs?
Additionally, the fact that this comes across as so abrasive and off-putting is on brand for online Rust evangelicalism.
Seeing this tone-deaf message from an Ubuntu employee would be funny if I didn’t actually use Ubuntu. Looks like I have to correct that…
In all seriousness though, let me assure you that I plan to take a very considerate approach to Rust in APT. A significant benefit of doing Rust in APT rather than rewriting APT from scratch in Rust means that we can avoid redoing all our past mistakes because we can look at our own code and translate it directly.
https://github.com/keepassxreboot/keepassxc/issues/10725#iss...
No: a little less than 5 years ago there was CVE-2020-27350, a memory safety bug in the tar/ar implementations.
Not necessarily. The "HTTP signature verification code" sounds like it's invoking cryptography, and the sense I've had from watching the people who maintain cryptographic libraries is that the "foundational code" is the sort of stuff you should run away screaming from. In general, it seems to me to be the cryptography folks who have beat the drum hardest for moving to Rust.
As for other kind of parsing code, the various archive file formats aren't exactly evolving, so there's little reason to update them. On the other hand, this is exactly the kind of space where there's critical infrastructure that has probably had very little investment in adversarial testing either in the past or present, and so it's not clear that their age has actually led to security-critical bugs being shaken out. Much as how OpenSSL had a trivially-exploitable, high criticality exploit for two years before anybody noticed.
(New cryptographic software can also be developed by all sorts of people. In this case I'm not familiar, but we do know that GnuPG worked for the highest profile case imaginable.)
Ironically, it was the urge not to roll your own cryptography that got people caught in GPG-related security vulnerabilities.
You don't want the core cryptography implemented in Rust for Rust's sake when there's a formally verified Assembler version next to it. Formally verified _always_ beats anything else.
The core cryptographic algorithms, IMHO, should be written in a dedicated language for writing cryptographic algorithms so that they can get formally-verified constant-time assembly out of it without having to complain to us compiler writers that we keep figuring out how to deobfuscate their branches.
In contrast, a rust implementation can be compiled into many architectures easily, and use intrinsically safer than a C version.
Plus cryptography and PKI is constantly evolving. So it can’t benefit from the decades old trusted implementations.
Formally verified in an obscure language where it's difficult to find maintainers does not beat something written in a more "popular" language, even if it hasn't been formally verified (yet?).
And these days I would (unfortunately) consider assembly as an "obscure language".
(At any rate, I assume Rust versions of cryptographic primitives will still have some inline assembly to optimize for different platforms, or, at the very least, make use of compile intrinsics, which are safer than assembly, but still not fully safe.)
Take BLAKE3 as an example. There's asm for the critical bits, but the structural parts that are going to be read most often are written in rust like the reference impl.
It seems that for reasons I don't understand this idea isn't popular and people really like hand rolling assembly.
https://github.com/PLSysSec/FaCT
They struggle to guarantee constant time for subroutines within a non-constant time application, which is how most people want to use cryptography.
After all the library wasn't designed around safety, we assumed the .debs you pass to it are trusted in some way - because you publish them to your repository or you are about to install them so they have root maintainer scripts anyway.
But as stuff like hosting sites and PPAs came up, we have operators publishing debs for untrusted users, and hence suddenly there was a security boundary of sorts and these bugs became problematic.
Of course memory safety here is only one concern, if you have say one process publishing repos for multiple users, panics can also cause a denial of service, but it's a step forward from potential code execution exploits.
I anticipate the rewrites to be 1 to 1 as close as possible to avoid introducing bugs, but then adding actual unit tests to them.
No. Rust is not magic, it just forces a discipline in which certain safety checks can be made automatically (or are obviated entirely). In other languages like C, the programmer needs to perform those checks; and it's technical debt if the C code is not coded carefully and reviewed for such issues. If coding is careful and the code is review - there is no technical debt, or perhaps I should say no more than the unsafe parts of a rust codebase or the standard libraries. And the safety of critical infra code written in C gets _better_ over time, as such technical debt is repaid.
> Rust is explicitly designed to be what you'd get if you were to re-create C knowing what we know now about language design and code safety.
That's not true. First, it's not a well-defined statement, since "what we know now" about language design is, as it has always been, a matter of debate and a variety of opinions. But even regardless of that - C was a language with certain design choices and aesthetics. Rust does not at _all_ share those choices - even if you tack on "and it must be safe". For example: Rust is much richer language - in syntax, primitive types, and standard library - than C was intended to be.
How many decades have we tried this? How many more to see that it just hasn't panned out like you describe?
According to what?
> Rust is explicitly designed
There is no standard. It's accidentally designed.
> knowing what we know now about language design and code safety.
You've solved one class of bugs outside of "unsafe {}". The rest are still present.
Are you really claiming that you can't design a language without an official standard? Not to mention that C itself has been designed long before its first ISO standard. Finally, the idea that a standard committee is a preconditionfor good language design is rather bold, I have to say. The phrase "design by committee" isn't typically used as a compliment...
> You've solved one class of bugs outside of "unsafe {}".
It's "only" the single most important class of bugs for system safety.
This kind of deflection and denialism isn't helping. And I'm saying this as someone who really likes C++.
Because that saves a lot of headaches down the line.
What I don't get is the burning need for Rust developers to insult others. Kind of the same vibes that we get from systemd folks and LP. Does it mean they have psychological issues and deep down in their heart they know they need to compensate?
I remember C vs Pascal flame back in the day but that wasn't serious. Like, at all. C/C++ developers today don't have any need to prove anything to anyone. It would be weird for a C developer to walk around and insult Rust devs, but the opposite is prevalent somehow.
By Sequoia, are they talking about replacing GnuPG with https://sequoia-pgp.org/ for signature verification?
I really hope they don't replace the audited and battle-tested GnuPG parts with some new-fangled project like that just because it is written in "memory-safe" rust.
Meanwhile, GnuPG is well regarded for its code maturity. But it is a C codebase with nearly no tests, no CI pipeline(!!), an architecture that is basically a statemachine with side effects, and over 200 flags. In my experience, only people who haven't experienced the codebase speak positively of it.
It exits 0 when the verification failed, it exits 1 when it passed, and you have to ignore it all and parse the output of the status fd to find the truth.
It provides options to enforce various algorithmic constraints but they only work in some modes and are silently ignored in others.
Does Sequoia-PGP have similar credentials and who funds it?
What would really be scary would be a distro that won't even boot unless a variety of LLM's are installed.
Boo!
No changes required. Bringing up the fil-C toolchain on weird ports is probably less work than bringing up the Rust toolchain
It also doesn't help you to attract new contributors. With the changes we made over in Ubuntu to switch to rust-coreutils and sudo-rs, we have seen an incredible uptake in community contributions amongst other things, and it's very interesting to me to try to push APT more into the community space.
At this time, most of the work on APT is spent by me staying awake late, or during weekends and my 2 week Christmas break, the second largest chunk is the work I do during working hours but that's less cool and exciting stuff :D
Adding Rust into APT is one aspect; the other, possibly even more pressing need is rewriting all the APT documentation.
Currently the APT manual pages are split into apt-get and apt-cache and so on, with a summary in apt(8) - we should split them across apt install(8), apt upgrade (8) and so on. At the same time, DocBook XML is not very attractive to contributors and switching to reStructuredText with Sphinx hopefully attracts more people to contribute to it.
That's easily fixable.
> It also doesn't help you to attract new contributors.
I don't understand this point.
as easily as fixing Rust to work on the remaining 4 architectures?
> > It also doesn't help you to attract new contributors. > I don't understand this point.
C++ doesn't attract a lot of developers, Rust attracts many more. I want more community, particularly _young_ community. I don't wanna work on this alone all the time :D
And this argument about "young" contributors is the same nonsense that came from your senior management. But you're independent.
Aren't the experienced engineers supposed to be leading the next generation? If you really want to get the young folks on board, drop Ubuntu and call it Gyatt. Instead of LTS, call it Rizz. Just think of all the young who will want to work on Skibidi 26.04!
Rust attracts hype and hype artists. Ask me how I know. Do you want drive-by people or do you want long-term community members? There are many young folk interested in learning C and looking for adequate mentorship along with a project to work on. Wouldn't that be a better use of energy? Have you even put out any outreach to attract others to these projects where you say you're alone?
You are making a mistake and falling on the sword for your bosses at the same time. Tough days are here but maybe hold on for better employment than this.
Easier, because you won't have to port Fil-C to all of the architectures in order to use it on amd64.
> C++ doesn't attract a lot of developers, Rust attracts many more.
C is #2 on TIOBE.
C++ is #3 on TIOBE.
Rust is #16 on TIOBE.
So I don't know what you're talking about
Sorry to double-reply, but this is actually a super important point in favor of Fil-C.
If you adopted Fil-C for apt, then you could adopt it optionally - only on ports that had a Fil-C compiler. Your apt code would work just as well in Fil-C as in Yolo-C. It's not hard to do that. I think about half the software I "ported" to Fil-C worked out of the box, and in those cases where I had to make changes, they're the sort of changes you could upstream and maintain the software for both Fil-C and Yolo-C.
So, with Fil-C, there would be no need to ruffle feathers by telling port maintainers to support a new toolchain!
Honestly, I am not even opposed to Rust. It has cool ideas. I do think it should care a lot more about being portable and properly defined and should have done so a lot earlier and I do deeply disagree with the opinion of some core team members that specification is meaningless.
C obviously always was a questionable choice for a tool like apt but Rust seems even worse to me. Apt has absolutely no need to be written in a low level language. At least you could argue that C was chosen because it’s portable but I don’t see what Rust has going for it.
0: https://rustfoundation.org/media/ferrous-systems-donates-fer...
Do you think that it was made up from whole cloth in the abstract machine and implemented later? No, it was based on the available implementations of its time.
On top of that, languages like Python do not have a specification and yet have multiple implementations.
Rust also has multiple compilers (rustc, mrustc, and gccrs) though only one is production ready at this time.
There is other work on specifying Rust (e.g. the Unsafe Code Guidelines Working Group), but nothing approaching a real spec for the whole language. Honestly, it is probably impossible at this point; Rust has many layers of implementation-defined hidden complexities.
But even if we accept that, it doesn’t seem like a good comparative argument: anybody who has written a nontrivial amount of C or C++ has dealt with compiler-defined behavior or compiler language extensions. These would suggest that the C and C++ standards are “performative” in the same sense, but repeated claims about the virtues of standardization don’t seem compatible with accepting that.
> The FLS is not intended to be used as the normative specification of the Rust language
The actual informal semantics in the standard and its successors is written in an axiomatic (as opposed to operational or denotational) style, and is subject to the usual problem of axiomatic semantics: one rule you forgot to read can completely change the meaning of the other rules you did read. There are a number of areas known to be ill-specified in the standard, with the worst probably being the implications of the typed memory model. There have since been formalized semantics of C, which are generally less general than the informal version in the standard and make some additional assumptions.
C++ tried to follow the same model, but C++ is orders of magnitude more complex than C and thus the standard is overall less well specified than the C++ standard (e.g. there is still no normative list of all the undefined behavior in C++). It is likely practically impossible to write a formal specification for C++. Still, essentially all of the work on memory models for low-level programming languages originates in the context of C++ (and then ported back to C and Rust).
Also, the C++ ordering model is defective in the sense that while it offers the orders we actually use it also offers an order nobody knows how to implement, so it's basically just wishful thinking. For years now the C++ standard has labelled this order "temporarily discouraged" as experts tried to repair the definition and C++ 26 is slated to just deprecate it instead. Rust doesn't copy that defect.
The war is over. ARM and x86 won.
(Plus, architecture quantity isn’t exactly the thing that matters. Quality is what matters, and Rust’s decision to conservatively stabilize on the subset of LLVM backends they can reliably test on seems very reasonable to me.)
(We detached this subthread from https://news.ycombinator.com/item?id=45782109.)
I am asking if the former option is a practical one
For other architectures currently unsupported by Rust, I doubt it'll happen. The CPU architectures themselves are long dead and often only used for industrial applications, so the probability of hobbyists getting their hands on them is pretty slim.
People still using these old architectures for anything but enthusiast hacking will probably not be using Debian Trixie, and if they do, they can probably find a workaround. It's not like the .deb format itself is changing, so old versions of apt and dpkg will keep working for quite a while.
I don't know if the rust compiler produces bigger binaries, but for a single program, it'll not make a big difference.
What's the long-term play for Canonical here?
The obvious potential motivations are things like making a more reliable product, or making their employees more productive by giving them access to modern tools... I guess I could imagine preparing for some sort of compliance/legal/regulatory battle where it's important to move towards memory safe tooling but even there I rather imagine that microsoft is better placed to say that they are and any move on canonical's part would be defensive.
Presumably it's rewriting critical parsing code in APT to a memory-safe language.
Open source fundamentally is a do-ocracy (it's in literally all of the licenses). Those who do, decide; and more and more often those who do are just one or two people for a tool used by millions.
Or I guess if you interpret this as a societal scale: we've collectively used C in production a lot, and look at all the security problems. Judgment completed. Quality is low.
> Rust is a security nightmare. We'd need to add over 130 packages to main for sequoia, and then we'd need to rebuild them all each time one of them needs a security update.
What has changed? Why is 130 packages for a crypto application acceptable?
> I find this particular wording rather unpleasant and very unusual to what I'm used to from Debian in the past. I have to admit that I'm a bit disappointed that such a confrontational approach has been chosen.
Ref: https://lists.debian.org/debian-devel/2025/10/msg00286.htmlLoved this statement on the state of modern software using the backbone of C (in linux and elsewhere)
This doesn't seem like a noteworthy change to the degree to which GNU/Linux is an accurate name... though there are lots of things I'd put more importance on than GNU in describing debian (systemd, for instance).
Edit: Looks like Perl 1.0 was under the following non-commercial license, so definitely not always GPL though that now leaves the question of licensing when debian adopted it, if you really care.
> You may copy the perl kit in whole or in part as long as you don't try to make money off it, or pretend that you wrote it.
https://github.com/AnaTofuZ/Perl-1.0/blob/master/README.orig
But, there are now a lot more replacements for GNU's contributions under non-copyleft licenses, for sure.
Whether the rewrite should be adopted to replace the original is certainly a big discussion. But simply writing a replacement isn’t really worth complaining about.
uutils/coreutils is MIT-licensed and primarily hosted on GitHub (with issues and PRs there) whereas GNU coreutils is GPL-licensed and hosted on gnu.org (with mailing lists).
EDIT: I'm not expressing a personal opinion, just stating how things are. The license change may indeed be of interest to some companies.
The GPL protects the freedom of the users while MIT-licensed software can be easily rug-pulled or be co-opted by the big tech monopolists.
Using GitHub is unacceptable as it is banning many countries from using it. You are excluding devs around the world from contributing. Plus it is owned by Microsoft.
So we replaced a strong copyleft license and a solid decentralized workflow with a centralized repo that depends on the whims of Microsoft and the US government and that is somehow a good thing?
There is also another crowd that completely aligns with the US foreign policy and also has the same animosity towards those countrie's citizens (I 've seen considerable amount of examples of these).
For the license part I really don't get the argument how can a coreutils rewrite can get rugpulled this is not a hosted service where minio [1] [2] like situation can happen and there is always the original utils if something like that were to happen.
[1] http://news.ycombinator.com/item?id=45665452 [2] https://news.ycombinator.com/item?id=44136108
* It's becoming increasingly difficult to find new contributors who want to work with very old code bases in languages like C or C++. Some open source projects have said they rewrote to Rust just to attract new devs.
* Reliability can be proven through years in use but security is less of a direct correlation. Reliability is a statistical distribution centered around the 'happy path' of expected use and the more times your software is used the more robust it will become or just be proven to be. But security issues are almost by definition the edgiest edge cases and aren't pruned by normal use but by direct attacks and pen testing. It's much harder to say that old software has been attacked in every possible way than that it's been used in every possible way. The consequences of CVEs may also be much higher than edge case reliability bugs, making the justification for proactive security hardening much stronger.
On your second part. I wonder how aviation and space and car industry do it. They rely heavily on tested / proven concepts. What do they do when introducing a new type of material to replace another one. Or when a complete assembly workflow gets updated.
Because of backwards compatibility. You don’t rewrite Linux from scratch to fix old mistakes, that’s making a new system altogether. And I’m pretty sure there are some people doing just that. But still, there’s value in rewriting the things we have now in a future-proof language, so we have a better but working system until the new one is ready.
The world isn't black or white. Some people write Rust programs with the intent to be drop-in compatible programs of some other program. (And, by the way, that "some other program" might itself be a rewrite of an even older program.)
Yet others, such as myself, write Rust programs that may be similar to older programs (or not at all), but definitely not drop-in compatible programs. For example, ripgrep, xsv, fd, bat, hyperfine and more.
I don't know why you insist on a word in which Rust programs are only drop-in compatible rewrites. Embrace the grey and nuanced complexity of the real world.
There is a ton of new stuff getting written in Rust. But we don't have threads like this on HN when someone announces a new piece of infra written in Rust, only when there's a full or partial rewrite.
Re automotive and other legacy industries, there's heavy process around both safety and security. Performing HARAs and TARAs, assigning threat or safety levels to specific components and functions, deep system analysis, adding redundancy for safety, coding standards like MISRA, etc. You don't get a lot of assurances for "free" based on time-proven code. But in defense there's already a massive push towards memory safe languages to reduce the attack surface.
I struggle to believe that this is really about a call to improve quality when there seem to be some other huge juicy targets.
https://github.com/keepassxreboot/keepassxc/issues/10725#iss...
> Be careful. Rust does not support some platforms well.[0] ANything
> that is not Tier 1 is not guaranteed to actually work. And
> architectures like m68k and powerpc are Tier 3.
>
> [0] <https://doc.rust-lang.org/beta/rustc/platform-support.html>.
[ The rustc book > Platform Support: https://doc.rust-lang.org/beta/rustc/platform-support.html ][ The rustic book > Target Tier Policy: https://doc.rust-lang.org/beta/rustc/target-tier-policy.html... ]
Thank you for your message.
Rust is already a hard requirement on all Debian release
architectures and ports except for alpha, hppa, m68k, and
sh4 (which do not provide sqv).
Create a plan to add support for {alpha, hppa, m68k, and
sh4,} targets to the Rust compiler- 2.5pro: "Rust Compiler Target Porting Plan" https://gemini.google.com/share/b36065507d9d :
> [ rustc_codegen_gcc, libcore atomics for each target (m68k does not have support for 64-bit atomics and will need patching to libgcc helper functions), ..., libc, liballoc and libstd (fix std::thread, std::fs, std::net, std::sync), and then compiletest will find thousands of bugs ]
So, CI build hours on those actual but first emulated ISAs?
"Google porting all internal workloads to ARM, with help from GenAI" (2025) https://news.ycombinator.com/item?id=45691519
"AI-Driven Software Porting to RISC-V" (2025) https://news.ycombinator.com/item?id=45315314
"The Unreasonable Effectiveness of Fuzzing for Porting Programs" (2025) https://news.ycombinator.com/item?id=44311241 :
> A simple strategy of having LLMs write fuzz tests and build up a port in topological order seems effective at automating porting from C to Rust.
Yes there are absolutely some obnoxious "you should rewrite this in Rust" folks out there, but this is not a case of that.
CartwheelLinux•15h ago
Much of the language used seems to stem from nauseating interactions that have occured in kernel world around rust usage.
I'm not a big fan of rust for reasons that were not brought up during the kernel discussions, but I'm also not an opponent of moving forward. I don't quite understand the pushback against memory safe languages and defensiveness against adopting modern tooling/languages
lelanthran•15h ago
If you could separate the language from the acolytes it would have seen much faster adoption.
kaoD•14h ago
Rust haters seem strangely obsessed.
lelanthran•14h ago
Well, this is a great example. People complaining about the community are labeled as people complaining about the language.
Do you not see the problem here?
whilenot-dev•13h ago
lelanthran•12h ago
Because it literally says "Rust haters"; not "Rust community haters".
Are you saying that when someone refers to "Rust", they mean the community and not the language?
whilenot-dev•12h ago
timeon•11h ago
lelanthran•8h ago
Maybe. What does that have to do with the Rust community having such a poor reputation compared to other communities?
tayo42•9h ago
testdelacc1•14h ago
That’s an interesting thought. It would run counter to everything we know about human nature, but interesting nevertheless.
Rust is already pretty successful adoption wise. It’s powering significant parts of the internet, it’s been introduced in 3 major operating systems (Windows, Linux, Android), many successful companies in a variety of domains have written their entire tech stack in it. Adoption as measured by crates.io downloads has doubled every year for the last 10 years.
Now I’m imagining how much more widely Rust would be used if they had adopted your visionary approach of never saying anything positive about it.
lelanthran•14h ago
No, it's the people who have given rise to the multiple Rust memes over the years.
I'm battling to think of any other about-to-go-mainstream language that had the reputation of a hostile community. Scala? Kotlin? Swift? Zig? None of those languages have built such poor reputations for their communities.
After all, for quite a few years every thread on forums that mentioned C or C++ was derailed by Rust proponents. I didn't see C++ users jumping into Rust threads posting attacks, but there are many examples of Rust users jumping into C++ or C threads, posting attacks.
> That’s an interesting thought. It would run counter to everything we know about human nature, but interesting nevertheless.
Well, the fact that Rust is an outlier in this sample should tell you everything you need to know; other up-and-coming languages have not, in the past, gotten such a reputation.
testdelacc1•14h ago
Because you’re young or you weren't around in 2010 when Go was gaining adoption. Same shit back then. People said “I like the language, it’s quite useful” followed by tirades from people who thought it was the end of human civilisation. It had exactly the reputation you speak of. (“DAE generics???”)
Eventually the haters moved on to hating something else. That’s what the Rust haters will do as well. When Zig reaches 1.0 and gains more adoption, the haters will be out in full force.
lelanthran•12h ago
I've been working as a programmer since the mid-90s
>> I'm battling to think of any other about-to-go-mainstream language that had the reputation of a hostile community.
> People said “I like the language, it’s quite useful” followed by tirades from people who thought it was the end of human civilisation.
And? That's not the same as having a hostile community. I never saw Go proponents enter C# or Java discussions to make attacks against the programmers using C# or Java like I saw constantly wirh Rust proponents entering C or C++ discussions and calling the developers dinosaurs, incompetent, etc.
testdelacc1•11h ago
Hostile according to who? According to the haters, maybe. I’m sure the Go community was called “hostile” by haters back in the day.
Look at the drama created by Linux maintainers who were being insanely hostile, coming up with spurious objections, being absolute asshats - to the point where even Linus said enough was enough. The Rust for Linux members conducted themselves with dignity throughout. The Linux subsystem maintainers acted like kindergarteners.
But of course, haters will read the same emails and confirmation bias will tell them they’re right and Rust is the problem.
Keep hating.
lelanthran•8h ago
I was there, and no it wasn't. The Go community didn't jump into every programming discussion throwing around accusations of dinosaur, insecurity, etc.
noisem4ker•12h ago
There absolutely are, and have been. You could say it's a reaction. I don't want to argue about who started it.
I agree with you that if the Rust community has gained such a peculiar reputation, it's also due to valid reasons.
tempest_•8h ago
I have rarely seen an argument that pushes back against Rust with actual alternative solutions to the problems the rust proponents are trying to solve. It is mostly a bunch of old people letting the perfect be the enemy of the good.
timeon•11h ago
> I didn't see C++ users jumping into Rust threads posting attacks, but there are many examples of Rust users jumping into C++ or C threads, posting attacks.
I already seen this with Zig. And even without language communities. Look at this whole thread. Look in to the mirror. Regularly when Rust is mentioned on HN. Anti-Rust cult comes to complain that there is Rust.
Even if someone just posts "I have made this with Rust" - then this cult comes and complains "why do you need to mention Rust?!". Like look at your self. Who hurt you?
lelanthran•8h ago
Pointing out that the Rust community has gained such a poor reputation while other communities have not requires "looking into the mirror"?
nicoburns•5h ago
Good news: you can. And that's why it has had fast adoption.
(those advocating for Rust in "meme-like" ways are not generally the same people actually developing the Rust compiler or the core parts of it's ecosystem)
uecker•15h ago
Mond_•15h ago
Really? As opposed to e.g. C or C++ (as the most important languages which Rust is competing with)? Sure, taste plays into everything, but I think a lot of people work with Rust since it's genuinely a better tool.
I hear you on free software being controlled by corporate interests, but that's imo a separate discussion from how good Rust is as a language.
noosphr•15h ago
Of course most people aren't smart enough for the language so they have to use inferior algol languages like rust.
AlotOfReading•15h ago
noosphr•14h ago
This is someone who says things like
>It's important for the project as whole to be able to move forward and rely on modern tools and technologies and not be held back by trying to shoehorn modern software on retro computing devices.
While on company time.
tclancy•10h ago
Yes well, glad to hear there’s no one bullying people there!
lsaferite•9h ago
Elitism is it's own form of bullying and needs to be treated as such.
I don't particularly like large swaths of humanity, but I also try hard not to be elitist towards them either. I'm not always successful, but I make a strong effort as my family raised me to be respectful to everyone, even if you don't personally like them.
noosphr•6h ago
Antibabelic•15h ago
pjmlp•15h ago
dvtkrlbs•8h ago
Xylakant•7h ago
dvtkrlbs•6h ago
Antibabelic•5h ago
dvtkrlbs•1h ago
[1] https://github.com/johnperry-math/AoC2023/blob/master/More_D...
einpoklum•14h ago
you see, GP did not speak in relative terms, but absolutely: They believe Rust has problems. They did not suggest that problems with programming languages are basically all fungible, that we should sum up all problems, compare different languages, and see which ones come out on top.
mirashii•15h ago
Nobody is being forced out of the community, you can fork and not adopt the changes if you want. Thats the real point of free software, that you have the freedom to make that choice. The whole point of free software was never that the direction of the software should be free from corporate control in some way, the maintainers of a project have always had the authority to make decisions about their own project, whether individual or corporate or a mix.
uecker•7h ago
Klonoar•6h ago
You describe it that way, but that's not how the world in general works in practice. You do things based on majority.
uecker•6h ago
throwaway7356•4h ago
False claims don't really make the claims about the evils of Rust more believable.
kelnos•33m ago
Requiring full consensus for decisions is a great way to make no decisions.
throwingrocks•14h ago
This hasn’t changed.
crote•14h ago
Well, what's the alternative? The memory safety problem is real, I don't think there is any doubt about that.
C/C++ is a dead end: the community has thoroughly rejected technical solutions like the Circle compiler, and "profiles" are nothing more than a mirage. They are yet again trying to make a magical compiler which rejects all the bad code and accepts all the good code without making any code changes, which of course isn't going to happen.
Garbage collection is a huge dealbreaker for the people still on C/C++. This immediately rules out the vast majority of memory-safe languages. What is left is pretty much only Zig and Rust. Both have their pros and cons, but Rust seems to be more mature and has better community adoption.
The way I see it, the pro-memory-safety crowd is saying "There's a giant hole in our ship, let's use Rust to patch it", and the anti-Rust crowd yells back "I don't like the color of it, we shouldn't repair the hole until someone invents the perfect solution". Meanwhile, the ship is sinking. Do we let the few vocal Rust haters sink the ship, or do we tell them to shut up or show up with a better alternative?
zozbot234•14h ago
> Garbage collection is a huge dealbreaker for the people still on C/C++.
The problem is not so much GC itself, but more like pervasive garbage collection as the only memory management strategy throughout the program. Tracing GC is a legit memory management strategy for some programs or parts of a program.
pron•9h ago
The reason memory safety is interesting in the first place (for practical, not theoretical reasons) is that it is a common cause of security vulnerabilities. But spatial memory safety is a bigger problem than temporal memory safety, and Zig does offer spatial memory safety. So if Rust's memory safety is interesting, then so is the memory safety Zig offers.
I'm a rabid software correctness advocate, and I think that people should acknowledge that correctness, safety (and the reasons behind it) are much more complex than the binary question of what behaviours are soundly disallowed by a language (or ATS advocates would say that from that their vantage point, Rust is just about as unsafe as C, and so is completely uninteresting from that perspective).
The complexity doesn't end with spatial vs temporal safety. For example, code review has been found to be one of the most effective correctness measures, so if a language made code reviews easier, it would be very interesting from a correctness/security perspective.
Measter•9h ago
pron•8h ago
the_duke•5h ago
The whole Rust ecosystem is heavily biased towards prioritising memory safety and "safe by construction" .
This is evident in the standard library, in how crates approach API design, what the compilation defaults are, ...
In 6+ years of using Rust the only time I had to deal with segfaults was when working on low level wrappers around C code or JIT compilation.
Zig has some very interesting features, but the way they approach language and API design leaves a lot of surface area that makes mistakes easy.
throwaway17_17•8h ago
pron•8h ago
davemp•9h ago
I’ve written a good chunk of low level/bare metal rust—unsafe was everywhere and extremely unergonomic. The safety guarantees of Rust are also much weaker in such situations so that’s why I find Zig very interesting.
No oob access, no wacky type coercion, no nullptrs solves such a huge portion of my issues with C. All I have to do is prove my code doesn’t have UAF (or not if the program isn’t critical) and I’m basically on par with Rust with much less complexity.
zozbot234•8h ago
davemp•7h ago
I don’t actually mind Rust when I was able to write in safe user land, but for embedded projects I’ve had a much better time with Zig.
dvtkrlbs•7h ago
uecker•7h ago
cardanome•2h ago
No it is not. We have a lot of amazing and rock solid software written in C and C++. Stuff mostly works great.
Sure, things could be better but there is no reason why we need to act right now. This is a long term decisions that doesn't need to be rushed.
> What is left is pretty much only Zig and Rust.
We had Ada long before Rust and it is a pretty amazing language. Turns out security isn't that important for many people and C++ is good enough for many projects apparently.
There is also D, Nim, Odin and so on.
> Garbage collection is a huge dealbreaker
It isn't. We had Lisp Machines in the 80s and automatic garbage collection has vastly improved these days. So I wouldn't rule those out either.
In short, no the ship is not sinking. There are many options to improve things. The problems is once you depend on rust it will be hard to remove so it is better to think things through because rushing to adopt it.
mrkeen•13h ago
This assumes there wasn't agreement.
And if so, what would 'eventually adopted by the majority' mean. Is this announcement not that?
tialaramex•12h ago
I haven't seen this from Rust. Obviously lots of us think that Rust is the way forward for us but I think the problem you're talking about is that nobody offered any alternatives you liked better and that's not on Rust.
If Bob is ordering pizza for everybody who wants one, it is not the case that "Pizza is necessarily the way forward", and it's not Bob's fault that you can't have sliders, I think if you want sliders you're going to need to order them yourself and "Pizza is the way forward" is merely the default when you don't and people are hungry.
Dave Abraham's Hylo is an example of somebody offering to order sushi in this analogy. It's not yet clear whether Dave knows a Sushi place that delivers here, or how much Sushi would be but that's what having another way forward could look like.
In C++ they've got profiles, which is, generously, "Concepts of a plan" for a way forward and in C... I mean, it's not your focus, but nobody is looking at this right? Maybe Fil-C is your future? I note that Fil-C doesn't work on these obsolete targets either.
bitwize•7h ago
I'll wait.
tcfhgj•15h ago
Apparently, Rust is part of the "woke agenda"
hofrogs•15h ago
hsbauauvhabzb•13h ago
noisem4ker•12h ago
alt187•14h ago
Personally, I'm simply bothered by the fact that (one of?) the most famous figure of Rust on Linux and Rust Forever consumes and advocates for pornography that's illegal in my country, without being held accountable by the community.
From what I could piece together, the only group who ever cried wolf about this is a forum full of contemptious little angry men who spend weeks researching people they hate on the internet. No one seems to want to touch the subject from fear of being associated with them.
I'll give it to you, this is not a great time.
JuniperMesos•13h ago
I'm pretty suspicious of demands for communities to hold people accountable, especially when the community in question is a loose group of people who mostly communicate online and are united by their shared use of a specific programming technology; and who probably disagree on all sorts of other issues, including contentious ones.
hofrogs•11h ago
egorfine•9h ago
If some form of speech is illegal in your country it does automatically mean it should be illegal for the whole world or that it is wrong or that the world-wide community should adhere to standards specific to your country. Even if that country is USA.
In other words, nobody should give a flying f about open source developers porn preferences.
rs186•9h ago
Your abhorrent personal opinion of another individual has no place in a technical discussion.
timeon•11h ago
4bpp•9h ago
If you opt into something with as high a barrier to entry and necessary time commitment as a programming language, you naturally also opt into the existing community around that language, because that will be where the potential contributors, people to help you solve issues, and people you have to talk to if you need the language or ecosystem to move in some direction will hail from. In turn, the community will naturally get to impose its own values and aesthetic preferences onto you, whether by proactively using the position of relative power they have over you, or simply by osmosis. As it happens, the community surrounding Rust does largely consist of American progressives, which should not be surprising - after all, the language was created by an American company whose staff famously threatened mutiny when its own CEO turned out to offend progressive sensibilities.
As such, it is natural that bringing Rust into your project would over time result in it becoming more "woke", just like using Ruby would make it more likely that you attract Japanese contributors, or targeting Baikal CPUs would result in you getting pulled into the Russian orbit. The "woke" side themselves recognises this effect quite well, which is why they were so disturbed when Framework pushed Omarchy as a Linux distribution.
Of course, one needs to ask whether it is fair to insinuate premeditation by calling a mere expected effect an "agenda". Considering the endlessly navel-gazing nature of the culture wars, I would find it surprising if there weren't at least some people out there who make the same observation as above, and do think along the lines that driving Rust adoption is [also] a good thing because of it. Thus, Rust adoption does become, in a sense, part of the "woke agenda", just as Rust rejection becomes, perhaps even more clearly so, part of the "chud agenda".
fishmicrowaver•5h ago
4bpp•2h ago
The general temperature of politics in FOSS, I think, is not obviously lower than before: just in terms of things that made it onto HN, in the past month or so alone we have seen the aforementioned kerfuffle about dhh (the leader? founder? of Ruby on Rails), his projects and their detractors, and the wrestling over control between NixOS's board and its community moderators who were known for prosecuting political purges and wanted to assert formal authority over the former.
hollerith•5h ago
JuniperMesos•2h ago
I think this analysis is basically accurate - there's no conspiracy or even deliberate agenda going on, it's just that the community surrounding Rust happens to have (at the moment, anyway) a relatively high number of American progressives, many of whom are openly interested in imposing American progressive ideological norms in spaces they care about (which is basically what we mean by the term "woke").
I think Rust is a good software tool and I would like to see it be as widely adopted and politically-neutral as C is, and used in all sorts of projects run by all sorts of people with all sorts of other agendas, political or otherwise. Consequently, I would like to see people and projects who do not agree with American progressive norms adopt the language and become active users of it, which will help dilute the amount of Rust users who are progressives. I myself am not an American political progressive and I have lots of issues with the stated politics of many well-known Rust developers.
rs186•9h ago
hulitu•14h ago
As far as i read on HN, the only memory safe language discused on HN is rust and mostly with childish pro arguments.
zozbot234•14h ago
kaoD•13h ago
zozbot234•13h ago
hypeatei•9h ago
kaoD•9h ago
EDIT: from a brief search: it doesn't.
kaoD•5h ago
rs186•9h ago
JuniperMesos•13h ago
egorfine•9h ago
ameliaquining•9h ago
egorfine•6h ago
modulared•2h ago
For a language to be memory safe it means there must be no way to mishandle a function or use some object wrong that would result in an "unsafe" operation (for Rust, that means undefined behavior).
That is to say the default is safe, and you are given an escape hatch. While in something like c/c++ the default is unsafe.
I'd also like to add that program correctness is another separate concept from language safety and code safety, since you could be using an unsafe language writing unsafe ub code and still have a correct binary.
krater23•7h ago