Boy, do I have bad news for you!
However, 3 months ago me was clearly an idiot judging from the utter crap he wrote. Me in 3 months is also an idiot, he won’t get the greatness of these genius hacks I’m making.
The words of every C programmer who created a CVE.
Also, https://github.com/ghostty-org/ghostty/issues?q=segfault
All jokes aside, it doesn’t actually take much discipline to write a small utility that stays memory safe. If you keep allocations simple, check your returns, and clean up properly, you can avoid most pitfalls. The real challenge shows up when the code grows, when inputs are hostile, or when the software has to run for years under every possible edge case. That’s where “just be careful” stops working, and why tools, fuzzing, and safer languages exist.
- Every C programmer I've talked to
No its not, if it was that easy C wouldn't have this many memory related issues...
avoiding all memory management mistakes is not easy, and the bigger the codebase becomes, the more exponential the chance for disaster gets
C and Zig aren't the same. I would wager that syntax differences between languages can help you see things in one language that are much harder to see in another. I'm not saying that Zig or C are good or bad for this, or that one is better than the other in terms of the ease of seeing memory problems with your eyes, I'm just saying that I would bet that there's some syntax that could be employed which make memory usage much more clear to the developer, instead of requiring that the developer keep track of these things in their mind.
Even if you must manually annotate each function so that some metaprogram that runs at compile time can check that nothing is out of place could help detect memory leaks, I would think. or something; that's just an idea. There's a whole metaprogramming world of possibilities here that Zig allows that C simply doesn't. I think there's a lot of room for tooling like this to detect problems without forcing you to contort yourself into strange shapes simply to make the compiler happy.
Probably both. They're words of hubris.
C and Zig give the appearance of practicality because they allow you to take shortcuts under the assumption that you know what you're doing, whereas Rust does not; it forces you to confront the edge cases in terms of ownership and provenance and lifetime and even some aspects of concurrency right away, and won't compile until you've handled them all.
And it's VERY frustrating when you're first starting because it can feel so needlessly bureaucratic.
But then after awhile it clicks: Ownership is HARD. Lifetimes are HARD. And suddenly when going back to C and friends, you find yourself thinking about these things at the design phase rather than at the debugging phase - and write better, safer code because of it.
And then when you go back to Rust again, you breathe a sigh of relief because you know that these insidious things are impossible to screw up.
I feel like I am most interested about nim given how easy it was to pick up and how interoperable it is with C and it has a garbage collector and can change it which seems to be great for someone like me who doesn't want to worry about manual memory management right now but maybe if it becomes a bottleneck later, I can atleast fix it without worrying too much..
Out of all of them from what little I know and my very superficial knowledge Odin seems the most appealing to me, it's primary use case from what I know is game development I feel like that could easily pivot into native desktop application development was tempted to make a couple of those in odin in the past but never found the time.
Nim I like the concept and the idea of but the python-like syntax just irks me. haha I can't seem to get into languages where indentation replaces brackets.
But the GC part of it is pretty neat, have you checked Go yet?
I think people prefer what's familiar to them, and Swift definitely looks closer to existing C++ to me, and I believe has multiple people from the C++ WG working on it now as well, supposedly after getting fed up with the lack of language progress on C++.
The most recent versions gained a lot in the way of cross-platform availability, but the lack of a native UI framework and its association with Apple seem to put off a lot of people from even trying it.
I wish it was a lot more popular outside of the Apple ecosystem.
https://docs.swift.org/swift-book/documentation/the-swift-pr...
Edits mine.
I like to keep the spacetime topologies complete.
Constant = time atom of value.
Register = time sequence of values.
Stack = time hierarchy of values.
Heap = time graph of values.
Seasoned Rust coders don’t spend time fighting the borrow checker - their code is already written in a way that just works. Once you’ve been using Rust for a while, you don’t have to “restructure” your code to please the borrow checker, because you’ve already thought about “oh, these two variables need to be mutated concurrently, so I’ll store them separately”.
The “object soup” is a particular approach that won’t work well in Rust, but it’s not a fundamentally easier approach than the alternatives, outside of familiarity.
My experience is that what makes your statement true, is that _seasoned_ Rust developers just sprinkle `Arc` all over the place, thus effectively switching to automatic garbage collection. Because 1) statically checked memory management is too restrictive for most kinds of non trivial data structures, and 2) the hoops of lifetimes you have to go to to please the static checker whenever you start doing anything non trivial are just above human comprehension level.
That doesn’t mean there aren’t other legitimate use cases, but “all the time” is not representative of the code I read or write, personally.
No, this couldn't be further from the truth.
If you use Rust for web server backend code then yes, you see `Arc`s everywhere. Otherwise their use is pretty rare, even in large projects. Rust is somewhat unique in that regard, because most Rust code that is written is not really a web backend code.
No true scotsman would ever be confused by the borrow checker.
i've seen plenty of rust projects open source and otherwise that utilise Arc heavily or use clone and/or copy all over the place.
My beef is sometimes with the ways traits are implemented or how AWS implemented Errors for the their library that is just pure madness.
I have some issues with Zig's design, especially around the lack of explicit interface/trait, but I agree with the post that it is a more practical language, just because of how much simpler its adoption is.
Yes, they know when to give up.
I like the fact that "fighting the borrow checker" is an idea from the period when the borrowck only understood purely lexical lifetimes. So you have to fight to explain why the thing you wrote, which is obviously correct, is in fact correct.
That's already ancient history by the time I learned Rust in 2021. But, this idea that Rust will mean "fighting the borrow checker" took off anyway even though the actual thing it's about was solved.
Now for many people it really is a significant adjustment to learn Rust if your background is exclusively say, Python, or C, or Javascript. For me it came very naturally and most people will not have that experience. But even if you're a C programmer who has never had most of this [gestures expansively] before you likely are not often "fighting the borrow checker". That diagnostic saying you can't make a pointer via a spurious mutable reference? Not the borrow checker. The warning about failing to use the result of a function? Not the borrow checker.
Now, "In Rust I had to read all the diagnostics to make my software compile" does sound less heroic than "battling with the borrow checker" but if that's really the situation maybe we need to come up with a braver way to express this.
zig & rust have a somewhat thin middle area in the venn diagram.
As for the Ads, even though it's my site, I'd urge you to turn on adblocker, pi-hole or anything like that, I won't mind.
I have ads on there yes, but since I primarily write tech articles for a target audience of tech people you can imagine that most readers have some sort of adblocker either browser, network or otherwise.
So my grand total monthly income from ads basically covers hosting costs and so on.
> This means that basically the borrow checker can only catch issues at comptime but it will not fix the underlying issue that is developers misunderstanding memory lifetimes or overcomplicated ownership. The compiler can only enforce the rules you’re trying to follow; it can’t teach you good patterns, and it won’t save you from bad design choices.
In the short times that I wrote Rust, it never occurred to me that my lifetime annotations were incorrect. They felt like a bit of a chore but I thought said what I meant. I'm sure there's a lot of getting used to using it--like static types--and becomes second nature at some point. Regardless, code that doesn't use unsafe can't have two threads concurrently writing the same memory.
The full title is "Why Zig Feels More Practical Than Rust for Real-World CLI Tools". I don't see why CLI tools are special in any respect. The article does make some good points, but it doesn't invalidate the strength of Rust in preventing CVEs IMO. Rust or Zig may feel certain ways to use for certain people, time and data will tell.
Personally, there isn't much I do that needs the full speed of C/C++, Zig, Rust so there's plenty of GC languages. And when I do contribute to other projects, I don't get to choose the language and would be happy to use Rust, Zig, or C/C++.
Because they don't grow large or need a multi-person team. CLI tools tend to be one & done. In other words, it's saying "Zig, like C, doesn't scale well. Use something else for larger, longer lived codebases."
This really comes across in the article's push that Zig treats you like an adult while Rust is a babysitter. This is not unlike the sentiment for Java back in the day. But the reality is that most codebases don't need to be clever and they do need a babysitter.
> Developers are not Idiots
I'm often distracted and AIs are idiots, so a stricter language can keep both me and AIs from doing extra dumb stuff.
> Rust’s borrow checker is a a pretty powerful tool that helps ensure memory safety during compile time. It enforces a set of rules that govern how references to data can be used, preventing common programming memory safety errors such as null pointer dereferencing, dangling pointers and so on. However you may have notice the word compile time in the previous sentence. Now if you got any experience at systems programming you will know that compile time and runtime are two very different things. Basically compile time is when your code is being translated into machine code that the computer can understand, while runtime is when the program is actually running and executing its instructions. The borrow checker operates during compile time, which means that it can only catch memory safety issues that can be determined statically, before the program is actually run. > > This means that basically the borrow checker can only catch issues at comptime but it will not fix the underlying issue that is developers misunderstanding memory lifetimes or overcomplicated ownership. The compiler can only enforce the rules you’re trying to follow; it can’t teach you good patterns, and it won’t save you from bad design choices.
This appears to be claiming that Rust's borrow checker is only useful for preventing a subset of memory safety errors, those which can be statically analysed. Implying the existence of a non-trivial quantity of memory safety errors that slip through the net.
> The borrow checker blocks you the moment you try to add a new note while also holding references to the existing ones. Mutability and borrowing collide, lifetimes show up, and suddenly you’re restructuring your code around the compiler instead of the actual problem.
Whereas this is only A Thing because Rust enforces rules so that memory safety errors can be statically analysed and therefore the first problem isn't really a problem. (Of course you can still have memory safety problems if you try hard enough, especially if you start using `unsafe`, but it does go out of its way to "save you from bad design choices" within that context.)
If you don't want that feature, then it's not a benefit. But if you do, it is. The downside is that there will be a proportion of all possible solutions that are almost certainly safe, but will be rejected by the compiler because it can't be 100% sure that it is safe.
The thing I wish we would remember, as developers, is that not all programs need to be so "safe". They really, truly don't. We all grew up loving lots of unsafe software. Star Fox 64, MS Paint, FruityLoops... the sad truth is that developers are so job-pilled and have pager-trauma, so they don't even remember why they got in the game.
I remember reading somewhere that Andrew Kelley wrote zig because he didn't have a good language to write a DAW in, and I think its so well suited to stuff like that! Make cool creative software you like in zig, and people that get hella about memory bugs can stay mad.
Meanwhile, everyone knows that memory bugs made super mario world better, not worse.
I am fine with ignoring the problems that rust solves, but not because I'm smart and disciplined. It just fits my use-case of making fast _non-critical_ software. I don't think we should rewrite security and networking stacks in it.
I don't think you need the ritual and complexity that rust brings for small and simple scripts and CLI utilities...
self.last.as_ref().unwrap().borrow().next.as_ref().unwrap().clone()
I know it can be improved but that's what I think of
Yes, safety isn't correctness but if you can't even get safety then how are you supposed to get correctness?
For small apps Zig probably is more practical than Rust. Just like hiring an architect and structural engineers for a fence in your back yard is less practical than winging it.
I once joined a company with a large C/C++ codebase. There I worked with some genuinely expert developers - people who were undeniably smart and deeply experienced. I'm not exaggerating and mean it.
But when I enabled the compiler warnings (which annoyed them) they had disabled and ran a static analyzer over the codebase for the first time, hundreds of classic C bugs popped up: memory leaks, potential heap corruptions, out-of-bounds array accesses, you name it.
And yet, these same people pushed back when I introduced things like libfmt to replace printf, or suggested unique_ptr and vector instead of new and malloc.
I kept hearing:
"People just need to be disciplined allocations. std::unique_ptr has bad performance" "My implementation is more optimized than some std algorithm." "This printf is more readable than that libfmt stuff." etc.
The fact is, developers, especially the smart ones probably, need to be prevented from making avoidable mistakes. You're building software that processes medical data. Or steers a car. Your promise to "pay attention" and "be careful" cannot be the safeguard against catastrophe.
tonetegeatinst•1h ago
dayvster•1h ago
osmsucks•1h ago
dayvster•1h ago
Ar-Curunir•1h ago
Cloudef•50m ago