Honestly I think that's probably the correct way to write high reliability code.
Do you have any evidence for "probably"?
See https://www.safetyresearch.net/toyota-unintended-acceleratio...
"I know for a fact that Italian cooks generate spaghetti, and the deceased's last meal contained spaghetti, therefore an Italian chef must have poisoned him"
It is impossible for a simulink model to accidentally type `i > 0` when they meant `i >= 0`, for example. Any human who tells you they have not made this mistake is a liar.
Unless there was a second uncommanded acceleration problem with Toyotas, my understanding is that it was caused by poor mechanical design of the accelerator pedal that caused it to get stuck on floor mats.
In any case, when we're talking about safety critical control systems like avionics, it's better to abstract away the actual act of typing code into an editor, because it eliminates a potential source of errors. You verify the model at a higher level, and the code is produced in a deterministic manner.
The Simulink Coder tool is a piece of software. It is designed and implemented by humans. It will have bugs.
Autogenerated code is different from human written code. It hits soft spots in the C/C++ compilers.
For example, autogenerated code can have really huge switch statements. You know, larger than the 15-bit branch offset the compiler implementer thought was big enough to handle any switch-statement any sane human would ever write? So now the switch jumps backwards instead when trying to get the the correct case-statement.
I'm not saying that Simulink Coder + a C/C++ compiler is bad. It might be better than the "manual coding" options available. But it's not 100% bug free either.
At least before we had zero-cost exceptions. These days, I suspect the HFT crowd is back to counting microseconds or milliseconds as trades are being done smarter, not faster.
Actual code i have seen with my own eyes. (Not in F-35 code)
Its a way to avoid removing an unused parameter from a method. Unused parameters are disallowed, but this is fine?
I am sceptical that these coding standards make for good code!
(void) a;
Every C programmer beyond weaning knows that. (void) a;
I'm sure there are commonly-implemented compiler extensions, but this is the normal/native way and should always work.https://godbolt.org/z/zYdc9ej88
clang gets this right.
_ = a;
And you would encounter it quite often because unused variable is a compilation error: https://github.com/ziglang/zig/issues/335It's extremely annoying until it's suddenly very useful and has prevented you doing something unintended.
Isn't it just bad design that makes both experimenting harder and for unused variables to stay in the code in the final version?
Notably this document is from 2005. So that's after C++ was standardized but before their second bite of that particular cherry and twenty years before its author, Bjarne Stroustrup suddenly decides after years of insisting that C++ dialects are a terrible idea and will never be endorsed by the language committee, that in fact dialects (now named "profiles") are the magic ingredient to fix the festering problems with the language.
While Laurie's video is fun, I too am sceptical about the value of style guides, which is what these are. "TABS shall be avoided" or "Letters in function names shall be lowercase" isn't because somebody's aeroplane fell out of the sky - it's due to using a style Bjarne doesn't like.
It would enable preventing a lot of footguns, at the very least, like vector<bool>.
Wait, you're one of those Rust evangelists, right? Have you been paid like fasterthanlime?
The main issue is mission assurance. Using the stack or the heap means your variables aren't always at the same memory address. This can be bad if a particular memory cell has failed. If every variable has a fixed address, and one of those addresses goes bad, a patch can be loaded to move that address and the mission can continue.
A good example of what I'm talking about is a program that I was peripherally involved with about 15 years ago. The lead wanted to abstract the mundane details from the users (on the ground), so they would just "register intent" with the spacecraft, and it would figure out how to do what was wanted. The lead also wanted to eliminate features such as "memory dump", which is critical to the anomaly resolution process. If I had been on that team, I would have raised hell, but I wasn't, and at the time, I needed that team lead as an ally.
I mean, even when I have the codebase readily accessible and testable in front of my eyes, I never trust the tests to be enough ? I often spot forgotten edge cases and bugs of various sort in C/embedded projects BECAUSE I run the program, can debug and spot mem issues and whole a lot of other things for which you NEED to gather the most informations you can in order to find solutions ?
what leads to better code in terms of understandability & preventing errors
Exceptions (what almost every language does) or Error codes (like Golang)
are there folks here that choose to use error codes and forgo Exceptions completely ?
In C++, which supports both, exceptions are commonly disabled at compile-time for systems code. This is pretty idiomatic, I've never worked on a C++ code base that used exceptions. On the other hand, high-level non-systems C++ code may use exceptions.
I actually think Ada would be an easier sell today than it was back then. It seems to me that the software field overall has become more open to a wider variety of languages and concepts, and knowing Ada wouldn't be perceived as widely as career pidgeonholing today. Plus, Ada is having a bit of a resurgence with stuff like NVidia picking SPARK.
I haven’t heard anything particularly bad about the software effort, other than the difficulties they had making the VR/AR helmet work (the component never made it to production afaik).
https://www.nwfdailynews.com/story/news/local/2021/08/02/f-3...
The electrical system performs poorly under short circuit conditions.
https://breakingdefense.com/2024/10/marine-corps-reveals-wha...
They haven't even finished delivering and now have to overhaul the entire fleet due to overheating.
https://nationalsecurityjournal.org/the-f-35-fighters-2-big-...
This program was a complete and total boondoggle. It was entirely the wrong thing to build in peace time. It was a moonshoot for no reason other than to mollify bored generals and greedy congresspeople.
From a european perspective, I can tell you that the mood has shifted 180 degrees from "buy American fighters to solidify our ties with the US" to "can't rely on the US for anything which we'll need when the war comes".
Definitely not a failure.
The evidence for this claim was found in testing for the F35 where it was dog fighting a older F16. The results of the test where that the F35 won almost every scenario except one where a lightweight fitted F16 was teleported directed behind a F35 weighed down by heavy missiles and won the fight. This one loss has spawned hundreds of articles about how the F35 is junk that can't dogfight.
In the end the F35 has a lot of fancy features that are not optional for modern operations. The jet has now found enough buyers across the west for economies of scale to kick in and the cost is about ~80 million each which is cheaper than retrofitting stealth and sensors onto other air frames like what you get with the F15-EX
Ok, joking aside: If it is considered a failure, what 100B+ military programme has not been considered a failure?
In my totally unqualified opinion, the best cost performance fighter jet in the world is the Saab JAS 39 Gripen. It is very cheap to buy and operate, and has pretty good capabilities. It's a good option for militaries that don't have the infinite money glitch.
That is of course not to say that exceptions and error codes are the same.
That explains all the delays on the F-35....,
I actually do this as well, but in addition I log out a message like, "value was neither found nor not found. This should never happen."
This is incredibly useful for debugging. When code is running at scale, nonzero probability events happen all the time, and being able to immediately understand what happened - even if I don't understand why - has been very valuable to me.
mwkaufma•5h ago
- no exceptions
- no recursion
- no malloc()/free() in the inner-loop
jandrewrogers•5h ago
DashAnimal•4h ago
jandrewrogers•4h ago
WD-42•4h ago
nicoburns•4h ago
Gupie•4h ago
tialaramex•4h ago
My guess is that you're assuming all user defined types, and maybe even all non-trivial built-in types too, are boxed, meaning they're allocated on the heap when we create them.
That's not the case in C++ (the language in question here) and it's rarely the case in other modern languages because it has terrible performance qualities.
jjmarr•4h ago
nmhancoc•4h ago
And if you’re using pooling I think RAII gets significantly trickier to do.
theICEBeardk•4h ago
DashAnimal•4h ago
jandrewrogers•3h ago
C++ is designed to make this pretty easy.
astrobe_•4h ago
Cyan488•4h ago
wiseowise•4h ago
bluGill•4h ago
canyp•4h ago
gmueckl•4h ago
elteto•4h ago
You can compile with exceptions enabled, use the STL, but strictly enforce no allocations after initialization. It depends on how strict is the spec you are trying to hit.
vodou•4h ago
theICEBeardk•4h ago
Espressosaurus•3h ago
Provocative talk though, it upends one of the pillars of deeply embedded programming, at least from a size perspective.
vodou•3h ago
- C++ Exceptions Reduce Firmware Code Size, ACCU [1]
- C++ Exceptions for Smaller Firmware, CppCon [2]
[1]: https://www.youtube.com/watch?v=BGmzMuSDt-Y
[2]: https://www.youtube.com/watch?v=bY2FlayomlE
elteto•3h ago
So, what exact parts of the STL do you use in your code base? Most be mostly compile time stuff (types, type trait, etc).
alchemio•2h ago
theICEBeardk•4h ago
elteto•3h ago
No algorithms or containers, which to me is probably 90% of what is most heavily used of the STL.
Taniwha•4h ago
AnimalMuppet•3h ago
thefourthchime•4h ago
It is "C++", but we also follow the same standards. Static memory allocation, no exceptions, no recursion. We don't use templates. We barely use inheritance. It's more like C with classes.
EliRivers•3h ago
The C++ was atrocious. Home-made reference counting that was thread-dangerous, but depending on what kind of object the multi-multi-multi diamond inheritance would use, sometimes it would increment, sometimes it wouldn't. Entire objects made out of weird inheritance chains. Even the naming system was crazy; "pencilFactory" wasn't a factory for making pencils, it was anything that was made by the factory for pencils. Inheritance rather than composition was very clearly the model; if some other object had function you needed, you would inherit from that also. Which led to some object inheriting from the same class a half-dozen times in all.
The multi-inheritance system given weird control by objects on creation defining what kind of objects (from the set of all kinds that they actually were) they could be cast to via a special function, but any time someone wanted one that wasn't on that list they'd just cast to it using C++ anyway. You had to cast, because the functions were all deliberately private - to force you to cast. But not how C++ would expect you to cast, oh no!
Crazy, home made containers that were like Win32 opaque objects; you'd just get a void pointer to the object you wanted, and to get the next one pass that void pointer back in. Obviously trying to copy MS COM with IUnknown and other such home made QueryInterface nonsense, in effect creating their own inheritance system on top of C++.
What I really learned is that it's possible to create systems that maintain years of uptime and keep their frame accuracy even with the most atrocious, utterly insane architecture decisions that make it so clear the original architect was thinking in C the whole time and using C++ to build his own terrible implementation of C++, and THAT'S what he wrote it all in.
Gosh, this was a fun walk down memory lane.
webdevver•2h ago
it was painful for me to accept that the most elite programmers i have ever encountered were the ones working in high frequency trading, finance, and mass-producers of 'slop' (adtech, etc.)
i still ache to work in embedded fields, in 8kB constrained environment to write perfectly correct code without a cycle wasted, but i know from (others) experience that embedded software tends to have the worst software developers and software development practices of them all.
uecker•38m ago
tialaramex•4h ago
The idea of `become` is to signal "I believe this can be tail recursive" and then the compiler is either going to agree and deliver the optimized machine code, or disagree and your program won't compile, so in neither case have you introduced a stack overflow.
Rust's Drop mechanism throws a small spanner into this, in principle if every function foo makes a Goose, and then in most cases calls foo again, we shouldn't Drop each Goose until the functions return, which is too late, that's now our tail instead of the call. So the `become` feature AIUI will spot this, and Drop that Goose early (or refuse to compile) to support the optimization.
tgv•4h ago
But ... that rewrite can increase the cyclomatic complexity of the code on which they have some hard limits, so perhaps that's why it isn't allowed? And the stack overflow, of course.
AnimalMuppet•3h ago
zozbot234•2h ago
tialaramex•1h ago
Because Rust is allowed (at this sort of distance in time) to reserve new keywords via editions, it's not a problem to invent more, so I generally do prefer new keywords over re-using existing words but I'm sure I'd be interested in reading the pros and cons.
zozbot234•52m ago
krashidov•3h ago
I feel like that's the way to go since you don't obscure control flow. I have also been considered adding assertions like TigerBeetle does
https://github.com/tigerbeetle/tigerbeetle/blob/main/docs/TI...
mwkaufma•3h ago
tonfa•1h ago
fweimer•58m ago
Some large commercial software systems use C++ exceptions, though.
Until recently, pretty much all implementations seemed to have a global mutex on the throw path. With higher and higher core counts, the affordable throw rate in a process was getting surprisingly slow. But the lock is gone in GCC/libstdc++ with glibc. Hopefully the other implementations follow, so that we don't end up with yet another error handling scheme for C++.
msla•3h ago
> no recursion
Does this actually mean no recursion or does it just mean to limit stack use? Because processing a tree, for example, is recursive even if you use an array, for example, instead of the stack to keep track of your progress. The real trick is limiting memory consumption, which requires limiting input size.
mwkaufma•3h ago
mwkaufma•2h ago
drnick1•1h ago
pton_xd•3h ago
petermcneeley•1h ago
mwkaufma•1h ago
petermcneeley•34m ago
mwkaufma•29m ago
petermcneeley•3m ago