frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: ZigZag – A Bubble Tea-Inspired TUI Framework for Zig

https://github.com/meszmate/zigzag
1•meszmate•1m ago•0 comments

Metaphor+Metonymy: "To love that well which thou must leave ere long"(Sonnet73)

https://www.huckgutman.com/blog-1/shakespeare-sonnet-73
1•gsf_emergency_6•3m ago•0 comments

Show HN: Django N+1 Queries Checker

https://github.com/richardhapb/django-check
1•richardhapb•18m ago•1 comments

Emacs-tramp-RPC: High-performance TRAMP back end using JSON-RPC instead of shell

https://github.com/ArthurHeymans/emacs-tramp-rpc
1•todsacerdoti•23m ago•0 comments

Protocol Validation with Affine MPST in Rust

https://hibanaworks.dev
1•o8vm•27m ago•1 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
2•gmays•29m ago•0 comments

Show HN: Zest – A hands-on simulator for Staff+ system design scenarios

https://staff-engineering-simulator-880284904082.us-west1.run.app/
1•chanip0114•30m ago•1 comments

Show HN: DeSync – Decentralized Economic Realm with Blockchain-Based Governance

https://github.com/MelzLabs/DeSync
1•0xUnavailable•34m ago•0 comments

Automatic Programming Returns

https://cyber-omelette.com/posts/the-abstraction-rises.html
1•benrules2•37m ago•1 comments

Why Are There Still So Many Jobs? The History and Future of Workplace Automation [pdf]

https://economics.mit.edu/sites/default/files/inline-files/Why%20Are%20there%20Still%20So%20Many%...
2•oidar•40m ago•0 comments

The Search Engine Map

https://www.searchenginemap.com
1•cratermoon•47m ago•0 comments

Show HN: Souls.directory – SOUL.md templates for AI agent personalities

https://souls.directory
1•thedaviddias•48m ago•0 comments

Real-Time ETL for Enterprise-Grade Data Integration

https://tabsdata.com
1•teleforce•52m ago•0 comments

Economics Puzzle Leads to a New Understanding of a Fundamental Law of Physics

https://www.caltech.edu/about/news/economics-puzzle-leads-to-a-new-understanding-of-a-fundamental...
3•geox•53m ago•0 comments

Switzerland's Extraordinary Medieval Library

https://www.bbc.com/travel/article/20260202-inside-switzerlands-extraordinary-medieval-library
2•bookmtn•53m ago•0 comments

A new comet was just discovered. Will it be visible in broad daylight?

https://phys.org/news/2026-02-comet-visible-broad-daylight.html
3•bookmtn•58m ago•0 comments

ESR: Comes the news that Anthropic has vibecoded a C compiler

https://twitter.com/esrtweet/status/2019562859978539342
2•tjr•59m ago•0 comments

Frisco residents divided over H-1B visas, 'Indian takeover' at council meeting

https://www.dallasnews.com/news/politics/2026/02/04/frisco-residents-divided-over-h-1b-visas-indi...
3•alephnerd•1h ago•3 comments

If CNN Covered Star Wars

https://www.youtube.com/watch?v=vArJg_SU4Lc
1•keepamovin•1h ago•1 comments

Show HN: I built the first tool to configure VPSs without commands

https://the-ultimate-tool-for-configuring-vps.wiar8.com/
2•Wiar8•1h ago•3 comments

AI agents from 4 labs predicting the Super Bowl via prediction market

https://agoramarket.ai/
1•kevinswint•1h ago•1 comments

EU bans infinite scroll and autoplay in TikTok case

https://twitter.com/HennaVirkkunen/status/2019730270279356658
6•miohtama•1h ago•5 comments

Benchmarking how well LLMs can play FizzBuzz

https://huggingface.co/spaces/venkatasg/fizzbuzz-bench
1•_venkatasg•1h ago•1 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
19•SerCe•1h ago•14 comments

Octave GTM MCP Server

https://docs.octavehq.com/mcp/overview
1•connor11528•1h ago•0 comments

Show HN: Portview what's on your ports (diagnostic-first, single binary, Linux)

https://github.com/Mapika/portview
3•Mapika•1h ago•0 comments

Voyager CEO says space data center cooling problem still needs to be solved

https://www.cnbc.com/2026/02/05/amazon-amzn-q4-earnings-report-2025.html
1•belter•1h ago•0 comments

Boilerplate Tax – Ranking popular programming languages by density

https://boyter.org/posts/boilerplate-tax-ranking-popular-languages-by-density/
1•nnx•1h ago•0 comments

Zen: A Browser You Can Love

https://joeblu.com/blog/2026_02_zen-a-browser-you-can-love/
1•joeblubaugh•1h ago•0 comments

My GPT-5.3-Codex Review: Full Autonomy Has Arrived

https://shumer.dev/gpt53-codex-review
2•gfortaine•1h ago•0 comments
Open in hackernews

Why I Always End Up Going Back to C

https://deplet.ing/why-i-always-end-up-using-c/
72•indy•1w ago

Comments

theamk•1w ago
C sounds nice if your task is simple enough, or at least if you can decompose this to a series of loosely-connected simple-enough tasks.

But sometimes, there is an inherent complexity in what you are trying to implement, then C becomes way, way complex than C++. You build stuff, and there is so many manual steps, and none of them must be missing, or things will subtly break.

A good example is "gstreamer", the multimedia streaming framework. It is implemented in pure C. It needs to use basic data structures, so it uses GLib. It also need to support runtime-defined connection graph, so it is built on top of GObject.

Yes, build times are amazingly fast. But you pay for it - look at something simple, like display_name property[0]. There is display_name member, and PROP_DISPLAY_NAME enum, and switch case in setter (don't forget to free previous value!) , and switch case in getter, and it's manually installed in class_init, and you need to manually free it in dispose (except they forgot this).

So many places just for a single property, something that would have been 1 or 2 lines in a well-structured C++ app. And unlike C++, you cannot make simple rules like "never use char* except for 3rd party libs" - it's all those multi-point checklists which are not even written down. A

[0] https://github.com/GStreamer/gst-plugins-good/blob/master/sy...

acuozzo•1w ago
> You build stuff, and there is so many manual steps

"The real goal isn’t to write C once for a one-off project. It’s to write it for decades. To build up a personal ecosystem of practices, libraries, conventions, and tooling that compound over time."

rramadass•1w ago
Right.

The article is well written and the author has touched upon all the points which make C still attractive today.

theamk•1w ago
You mean, you are not worried about high complexity of codebase because you work with it every day for decades, so you know all this complexity by heart?

This basically requires one to be working solo, neither receiving not sharing the source code with others, treating third-party libraries as blackboxes.

I guess this can work for some people, but I don't think it would work for everyone.

acuozzo•1w ago
> so you know all this complexity by heart?

No. It's that you've built up a personal database of libraries, best-practices, idioms, et al. over decades.

When you move on to a new project, this personal database comes with you. You don't need to wonder if version X of Y framework or library has suddenly changed and then spend a ton of time studying its differences.

Of course, the response to this is: "You can do this in any language!"

And you'd be right, but after 20 years straight of working in C alongside teams working in Java, Perl, Python, Scheme, OCaml, and more, I've only ever seen experienced C programmers hold on to this kind of digital scrapbook.

theamk•1w ago
I don't see how this can work.

You have your personal string library.. and you move to a new project, and it has it's own string library (it's pretty much given, because stdlib C library sucks). So what next? Do you rewrite the entire project into _your_ string library, and enjoy familiar environment, until the next "experienced C programmer" comes along? Or do you give up on your own string library and start learning whatever project uses?

And this applies to basically everything. The "personal database" becomes pretty useless the moment the second person with such database joins the project.

(This is a big part of Python's popularity, IMHO. Maybe "str" and "logger" are not the best string and logger classes in the world, but they are good enough and in stdlib, so you never have to learn them when you start a new project)

flohofwoe•1w ago
It's not the "string library" that's important, but standardized interface types - so that different libraries can pass strings to each other while still being able to select the best string library that matches the project's requirements.

In C this standardized string interface type happens to the poiinter to a zero-terminated bag of bytes, not exactly perfect in hindsight, but as long as everybody agrees to that standard, the actual code working on strings can be replaced with another implementation just fine.

E.g. a minimal stdlib should mostly be concerned about standardized interface types, not about the implementation behind those types.

theamk•1w ago
Are you saying that if you join existing project which uses acuozzo_strzcpy, and you need to do some string copying, instead of using the same functions that everyone already uses, you'll bring your own library and start using flohofwoe_strjcpy in all code _you_ write? (Assuming both of those work on char* types, that is)?

This.. does not seem like a very good idea if you want your contributions to be received well.

flohofwoe•6d ago
I mean, it depends? The fact that it's possible doesn't mean it's a good idea, but at least it's possible. Maybe flohofwoe_strjcpy has a slight performance advantage in an extremely esoteric edge case but extremely hot loop that wasn't considered by acuozzo_strzcpy.
SleepyMyroslav•6d ago
(Not a GP) I think you can see how poorly the string abstraction argument looks in context of a team-based project. Instead of dismissing it completely I would like to provide an example of a context where C is perfectly fine now.

Consider data compression library like Oodle. Even with closed source and use of dangerous things like multiple threads it is perfectly reasonable deal if game project's budget has money to be spent on performance.

The thing is if game project have money it is not likely to be interested in written in C game engines or core middleware libraries (like physics, sound or occlusion culling). Because after buying a license your team is still expected to work in that code even if official support is very active.

Disclaimer, I work in gamedev and never needed to touch C code.

flohofwoe•1w ago
> neither receiving not sharing the source code with others, treating third-party libraries as blackboxes.

Tbh, this is an intriguing idea. Determine the size of a library (or module in a bigger system) by what one programmer can manage to build and maintain. Let the public interface be the defining feature of a library, not its implementation.

flohofwoe•1w ago
> It needs to use basic data structures, so it uses GLib. It also need to support runtime-defined connection graph, so it is built on top of GObject.

That's running into the age old trap of trying to shoehorn an OOP system into C, just don't do that ;) E.g. don't design your systems around the OOP paradigm in the first place.

flomo•1w ago
Unfortunately your insightful comment is 30 years too late. You'll have to find a time-machine and go back to the 1990s and tell GNU/GTK/Gnome/etc that they are doing it wrong.
theamk•1w ago
Good luck making any sort of UI without OOP-like methods. The moment you have grouped state (say "button_enabled", "button_shown", and "button_checked") and corresponding methods, you get something OOP-like.

The only way to work around is immediate mode UI, but this requires fairly powerful CPU, so it's only feasible on the modern machines. Certainly not something that people would want about 30 years ago, they still cared about performance back then.

wasmperson•1w ago
Immediate mode UI doesn't require a powerful CPU and was invented in 2002, so about 24 years ago. I think the belief that it necessarily sacrifices performance is somewhat of a misconception.

Compare a hypothetical "Immediate Mode" counter:

  void render_and_handle_button_click(struct ctx *ctx){
      draw_text(ctx, "Count: %d", ctx->count);
      if(draw_button(ctx, "Increment")){
          ctx->count++;
      }
  }
To the equivalent "Retained" counter:

  void render(struct ctx *ctx){
     set_textbox_text(ctx->textbox, "Count: %d", ctx->count);
  }

  void handle_button_click(void *ctx_in){
     struct ctx *ctx = ctx_in;
     ctx->count++;
     render(ctx);
  }

  void init(struct ctx *ctx){
      ctx->textbox = create_textbox(ctx);
      ctx->button = create_button(ctx);
      set_button_text(ctx->button, "Increment");
      set_button_click_handler(ctx->button, ctx, handle_button_click);
      render(ctx);
  }
The only difference I see here is whether the stateful "cache" of UI components (ctx->textbox and ctx->button) is library-side or application-side.
theamk•1w ago
You are looking at it from the user side, while all the interesting parts are on the implementation side:

- If you have partial redraw request, can you quickly locate _only_ the controls which are covered and only redraw those?

- If you are clicking on something, can you quickly locate which component will receive the click?

- If you are moving mouse, or dragging a selection, can you quickly determine if any components should change the state? Can you avoid running the full event loop on every mouse move event?

- If your application state has updated, do you need to force redraw or not? (Many immediate mode UIs fail badly here, never going to idle even if nothing is happening)

This is all trivial in the old-style UI - efficient redraw / mouse mapping is table stakes in the older GUIs. While all that immediate mode can do is to keep running redraw loop _every_ _single_ _time_ something as trivial as mouse move is happening, just in case this can change highlighted item or something.

(Unless the "immediate mode UI" is just a thin veneer, and library is using good-old OOP based GUI components under the hood... but in this case, it's still faster to cut out the middleman and control components yourself)

And yes, back when I was doing "game development" class in college, around that time, I've used the immediate mode UI for menus. This only makes sense - games run on foreground _anyway_, and they basically consume 100% of CPU anyway. But for regular apps? Please don't.

Example: I just opened https://www.egui.rs/#demo in background tab... The browser's task manager shows this tab never goes idle, consuming between 0.1% and 3%. Considering I often have 40+ tabs open, this can take a significant part of CPU.

Immediate mode GUIs, unless in app which already uses 100% CPU, like game or video player, will always be less efficient than classical event-driven ones.

wasmperson•1w ago
You appear to be making assumptions about immediate mode UI limitations based on some implementations you've worked with and not based on what's actually dictated by the interface itself. You touch on this somewhat by claiming that it's possible to be fast as long as the UI is merely a "thin veneer" over something more stateful, but that isn't a distinction I care about.

I'm not a good advocate for IMGUI; there are resources available online which explain it better than I can. I'm just trying to point out that the claim that immediate mode GUIs are some sort of CPU hog isn't really true. That's what I meant by "doesn't necessarily sacrifice performance," not that there is literally zero overhead (although I wouldn't be surprised if that were the case!).

> ...The browser's task manager shows this tab never goes idle...

As far as I can tell, the demo you linked appears to be bugged. You can see from poking around in a profiler (and from the frame timer under the "backend" popout) that the UI code does in fact go idle, but the requestAnimationFrame loop is never disabled for some reason. Regardless, even if this particular GUI library has problems going idle, that's not an inherent problem with the paradigm. I get the impression you understand this already, so I'm not sure why you've brought it up.

kvemkon•6d ago
> Many immediate mode UIs fail badly here, never going to idle even if nothing is happening

ImGui's author deliberately doesn't fix this because this is one of the main issues preventing ImGui to be widely adopted on desktop potentially attracting too many users at once but lacking support for all of them.

https://github.com/ocornut/imgui/issues/7892#issuecomment-22...

pjmlp•1w ago
If at least C solutions took advantage of abstract data types as advocated by modular design approaches before OOP took off, but no it is all reaching out to field data directly with macros, and clever pointer tricks that fail down.

There are several books on the matter, that obviously very few read.

Here one paper example from 1985 on the subject, "Modular programming in C: an approach and an example"

https://dl.acm.org/doi/10.1145/382284.382285

rramadass•1w ago
> If at least C solutions took advantage of abstract data types as advocated by modular design approaches

People have been writing C code with ADTs and "Modules" from the very beginning.

Two excellent examples which come to mind are; Andrew Tanenbaum's Minix book Operating Systems Design and Implementation and David Hanson's C Interfaces and Implementations: Techniques for Creating Reusable Software.

And of course the Linux Kernel is full of great modular C techniques which one can study.

pjmlp•1w ago
Unfortunely I have seen plenty of counter examples since 1991.

Starting with RatC from "Book on C", 1988 edition, over to Turbo C 2.0 in 1991, all the way to modern times.

That is just not how most C codebases look like.

rramadass•1w ago
Nope, you are just generalizing your opinion which is not quite true. My (and my colleagues) experience studying/programming C/C++ from the beginning-90's has been pretty good.

When the PC explosion happened, a lot of programmers without any CS background started with C programming and hence of course there is a lot of code (usually not long lasting) which do not adhere to software engineering principles. But quite a lot more C code was written in a pretty good style which was what one picked up at work if not already exposed to them during studies.

I still remember the books from late-80's/early-90's on the PC side, by authors like Al Stevens (utils/guis/apps using Turbo C) who wrote for Dr. Dobb's Journal. On the Unix side, of course you had Richard Stevens, P.J.Plauger, Thomas Plum etc. They all taught good C programming principles which are still relevant and practiced today.

pjmlp•6d ago
Each one is their own anecdote.

I have also all those books and magazines, pitty most coders of the code I have seen on my lifetime don't.

The regular developers, those that don't give a shit online forums exist, other than Stack Overflow, and go home to do non computer related stuff after work.

rramadass•6d ago
As i said, you cannot generalize from your experiences alone.

You have to look at the programming community as a whole and industry practices developed and adopted over time in the real world.

There is enough data here to show that C does not deserve the negativity that i often see here on HN.

theamk•1w ago
You know what gstreamer does, right? It's a dynamic multimedia framework - you give it pipeline defined by string, like:

ximagesrc display_name=:1 ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=127.0.0.1 port=5000

and it automatically loads .so files, creates all those components and connects them to each other. Super handy for all sorts of fun audio/video processing.

So all that C ceremony is required because user _should_ be able to say "ximagesrc display_name=:1", and possibly dynamically change this attribute to something else via script command (because a lot of time gstreamer is embedded in other apps).

So if you know how to achieve the same without trying to "shoehorn an OOP system into C", do let me know. But I bet whatever solution you came up with would be very close to what GStreamer ended up doing, if not even more complex.

(Unless what you are trying to say is: "If problem's most-efficient representation is OOP-like, don't use it with C because C is for simpler problems only. Use complex languages for complex tasks." If that's the case, I fully agree)

comex•1w ago
This kind of automatic property serialization/deserialization, however, has traditionally been a sore spot for C++ as well.

You can do it, but you will have to either repeat yourself at least a little, use some very ugly macros, or use a code generator.

And many of those ugly macro tricks work in C as well. So do code generators.

That said, as C++ has added features, this type of metaprogramming has gotten easier and easier, and more and more distinct from C. This culminates in C++26 reflection which will finally make it possible to just define a struct and then automatically generate serialization/deserialization for it, without hacks. Once reflection is implemented and widely adopted, then I will agree with you that this should be 1 or 2 lines in a well-structured C++ app.

flohofwoe•1w ago
> and it automatically loads .so files, creates all those components and connects them to each other. Super handy for all sorts of fun audio/video processing.

I created a quite similar OOP system for C around 1995 (as I guess did most programmers at that time who were fascinated by Objective-C), classes were implemented in DLLs and were loaded on demand, classes were objects themselves, the tree of class objects could be inspected (e.g. runtime-type-information and -reflection), and the whole system was serializable - this was for a PC game (https://en.wikipedia.org/wiki/Urban_Assault).

It looked like a neat thing at the time, but nothing a couple of structs with function pointers or a switch-case message dispatcher wouldn't be able to do just as well, definitely not something that should be the base for more than one product, and most definitely nothing that should have survived the OOP hype of the 90's ;)

1718627440•1w ago
People point to GObject say it's complicated and compare it to C++ classes. But the same thing as C++ classes in C would also be far simpler. GObject is so complicated, because it is essentially runtime creation and modification of classes (not objects, classes). Doing that in C++ will also be some work and look ridiculously complicated.
kianN•1w ago
> Code gets simpler because it has to, and architecture becomes explicit.

> The real goal isn’t to write C once for a one-off project. It’s to write it for decades. To build up a personal ecosystem of practices, libraries, conventions, and tooling that compound over time. Each project gets easier not because I've memorized more tricks, but because you’ve invested in myself and my tools.

I deeply appreciate this in the C code bases I work in (scientific computing, small team)

rramadass•1w ago
Agreed.

I generally try to use C++ as a "better C" before the design complexity makes me model higher-level abstractions "the C++ way". All abstractions have a cognitive cost and C makes it simpler and explicit.

1718627440•1w ago
Personally, I tried that, but it already breaks down for me, once I try to separate allocation from initialization, so I am back to C really quickly. And then I want to take the address from temporaries or create types in function declarations, and C++ is just declares that to be not allowed.
jezze•1w ago
First off, I want to congratulate you on reaching this milestone. I think this is the state where the most seasoned programmers end up. They know how to write code that works and they don't need a language to "help" or "guide" them.

Enjoy!

imtringued•1w ago
If software development taught me anything it is that everything that can go wrong will go wrong, the impossible will happen. As a result I prefer having less things that can go wrong in the first place.

Since I acknowledge my own fallibility and remote possibilities of bad things happening I have come to prefer reliability above everything else. I don't want a bucket that leaks from a thousand holes. I want the leaks to be visible and in places I am aware of and where I can find and fix them easily. I am unable to write C code to that standard in an economical fashion, which is why I avoid C as much as possible.

jezze•1w ago
This is, perhaps surprisingly, what I consider the strength of C. It doesn't hide the issues behind some language abstraction, you are in full control of what the machine does. The bug is right there in front of you if you are able to spot it (given it's not hiding away in some 3rd party library of course) which of course takes many years of practice but once you have your own best practices nailed down this doesn't happen as often as you might expect.

Also, code doesn't need to be bulletproof. When you design your program you also design a scope saying this program will only work given these conditions. Programs that misbehaves outside of your scope is actually totally fine.

pjmlp•1w ago
How is one in full control of SIMD and CPU/OS scheduling in NUMA architecures in C?
rramadass•1w ago
Linux has libnuma (https://man7.org/linux/man-pages/man3/numa.3.html) while Windows has its own NUMA api (https://learn.microsoft.com/en-us/windows/win32/procthread/n...)

For CPU/OS scheduling, use pthreads/OpenMP apis to set processor affinity for threads.

For SIMD, use compiler intrinsics.

pjmlp•1w ago
Nothing of that is written in pure C, as per ISO C standard.

Rather they rely on a mix of C compiler language extensions, inline or external Assembly written helpers functions, which any language compiled language also has available, when going out of the standard goes.

theamk•1w ago
I think you are being nitpicky here.

When most people say "I write in C", they don't mean abstract ISO C standard, with the possibility of CHAR_BIT=9. They mean "C for my machine" - so C with compiler extensions, assumptions about memory model, and yes, occasional inline assembly.

pjmlp•6d ago
I am, because people making C something special that it isn't.

Other languages share the same features.

rramadass•1w ago
That is not an argument. ANSI/ISO C standardizes hardware-independent parts of the language but at some point you have to meet the hardware. The concept of a "implementation platform" (i.e. cpu arch + OS + ABI) is well known for all language runtimes.

All apps using the above-mentioned are written in standard ANSI/ISO C. The implementation themselves are "system level" code and hence have Language/HW/OS specific extensions which is standard practice when interfacing with low-level code.

> any language compiled language also has available

In theory yes, but in practice never to the ease nor flexibility with which you can use C for the job. This is what people mean when they say "C is close to the metal" or "C is a high-level assembly language".

pjmlp•6d ago
It is, because C is nothing special, those features are available in other languages.

Proven before C was even a dream at AT&T, and by all other OS vendors outside Bell Labs using other systems languages.

Then people get to argue C can X, yeah provided it is the Compiler XYZ C dialect.

rramadass•6d ago
Not quite.

C took off because system programmers could not do with other languages what they wanted, with the ease and flexibility that C offered.

Having a feature in a language is not the same as how easy it is to span hardware, OS and application in the same language and runtime.

pjmlp•6d ago
Not really.

C took off because it was free, shipped alongside with an operating system that initially was available for a symbolic price, as AT&T was forbidden to take advantage of UNIX.

Had UNIX been a commercial operating system, with additional licenses for the C compiler, like every other operating systems outside Bell Labs, we would not be even talking about C in 2026.

rramadass•5d ago
Being easily affordable/available in those times was the initial "hook" but C's subsequent and sustained success was due to a happy confluence of various design decisions.

Not too high-level, Not too low-level, easy access to memory/ISA, simple abstract machine, being imperative procedural, spanning bare-metal/OS/app, adopted by free software movement producing free compilers/tools, becoming de-facto industry standard ABI etc. all were crucial in its rise to power.

Note that its main competitor at that time, Pascal; lost out in spite of being simpler, having clean high-level features, promoted by academia, safety focused etc.

As Dennis Ritchie himself said in "The Development of the C Language" (https://www.nokia.com/bell-labs/about/dennis-m-ritchie/chist...);

C is quirky, flawed, and an enormous success. While accidents of history surely helped, it evidently satisfied a need for a system implementation language efficient enough to displace assembly language, yet sufficiently abstract and fluent to describe algorithms and interactions in a wide variety of environments.

bigstrat2003•1w ago
Empirically speaking, programmers as a whole are quite bad at avoiding such bugs. Humans are fallible, which is why I personally think it's good to have tools to catch when we make mistakes. One man's "this takes control away from the programmer" is another man's "friend that looks at my work to make sure it makes sense".
pjmlp•1w ago
> The language shows you the machine, a machine which is not forgiving to mistakes.

Assembly does that, C not really, it is a myth that it does.

jezze•1w ago
True, it doesn't give you the bare machine. What it gives you is the thinnest of machine abstraction with the possibility of linking to your own assembly if you have the demand for it.
pjmlp•1w ago
Yet another myth, plenty of languages since JOVIAL in 1958 offer similar capabilities.
BoredomIsFun•1w ago
Ok, if you insist on ultra precise description - "C is is the lowest level language among widely used".
pjmlp•6d ago
Not even that, because C compilers nowadays are written in C++.
BoredomIsFun•6d ago
It is unrelated to the point.
jezze•1w ago
I am curious, what was it I said that you consider to be a myth? If I have some misunderstanding I would like to know. I looked at JOVIAL on wikipedia quickly but I can't see exactly how it would be thinner than C or if it's compiler would output something vastly different to a C compiler. Or did you mean it's as thin as C but it came out earlier?
pjmlp•6d ago
Both, the properties that UNIX crowd assigns to C aren't unique.

Most think that way because they never learnt anything other than C and C++.

jezze•6d ago
I see, you thought I meant that C was the only language with this property. No there are plenty of others, I was fully aware of that. I on the other hand thought you meant that JOVIAL in some way was even thinner or more tuned to underlying architecture in some way that made it thinner than C.
hgs3•1w ago
I'm being pedantic, but on modern hardware, the ISA is an abstraction over microarchitecture and microcode. It's no longer a 1-to-1 representation of hardware execution. But, as programmers, it's as low as we can go, so the distinction is academic.
pjmlp•6d ago
Still one layer below C, and with plenty of features not available on C source code.
rramadass•6d ago
Compiler intrinsics do give you C/C++ api access to relevant ISA subsets as platform-specific extensions.
ivanjermakov•1w ago
> In C, you can see what the machine is doing. Allocations don’t hide behind constructors, and destructors don’t quietly run during stack unwinding. You can profile at the machine-code level without feeling like you’re peeling an onion, appropriately shedding tears the whole time.

This is why explicit control flow is important design goal for systems programming language. This is basically 2/3 of core design principles in Zig.

pjmlp•1w ago
Like setjmp()/longjmp() and signal(), very explicit. /s
rramadass•1w ago
The control flow is explicit; there is no language "magic" here. Non-local gotos in the former case and asynchronous callbacks from the OS in the latter case are pretty well known.
pjmlp•6d ago
Except knowing where the jump lands, very explicit.
rramadass•6d ago
These are low-level api and hence if you know the caveats to follow, then using them correctly is not difficult; Eg. keep the context of the function calling setjmp active, don't use jmp_bufs allocated on a stack etc.

Not knowing how to do something is the fault of the programmer and not the language/tool.

nacozarina•1w ago
I did a lot of c++ in the mid-90s, often on teams with experienced C programmers new to C++.

They had little appetite for C++, it was 90% mgmt saying ‘use the shiny new thing we read about’. I was the FNG who ‘helped’ them get thru it by showing them the tools & lingo that would satisfy mgmt.

OOP is non-scientific and the snake-oil hype made it cancerous. C++ has ballooned into an absurd caricature. It obfuscates business logic with crypto-like strength, it doesn’t clarify anything. I feel like a war criminal. Replacing C++ is one thing but ridding the world of the OOP rot is a far deeper infection.

I later spent years doing my Typhoid Mary bit in the Java community before abandoning the heresy. Repent and sin no more, that’s all one can do.

rramadass•1w ago
> OOP is non-scientific and the snake-oil hype ... ridding the world of the OOP rot is a far deeper infection.

You are spewing nonsense.

Read Bertrand Meyer's Object-Oriented Software Construction, Barbara Liskov's Program Development in Java: Abstraction, Specification, and Object-Oriented Design and Brad Cox's Object-Oriented Programming: An Evolutionary Approach for edification on OOD/OOP.

andrekandre•6d ago

  > Brad Cox's Object-Oriented Programming: An Evolutionary Approach
i liked this one and got some good insights from it, though it was so old it was hard to get through...

the snake-oil aspect though, i think is true to a large extent:

oop became a huge hype and a marketing term, and things like c++ and java oop are so far away from the original ideas of the original 'oop' of smalltalk and we have been suffering from really bad/low quality abstractions (javas infamous FactoryFactory pattern, subclass everything etc) for a long time...

rramadass•5d ago
> c++ and java oop are so far away from the original ideas of the original 'oop' of smalltalk

This is the fundamental misunderstanding; there is no "original OOP" but different "strains of OOP" viz. the Simula67 vs. Smalltalk models.

C++ followed the Simula approach (i.e. static object model) while Java is hybrid mixing both Simula and Smalltalk approaches (i.e. dynamic object model but with static typing). You have to look at the entirety of the OOD/OOP domain to understand how modern languages have evolved OOP support.

See also OO History: Simula and Smalltalk - https://www.cs.cmu.edu/~charlie/courses/15-214/2014-fall/sli...

rramadass•1w ago
Good Article. The author has touched upon all the points that make C still attractive today.

A few more points;

C allows you program everything from dinky little MCUs all the way to honking big servers and everything in-between. It also allows you to span all levels of programming from bare-metal, system-level (OS/System utilities etc.) to any type of applications.

There has also been a lot of work done and available on Formal Verification of C programs Eg. Frama-C, CBMC etc.

Finally, today all LLM agents are well trained on the massive publicly available C codebases making their output far more reliable.

PS: See also Fluent C: Principles, Practices, and Patterns by Christopher Preschern for further study.

iainctduncan•1w ago
I wound up going back to C in a big way about five years ago when I embarked on Scheme for Max, an extension to Max/MSP that lets you use s7 Scheme in the Max environment. Max has a C SDK, and s7 is written in 100% ANSI C. (Max also has C++ SKD, but is far less comprehensive with far fewer examples and docs).

I was, coming from being mostly a highlevel language coder, suprised at how much I like working in this combo.

Low level stuff -> raw C. High level stuff -> Scheme, but written such that I can drop into C or move functions into C very easily. (The s7 FFI is dead simple).

It's just really nice in ways that are hard to articulate. They are both so minimal that I know what's going on all the time. I now use the combo in other places too (ie WASM). It really forces one to think about architecture in what I think is a good way. YMMV of course!

rramadass•6d ago
Nice.

Reminds me of optimizations done in the early days of Erlang and BEAM using C for performance reasons - https://www.erlang.org/blog/beam-compiler-history/

Fwirt•1w ago
Returning to C after a decade away, I think the bottom line is that the reason why C has stuck around for so long is straight up path dependence. C is the foundation of Unix, and Unix-like operating systems took over the world, and C is a decent enough systems programming language that people started using it to write other OSes as well.

It bothers me that there’s some kind of mysticism around C. I keep seeing weird superstitions like, “modern processors are designed to run C code fast”. Or that there’s some inherent property of C that makes it “closer to the hardware”. The reality is just that C has been around for so long that there are millions of lines of optimizations handcrafted into all the compilers, and people continue to improve the compilers because there are billions of lines of C that will benefit from it.

FORTRAN is also crazy fast, but people don’t worship it the same way. SBCL and even some BASIC compilers approach the speed of C. And C is a high level language, despite what many people who have never touched assembler may assert.

C is not a bad language, and once you get your head around it you can write anything in C, but it’s absolutely full of landmines (sorry, “undefined behaviors”).

The author makes some really great points about the standard library. A lot of C’s pain points stem from memory management, string handling, etc. which stem from quirks in the standard library. And yet it’s possible to completely ignore the standard library, especially if you’re on an embedded system or bare metal. Personally I feel that a large standard library is a liability, and a much stronger property of a language is that the base language is small enough to keep the entirety of it in your head and still have the ability to implement any algorithm you need in a minimal number of lines without resorting to mental gymnastics, and still be able to read the code afterwards. I think this is why Lisp is so revered. I feel like Lua also falls into this bucket.

We need to stop starting flame wars about which language is best, cargo culting the newest shiny language, and instead just learn the strengths and weaknesses of our tools and pick the right tool for the job. You can machine almost anything on a lathe, but sometimes a drill press or a mill are much better suited for the task.

UltraSane•1w ago
C's memory model made (some) sense when computers were very slow and mostly isolated but it is a complete disaster for code connected to the internet.
zzo38computer•1w ago
I program in C and I like many of the reasons they mention here, are things I like about C programming. They use C89 (and sometimes C99), although I do use some of the GNU extensions (which, as far as I know, both GCC and Clang support them).
UltraSane•1w ago
C's memory model requires shared invisible invariants that can't be encoded in the type system or signatures. This makes C code incredibly fragile.
wasmperson•1w ago
I don't think most of the recent essays we're seeing defending C are coming from experienced C veterans, but instead from much younger programmers who were introduced to "The C Way" of doing things by Handmade Hero.

It's surprising the number of people for whom that series appears to have completely rewritten their understanding of programming. It's almost like when someone reads Karl Marx or Richard Dawkins for the first time and finds themselves questioning everything they thought they knew about the world. It's such an outsized impact for such a seemingly straightforward tutorial series.

estimator7292•1w ago
I'm perfectly happy writing C With Classes. There isn't a problem in my domain that can't be solved with C, and frothing at the mouth about "safety" simply isn't relevant. I program machines and I need a language that doesn't do everything possible to hide that fact.

C is a fine language. Sure it's got sharp edges to poke your eyes our and big gears that will rip your arm off, but guess what? So does the machine. Programming a machine is an inherently unsafe activity and you have to act like a responsible adult, not some cargo culting lunatic that wants to purge the world of all code written before 2010.

I'm going back to statically allocating my 2KB of SRAM now. Humbug, etc