>The idea is to provide unmoderated side channel for random contributors to work on a project, with similar rationale as e.g. Wikipedia - that given enough interested people, the quality will grow rapidly and occassional "vandalism" will get fixed quickly. Of course this may not work nearly as well for software, but here we are, to give it a try.
A small and minimalistic C compiler is actually a very important foundational project for the software world IMNSHO.
I'm definitely reminded of: https://xkcd.com/2347/
Actual, geoblocks can be confounding of course. After brexit I've personally thought of blocking UK phone numbers from calling me though... So could just as well be intentional
When I taught programming (I started teaching 22 years ago), the course was still having students either use GCC with their university shell accounts, or if they were Windows people, they would use Borland C++ we could provide under some kind of fair use arrangement IIANM, and that worked within a command shell on Windows.
I used it just the other day to do some tests. No dependencies, no fiddling around with libwhater-1.0.dll or stuff like that when on Windows and so on.
Perhaps, or maybe they just got tired of students coming in and claiming that their program worked perfectly on such-and-such compiler.[1] It looks like tcc would run on most systems from the time of its introduction, and perhaps some that are a great deal older. When I took a few computer science courses, they were much more restrictive. All code had to be compiled with a particular compiler on their computers, and tested on their computers. They said it was to prevent cheating but, given how trivial it would have been to cheat with their setup, I suspect it had more to do with shutting down arguments with students who came in to argue over grades.
[1] I was a TA in the physical sciences for a few years. Some students would try to argue anything for a grade, and would persist if you let them.
https://arstechnica.com/ai/2026/02/sixteen-claude-ai-agents-...
> The $20,000 experiment compiled a Linux kernel but needed deep human management.
We tasked Opus 4.6 using agent teams to build a C Compiler | Hacker News
Except it was written in a completely different language (Rust), which likely would have necessitated a completely different architecture, and nobody has established any relationship either algorithmically or on any other level between that compiler and TCC. Additionally, and Anthropic's compiler supports x86_64 (partially), ARM, and RISC-V, whereas TCC supports x86, x86_64, and ARM. Additionally, TCC is only known to be able to boot a modified version of the Linux 2.4 kernel[1] instead of an unmodified version of Linux 6.9.
Additionally, it is extremely unlikely for a model to be able to regurgitate this many tokens of something, especially translated into another language, especially without being prompted with the starting set of tokens in order to specifically direct it to do that regurgitation.
So, whatever you want to say about the general idea that all model output is plagiarism of patterns it's already seen or something. It seems pretty clear to me that this does not fit the hyperbolic description put forward in the parent comments.
/* add a file (either a C file, dll, an object, a library or an ld script). Return -1 if error. */
int tcc_add_file(TCCState *s, const char *filename);
/* compile a string containing a C source. Return non zero if error. */
int tcc_compile_string(TCCState *s, const char *buf);https://guix.gnu.org/manual/1.5.0/en/html_node/Full_002dSour...
But that would require terminally online frogs acting in their collective interests, not isolating at home hoping the heat never reaches them.
The authors say, basically, that there's a risk of prosecution in the UK that would financially devastate anyone that works on the project, and that the act of determining how to comply with UK laws is itself an extremely resource-intensive legal task that they can't or won't do. In other words, they're geoblocking the UK not out of activism but out of pragmatic self-preservation.
That's not in any way mutually exclusive with collective action.
...also, couldn't deciding to geoblock the UK be a form of collective action? If that's what you originally meant, I sincerely apologize for reading it backwards.
Tiny C, Small C are names I seem to recall, buts its very fuzzy - Not sure if they were compilers, may have been interpreters....
You are using the compiler to compile itself.
"TCC is its own test set." Absolutely brilliant.
[1] https://en.wikipedia.org/wiki/Compilers:_Principles,_Techniq...
It’s focused on theory and very heavy on parsing. All of that is fine, but not especially useful for the hobbies.
- Writing a Compiler is Surprisingly Easy - https://news.ycombinator.com/item?id=38182461
- Write your own retro compiler - https://news.ycombinator.com/item?id=38591662
- Compiling a Lisp - https://news.ycombinator.com/item?id=39216904
- Writing a C Compiler - https://news.ycombinator.com/item?id=41227716
- Compilers: Incrementally and Extensibly - https://news.ycombinator.com/item?id=43593088
- Working through 'Writing a C Compiler' - https://news.ycombinator.com/item?id=44541565
- Build a Compiler in Five Projects - https://news.ycombinator.com/item?id=46031220
Personally I found Crafting Interpreters to be a terrific introduction to the key concepts: https://craftinginterpreters.com
(But I'm just a kibitzer, I've never written anything more serious than a DSL in Perl.)
#!/usr/bin/env -S nim r --cc:tcc -d:useMalloc --verbosity:0 --hints:off --tlsEmulation:on --passL:-lm
echo "Hello from Nim via TCC!"
Here's a comparison (bash script at [1]) of a minimal binary compiled this way with different interpreters. First line is the noise. Measured by tim[2] written by @cb321. 1.151 +- 0.028 ms (AlreadySubtracted)Overhead
1.219 +- 0.037 ms bash -c exit
2.498 +- 0.040 ms fish --no-config --private -c exit
1.682 +- 0.058 ms perl -e 'exit 0'
1.621 +- 0.043 ms gawk 'BEGIN{exit 0}'
15.8 +- 2.2 ms python3 -c 'exit(0)'
20.0 +- 5.7 ms node -e 'process.exit(0)'
-2.384 +- 0.041 ms tcc -run x.c
153.2 +- 4.6 ms nim r --cc:tcc x.nim
164.5 +- 1.2 ms nim r --cc:tcc -d:release x.nim
Measured on a laptop without any care to clean the environment, except turning the performance governor. Even with `-d:release` compiling nim code is comparable.The fact that tcc compilation cycle measures negative here is a nice punchline.
[1]: https://gist.github.com/ZoomRmc/58743a34d3bb222aa5ec02a5e2b6...
(Since a lot of the stuff you'd use this for doesn't change often, I don't think this invalidates the "point", since it makes "nim r" run very quickly, but still)
There's also the Nim interpreter built into the compiler, "NimScript", which can be invoked like:
#!/usr/bin/env -S nim e --hints:off
echo "Hello from Nim!"
The cool thing is that, without --forceBuild, Nim + TCC (as a linker) has a faster startup time than NimScript. But if you include compile time, NimScript wins.
rustyhancock•4h ago
Sad but not surprised to see it's no longer maintained (8 years ago!).
Even in the era of terabyte NVMe drives my eyes water when I install MSVC (and that's usually just for the linker!)
antirez•4h ago
pkal•4h ago
kristianp•4h ago
https://lists.nongnu.org/archive/html/tinycc-devel/2026-02/t...
shakna•3h ago
Debian, Fedora, Arch and others pull their package from the mob repo. They're pretty good at pulling in CVE fixes almost immediately.
Thomas Preud'homme is the new maintainer lead, though the code is a mob approach.