What a flex of patience!
Ah, like Stephen Bourne
Do you think the article meant to say it was more likely that the code wasn't inspired by APL?
I do agree that Whitney was inspired to some extent by APL conventions (not exclusively; he was quite a Lisp fan and that's the source of his indentation style when he writes multi-line functions, e.g. in [1]). The original comment was not just a summary of this claim but more like an elaboration, and began with the much stronger statement "The way to understand Arthur Whitney's C code is to first learn APL", which I moderately disagree with.
[0] https://aplwiki.com/wiki/List_of_open-source_array_languages
That's backing for a claim.
Also, I haven't once written APL. I think this might've been borderline trolling, just because of how little investment I have in the topic in reality. Sorry.
The C pre-processor is probably one of the most abused pieces of the C toolchain and I've had to clean up more than once after a 'clever' programmer left the premises and their colleagues had no idea of what they were looking at. Just don't. Keep it simple, and comment your intent, not what the code does. Use descriptive names. Avoid globally scoped data and functions with side effects.
That doesn't look smart and it won't make you look smart, but it is smart because the stuff you build will be reliable, predictable and maintainable.
It’s probably more accessible than APL since its symbols can be found on conventional keyboards.
Ok, so this article is tongue in cheek. Good to know that up front.
This was a very fun read that I'm fairly convinced I will have to come back to.
And in every field they work well for the average case, but are rarely the best fit for that specific scenario. And in some rare scenarios, doing the opposite is the solution that fits best the individual/team/project.
The interesting takeaway here is that crowd wisdom should be given weight and probably defaulted if we want to turn off our brains. But if you turn on your brain you will unavoidably see the many cracks that those solutions bring for your specific problem.
Just because you succeed at one says nothing about other practical and important metrics.
The proper way to read it is to understand the problem and its pros and cons.
Without going long in the speculation, the situation likely was: there's only one guy who really can deliver this because of his knowledge, cv and experience and we need it.
And at that point your choice is having a solution or not.
But even if we grant that only one person could deliver a solution, it wouldn’t change the fact that you’re giving up on certain things to get it.
Example: https://www.tuhs.org/cgi-bin/utree.pl?file=V7/usr/src/cmd/sh...
- you have no interest in maintaining your code
- your code will never be maintained by someone else
- you know your C preprocessor better than you know your C compiler
- your favorite language isn't available for this particular target
- you don't mind object level debugging
- your idea of a fun time is to spend a few hours per day memorizing code
- you really are smarter than everybody else
I believe “oo” is probably an infinity error condition or some such not 100% sure. I didn’t see the author discuss it since they said it’s not used. Was probably used during development as a debug printout.
no.
#define _(e...) ({e;})
#define x(a,e...) _(s x=a;e)
#define $(a,b) if(a)b;else
#define i(n,e) {int $n=n;int i=0;for(;i<$n;++i){e;}}
```
>These are all pretty straight forward, with one subtle caveat I only realized from the annotated code. They're all macros to make common operations more compact: wrapping an expression in a block, defining a variable x and using it, conditional statements, and running an expression n times.
This is war crime territory
#define $(a,b) if(a)b;else
due to not having brackets. So it's just extremely lazy to.
> Some of these are wrong to[o] <- that needs an extra 'o' > due to not having brackets. <- that one is fine > So it's just extremely lazy to[o]. <- that needs an extra 'o' too
'to' comes in two versons, 'too' and 'to', both have different meanings.
The whole point of the piece seems completely lost on some readers. Yes, we all know that #define $(a,b) if(a)b;else is questionable. I don't need a crash course on C macros in the comments, thank you. The author already acknowledges that Whitney's style is controversial. Do we really need to keep rehashing that point in every comment, or can we finally focus on how all this unconventional code fits together beautifully to form a working interpreter?
'Hey, look at this interesting way of using the CPP to create a DSL'
I'm fine with that. But this is precisely what aspiring C programmers should avoid at all costs. It's not controversial. It's bad.
This is an enduring great & terrible thing about sites like HN and reddit: As people become more senior & experienced, junior engineers come in to fill the ranks. You and I don't need a crash course on C macros in the comments. But I promise you, a lot of people here have no idea why #define $(a,b) if(a)b;else is a weird C macro.
How much should HN cater to junior engineers?
The assumption that HN should cater to junior engineers is curious. It implies a purpose the site has never claimed to have.
The company was originally a bunch of Access/VB6 programmers.
Then they wrote their VB code in PHP.
And then they wrote their PHP code in Python. It was disgusting.
https://github.com/KxSystems/javakdb/blob/8a263abee29de582cd...
There's a decent chance your broker (or their dealers) are using stuff built on this.
I wholeheartedly concur with popular opinion. It's like writing a program in obfuscated code.
Hmmm... his way of basically making C work like APL made me wonder: Is there a programming language out there that defines its own syntax in some sort of header and then uses that syntax for the actual code?
I’ve never seen code written like this in real-world projects — maybe except for things like the "business card ray tracer". When I checked out Arthur Whitney’s Wikipedia page I noticed he also made the J programming language (which is open source) and the code there has that same super-dense style https://github.com/jsoftware/jsource/blob/master/jsrc/j.c
Lucky you. I've seen far worse (at least this is somewhat consistent). But this isn't C anymore, it is a new language built on top of C and then a program written in that language. C is merely the first stage compilation target.
The C preprocessor gives you enough power to shoot yourself in the foot, repeatedly, with anything from small caliber handguns to nuclear weapons. You may well end up losing control over your project entirely.
One nice example: glusterfs. There are a couple of macros in use there that, when they work are magic. But when they don't you lose days, sometimes weeks. This is not the way to solve coding problems, you only appear smart as long as you remember what you've built. Your other self, three years down the road is going to want to kill the present one, and the same goes for your colleagues a few weeks from now.
yes! like any craft, this works only if you keep practising it.
various implementations of k, written in this style (with iterative improvements), have been in constant development for decades getting very good use out of these macros.
I'm seeing this on multiple fronts, and it's quickly becoming an unsustainable situation in some areas. I expect I'm not alone in this regard.
And the C pre-processor has figured prominently in more than one such case in my career. And it was precisely in the kind of way that is described in TFA.
For something to be doable it needs to make economic sense as well and that's the problem with nightmare trickery like this. Initially it seems like a shortcut, but in the long run the price tag keeps going up.
Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?
"Which line was that again? Oh... "
Pics up the phone, dials.
"Honey, I won't be home in time for dinner."
At first, I thought it looked like line noise. $var on the left of the = sign? Constructs like $_ and @_? more obscure constructs were worse.
But I had to keep going and then one day something happened. It was like one of those 3d stereograms where your eyes have to cross or uncross. The line noise became idioms and I just started becoming fluent in perl.
I liked some of it too - stuff like "unless foo" being more a readable/human of saying if not foo.
perl became beautiful to me - it was the language I thought in, and at the highest level. I could take an idea in my mind and express it in perl.
But I had some limits. I would restrain myself on putting entire loops or nested expression on one line just to "save space".
I used regular expressions, but sometimes would match multiple times instead of all in one giant unreadable "efficient" expression.
and then, I looked at other people's perl. GAH! I guess other people can "express themselves in perl", but rarely was it beautiful or kind, it was statistically worse and closer to vomit.
I like python now. more sanity, (somewhat) more likely that different people will solve a problem with similar and/or readable code.
by the way, very powerful article (even if I intensely dislike the code)
When I see stuff like this, personally, I don't try to understand it, as code like this emerges from basically three motivations:
- The other person wanted to write in some other more (functional|object oriented|stack) language but couldn't, so they did this.
- The person couldn't be bothered to learn idioms for the target language and didn't care about others being able to read the program.
- The person intentionally wanted to obfuscate the program.
And none of these are good reasons to write code in a particular way. Code is about communication. Code like this is the equivalent to saying "I know the grammatical convention in English is subject-verb-object but I feel like speaking verb-object-subject and other people will have to just deal with it"—which, obviously, is a horrible way to communicate if you actually want to share ideas/get your point across.
That all said, the desire to have logic expressed more compactly and declaratively definitely resonates. Unfortunately C style verbosity and impurity remains dominant.
readthenotes1•5h ago
I got too much other stuff to do than decode the voynich manuscript...