IMO it was a super cool idea for more technical content that’s common in STEM fields.
Here’s an example from my old chemistry notes:
https://colbyn.github.io/old-school-chem-notes/dev/chemistry...
But it goes much deeper than that. Once my professor explained how many great discoveries are often paired with new notation. That new notation signifies "here's a new way to think about this problem". And that many unsolved problems today will give way to powerful notation.
The DSL/language driven approach first creates a notation fitting the problem space directly, then worries about implementing the notation. It's truly empowering. But this is the lisp way. The APL (or Clojure) way is about making your base types truly useful, 100 functions on 1 data structure instead of 10 on 10. So instead of creating a DSL in APL, you design and layout your data very carefully and then everything just falls into place, a bit backwards from the first impression.
Reminds me of Richard Feynman. He started inventing his own math notation as a teenager while learning trigonometry. He didn’t like how sine and cosine were written, so he made up his own symbols to simplify the formulas and reduce clutter. Just to make it all more intuitive for him.
And he never stopped. Later, he invented entirely new ways to think about physics tied to how he expressed himself, like Feynman diagrams (https://en.wikipedia.org/wiki/Feynman_diagram) and slash notation (https://en.wikipedia.org/wiki/Feynman_slash_notation).
The paper doesn't really explore this concept well, IMHO. However, after a lot of time reading and writing APL applications, I have found that it points at a way of managing complexity radically different from abstraction.
We're inundated with abstraction barriers: APIs, libraries, modules, packages, interfaces, you name it. Consequences of this approach are almost cliché at this point—dizzyingly high abstraction towers, developers as just API-gluers, disconnect from underlying hardware, challenging to reason about performance, _etc._
APL makes it really convenient to take a different tack. Instead of designing abstractions, we can carefully design our data to be easily operated on with simple expressions. Where you would normally see a library function or DSL term, this approach just uses primitives directly:
For example, we can create a hash map of vector values and interred keys with something like
str←(⊂'') 'rubber' 'baby' 'buggy' 'bumpers' ⍝ string table
k←4 1 2 2 4 3 4 3 4 4 ⍝ keys
v←0.26 0.87 0.34 0.69 0.72 0.81 0.056 0.047 0.075 0.49 ⍝ values
Standard operations are then immediately accessible: k v⍪←↓⍉↑(2 0.33)(2 0.01)(3 0.92) ⍝ insert values
k{str[⍺] ⍵}⌸v ⍝ pretty print
k v⌿⍨←⊂k≠str⍳⊂'buggy' ⍝ deletion
What I find really nice about this approach is that each expression is no longer a black box, making it really natural to customize expressions for specific needs. For example, insertion in a hashmap would normally need to have code for potentially adding a new key, but above we're making use of a common invariant that we only need to append values to existing keys.If this were a library API, there would either be an unused code path here, lots of variants on the insertion function, or some sophisticated type inference to do dead code elimination. Those approaches end up leaking non-domain concerns into our codebase. But, by subordinating detail instead of hiding it, we give ourselves access to as much domain-specific detail as necessary, while letting the non-relevant detail sit silently in the background until needed.
Of course, doing things like this in APL ends up demanding a lot of familiarity with the APL expressions, but honestly, I don't think that ends up being much more work than deeply learning the Python ecosystem or anything equivalent. In practice, the individual APL symbols really do fade into the background and you start seeing semantically meaningful phrases instead, similar to how we read English words and phrases atomically and not one letter at a time.
This is infeasible in most languages, but if your language and concise and expressive enough, it becomes possible again to a large degree.
I always think about how Arthur Whitney just really hates scrolling. Let alone 20 open files and chains of "jump to definition". When the whole program fits on page, all that vanishes. You navigate with eye movements.
I like your funny words. No, really, I should expend some time learning APL.
But your idea deeply resonate with my last weeks struggle.
I have a legacy python code with too much coupling, and every prior attempt to "improve things" went adding more abstraction over a plain wrong data model.
You can't infer, reading the code linearly, what methods mutate their input objects. Some do, some don't. Sometimes the same input argument is returned even without mutation.
I would prefer some magic string that could be analyzed and understood than this sea of indirection with factories returning different calculators that in some instances they don't even share the same interface.
Sorry for the rant.
https://www.arraycast.com/episodes/episode92-iverson
It's quite interesting, and arguably more approachable than the Turing lecture.
In 1979 APL wasn't as weird and fringe as it is today, because programming languages weren't global mass phenomena in the way that they are today, pretty much all of them were weird and fringe. C was rather fresh at the time, and if one squints a bit APL can kind of look like an abstraction that isn't very far from dense C and allows you to program a computer without having to implement pointer juggling over arrays yourself.
many high schools were teaching mathematics with APL! There are quite a few textbooks to learn math with APL [1] or J [2] syntax. Iverson originally wrote APL as a superior syntax for math, the programming implementation came a few years later.
[1] https://alexalejandre.com/about/#apl [2] https://code.jsoftware.com/wiki/Books#Math_for_the_Layman
jweir•3h ago
fc417fc802•3h ago
For some reason the reality is unintuitive to me - that the other tools would have taken me far longer. All the stuff that feels difficult and like it's just eating up time is actually me being forced to work out the problem specification in a more condensed manner.
I think it's like climbing a steeper but much shorter path. It feels like more work but it's actually less. (The point of my rambling here is that I probably ought to learn APL and use it instead.)
Qem•3h ago
https://analyzethedatanotthedrivel.org/2018/03/31/numpy-anot...
skruger•2h ago
https://xpqz.github.io/learnapl
(disclosure: author)
xelxebar•1h ago
Very well put!
Your experience aligns with mine as well. In APL, the sheer austerity of architecture means we can't spend time on boilerplate and are forced to immediately confront core domain concerns.
Working that way has gotten me to see code as a direct extension of business, organizational, and market issues. I feel like this has made me much more valuable at work.
jonahx•57m ago
pinkamp•1h ago