If you want to get started on PLT, Harper's PFPL is pretty accessible (https://www.cs.cmu.edu/~rwh/pfpl/). Even Martin-Löf's article on intuitionistic type theory (the one that introduced dependent types) is fairly readable for a PL geek.
I'm unfamiliar with Barendregt's article but it sounds too mathematical by comparison. I.e. by the title I'd classify it as mathematical logic rather than PL. Remeember there were no computers when Alonzo Church invented lambda calculus in the 1920's.
However providing people with software engineering background an easier on ramp for understanding PLT would be nice, wouldn't it?
This means we users, and specifically for me, compiler developers rarely get to use the fruits such theory and research, making tools a but janky in practice.
Pre-dates usable functional programming languages.
> immutability
Good point. Even if it doesn't have much to do much with functional programming, it did originate from FP languages.
> closures, generators, monads
These are really concepts needed to make up for the limitations of a pure functional layer of abstraction in a imperative world.
Yet they have been found to be much more widely useful. Closures as callbacks, generators as the basis for coroutines, and monads for error handling, not to mention things like C#’s Linq.
The counterparts of these concepts in the everyday programming world are fairly common sense design patterns that have been in use for fifty years.
The names and the beautiful theory for reasoning about them come from FP languages for sure, but that's the side effect of all PL theory being approached as FP theory, and are not fundamental to the 'functional' aspect of functional programming.
The research community has failed the "ordinary" interested programmer wanting to learn PLT by not producing an overview/top-down book which brings together the various strands of needed knowledge into a coherent whole. Pierce/Harper/etc's books are simply overwhelming for a beginner without the requisite background knowledge.
PLT approaches the idea of Computation from a formal system (i.e. language) pov and that is what needs to be communicated with the needed logical/mathematical knowledge inline. I have found the following useful in my study of PLT;
Formal Syntax and Semantics of Programming Languages: A Laboratory Based Approach by Ken Slonneger and Barry Kurtz - https://homepage.divms.uiowa.edu/~slonnegr/ and https://homepage.divms.uiowa.edu/~slonnegr/plf/Book/
Understanding Formal Methods by Jean-Francois Monin.
The problem is the focus on lamda calculus imo. I get it that it's nice for mathematics and proofs, but for (almost all) practical applications it's useless.
Unfortunately a lot of CS theory really is of limited application. The stuff that is, then takes a very long time to make it into implementations, and then the implementation has to climb the adoption curve. Facing at every step of the way entrenched users who don't like change. See the C++ -> Rust transition, for example.
Personally I like the way that C# has gradually introduced functional-flavor material into an OO imperative language.
Who's preventing this now? Take a look at things like Software Foundations and other works which are built on automated or interactive proof systems.
Another peeve is when authors present algorithm pseudo-code in terms of mathematical symbols, where it isn't at all obvious how to compute the value of the symbols in practice. You've shown that there is an algorithm, but not in a useful way.
I mean, I see the value of mathematical notation; I have a math degree. But still....
johnecheck•2d ago
Building a language is about accepting massive up-front investment to build something that hopefully meaningfully improves user experience. Or you can build to learn about the inherent beauty of computation and formal languages, but I find that a little less compelling.