E.g. node says "oh no, you need a library to write tests!" and now that means that you have to have a testing framework built into your runtime. And of course it's just another library, really, that competes with the original one, but this one is blessed with standards so it has a monopolistic advantage that will deter further innovation.
It's called convenience. You're free to create a testing library so good that it'll knock socks off the stdlib one.
Have we given up on having a market economy? You can't have a fair game when one team playing is also the referee. Yet everywhere I look that's the way of it. Apple runs the app store so that they can also ensures that every successful app is theirs. The supermarket stocks the shelves, but also uses their position of power to kill successful independent brands. Node takes libraries which have taken years of work and shits out builtins with at most a week invested.
I think it's a national embarrassment: anticompetition to the point of communism held up as if it were the genuine spirit of the American Dream.
> I think it's a national embarrassment: anticompetition to the point of communism held up as if it were the genuine spirit of the American Dream.
You think including a testing library in a language implementation or standard tooling is tantamount to communism?
With that said, the divergence in comments on this very insightful and well written article will soon provide an unusually clear means of determining who is commenting on the title and who is commenting on the article.
I feel similarly to the article author, it is trivially true that we could express anything in all the general purpose languages, that's what general purpose even means, but I find for Computer Languages the Weak Sapir-Whorf Hypothesis checks out pretty well. The language changes how you think about the problem.
I follow new language developments with keen interest, but few of them will ever reach the level of maturity to be considered serious candidates for adoption. It's also risky to adopt a language that you cannot easily hire developers for, for example.
Libraries are great, but there is only so much they can address, and that depends on the language, too, as the article correctly points out. And there are two kinds of libries: tool libraries and frameworks. Someone once said it nicely: "Frameworks are like Hollywood - 'You don't call us, we call you!'". Frameworks often require you to submit to their control flow rather the other way round; that's why I prefer tool libraries.
What comes close is:
#! /usr/bin/env elixir
Mix.install([:jason])
defmodule JsonPrettyPrinter do
def get_stdin_data do
:stdio
|> IO.read(:all)
|> Jason.decode()
|> case do
{:ok, json_data} -> json_data
_ -> raise "Invalid JSON payload was provided"
end
end
end
JsonPrettyPrinter.get_stdin_data()
|> JsonPrettyPrinter.pretty_print_json()
|> IO.puts()https://taonexus.com/publicfiles/jan2026/171toy-browser.py.t...
it doesn't look like it would be easily derived from Chromium or Firefox, because this code is Python and those don't use Python this way.
By the way is there any feature you'd like to see added to the toy browser? The goal is that one day it's a replacement for Chrome, Firefox, etc. It's being built by ChatGPT and Claude at the moment. Let me know if there are any feature ideas you have that would be cool to add.
Great questions. 1. Yes, for the moment. Like the title of this article suggests - we're using a library! :)
It's great to iterate in Python, which has a large ecosystem of libraries. Believe it or not, there is a chance that in the future it would be able to translate the language into a different one (for example, C++) while using C++ bindings for the same gui libraries. This would speed up its actions by 40x. However, not all of the libraries used have C++ bindings so it could be harder than it looks.
2. Here's the current version of the source code:
https://taonexus.com/publicfiles/jan2026/171toy-browser.py.t...
you can have a quick read through. Originally it was using tkinter for the GUI toolkit. I believe it is still using tkinter, but the AI might be leaning on some other library. As you read it, is it using anything but tkinter for the GUI toolkit?
These libraries are doing a lot of heavy lifting, but I think it is still ending up drawing in tkinter (not handing off rendering to any other library.)
Python lets you dynamically import from anywhere. The syntax is a bit funky, but thats what llms are for.
With Deno you can just import by relative file path and it just works like you'd expect and the tools support it. I wish more languages worked like that.
https://docs.python.org/3/reference/import.html#relativeimpo...
You'd use:
import ...foo.barEveryone wants that to just mean "import relative to this file" but it doesn't.
Sure, for "applications", the ecosystem can be frustrating at times, but I don't think that's what we're talking about here.
Also I think a Python script is reasonable if you use a type-checker with full type annotations, although they are not a silver bullet. For most scripts I use fish, which is my preferred interactive shell too.
[1]: https://hackage.haskell.org/package/shh
[2]: https://docs.haskellstack.org/en/v3.9.1/topics/scripts/
[3]: https://wiki.nixos.org/wiki/Nix-shell_shebang. On a side note, if you were to use nix's shebang for haskell scripts with dependencies, you should be using https://github.com/tomberek/- instead of impure inputs, because it allows for cached evaluation. I personally cloned the repo to my personal gitlab account, since it's small and should never change
I’m taking the GP seriously instead of dismissing it. Raku looks like more fun than nushell tbh.
print 42 + 99; # 141
print &print.file ; # ...src/core.c/io_operators.rakumod
print &infix:<+>.file; # ...src/core.c/Numeric.rakumod
print ?CORE::<&print>; # True
I barely understood these four example lines.Contrived example:
ls | where type == 'file' | sort-by size | take 4 | each {|f| {n: $f.name, s: ($f.size | format filesize MB) }} | to json
outputs {
"n": "clippy.toml",
"s": "0.000001 MB"
},
{
"n": "README.md",
"s": "0.000009 MB"
},
{
"n": "rustfmt.toml",
"s": "0.000052 MB"
},
{
"n": "typos.toml",
"s": "0.00009 MB"
} gci -file | sort-object size | select name, size -first 4 | % { $_.size /= 1MB; $_ } | ConvertTo-JsonThe last command is properly cased, because I pressed tab (it auto-completes and fixes the case). The other commands I typed without tab completion. You can write however you want, PS is not case sensitive.
I agree that bash sucks, but I really have no motivation to learn something like nushell. I can get by with bash for simpler things, and when I get frustrated with bash, I switch to python, which is default-available everywhere I personally need it to be.
Back to text, though... I'm honestly not sure objects are strictly better than dumb text. Objects means higher cognitive overhead; text is... well, text. You can see it right there in front of you, count lines and characters, see its delimiters and structure, and come up with code to manipulate it. And, again, if I need objects, I have python.
About objects vs. text: I'm convinced that objects are vastly superior. There was a comment about this here with good arguments: https://news.ycombinator.com/item?id=45907248
Unfortunately, I don't think Nushell brings much benefit for folks who already know Bash enough to change directories and launch executables and who already know Python enough to use more complicated data structures/control flow/IDE features
I'm still rooting for Nushell as I think its a really cool idea.
ls -l --sort=size | head -n 5 | tail -n 4 | awk '{print $5 " = " $9}' | numfmt --to iec | jq --raw-input --null-input 'inputs | gsub("\r$"; "") | split(" = "; "") | select(length == 2) | {"s": (.[0]), "n": .[1]}'I think it doesn't even work correctly. ls lists files and directories and then picks the first 4 (it should only select files).
And this also uses awk and jq, which are not just simple "one purpose" tools, but pretty much complete programming languages. jq is not even part of most standard installations, it has to be installed first.
find -maxdepth 1 -type f -printf '%s %f\n' | sort -n | head -n 5
For the latter part, I'd tend to think that if you're going to use awk and jq, you might as well use Ruby. ruby -rjson -nae ' puts(JSON.pretty_generate({n: $F[1], s: "%.5f MB" % ($F[0].to_i / 10e6) }))'
("-nae" effectively takes an expression on the command line (-e), wraps it in "while gets; ... end" (-a), and adds the equivalent to "$F = $_.split" before the first line of your expression (-n))It's still ugly, so no competition for nushell still.
I'd be inclined to drop a little wrapper in my bin with a few lines of helpers (see my other comment) and do all Ruy if I wanted to get closer without having to change shells...
https://lucasoshiro.github.io/posts-en/2024-06-17-ruby-shell...
In a way that exactly illustrates the GGP's point: why learn a new language (nushell's) when you can learn awk or jq, which are arguably more generally- and widely-applicable than nushell. Or if awk and jq are too esoteric, you could even pipe the output of `find` into the python or ruby interpreters (one of which you may already know, and are much more generally applicable than nushell, awk, or jq), with a short in-line script on the command line.
That is backwards. I know I said "complete programming languages", but to be fair, awk only shines when it comes to "records processing", jq only shines for JSON processing. nushell is more like a general scripting language — much more flexible.
E = Struct.new(:name, :size, :type)
def ls = Dir.children('.').map{ s=File::Stat.new(_1); E.new(_1, s.size, s.file? ? 'file' : 'dir') }
This becomes valid Ruby: ls.find_all{_1.type == 'file'}.sort_by(&:size).take(4).map{ {n: _1.name, s: _1.size } }.each { puts JSON.pretty_generate(_1) }
(drops your size formatting, so not strictly equivalent)Which isn't meant to "compete" - nushell looks nice -, but to show that the lower-threshold option for those of us who don't want to switch shells is to throw together a few helpers in a language... (you can get much closer to your example with another helper or two and a few more "evil" abuses of Ruby's darker corners, but I'm not sure it'd be worth it; I might a wrapper for the above in my bin/ though)
Discussion: https://news.ycombinator.com/item?id=46431028
Though, I don't think it has the capability for single-file scripts to declare 3rd-party dependencies to be automatically installed.
The best option I've found for this use case (ad-hoc scripting with third party dependencies) is Deno.
I'm hoping Rust will get there in the end too.
LLMs are eval(). Skills are programs. YAML is the motherboard.
@unkulunkulu nails it -- "library as the final language", languages all the way down. Exactly. Skills ARE languages. They teach the interpreter what to understand. When the interpreter understands intent, the distinction dissolves.
@conartist6: "DSL is fuzzy... languages and libraries don't have to be opposing" -- yes. Traditional DSL: parse -> AST -> evaluate. LLM "DSL": read intent -> understand -> act. All one step. You can code-switch mid-sentence and it doesn't care.
The problem with opinionated frameworks like ROR and their BDFLs like DHH is that one opinion is the WRONG number!
The key insight nobody's mentioned: SPEED OF LIGHT vs CARRIER PIGEON.
Carrier pigeon: call LLM, get response, parse it, call LLM again, repeat. Slow. Noisy. Every round-trip destroys precision through tokenization.
Speed of light: ONE call. I ran 33 turns of Stoner Fluxx -- 10 characters, many opinions, game state, hands, rules, dialogue, jokes -- in a single LLM invocation. The LLM simulates internally at the speed of thought. No serialization overhead. No context-destroying round trips.
@jakkos, @PaulHoule: nushell and Python are fine. But you're still writing syntax for a parser. What if you wrote intent for an understander?
Bash is a tragedy -- quoting footguns, jq gymnastics, write-only syntax. Our pattern: write intent in YAML, let the LLM "uplift" to clean Python when you need real code.
Postel's Law as type system: liberal in what you accept. Semantic understanding catches nonsense because it knows what you MEANT, not just what you TYPED.
Proof and philosophy: https://github.com/SimHacker/moollm/blob/main/designs/stanza...
So do you disagree with any of my points, or my direct replies to other people's points, or is that all you can think of to say, instead of engaging?
Do you prefer to use bash directly? Why? If not, then what is your alternative?
What do you think of Anthropic Skills? Have you used or made any yourself, or can you suggest any improvements? I've created 50+ skills, and I've suggested, implemented, and tested seven architectural extensions -- do you have any criticism of those?
https://github.com/SimHacker/moollm/tree/main/skills
Obviously you use llms yourself, so you're not a complete luddite, and you must have some deeper more substantial understanding and criticism than those two words from your own experience.
How do your own ideas that you blogged about in "My LLM System Prompt" compare to my ideas and experience, in your own "professional, no bullshit, scientific" opinion?
https://mahesh-hegde.github.io/posts/llm_system_prompt/
Your entire blog post on LLM prompts is "I don't like verbiage" in five sentences. Ironic, then, that your entire contribution here is two empty words. I made specific technical points, replied to real people, linked proof. 'Slop' is the new 'TL;DR' -- a confession of laziness dressed as critique. Calling substance slop while contributing nothing? That's actual slop.
#!/usr/bin/dotnet run
https://devblogs.microsoft.com/dotnet/announcing-dotnet-run-...https://andrewlock.net/exploring-dotnet-10-preview-features-...
Ruby got a hype phase with regards to rails. It then dropped. A lot.
TIOBE, while it is in general crap, is somewhat accurate when you plot things over time:
https://www.tiobe.com/tiobe-index/ruby/
So, ruby peaked with rails between 2006 to 2009 or so, give or take. Then the decline phase set in, and now it is unfortunately also crawling behind perl into extinction. This is very unfortunate - I still use ruby almost daily as the ultimate glue language. But this can not be denied now that ruby is following the extinction path perl already had going some years before.
I was using ruby before rails was created and ruby covers all my web-needs. I had a web-framework in PHP, used it for about three years, ported it into ruby and expanded it massively in the last 20 years or so (well, almost 20 years). I retired from rubygems.org when RubyCentral got crazy in 2024 (and even crazier in 2025 with the mass purge of developers). So, one difference here is that the friend he talks about is using a specific framework. He probably no longer uses ruby nor rails. I use ruby because the language is very well designed and covers (most of my) use cases; the rest I may sprinkle down with java. So whether rails exists or not, makes zero difference to me. Actually without rails it would be better, because people using ruby would be using it because of ... ruby. Even if there are then fewer users. I still think this is better than those who will jump ship anyway because they only use ruby due to rails. These guys are not like in the same boat. They have use cases for getting work done via rails, designing websites, infrastructure related to websites, user-interaction and so forth. But they don't really use ruby as such. Their use case is quite limited. I think this is one of the biggest problems here. It in part explains why ruby dropped down a lot (there are many reasons for this, python being so successful is in my opinion the biggest reason, but the other smaller reasons also add up - that also includes the laughable joke that is documentation in the whole ruby ecosystem. That's inexcusable - note, I am not saying documentation must be perfect, but please look at opal, ruby-wasm or rack - the documentation there is virtually NOT EXISTING.)
> The vast majority of programmers are non-experts, like himself
No, I think he is an expert - just in a specific niche and field. Not all experts know everything equally well.
> Subtle language features like first-class functions, and object systems, are lost on them because they don't really use them anyway.
I don't think this is true. Some language features are very useful. Ruby's blocks for instance. They are probably one of the top three win-win features ruby offers.
> Computer scientists should really be spending their time developing new libraries rather than inventing new programming languages.
I also disagree here. I would, however had, say, that new languages should be well-designed. Many new languages suck. Old languages also suck. Designing a great language is very hard. If it is just a toy or research language then this is fine, but once a language is meant to be "real", it really needs to have compelling use cases and be great in many areas including documentation.
> These features are simply not available in all other languages. Java's meta-programming features, for example, are just not powerful enough to implement a system like ActiveRecords. Rails is only possible because of Ruby.
That's also incorrect. You can create any DSL as you like in Java too. Ruby just makes this a lot easier out of the box. Plus, you can also have great websites without rails.
> Ruby on Rails was designed to make it possible to build websites without understanding type theory, or memory management, or object-oriented design patterns.
Ok so ... why would this not be possible in Java? Why would he have to write Java code for a library to be used in this regard?
> Ruby on Rails provides a concise way for expressing: do this when the button is clicked
But you have the same in many other languages and frameworks too. I mean this is how PHP was started initially.
> The "do this" part is implemented in Ruby as a first-class function. How would it be implemented in languages like Java which don't support them?
Write a solid DSL.
> The programming language directly shapes the design of its libraries.
If this were true, why would GTK have glib+vala? I mean, they could just rely on C directly, right?
Besides, ruby is just a wrapper over C really.
> The more powerful the language, the easier the libraries are to use.
That part is true. A better designed language makes for better libraries or a chance to have better libraries. I noticed this when I compared my PHP code to my ruby code. I am not a good programmer, but my ruby code is much better on every level than the equivalent PHP code. Fewer lines too. While this also has to do with experience, at the end of the day PHP is simply a much worse language than Ruby is. At some point I decided I don't want to invest into languages that suck when I could be using better languages instead. That is also why I stopped writing shell scripts - it is just a waste of time having them.
Rails is also, by the way, fairly well documented. So I am not saying all in ruby has horrible documentation of course.
I'm pretty sure that can be denied.
Rails and Ruby (both separately and as a unit) is still absolutely huge. It's launching massive new releases regularly and still underpins a healthy chunk of the top websites on the planet (Shopify, GitHub, Airbnb, Twitch, Hulu, Kickstarter, Zendesk, Basecamp, Crunchbase, Dribbble etc etc) and is still taught to new developers as well.
int number;
… you choose Pascal style
number : Integer; my $a of Int = 42;
say $a; # 42
or my $a of Int = "foo";'
# Type check failed in assignment to $a; expected Int but got Str ("foo")
?They talk about the programmer which doesn't know neither cares about the language stuff. So what is Spring lacking from that perspective?
"XML is like violence. If it doesn't solve your problem, you're not using enough of it."
But aren't Rails, Laravel and Django a bit similar? At least for the people not directly involved in coding.
It could if you added message passing to C.
Which we have, and DHH admits that Rails took directly from that. Rails is (loosely) a rewrite of what was originally written in C [with extensions].
Are you talking about WebObjects, which was written in Objective-C? (1) If so your comment is somewhat tangential to the truth.
I’m hoping more recent developments, like WASM or Graal, provide a route for more flexibility when selecting languages. It’s nice to see Rust slowly become a serious choice for web development. Most of the time JS is fine, but it’s good to have the option to pull out a stricter low-level language when needed.
Although I agree the usual lens, optics, machines, pipes or other higher kinded libs are completely unnecessary, solving problems you do not want to have and have dire performance implications, but are at least correct and allow you to throw code at problems quickly, even though that code sucks in all ways except correctness.
Pipes also don’t necessarily have “dire performance implications”, but it depends a lot on the implementation. Haskell libraries don’t always emphasize real world performance as a top criterion. E.g. see https://github.com/composewell/streaming-benchmarks for some truly wild variations in performance across libraries (disclaimer: I haven’t investigated or verified those numbers.)
Also see, for instance, Java. There's Java, the language that keeps improving, and then the Spring ecosystem, which is what 95% of programmers end up having to use professionally, with its heavy "magic" component. Writing services avoiding Spring is going against the grain. It might as well be part of the language as far professional Java use is concerned.
Communities matter more than the language features, and Java is all Spring, and now Scala is really a choice of Zio and Cats
It's not the whole community, not by a long shot. Don't judge Scala by the Scala subreddit.
Most new things you'll see written about Scala are about solving difficult problems with types, because those problems are inexhaustible and some people enjoy them, for one reason or another. Honestly I think this shows how easy and ergonomic everything else is with Scala, that the difficulties people write about are all about deep type magic and how to handle errors in monadic code. You can always avoid that stuff when it isn't worth it to you.
The type poindexters will tell you that you're giving up all the benefit of Scala the moment you accept any impurity or step back from the challenge of solving everything with types, and you might as well write Java instead, but they're just being jerks and gatekeepers. Scala is a wonderful language, and Scala-as-a-better-Java is a huge step up from Java for writing simple and readable code. It lets you enjoy the lowest hanging fruit of functional programming, the cases where simple functional code outshines OO-heavy imperative code, in a way that Java doesn't and probably never will.
I have a problem.
Right, I'll design a DSL.
Hmm. Now I have two problems.
I'm sure there's good use cases for it - one impressive example at the time was using functional programming to create Hadoop map / reduce jobs, a oneliner in Scala was five different files / classes in Java. But for most programming tasks it's overkill.
You can write boring code in Scala, but in my (limited) experience, Scala developers don't want to write boring code. They picked Scala not because it was the best tool for the job, but because they were bored and wanted to flex their skills. Disregarding the other 95% of programmers that would have to work with it.
(And since these were consultants, they left within a year to become CTOs and the like and ten years on the companies they sold Scala to are still dealing with the fallout)
Intersting observation.
So basically Scala is to the JVM what Perl is to scripting?
Scala was designed from the beginning to support classical Java-style OOP code and also Haskell-like functional code. These are 2 very different styles in one language. And then Scala supports defining DSLs which give you even more flexibility.
> They picked Scala not because it was the best tool for the job, but because they were bored and wanted to flex their skills.
Guilty as charged!
> Disregarding the other 95% of programmers that would have to work with it.
No. Your coworkers end up being the other 5% of programmers that have the same taste as you. Interviewers ask about monads and lenses. It's fine, as long as everyone is on the same page. Which... they kind of have to be.
That is AFAIK the "curse of lisp" because is so easy (and needed and encouraged) to write SDLs, any ecosystem grows many languages in a hurry, so suddenly that elegant minimalistic beautiful pure language, becomes 1000 beautiful clean languages. Now you have to learn them all...
I used Scala a few times when it was semi popular, just seemed like Java but with lots of redundant features added. Not sure what the aim was.
And the most important thing Java was always missing until recently, virtual threads, were lacking in Scala too.
(And I'd disagree that virtual threads were all that important compared to language features.)
Records/sealed interfaces (ADTs) are quite clean.
Text Blocks are better in Java IMO. The margin junk in Scala is silly.
Java may not be the pinnacle of programming languages, but since Java 8, pretty much every feature it's added has been absolutely excellently done.
Blub is a great language!
But ultimately using Scala at the place I worked at the time was a failure. A couple of my co-workers had introduced it, and I joined the bandwagon at some point, but it just didn't work out.
Many Java developers inside the company didn't want to learn, and it was really hard to hire good Scala developers. The ones who did learn (myself included) wrote terrible Scala for a least the first 6 months, and that technical debt lingered for a long time. When other people outside the team (who didn't know Scala) needed to make changes to our code, they had a lot of trouble figuring things out, and even when they could, the code they wrote was -- quite understandably -- bad, creating extra work for us to review it and get it into shape.
I also feel like Scala suffers from similar complexity/ways-to-do-things problems as C++. I often hear people say things like "C++ can be a safe, consistent language if you just use a subset of it!", and then of course everyone has a subtly (or not-so-subtly) different subset that they consider as The One True Subset. With Scala, you can write some very complex, type-heavy code that's incredibly hard to read if you are not well-versed in type/category/etc. theory (especially if you are using a library like cats or scalaz). Sure, you could perhaps try to come up with some rules around what things are acceptable and what aren't, but I think in this case it's a hard thing to specify, and different people will naturally disagree on what should be allowed.
I really wanted Scala to succeed at our company, but I think that's hard to do. I feel like the ideal case is a small company with just a few tens of developers, all whom were hired specifically for their Scala expertise, with a product/business that is going to keep the number-of-developers requirement roughly static. But that's probably very rare.
Or, as Paul Grahmam put it in his 1993 book On Lisp: "a bottom-up style in which a program is written as a series of layers, each one acting as a sort of programming language for the one above"
https://paulgraham.com/progbot.html
https://www.paulgraham.com/onlisptext.html
Here is a talk that explains the concept in Clojure, titled Bottom Up vs Top Down Design in Clojure:
https://www.contalks.com/talks/1692/bottom-up-vs-top-down-de...
Java was the language where "write libraries instead" happened, and it became an absolute burden. So many ugly libraries, frameworks and patterns built to overcome the limitations of a simple language.
Scala unified the tried-and-tested design patterns and library features used in the Java ecosystem into the core of its language, and we're better off for it.
In Java we needed Spring (urghh) for dependency injection. In Scala we have the "given" keyword.
In Java we needed Guava to do anything interesting with functional programming. FP features were slowly added to the Java core, but the power and expressivity of Java FP is woeful compared what's available at the core of Scala and its collections libraries.
In Java we needed Lombok and builder patterns. In Scala we have case classes, named and default parameters and immutability by default.
In the Java ecosystem, optionality comes through a mixture of nulls (yuck) and the crude and inconsistently-used "Optional". In Scala, Option is in the core, and composes naturally.
In Java, checked exceptions infect method signatures. In Scala we have Try, Either and Validated. Errors are values. It's so much more composable.
There's so much more - but hopefully I've made the point that there's a legitimate benefit in taking the best from a mature ecosystem and simple language like Java and creating a new, more elegant and complete language like Scala.
So you don't actually disagree with the article.
It helps to actually read it. The title is in quotes because the point of the article is to refute it.
All programming languages are equivalent meaning their level of expressiveness is the same, it's not an opinion it's a fact. Each language comes with its runtime and its peculiarities but potentially you can always make any feature that another language runtime has with any language, even though probably not with the same performance and efficiency has been that feature native to the runtime itself.
So there are no "more powerful languages" just runtimes that allow you to hide away some stuff considered stable enough that they become some kind of primitive for the programmer, now we may have different opinions on what elegant code is, but personally I'd like to avoid code that directly (i.e. no kind of abstraction) relies on runtime features and instead express clearly my intention in code, but I recognize the productivity gains.
However, not all languages are turing complete. See, for example, charity: https://github.com/dobesv/charity
Furthermore, turing completeness says nothing about expressiveness or capability. Imagine a language that has no IO. Such a language would be able to compute any function any other language can but not do anything viewable by the rest of the world. So obviously not equivalent.
And w.r.t. expressiveness, there is some academic research into how to quantify that: https://www2.ccs.neu.edu/racket/pubs/scp91-felleisen.pdf
I hope I've cleared my standpoint.
(My comment is slightly off-topic to the article but on-topic to the title.)
References were Racket with the Racklog library¹. There's also Datalog² and MiniKanren, picat, flix. There were tons of good comments there which you should check out, but PySwip seemed like "the right thing" when I was looking at it: https://github.com/yuce/pyswip/
...documentation is extremely sparse, and assumes you already know prolog, but here's a slightly better example of kindof the utility of it:
https://eugeneasahara.com/2024/08/12/playing-with-prolog-pro...
...ie:
# ya don't really care how this works
prolog.consult("diabetes_risk.pl")
# ...but you can query into it!
query = "at_risk_for_diabetes(Person)"
results = list(prolog.query(query))
...the point being there's sometimes some sort of "logic calculation that you wish could be some sort of regex", and I always think of prolog as "regexes for logic".One time I wished I could use prolog was trying to figure the best match between video file, format, bitrate, browser, playback plugin... or if you've seen https://pcpartpicker.com/list/ ...being able to "just" encode all the constraints, and say something like:
valid_config = consult("rules.pl")
+ consult("parts_data.pl")
+ python.choice_so_far(...)
rules.pl: only_one_cpu, total_watts < power_supply(watts)
parts_data.pl: cpu_xyz: ...; power_supply_abc: watts=1000
choices: cpu(xyz), power_supply(abc), ...
...this is a terribly syntactically incorrect example, but you could imagine that this would be horrific code to maintain in python (and sqrt(horrific) to maintain in prolog), but _that's_ the benefit! You can take a well-defined portion and kindof sqrt(...) the maintenance cost, at the expense of 1.5x'ing the number of programming languages you need to expect people to know.I haven't looked into the implementation. But taking a brief glance now, it looks interesting. They appear to be translating Prolog to Java via a WAM representation[3]. The compiler (prolog-cafe) is written in prolog and bootstrapped into Java via swi-prolog.
I don't know why compilation is necessary, it seems like an interpreter would be fast enough for that use case, but I'd love to take it apart and see how it works.
[1]: https://www.gerritcodereview.com/ [2]: https://gerrit-documentation.storage.googleapis.com/Document... [3]: https://gerrit.googlesource.com/prolog-cafe/+/refs/heads/mas...
Sure you can implement OOP as a library in pretty much any language, but you’ll probably sacrifice ergonomics, performance and/or safety I guess.
You might be able to hack on some of the datatype semantics into JS prototype-based inheritance (I'd rather start with TypeScript at that point, but then we're back at the "why isn't it a library" debate) to keep those ontologies from being semantically separate, but that's an uphill battle with some of JS's implicit value conversions.
I consider Logic Programming languages to be the go-to counterargument to TFA but yeah, anything with lazy eval and a mature type system are strong counterexamples too.
Sometimes, having a language with a distinct syntax is nicer.
For example, Prolog isn't a general purpose functional or imperative language: you can assert, retract and query facts in the automatically managed database, risking only incorrect formulas, inefficiencies and non-monotonicity accidents, but not express functions, types, loops, etc. which could have far more general bugs.
I am so glad LLMs eliminate all of that and just call functions in the right order.
When LLMs do something, it's always because everybody was already doing it.
The noise you see online about it exists exactly because most people don't understand and can't use DI.
There is nothing magical about topological sort and calling constructors in the right order, which is all DI is.
I dislike it a lot, it is exactly like any other construct that allows you to throw code at anything in a way that sucks (Haskell lens, optics, monad transformers).
It allows people to granularize the whole codebase to the point where you can’t do much about it. For most, they should just stick with functions, no one can build 100 levels deep function callstacks without it being cumbersome, but DI makes it a breeze.
Then I got into Python and people were building useful server APIs in a day.
Both have their place, but I think the problem with the first route is that EVERYTHING ends up with Spring or CDI and complexity overload even if only 1 thing will ever be "implemented".
I was talking to Bob Harper about this specific issue (context was why macro systems are important to me) and his answer was “you can just write a separate programming language”. Which I get.
But all of this is just to say that doing relational-programming-as-a-library has a ton of issues unless your language supports certain things.
(Select the "Usinag Datalog..." example in the code sample dropdown)
The Rust code looks completely "procedural"... it's like building a DOM document using `node.addElement(...)` instead of, say, writing HTML. People universally prefer the declarative alternative given the choice.
The nice thing about prolog is that you write logical rules, and they can get used in whatever order and direction that is needed. By direction, I mean that if you define "a grandparent is the parent of a parent", you can now use that rule not just to evaluate whether one person is a parent (or to find all grandparents) but also to conclude that if you know someone is a grandparent, and they are the parent of some one, then that person is someone's parent. Ok, it can also do recursion, so if you define an ancestor as a parent or the parent of an ancestor it will recurse all the way up the family tree. Neat.
You could write some kind of runtime that takes c code and brute-forces its way from outputs to inputs, except that regular imperative code allows for all kinds of things that make this impossible (e.g. side-effects). So then, you'd be limited to some subset, essentially ending up with a domain specific language again, albeit with the same syntax as your regular code, rather than those silly :- symbols (although LISP looks much sillier than prolog IMHO).
What the article is getting at is that if you use some features specific to a language, it's hard to embed your code as a library in another language. But is it? I mean, DLLs don't need to be written in the same language, there's stuff like JNI, and famously there's stuff like pytorch and tensorflow that runs CUDA code from python.
Not necessarily.
This generalizes!
Prolog: declare relations. Engine figures out how to satisfy them. Bidirectional -- same rules answer "is X a grandparent?" and "find all grandparents."
LLMs do something similar but fuzzier. Declare intent. Model figures out how to satisfy it. No parse -> AST -> evaluate. Just: understand, act.
@tannhaeuser is right that Prolog's power comes from what the engine does -- variables that "range over potential values," WAM optimization, automatic pruning. You can't get that from a library bolted onto an imperative language. The execution model is different.
Same argument applies to LLMs. You can't library your way into semantic understanding. The model IS the execution model. Skills aren't code the LLM runs -- they're context that shapes how it thinks.
Prolog showed that declarative beats imperative for problems where you can formalize the rules. LLMs extend that to problems where you can't.
I've been playing with and testing this: Directories of YAML files as a world model -- The Sims meets TinyMUD -- with the LLM as the inference engine. Seven architectural extensions to Anthropic Skills. 50+ skills. 33 turns of a card game, 10 characters, one LLM call. No round trips. It just works.
https://github.com/SimHacker/moollm/blob/main/designs/stanza...
The integration of a Prolog backend into a mainstream stack is typically achieved via Prolog code generation (and also code generation via LLMs) or as a "service" on the Prolog side, considering Prolog also has excellent support for parsing DSLs or request/responses of any type; as in, you can implement a JSON parser in a single line of code actually.
As they say, if Prolog fits your application, it fits really well, like with planning, constraint solving, theorem proving, verification/combinatoric test case enumeration, pricing models, legal/strategic case differentiation, complex configuration and the like, the latter merely leveraging the modularity of logic clauses in composing complex programs using independent units.
So I don't know how much you've worked hands on with Prolog, but I think you actually managed to pick about one of the worst rather than best examples ;)
Seems more like an interesting research project than something I'd ever deploy in an application serving millions of users
You mean like the kinds of problems digital computing was originally invented to solve?
You know that still exists, right? There are many people using computers to advance the state of Mathematics & related subjects.
Now try to produce a library that adds compile-time features: static types, lifetimes, the notion of const and constexpr, etc. You can, of course, write external tools like mypy, or use some limited mechanism like Java annotations. But you have a really hard time implementing that in an ergonomic way (unless your language is its own metalanguage, like Lisp or Forth, and even then).
Creating a library that alters the way the runtime works, e.g. adding async, is not entirely impossible, but usually involves some surgery (see Python Twisted, or various C async libs) that results in a number of surprising footguns to avoid.
Frankly, even adding something by altering a language, but not reworking it enough to make the new feature cohesive, results in footguns that the source language did not have. See C#'s LINQ and exceptions.
Here is a sample on how to read a file.
https://lpn.swi-prolog.org/lpnpage.php?pagetype=html&pageid=...
How to make syscalls,
I don't have real work I need prolog for, but I find it an interesting subject, My personal learning goal, the point where I can say I know prolog reasonably well is when I can get it to solve this mit puzzle I found, a sort of variant of soduku. I found a clever prolog solver for soduku that I thought could teach me more in this domain, but it was almost to clever, super optimized for soduku(it exploited geometric features to build it's relationships) and I was still left with no idea on how to build the more generic relationships I need for my puzzle(specific example if soduku cells were not in a grid how could they be specified?), in fact I can find very little information on how to specify moderately complex, ad hoc relationships. One that particularly flummoxed me was that some rules(but you don't know which) are wrong.
All the other books that I looked at were pretty awful, including the usual recommendations.
If you want to learn LP concepts in general, Tarski's World is a great resource as well.
But I have heard repeatedly that the good thing of prolog is the compiler, that takes information and queries that would be awful inefficient, and convert them in something that actually works. So I'm not sure... of course, you can convert virtually any language in a kind of library with some API that basically accepts source code... but I'm pretty sure is not what you meant.
false.
https://www.j-paine.org/dobbs/prolog_lightbulb.html
I always wanted to write a compiler whose front-end consumes Prolog and back-end emits PostScript, and call it "PrologueToPostscript".
prologue: a separate introductory section of a literary, dramatic, or musical work.
postscript: an additional remark at the end of a letter, after the signature and introduced by ‘PS’.
We already have it. It's an obscure little language called C++. Tise interested in those kinds of extensions to a language should look into Herb Sutter's experiments with cppfront: https://hsutter.github.io/cppfront/welcome/overview/
Lisp is what you are after if you want to include some object system as a library, or a new type of switch statement as a library or a new kind of if statement as a library.
C++ can do none of that.
OK, if you squint enough that by "block of code" you mean closure, or function object, then I can write that in C++. I can make the if statement a free-standing function (that is, not a member of a class), and add it to any library I wish.
Now, you can say that it's going to be tedious to use that, because you have to set up three closures every time you want to call this "super if". And you'd be right, but that's a different argument.
My point was that you can often get the effect you want with no new syntax. (Cue 10,000 replies that state "but you can't get this effect without new syntax!" Perhaps not. Many of those tend to be rather contrived, though. I'm more sympathetic to the argument that new syntax would make something less clumsy. If it's something you need to do a lot, that matters.)
https://learn.microsoft.com/en-us/cpp/mfc/tn038-mfc-ole-iunk...
But we aren't squinting here; those closures can't perform a return where your `new-if` function is being used, they can't perform a `return` like a proper `if` can, you can't goto, or break or continue.
It's just a function taking functions, with all the restrictions that that entails.
It is the best way? Probably not, but we seldom get to chose what mainstream languages win out on the field.
years ago a senior developer close to me said "when screening interviews, if i see rails i throw the resume in the trash"
so ironic how trivial/stupid these language-based judgements are
Not as easy to find in my vicinity, at least good ones, which is of course true for any language and profession in general.
I have RoR on my resume and very fond of it.
What was the senior's stack?
But shouldn't the check just be that the candidate has used more than one different stack? It's pretty hard for anyone with real experience to stick to one, and even if they do, that's not a good sign either. Or are you saying those bootcamp people end up learning another stack but still not being very good?
If you had another filtering mechanism, perhaps you could do that. But what other arbitrary, legally acceptable, filter are you going to use to further narrow the search? Can't realistically throw out all the resumes with female-sounding names, for example. What is going to keep you out of trouble is quite limited.
Why not throw out all the "Rails" resumes? If you had all the time in the world you would interview every last person, of course, but in the real world, with real world constraints, you have to pick a few to interview and live with your choice.
To use the internet's favourite analogy: It's like buying a car. Most people would never find it reasonable to test-drive every single one of them. It is just too time consuming to do that. So, instead, one normally looks at signals to try and distill the choice down to a few cars to test drive. You very well might miss out on what is actually your perfect car by doing that, but if you find one that is good enough, who cares?
On the other, you only have so much time in the day. It'd take me 3-6 months to give phone screens to every resume that comes in the door for any one engineering role, 8x that for a full 4-hour interview. I have to filter through them somehow if it's my job to hire several people in a month.
You'll obviously start with things that are less controversial: Half of resumes are bot-spam in obvious ways [0]. Half of the remainder can easily be tossed in the circular filing bin by not having anything at all in their resume even remotely related to the core job functions [1].
You're still left with a lot of resumes, more than you're able to phone screen. What do you choose to screen on?
- "Good" schools? I personally see far too much variance in performance to want to use this as a filter, not to mention that you'd be competing even more than normal on salary with FAANG.
- Good grades? This is a better indicator IME for early-career roles, but it's still a fairly weak signal, and you also punish people who had to take time off as a caretaker or who started before they were mature enough or whatever.
- Highest degree attained? I don't know what selection bias causes this since I know a ton of extremely capable PhDs, but if anything I'd just use this to filter out PhDs at the resume screening stage given how many perform poorly in the interviews and then at work if we choose to hire them.
- Gender? Age? ... I know this happens, but please stop.
If there's a strong GitHub profile or something then you can easily pass a person forward to a screen, but it's not fair to just toss the rest of the resumes. They have a list of jobs, skills, and accomplishments, and it's your job to use those as best as possible to figure out if they're likely to come out on top after a round of interviews.
I don't have any comment on rails in particular, but for a low-level ML role there are absolutely skills I don't want to see emphasized too heavily -- not because they're bad, but because there exists some large class of people who have learned those skills and nothing else, and they dominate the candidate pool. I used to give those resumes a chance, and I can't accept 100:1 odds anymore on the phone screen turning into a full interview and hopefully an offer. It's not fair to the candidates, and I don't have time for it either.
And that's ... bad, right? I have some things I do to make it better in some ways (worse in others, but on average trying to save people time and not reject too many qualified candidates) -- pass resumes on to a (brief) written screen instead of outright rejecting them if I think they might have a chance, always give people a phone screen if they write back that I've made a mistake, revisit those filtering rules I've built up from time to time and offer phone screens anwyay, etc -- hiring still sucks on both sides of the fence though.
[0] One of my favorites is when their "experience" includes things like how they've apparently done some hyper-specific task they copy-pasted from the job description (which exists not as a skills requirement but as a description of what their future day-to-day looks like), they did it before we pioneered whatever the tech in question was, they did it at several FAANG companies, and using languages and tools those companies don't use and which didn't exist during their FAANG tenure. Maybe they just used an LLM incorrectly to touch up their resume, but when the only evidence I should interview you is a pack of bold-faced lies I'm not going to give the benefit of the doubt.
[1] And I'm not even talking about requiring specific languages or frameworks, or even having interacted with a database for a database-adjacent role. Those sorts of restrictions can often be too overbearing. Just the basics of "I need you to do complicated math and program some things that won't wake me up at night" and resumes that come in without anything suggesting they've ever done either at any level of proficiency (or even a forward or a cover letter stating why their resume appears bare-bones and they deserve a shot anyway).
Why yes, it can and has been done: https://www.dreamsongs.com/Files/ECOOP.pdf
I think about programming/design as languages/translation in a lot of ways: its languages all the way down.
People, on the other hand, work with ideas, metaphors, expressions of intent, etc. If a language/library makes the communication of those things easier/better/faster; if it can be "written down" clearly, and "read" clearly by a person, then does it really matter into which taxonomic category it fits? We pick horses for courses. That seems about right.
If Rails works for you, is complementary with what you want to achieve, is an accelerator, and is generally well-understood by the people with whom you work, then use it. Alternatively, if the answer to all the previous is Stanza then go with that. There's less "right" and "wrong" in those decisions than there is "advance", or "struggle". It sounds trite. But, use what works. If something doesn't work make something that does, iff that's the most efficient approach.
I’ve been experimenting with a small defeasible-logic core (argumentation semantics + priorities + conflict detection) that stays stable, while most of the real meaning lives in extension points. https://github.com/canto-lang/canto-lang
It's true, you couldn't really do Express in Java, at least not back then.
But Java problem is not the mechanics, it’s that the community doesn’t want nice things.
Anyway, libraries like this were only really feasible after Java 8 because of the reliance on lambdas. Having to instantiate anonymous nested classes for every "function" was a total pain before that.
yet at the same time, clojurists love the idea of doing lots of dsl libraries..
Now that we can develop script style with Java 21, I'd like to see something like Grails that worked with Java only...that would be a fun way to develop web apps. I liked Grails, just not Groovy.
Also often, the language doesn't live isolated from its implementation (compiler or interpreter). While theory looks at languages via its semantics, in practice as the OP notes it is about the quality of the implementation and what can be reasonably done with the language.
A recent [1] case is Julia. I think it has hit a kind of sweet spot for language design where new performant code tends to get written in Julia rather than in some other language and bound to it. At its core, it is a simple "call functions, passing data in and getting results out" kind of language, but what the functions ("methods") mean and how the compiler does just-ahead-of-time compilation with deep type specialized code means you can write high level code that optimizes very well. Those mechanics operate under the hood though, which makes for a pleasant programming experience ... and there are loads of cutting edge packages being written in Julia. It is less interesting to look at Julia as "just the language".
[1] recent in programming languages is perhaps anything <= 15 years? .. because it takes time to discover a language's potential.
Wasn't that the point of the article? That you need both?
> At this point, the right question to ask would be, well can you write a static-typing library for Scheme that then automatically checks your code for type errors? And the current answer, for now and for the foreseeable future, is no. No mainstream language today allows you to write a library to extend its type system.
The author seems to provide a counter example themselves(?):
> Racket and Shen provide mechanisms for extending their type systems...
I wonder if this is as clear-cut as the author is making it out to be. Coalton, which is effectively a library (and language) for Common Lisp, seems like it basically does this. Maybe that's not exactly what the author is referring to, because it is essentially a new language on top of Lisp using its meta-programming facilities, as opposed to merely extending the type system. Still, it can be used as a library right alongside Lisp code, so I think it's in the same spirit of of the first question of writing a "static-typing library that automatically checks your code" in a dynamic language.
Standard scheme may or may not be able to do this, but most Scheme implementations have unhygienic macros like CL's too, so I'd assume something similar would be possible. The fact that that these tend to be extensions from implementation designers might align with the article's point though. Also somewhat to the author's point, Coalton does rely strongly on CL's underlying type system, for which there's no real equivalent in Scheme. It also relies on implementation-specific optimizations alongside that.
For what it's worth you can (and indeed people have) written object systems in Scheme, despite the language not having one, though they tend not to be performant, which is likely another point towards using/writing a different language. CL also tends to allow fairly deep extension of its object system through the Meta-object Protocol.
I guess my point is that in my (probably biased) opinion, Lisps, or other languages with very strong meta-programming facilities, are pretty close to the language longed for in "Perhaps one day we'll have such a language." They aren't a silver bullet, of course. CL has no easy way to write performant coroutines/continuations, for example, even given all its extensibility. Scheme has no real type system, etc. etc.
I don't think any of this invalidates the articles points, I'm just not sure I agree with the absolutes.
There is however something wrong with releasing your new language! In most cases you should show it off to close friends, or your professor, and then burn it and all the source. Sure you might be better than the current language, but you won't be enough better as to be worth it. (even if the language is notoriously bad C++ where it is easy to be better - you won't be enough better as to be worth it).
If you want a better language there are two good options: switch to a different one that already exists; or make your language better. There are lots of great choices for languages out there if you want to switch. If the language you are thinking of doesn't have an active community of people working on making it better, it probably isn't a good choice.
Whichever language you choose though, libraries are the hard part. There are a lot of bad libraries, we need someone to write a better one - but only if one doesn't exist! For the great libraries out there, most need someone to contribute.
A large part of libraries is the consistent interface. Often there is a great libFoo and libBar, but their APIs are not consistent and so we need a libBarWithFooStyleInterface, and/or libFooWithBarStyleInterface. Better yet, we need everyone to come together and agree on how the interface should be and then make both use that new standard - cleaning Augean stables with a toothbrush seems like an easier task. Of course in the real world there a hundreds of libraries each with a great interface that is not consistent with the others.
Nah, don't do that. You will enjoy looking back at it in your old age. I wish I had all my old code now.
Although that in itself might be a hint to change language and write your library there, instead of inventing a new one.
The work matters too of course. Frequently used functions should be categorically bagged into libs and if the lib is used often enough some of it's functionality should be baked into the language. If languages require something often enough the language layer above should provide it and if things are frequently needed close to the metal it should be baked in hardware. Possibly as a co-processor and then as part of the cpu. We should even have a similar process moving things in the other direction. I cant wait for the day when floating-point arithmetic becomes a library complete with a community of people who still think it's wonderful. Like a football channel to cleanly contain that kind of undesirable materials. Some libraries/modules/frameworks should also be replaced by competent developers. We don't want a leftpad community.
Close! The purpose of a general-purpose programming language is to enable the creation of powerful and easy-to-use languages, but often just libraries.
- and runs 50 times faster than c, c++, rust and zig
- comes with a standard library covering 50000 use cases
- has direct integrations with drivers to every major database, crm, analytics provider, key value store, queue systems
Many DSLs can be bolted onto an existing language with support for compiler extensions. This approach offers more flexibility, but often leads to fragmentation and poor interoperability in the language ecosystem.
There is third approach, established by a group in Minnesota [1], which is to design languages and tools which are modular and extensible from the get-go, so that extensions are more interoperable. They do research on how to make this work using attribute grammars.
If the host language has a sufficiently expressive type system, you can often get away with writing a fluent API [2] or type safe embedded DSL. But designing languages and type systems with good support for meta-programming is also an active area of research. [3, 4]
If none of these options work, the last resort is to start from tabula rasa and write your own parser, compiler, and developer tools. This offers the most flexibility, but requires an enormous amount of engineering, and generally is not recommended in 2026.
[2]: https://arxiv.org/pdf/2211.01473
MLIR [1] has entered the chat :P
I know I know MLIR is an IR and not a programming language, but MLIR does give me the feeling of "an IR to rule them all" (as long as you're ok with SSA representation), and the IR itself is quite high-level to the point it almost feels like an actual programming language, e.g. you can write a MLIR program that compiles to C using the EmitC dialect that feels a lot like writing C.
You mean like this?
https://github.com/williamcotton/webpipe
https://github.com/williamcotton/webpipe-lsp
https://github.com/williamcotton/webpipe-js
It's less effort if you, well, you know where I'm going with this...
There’s a time in my life where I designed languages and wrote compilers. One type of language I’ve always thought about that could be made more approachable to non technical users is an outline-liked language with English like syntaxes and being a DSL, the shape of the outline would be very much fixed and on a guardrail, and can’t express arbitrary instructions like normal programming languages, but an escape hatch (to more expressive language) for advanced users can be provided. An area where this DSL can be used would be common portal admin app generation and workflow automation.
That said, with the advent of AI assistants, I’m not sure if there is still room for my DSL idea.
It looks that somebody listened. We now have 3 GTK libraries in a system, a lot of graphics libraries (cairo, etc) 3d libraries (Mesa, vulkan). It is a mess.
the key is that not all worlds enable the same kinds of libraries.
At the extremes, we call these paradigms. Functional languages, Object Oriented languages, Array languages, etc. But every language exists to make some subset of "shapes" of solutions easier to read and write. Elixir encourages you to describe your programs as a distributed system of programs each running a series of transformations to data, thanks to its pipeline system, and to push your branching to function-call level with its pattern-matching multiple dispatch.
Java encourages you to write your application as a collection of things, that know stuff and do stuff, including telling other things to do stuff based on things they know.
C and Go encourage you to write your programs as an ordered series of atomic tasks for the computer to complete.
SQL has you describe what you want your output to look like.
Etc, etc. There are inherent trade offs between languages, because what you make inelegant to express or even inexpressible carries value, too.
The structure of a language matters to the ease and feel of its use, despite even being logically identical. One parallel would be the syntactic benefits of something like Hintikka’s independence-friendly logic vs first order logic, even if they are equivalent.
The sentiment shared is that we should sacrifice benefits to the next generation to make our own lives easier. This is a common sentiment, but a sad one. The goal should be a natural language based programming language, that everyone can use, along side a technical programming language that makes unambiguous the interface between the language and the machine.
Everyone seems to endorse their happy medium, and those languages are also perfectly fine.
But, if you squint, great API design is a bit like embedded domain specific language design as well.
I think there's room for both.
There are lots of libraries already. Instead of rewriting them in every language why make them available to every language.
Yes, I know it would be difficult and in some cases impossible.
whazor•1d ago
conartist6•1d ago
librasteve•1d ago
morshu9001•7h ago