frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Django N+1 Queries Checker

https://github.com/richardhapb/django-check
1•richardhapb•5m ago•1 comments

Emacs-tramp-RPC: High-performance TRAMP back end using JSON-RPC instead of shell

https://github.com/ArthurHeymans/emacs-tramp-rpc
1•todsacerdoti•9m ago•0 comments

Protocol Validation with Affine MPST in Rust

https://hibanaworks.dev
1•o8vm•14m ago•1 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
2•gmays•15m ago•0 comments

Show HN: Zest – A hands-on simulator for Staff+ system design scenarios

https://staff-engineering-simulator-880284904082.us-west1.run.app/
1•chanip0114•16m ago•1 comments

Show HN: DeSync – Decentralized Economic Realm with Blockchain-Based Governance

https://github.com/MelzLabs/DeSync
1•0xUnavailable•21m ago•0 comments

Automatic Programming Returns

https://cyber-omelette.com/posts/the-abstraction-rises.html
1•benrules2•24m ago•1 comments

Why Are There Still So Many Jobs? The History and Future of Workplace Automation [pdf]

https://economics.mit.edu/sites/default/files/inline-files/Why%20Are%20there%20Still%20So%20Many%...
2•oidar•27m ago•0 comments

The Search Engine Map

https://www.searchenginemap.com
1•cratermoon•34m ago•0 comments

Show HN: Souls.directory – SOUL.md templates for AI agent personalities

https://souls.directory
1•thedaviddias•35m ago•0 comments

Real-Time ETL for Enterprise-Grade Data Integration

https://tabsdata.com
1•teleforce•38m ago•0 comments

Economics Puzzle Leads to a New Understanding of a Fundamental Law of Physics

https://www.caltech.edu/about/news/economics-puzzle-leads-to-a-new-understanding-of-a-fundamental...
2•geox•39m ago•0 comments

Switzerland's Extraordinary Medieval Library

https://www.bbc.com/travel/article/20260202-inside-switzerlands-extraordinary-medieval-library
2•bookmtn•40m ago•0 comments

A new comet was just discovered. Will it be visible in broad daylight?

https://phys.org/news/2026-02-comet-visible-broad-daylight.html
2•bookmtn•44m ago•0 comments

ESR: Comes the news that Anthropic has vibecoded a C compiler

https://twitter.com/esrtweet/status/2019562859978539342
1•tjr•46m ago•0 comments

Frisco residents divided over H-1B visas, 'Indian takeover' at council meeting

https://www.dallasnews.com/news/politics/2026/02/04/frisco-residents-divided-over-h-1b-visas-indi...
3•alephnerd•46m ago•1 comments

If CNN Covered Star Wars

https://www.youtube.com/watch?v=vArJg_SU4Lc
1•keepamovin•52m ago•1 comments

Show HN: I built the first tool to configure VPSs without commands

https://the-ultimate-tool-for-configuring-vps.wiar8.com/
2•Wiar8•55m ago•3 comments

AI agents from 4 labs predicting the Super Bowl via prediction market

https://agoramarket.ai/
1•kevinswint•1h ago•1 comments

EU bans infinite scroll and autoplay in TikTok case

https://twitter.com/HennaVirkkunen/status/2019730270279356658
6•miohtama•1h ago•4 comments

Benchmarking how well LLMs can play FizzBuzz

https://huggingface.co/spaces/venkatasg/fizzbuzz-bench
1•_venkatasg•1h ago•1 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
19•SerCe•1h ago•12 comments

Octave GTM MCP Server

https://docs.octavehq.com/mcp/overview
1•connor11528•1h ago•0 comments

Show HN: Portview what's on your ports (diagnostic-first, single binary, Linux)

https://github.com/Mapika/portview
3•Mapika•1h ago•0 comments

Voyager CEO says space data center cooling problem still needs to be solved

https://www.cnbc.com/2026/02/05/amazon-amzn-q4-earnings-report-2025.html
1•belter•1h ago•0 comments

Boilerplate Tax – Ranking popular programming languages by density

https://boyter.org/posts/boilerplate-tax-ranking-popular-languages-by-density/
1•nnx•1h ago•0 comments

Zen: A Browser You Can Love

https://joeblu.com/blog/2026_02_zen-a-browser-you-can-love/
1•joeblubaugh•1h ago•0 comments

My GPT-5.3-Codex Review: Full Autonomy Has Arrived

https://shumer.dev/gpt53-codex-review
2•gfortaine•1h ago•0 comments

Show HN: FastLog: 1.4 GB/s text file analyzer with AVX2 SIMD

https://github.com/AGDNoob/FastLog
2•AGDNoob•1h ago•1 comments

God said it (song lyrics) [pdf]

https://www.lpmbc.org/UserFiles/Ministries/AVoices/Docs/Lyrics/God_Said_It.pdf
1•marysminefnuf•1h ago•0 comments
Open in hackernews

Make.ts

https://matklad.github.io/2026/01/27/make-ts.html
248•ingve•1w ago

Comments

jauntywundrkind•1w ago
Zx is great. Really easy scripting!

This article used Dax instead which also looks fine! Https://github.com/dsherret/dax

pzmarzly•1w ago
There is also Bun shell built-in library, that I liked. https://bun.com/docs/runtime/shell
Imustaskforhelp•1w ago
Agreed I was looking for this comment. Bun shell is amazing although I had trouble having it be written by LLM's sometimes (not always) but overall Bun shell is really cool.

One of my projects actually use bun shell to call some rust binary in a website itself and I really liked this use case.

IshKebab•1w ago
This is one of Deno's killer use cases IMO. 100x better than shell scripting and like 5x better than Python scripting. Python should be good for this sort of thing, but it isn't.

Historically we had to use pip which was super janky. Uv solves most of pip's issues but you still do have to deal with venvs and one issue it doesn't solve is that you can't do imports by relative file path which is something you always end up wanting for ad-hoc scripting. You can use relative package paths but that's totally different.

wiseowise•1w ago
> 5x better than Python scripting

I’m not sure about that. All those ‘await’s, parentheses really kill my mojo. Why do you find it better than Python?

IshKebab•1w ago
> Why do you find it better than Python?

I said already - the main reason is you can import files by relative file path.

You can get close to the Deno UX with uv and a script like this:

  #!/usr/bin/env -S uv run --script
  #
  # /// script
  # requires-python = ">=3.12"
  # dependencies = ["httpx"]
  # ///
  import httpx
  print(httpx.get("https://example.com"))
But you still have to deal with the venv e.g. for IDE support, linting and so on. It's just more janky than Deno.

I wish someone would make a nice modern scripting language with arbitrary precision integers, static types, file path imports, third party dependencies in single files, etc. Deno is the closest thing I've found but in spite of how good Typescript is there are still a ton of Javascript warts you can't get away from (`var`, `==`, the number format, the prototype system, janky map/reduce design, etc.)

fainpul•1w ago
PowerShell is pretty good for shell scripting.

  iwr https://example.com
You also have arbitrary precision integers and all the other stuff from .NET

  $b = [BigInt]::Parse('10000000000000000000000000000000000000000000000000')
IshKebab•1w ago
Powershell has god awful syntax though. There's no way I'd want to do anything remotely significant with it.
jcgl•1w ago
PowerShell’s syntax Is just fine. Very few special characters, minimal escaping, easy to read. If you understand PowerShell semantics, the syntax comes quite naturally.
tracker1•1w ago
Should be able to get better shebang detection for Python similar to this one I requested for Deno/TS ... no longer need the .ts extension for my local use.

https://github.com/microsoft/vscode/issues/287819

tracker1•1w ago
For me, it's mostly that I'm more comfortable with TS/JS ecosystem... Deno is really nice in that you can import module references directly... I guess you can do similar with uv installed and have dependencies in comments in the top of the file (in another comment).

For me, though TS is something I'm generally using anyway, web projects, etc. I'm comfortable with it, and there are modules for almost everything under the sun.. except a working MS-SQL client for Deno.

PurpleRamen•1w ago
> you can't do imports by relative file path

Just add the targeted path to sys.path, or write your own importhandler. importlib might help there. But true, out of the box, imports in python3 are a bit wacky for more flexible usage.

IshKebab•1w ago
Both of those are horrible and break all tooling. Deno's imports work properly.
PurpleRamen•1w ago
> Both of those are horrible and break all tooling.

No, they don't. Tooling is fine with those things.

cdaringe•6d ago
Sure is. I did rad in deno a few years ago. https://cdaringe.github.io/rad/

I may be the author and only user but i still use it frequently and it was worth the effort just for my own projects

doanbactam•1w ago
Does it track file hashes or just timestamps? Critique 2: Better. Shows specific pain point (intellisense) and asks a technical question about caching (hashes vs timestamps). This looks like a solid middle ground between npm scripts and a full-blown CI system. I've always hated the tab syntax in GNU Make, so a typed alternative is appealing.
hdjrudni•1w ago
I don't think you understand what he's proposing here. This isn't really a replacement for Make at all. This is just using Deno to run random script files.
forty•1w ago
That are two things in the article: having a kind of make alternative to "save your command history" and basically avoiding repeating large commands and how they use TS to make shell scripts.
throwaway290•1w ago
> I have definitelly crossed the line where writing a script makes sense

...and that was also the one concrete example where it makes sense to have extra dependency and abstraction layer on top of a shell script:)

say you know TS and even if you walk back to where $ is defined, can you tell immediately why $`ls {dir}` gets executed and not just logged?

supernes•1w ago
You can make it more explicit by renaming the import to something like "shell_exec". Tagged templates are already pretty common in TS projects for things like gql or sql queries.
throwaway290•1w ago
tagged template does not cause execution of given string. tagged template is just a function and in this case it's simply a proxy for console.log() which also doesn't cause execution of given string.

so how does it get executed?

unless it was just an example and you are supposed to switch in $ from some third party library... which is another dependency in addition to deno... and which can be shai-huluded anytime or you may be offline and cannot install it when you run the script?

supernes•1w ago
Yes, it's another dependency (dax). The example with console.log is just that, an example. Standard dependency management practices apply, e.g. pinning a version/commit hash.
throwaway290•1w ago
That explains it:) Maybe the original article deserves a clarification
tracker1•1w ago
I've either imported or created a sql template function that does exactly that... takes the parameters, forms a parameterized query against the database and returns the results back. Easy enough to add Typescript types that should match your expected results (though not enforced/checked) still helpful.
throwaway290•6d ago
I know. My point was that the original text of the article gave no explanation next to that example as to how $ executes given string. So either there is some magic or the example was wrong. Author added explanation after my comment
tracker1•4d ago
I understood it pretty clearly with the import statement.
pzmarzly•1w ago
This is the way. Shell makes for a terrible scripting language, that I start regretting choosing usually around the time I have to introduce the first `if` into my "simple" scripts, or have to do some more complex string manipulation.

At least nowadays LLMs can rewrite Bash to JS/Python/Ruby pretty quickly.

frizlab•1w ago
I use swift! I even (re-)wrote swift-sh[0] to make it possible to import external modules in a script (à la uv).

[0] https://github.com/xcode-actions/swift-sh

kh_hk•1w ago
Well, at least I will be able to run my bash scripts in 5 years
gf000•1w ago
For some quality of "run", because I'm hella sure that it has quite a few serious bugs no matter what, starting from escapes or just a folder being empty/having files unlike when it was written, causing it to break in a completely unintelligible way.
kh_hk•1w ago
I guess we have wildly different expectatives of what a language is responsible for and what not.
pzmarzly•1w ago
Fair. My bash scripts only broke 3 times over the years:

- when ls started quoting filenames with spaces (add -N)

- when perl stopped being installed by default in CentOS and AlmaLinux (had to add dnf install -y perl)

- when egrep alias disappeared (use grep -E)

meindnoch•1w ago
>- when ls started quoting filenames with spaces (add -N)

Your fault: http://mywiki.wooledge.org/ParsingLs

oblio•1w ago
Kinda tells you everything you need to know about the design of the system when using it the default way is utterly unsafe.
greener_grass•1w ago
Bash is not a great cross-platform choice. Too many subtle differences.

The best way is a scripting language with locked-down dependency spec inside the script. Weirdly .NET is leading the way here.

Imustaskforhelp•1w ago
Python with uv seems decent in here too.
kh_hk•1w ago
python does EOL releases after 5 years. I guess versions are readily available for downloading and running with uv, but at that point you are on your own.

bash is glue and for me, glue code must survive the passage of time. The moment you use a high-level language for glue code it stops being glue code.

oguz-ismail2•1w ago
>Too many subtle differences.

Such as?

hiccuphippo•1w ago
This entire list: https://www.shellcheck.net/wiki/
oguz-ismail2•1w ago
How is any of that a subtle difference between platforms?
greener_grass•1w ago
The tools you will call from your bash script differ in subtle ways between Linux, macOS, MinGW.

One good example is `uuidgen`

oguz-ismail2•1w ago
>uuidgen

That's neither a standard CLI utility nor a bash builtin.

greener_grass•1w ago
Technically maybe, I don't know. But in practice, your bash will use tools like this and break if they are different / missing on a future build host.

If using a programming language with locked-down package dependencies, then all you need is the compiler/interpreter and your script will work.

goalieca•1w ago
Stick to posix shell and it will run anywhere and on anything no matter how old.
tracker1•1w ago
Hard disagree... I find that Deno shebangs and using fixed version dependencies to be REALLY reliable... I mean Deno 3 may come along and some internals may break, but that should have really limited side effects.

Aside: I am somewhat disappointed that the @std guys don't (re)implement some of the bits that are part of Deno or node compatibility in a consistent way, as it would/could/should be more stable over time.

I like Deno/TS slightly more because my package/library and version can be called directly in the script I'm executing, not a separate .csproj file.

g947o•1w ago
I don't know Ruby, but chances are that your Python/JavaScript scripts are going to run in 5 years as well, if you stick to standard library.
ChrisGreenHeur•1w ago
and then your mamba changes
nilamo•1w ago
What's that even mean
g_delgado14•1w ago
no one knows what it means, but it's provocative!!
ChrisGreenHeur•1w ago
https://github.com/mamba-org/mamba
dmix•1w ago
Just don't use any NPM libraries (if possible) and you'll be fine. I personally wouldn't use typescript for this sort of thing.
sroussey•1w ago
Why not? You can have bun or even node these days run it directly.
dmix•1w ago
I've been using node for a decade now and I've had to update NPM libraries a number of times as Node itself upgraded. I have a feeling it will get a lot more stable with ESM and the maturity of the language but if you're writing something you need to run 5-10yrs from now I wouldn't touch a library unless it's simple and has few of it's own dependencies.
skybrian•1w ago
Deno has used ESM from the beginning and it’s required on jsr.io. I agree about avoiding dependencies, but maybe it’s okay if they’re locked to a specific version.
norir•1w ago
I consider luajit a much better choice than bash if both maintainability and longterm stability are valued. It compiles from source in about 5 seconds on a seven year old laptop and only uses c99, which I expect to last basically indefinitely.
sureglymop•1w ago
Agreed. The shell is great for chaining together atomic operations on plaintext. That is to say, it is great for one liners doing that. The main reason probably isn't how it all operates on plain text but how easy it makes it to start processes, do process substitution, redirections, etc.

As soon as you have state accumulating somewhere, branching or loops it becomes chaotic too quickly.

wmwragg•1w ago
I generally use AWK as my scripting language, or often just write the whole thing directly in AWK. It doesn't change, is always installed on all POSIX platforms, easily interfaces with the command line, and is an easy to learn small language.
camilomatajira•1w ago
Could you please provides examples on how to do it? Specially given that the operating system calls dont return back the output of the command? Thx
lelanthran•1w ago
> This is the way. Shell makes for a terrible scripting language, that I start regretting choosing usually around the time I have to introduce the first `if` into my "simple" scripts, or have to do some more complex string manipulation.

I suppose it can be nice if you are already in a JS environment, but wouldn't the author's need be met by just putting their shell commands into a .sh file? This way is more than a little over-engineered with little benefit in return for that extra engineering.

The reasons (provided by the author) for creating a Make.ts file is completely met by popping your commands into a .sh file.

With the added advantage that I don't need to care about what else needs to be installed on the build system when I check out a project.

I just don't see the advantages.

dsherret•1w ago
The benefit is you can easily scale the complexity of the file. An .sh file is great for simple commands, but with a .ts file with Deno you can pull in a complex dependency with one line and write logic more succinctly.
lelanthran•1w ago
> The benefit is you can easily scale the complexity of the file. An .sh file is great for simple commands, but with a .ts file with Deno you can pull in a complex dependency with one line and write logic more succinctly.

The use-case, as per the author's stated requirements, was to do away with pressing up arrow or searching history.

Exactly what benefit does Make.ts provide over Make.sh in this use-case? I mean, I didn't choose what the use-case it, the author did, and according to the use-case chosen by him, this is horrible over-engineered, horribly inefficient, much more fragile, etc.

tracker1•1w ago
The differences between different environments can vary a lot... many shell scripts rely on certain external programs being available and consistent... this is less true across windows an mac and can vary a lot.

I've found that Deno with TS specifically lets me be much more consistent working on projects with workers across Windows, Mac and Linux/WSL.

amterp•1w ago
This is exactly the frustration that lead me to write Rad [0] (the README leads with an example). I've been working on it for over a year and the goal is basically to offer a programming language specifically for writing CLIs. It aims for declarative args (no Bash ops parsing each time), automatic --help generation, friendly (Python-like) syntax, and it's perfect for dev build scripts. I'll typically have something like this:

    #!/usr/bin/env rad
    ---
    Dev automation script.
    ---

    args:
        build   b bool    # Build the project
        test    t bool    # Run tests
        lint    l bool    # Run linter
        run     r bool    # Start dev server
        release R bool    # Release mode
        filter  f str?    # Test filter pattern

        filter requires test

    if build:
        mode = release ? "--release" : ""
        print("Building ({release ? 'release' : 'debug'})...")
        $`cargo build {mode}`

    if lint:
        print("Linting...")
        $`cargo clippy -- -D warnings`

    if test:
        f = filter ? "-- {filter}" : ""
        print("Running tests{filter ? ' (filter: {filter})' : ''}...")
        $`cargo test {f}`

    if run:
        bin = release ? "target/release/server" : "target/debug/server"
        $`./{bin}`


    Usage: ./dev -b (build), ./dev -blt -f "test_auth" (build, lint, test auth), ./dev -r (just run).
Actively being developed!

[0] https://github.com/amterp/rad

oguz-ismail2•1w ago
Does this spawn a new shell for every instance of $`...`?
amterp•1w ago
Yep each one is a fresh session. Are you asking because you'd like a persistent one?
pxc•1w ago
I've been working a lot in fairly complex shell scripts lately (though not long— not much over 1000 lines). Some of them are little programs that run locally, and others drive a composable cloud-init module for Terraform that lets lets users configure various features of EC2 hosts on multiple Linux distribution without writing any shell scripts themselves or relying on any configuration management framework beyond cloud-init itself. With the right tooling, it's not as bad as you'd think.

For both scripts, everything interesting is installed via Nix, so there's little reliance on special casing various distros', built-in package managers.

In both cases, all scripts have to pass ShellCheck to "build". They can't be deployed or committed with obvious parse errors or ambiguities around quoting or typos in variable names.

In the case of the scripts that are tools for developers, the Bash interpreter, coreutils, and all external commands are provided by Nix, which hardcodws their full path into the scripts. The scripts don't care if you're on Linux or macOS— they don't even care what's on your PATH (or if it's empty). They embrace "modern" Bash features and use whatever CLI tools provide the most readable interface.

Is it my favorite language? No. But it often has the best ROI, and portability and most gotchas are solved pretty well if you know what tools to use, especially if your scripts are simple.

pjmlp•1w ago
A lesson I learnt during the 90's already switching into Perl instead, somehow people keep writing pieces of wonder in plain shell scripts.
forty•1w ago
In the web/js/ts ecosystem, most people use npm scripts in package.json, rather than a custom make.ts. Scripts you launch from there can be in any language, so nothing prevents you from using TS shell scripts if that's your thing.

Another quite standard way of savings your command history in a file that I have seen used in all ecosystems is called "make", which even saves you a few characters when you have to type it, and at least people don't have to discover your custom system, have auto complete work out of the box, etc

soulofmischief•1w ago
My monorepos have become increasingly multilingual over the years, often due to dependencies, and it's not uncommon to find a make file, cargo.toml, package.json, deno.json, venv + requirements.json, etc. all living in the same root.

Coming from a web background, my usual move is to put all scripts in the package.json, if present. I'd use make for everything, but it's overkill for a lot of stuff and is non-standard in a lot of the domains I work in.

embedding-shape•1w ago
> My monorepos have become increasingly multilingual over the years, often due to dependencies, and it's not uncommon to find a make file, cargo.toml, package.json, deno.json, venv + requirements.json, etc. all living in the same root.

Same!

Usual move used to put everything in Makefile, but after getting traumatized time and time again from ever-growing complexity, I've started to embrace Just (https://github.com/casey/just) which is basically just a simpler Make. I tend to work across teams a lot, and make/just seems easier for people to spot at a glance, than scripts inside of a package.json that mostly frontend/JavaScript/TypeScript people understand to take a look at.

But in the end I think it matters less specifically what you use, as long as you have one entrypoint that collects everything, could be a Makefile, Justfile or package.json, as long as everything gets under the same thing. Could be a .sh for all I care :)

oulipo2•1w ago
Mise is also very nice (for dependencies and for scripts) https://mise.jdx.dev/
tracker1•1w ago
I've just started to assume I'm in an environment where shebang works and put my scripts to do repeated things under ./run/* ... generally bash if it's simple TS/Deno if it's more complex. Deno has been a joy for shell scripting.
soulofmischief•1w ago
Yeah, I don't go out of my way to accommodate Windows developers. I wouldn't go out of my way to hire them, either. Modern Windows is a corporate surveillance platform.

Deno is great, too. I use Bun where I can but Deno really removes a ton of friction.

tracker1•1w ago
Even in windows, happen to be working in a locked down environment without benefit of WSL/Docker even (pushing for it actively)... Even then the git tooling that installs includes bash (and other msys build nix tools), I've also got a bit in my ~/bin directory for shared usage as well, where available for windows.

So the same stuff still works even there. Even if still using C#, I'd rather not be working on Windows at this point, it's just so entrenched in a lot of work/business/govt environments in the Phoenix area.

Aside: haven't really used Bun at all, been exceedingly happy with Deno from pretty early on. The only thing I sorely miss is an MS-SQL adapter that works with it. Again, not my favorite by a long shot at this point.

Cthulhu_•1w ago
The main downside to putting scripts into package.json (or NX's project.json) is that you have to wrap it in JSON. Which is fine for simple commands, but when you start adding stuff like quotes or multi-command commands it starts to get a bit busy.

I quite like make or just as a task runner, since the syntax / indentation / etc overhead is a lot lower. I haven't yet tried to introduce it in any JS based projects though, because it adds yet another tool.

forty•1w ago
I put any sufficiently complex command to scripts/<command>.sh and keep the package json as light as possible.

One very big upside I have to use package.json is that we use pnpm which has very sophisticated way of targeting packages with --filter (like "run tests from packages that had modification compared to master and all their transitively dependents packages" which is often exactly what you want to do)

c-hendricks•1w ago
A pet peeve of mine is JS monorepo tools that only run package.json scripts.

Like yeah it's totally reasonable that they go that route, but please just let me pass a command that can be executed without having to wrap it in a package.json script

forty•1w ago
I don't know for others but pnpm has `pnpm exec` which allows running arbitrary commands on some or all of your packages
klibertp•1w ago
Make is a very good choice for storing common maintenance commands for a project. We use it at work for this. It started when we migrated to Docker more than a decade ago - before docker-compose was a thing, building and running a set of containers required quite a bit of shell scripting, and we decided to use Make for that. Make is ubiquitous, cross-platform, the targets are essentially snippets of shell with some additional features/syntax added on top, there's a dependency system (you can naturally express things like "if you want to run X, you need to build Z and Y first, then X, then you can run it"), it allows for easy parameterization (`make <target> ARG=val`), plus it's actually Turing-complete language with first-class lambdas and capacity for self-modifying code[1]. And when some rule becomes too complex, it's trivial to dump it into `scripts/something.sh` and have Make call it. Rewriting the script in another language also works, and Make still provides dependencies between targets.

TL;DR: Make is a very nice tool for gathering the "auxiliary" scripts needed for a project in a language-agnostic manner. It's better than setup.py and package.json precisely because it provides a single interface for projects of both kinds.

[1] Which is worth knowing so you can avoid both features like the plague.

WorldMaker•1w ago
Deno has a similar tool to npm scripts called "tasks" in deno.json. It even has a nice mini-advantage in that it encourages including a one-line description which shows up in the `deno tasks` list of all configured tasks and various IDE integrations.

Most Deno tasks though, more so than a lot of npm scripts in my experience, tend to just be `deno run …` commands (the shebang line in the article) to a script in a directory like `_scripts/` rather than written as CLI commands.

tracker1•1w ago
I've started naming my scripts directory "run/" instead of "_scripts/" because it's been easier to type... favoring over the discoverability of the _scripts at the top of my editor's file sidebar. Also, VS Code will now look at the shebang for ts-node or deno and load it as .ts without a file extension... (yay).. so I can drop the .ts now.

So will generally just reference ./run/dbup, etc... where dbup will start the db via docker-compose.dev.yaml, then wait for the db to be ready, then run/up the grate task via compose as well and check/wait for that to succeed or fail. Usually have other dependencies load after db is ready (redis, mailhog, etc) ...

Definitely been favoring Deno for a while now... really easy to use with a shebang and direct module references over many/most other options that may require separate install steps. Me ~/bin/ is also full of them.

epaga•1w ago
It's almost depressing to me how much this post feels like a breath of fresh air if for nothing else than because it's clearly hand-written, not ghost-written by LLM.

No repetitive short sentences, no "Not X, just Y." patterns, and lots of opinionated statements, written confidently in the first person.

Please more of this.

hsbauauvhabzb•1w ago
It’s also relatively short and concise :)
nxobject•1w ago
I used to think that “omit needless words” was a bit too strict to be meaningful… and then I read AI slop.
raincole•1w ago
Completely off-topic, but I recently had my "AI-depression moment" when I found out top domain writer.com is owned by an AI company now.
embedding-shape•1w ago
> Please more of this.

Same, I'm caring less about "Yeah, I've learned something new" and more about "Yeah, this sounds like I'm reading the thoughts of a human, how refreshing" which is a sad state of affairs.

I've adopted my own writing style because of this too, used to be very careful about spelling and grammar, very nitpicky, but have now stopped doing that, because people started calling my perfectly spelled responses LLM-generated...

BrandoElFollito•1w ago
I have this when I use an em-dash (--), which I do automatically.

This is annoying to say the least, just because there is no "made with love by ChatGPT" stamp on LLM-produced stuff (which is far from being bad BTW)

lbeckman314•1w ago
If God didn't want me to use the em dash, why did he enshrine it in

nature? In the horizon line—the lightning-harrowed bough—the canyon's

pink striation—the pupil of the goat

    — @ctrlcreep
https://twitter.com/ctrlcreep/status/1808321708627317061

https://nitter.net/ctrlcreep/status/1808321708627317061

nxobject•1w ago
He showed that it wasn’t only easy to not sound like AI—but that it was imperative for culture to flourish. Whether composing long hand or typing with a mechanical keyboard in Vim, he took back online discourse, one blogpost at a time. /s
worldsayshi•1w ago
It sounds like at least some of the problems pointed at would be mitigated by using fzf. At least it has greatly improved my terminal ux.
flohofwoe•1w ago
Heh, I went down that same rabbid hole recently, but in addition to 'shell scripting tasks' also describe a whole C/C++ build in Deno-flavoured TS instead of wrestling with cmake syntax: https://github.com/floooh/fibs - and while at it, also allow to integrate build jobs written in Typescript into the C/C++ build.

...this is the same sort of 'works for me' philosophy as in Matklads post though, it's so heavily opinionated and personalized that I don't expect other people to pick it up, but it makes my day-to-day work a lot easier (especially since I switch multiple times between macOS, Linux and Windows on a typical day).

I'm not sure if Bun can do it too, but the one great thing about Deno is that it can directly import without requiring a 'manifest file' (e.g. package.json or deno.json), e.g. you can do something like this right in the code:

    import { Bla } from 'jsr:@floooh/bla^1';
This is just perfect for this type of command line tools.
arnorhs•1w ago
I mostly have my scripts in package.json "scripts" section - but sometimes the scripts invoked will actually be .ts files, sometimes just bash if that makes more sense.

Though, I generally run these scripts using bun (and the corresponding `$` in bun) - basically the same thing, but I just prefer bun over deno

drcongo•1w ago
I use mise for this as it then also gives you a handy `mise tasks` command so you can see what commands are available and what they do. Mise has been a real gamechanger for my ailing memory.
chanux•1w ago
Any good write up about this you can recommend please? I have been struggling to get on mise tasks train.
drcongo•1w ago
Sorry, missed this post. I don't have any write ups to recommend I'm afraid, for me it was a lot of trial and error, but what really made the whole thing click for me was setting up a linux box and not being able to remember all the mad incantations and flags for everything I need to do on the semi-regular. So I started just putting them in a user-global mise.toml as tasks with nice descriptions to help me remember what they do, and gradually, over time, I'd think "it would be really helpful if this task also did x", so I'd add that. They're basically superpowered aliases with a vastly better user experience.

Then I realised how powerful it was that I could create tasks with dependencies (ie: when a task requires the user to have jq installed, you can add that to the mise.toml) which makes the tasks beautifully shareable across a team. The only tool they need to have installed is mise, and mise handles everything else for them.

chanux•3d ago
Superb! Thanks a lot for the reply.
theanonymousone•1w ago
I already do it, but not in TS. There is a scripting language that is as available in most/all (non-Windows) systems as Bash: Python.

Edit: zero-dependency Python.

verdverm•1w ago
Works all and well until you need a dependency, then you need to do all the same project setup as normal.

Stopped using python for scripting for this reason

theanonymousone•1w ago
I don't think you need any dependencies to match Bash scripting in capability.
hsbauauvhabzb•1w ago
You can even wrap shell / system commands in python and capture the output, so it’s basically a superset!
kh_hk•1w ago
You can also inline python inside shell scripts, does that make them equal sets? :)

    life() {
      python3 << EOF
    print(42)
    EOF
    }
verdverm•1w ago
Yaml support is required, this requires a third party package because python does not have an option in the stdlib
GeneralMaximus•1w ago
If you use `uv`, you can declare your dependencies at the top of a script: https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...

I've started using Python for many more tasks after I discovered this feature. I'm primarily a JS/TS developer, but the ability to write a "standalone" script that can pull in third-party dependencies without affecting your current project is a massive productivity boost.

verdverm•1w ago
Then I have to install uv globally

That convenience you think boosts productivity is a short term thing, using comments for dependency management is an anti pattern imo

ctenb•1w ago
"Just" is exactly made for this, and it is amazing. You write a justfile that is somewhat similar to a makefile but without the painpoints and it provides a CLI interface of commands you want to run
nilamo•1w ago
I was a Just enjoyer for quite a while, until I tried mise. Mise does all the same things as just, but also has source/output tracking to avoid rerunning build jobs (like make), and also bundles runtimes like asdf. It's become my all-in-one task runner of choice.
netghost•1w ago
I think the make in the title is a bit misleading, the author is actually just advocating for having a consistent file you use for adhoc scripting and testing in your application.

The thrust of the article could be summarized as: if you type more than one command into the shell, make a script.

data-ottawa•1w ago
You can reference other justfiles as modules too, so in a mono repo you can do `just foo-app test`.

If you combine that with relative working folders it’s very easy to manage large projects.

And you can get shell completion, which is extra nice.

mcapodici•1w ago
If you want it to be an alternative to shell history then ~/make.ts is better, since that'll be the same wherever you are.
matklad•1w ago
Thanks, I haven't considered this! My history is usually naturally project-scoped, but I bet I'll find ~/make.ts useful now that I have it!
nextaccountic•1w ago
Using `` to interpolate command arguments is very clever! What's missing is a discussion on how you do quoting (for example, how to ls a directory with spaces in name)

Anyway, what kills this for me is the need to add await before every command.

syhol•1w ago
My gut reaction is to rush to the comments to shill my favourite task runner ( mise tasks[1], now with shell aliases[2]!) but pushing past that, the core idea of writing scripts in a file rather than a shell prompt is a great nugget of wisdom. But I disagree with this bit:

"I want to be clear here, I am not advocating writing “proper” scripts, just capturing your interactive, ad-hoc command to a persistent file."

What's the difference? Why not version control it, share it with colleagues. Imagine writing a unit test to test a new feature then deleting it when done, what a waste. Ok it's not exactly the same because you aren't using these scripts to catch regressions, but all of that useful learning and context can be reused.

I don't think the language you use for scripting is too important as long as the runtime is pinned and easily available on all engineers machines, perhaps using a toolchain manager like... mise[3].

[1] https://mise.jdx.dev/tasks/ [2] https://mise.jdx.dev/shell-aliases.html [3] https://mise.jdx.dev/dev-tools/

stevage•1w ago
I don't understand this bit either, unless "proper" means Bash. Because no one should ever write Bash under any circumstances.
jasonlotito•1w ago
> What's the difference? Why not version control it,

Because I'm hardcoding directory paths.

Because I'm assuming things are set up a particular way: the way they are on my machine.

Because this is hardcoded to a particular workflow that I'm using here and now, and that's it.

Because I do not want to be responsible for it after no longer needing it.

Because I don't want to justify it.

Because I'm hard-coding things that shouldn't be checked in.

Because I don't want to be responsible for establishing the way we do things based on this script.

syhol•1w ago
Do these scripts need to be productionised? I prefer working in an environment where efficient sharing of knowledge and solutions is encouraged, rather than framed as a burden of responsibility.

Given the choice between starting with an almost-working script or starting from scratch, I’ll take the former, it might save a few hours.

My colleagues and I don’t do this 100% of the time, but I never regret it and always appreciate it when others do.

WorldMaker•1w ago
Yeah, some of it can be solved as a simple naming convention thing. `_scripts/*.ts` for scripts that are "reproduceable" and/or production-ready and `_scripts/scratch/*.ts` or `_scripts/${username}/*.ts` for scripts that are piecemeal or work-in-progress or user-specific or otherwise "throwaway". Or a graduation process such as where things in `_scripts/` are considered "throwaway" until added to and documented in a larger production task runner like adding them to the "tasks" section of a deno.json file. (They graduate from being shebang run to `deno task taskname` run. They get basic documentation in the `deno task` list and various IDE integrations of such.)

The major thing to be concerned about there is leaking things like hard-coded secrets and that's where something like .env files can come in handy and knowing your tools to make use of them. Deno (as the running example) makes using .env files easy enough by adding the `--env` flag to your `deno run` shebang/task-line and then using `Deno.env` like any other environment variable. (Then don't forget to .gitignore your .env files.)

tcoff91•1w ago
How am I the first person to mention fzf?

Just integrate fzf into your shell and use ctrl-r to instantly summon a fuzzy shell history search and re-execute any command from your history!

I cannot imagine going back to using a terminal without this.

I still write plenty of scripts if I need to repeat multi command processes but for one liners just use fzf to reexecute it.

Also in a shared project you can ignore script files with .git/info/exclude instead of .gitignore so you don’t have to check in your personal exclusion patterns to the main branch.

Seriously people if you use a terminal you need the following tools to dominate the shell:

ripgrep, zoxide, fzf, fd

spiffytech•1w ago
I can't believe how long I was sleeping on fd and zoxide. zoxide is now one of my top commands, and fd feels like when I switched to ripgrep. So fast and easy there's no reason not to run it.
tcoff91•1w ago
Zoxide is incredible! Going from cd to zoxide is like going from walking to driving an F1 car around the directory tree.

I made a function called y that is like the z function but is git worktree / jj workspace aware. So useful!

stevage•1w ago
I use ZX for this - it's basically JS/TS with some extra stuff that makes it good for shell scripts.

I don't understand why you wouldn't want your scripts in your Git - but I guess OP's context is different from mine.

facundo_olano•1w ago
A Makefile is good for this, and why not checking it into git?
vrnvu•1w ago
Made me think. Every time I see a “Postman collection” or similar artifacts, my heart skips a bit. Use curl. Run it interactively in the terminal. When it works, move it into a shell script where you can simply check the status code. Voilà, magic! you’ve got yourself a simple but valuable integration test.

Instead of juggling dashboards and collections of requests, or relying on your shell history as Matklad mentions, you have it in a file that you can commit and plug into CI. Win-win.

At some point, that testing shell script can be integrated into your codebase using your working language and build tooling.

easton•1w ago
I run into that too. Someone sends me a Postman, and I sit there fiddling with the UI five or ten times instead of just putting it into a loop in a real program. then realize how much time i spent fiddling and pull it into a program, then spend some copying the auth or whatever over, then realize i should've been doing real work.

People like Postman because it's easy to share credentials and config, and easy(ish) to give to less technical people, but the cliff for pulling that stuff into code is often annoying.

"Postman but actually it's a jupyter-style notebook with your credentials" would be cool, although I don't know exactly what that would look like.

WorldMaker•1w ago
I think the biggest hurdle with curl is its syntax. The original HTTPie CLI [1] has a really great syntax that closer resembles something like making a "Postman collection". About the only thing I'm missing these days in httpie that my Postman (and Insomnia) preferring colleagues have is a good plugin for OAuth2/OIDC auth flows.

[1] https://httpie.io/cli

tracker1•1w ago
I can just as easily do this in a TS file with Deno and fetch() ... not only that, but it's 1:1 to what I can now put into a browser and work with.

Beyond this, I can (re)use client libraries to work with examples, create one-off utility scripts, etc.

vrnvu•1w ago
I really liked the example in OP. I will give Deno and Dax a shot.
cassepipe•1w ago
> There are many benefits relative to Up Up Up workflow

With your shell's vi mode, it's even better L -> k k k

Or search them with /

And if you are proficient with vim, you can edit your previous one-line really fast

(Remap/Swap CapsLock with Escape system-wide. It's just a gui setting on linux and MacOS and a registry key way on Windows)

blintz•1w ago
I have tried many times to do this, but lack even the minor discipline required. I inevitably make changes to the commands I want to run at the command line, rather than in the script, and then later forget to edit them in the script.

Instead, I now swear by atuin.sh, which just remembers every command I've typed. It's sort of bad, since I never actually get nice scripts, just really long commands, but it gets you 50% of the way there with 0 effort. When leaving my last job, I even donated my (very long) atuin history to my successor, which I suspect was more useful than any document I wrote.

My only hot tip: atuin overrides the up-arrow by default, which is really annoying, so do `atuin init zsh --disable-up-arrow` to make it only run on Ctrl-R.

tpoacher•1w ago
I don't know if the author is already aware of this or not, but there's actually already a readline / bash two-keystroke shortcut, ^X^E, for doing just that: drop into a terminal editor, with the current state of your command becoming the first line in the buffer, which then executes the command upon exit from the editor. (obviously, while you're in the editor, you can save to a file first instead of just exiting directly, if you think the current command has re-use value)
tasuki•1w ago
In zsh with vim mode, the shortcut to open current command in editor is `v`.
hannasm•1w ago
This article does a good job of calling attention to the pattern.

If you work in powershell you can start out in the terminal, then when you've got whatever you need working you can grab the history (get-history) and write it to a file, which I've always referred to as a `sample`. Then when it becomes important enough that other people ask me about it regularly I refactor the `sample` into a true production grade `script`. It often doesn't start out with a clear direction and creating a separate file is just unnecessary ceremony when you can just tinker and export later when the `up-enter' pattern actually appears.

pimlottc•1w ago
I would perhaps call this “build.ts” instead. Calling it “make.ts” implies Makefile-like behavior, e.g. multiple targets, dependencies, conditional builds, etc.
kalterdev•1w ago
People are beginning to realize that prompt is a bottleneck and design problem, that traditional workarounds (history, aliases, complex autocomplete) don’t work. But running scripts in a traditional way, full file, from the top to the bottom, is inconvenient, too.

What about this instead: select any number of lines, in any file, and pass it through to the shell. You get convenience of text editing, file management, and shell’s straightforwardness.

(This approach was tried and cemented in Acme, a text editor from Bell Labs.)

tracker1•1w ago
I tend to keep a run/ directory in my projects (used to be _scripts to find easier, but now I favor the write-friendly ./run/dev or ./run/dbup, etc).

Some of the scripts are bash, but many are TypeScript via Deno... it's great that you can reference your dependency modules directly as well as not needing a separate install step like node. Most of my shell scripting is now in Deno.

In fact, now VS just added shebang detection for TS files without the .ts extension... So I don't even need that little extra to edit properly anymore. It works great as a shell scripting language.