> cat clone with syntax highlighting and git integration
doesn't make any sense because cat is not really meant for viewing files. You should be comparing your tool with the more/less/most family of tools, some of which can already do syntax highlighting or even more complex transforms.
Actual LOL. Indeed. I was working for a large corporation at one point and a development team was explaining their product. I asked what its differentiators were versus our competitors. The team replied that ours was written in Go. #faceplam
An example from my personal experience: I used to think that oxipng was just a faster optipng. I took a closer look recently and saw that it is more than that.
See: https://op111.net/posts/2025/09/png-compression-oxipng-optip...
Learn the classic tools, learn them well, and your life will be much easier.
Many folks nowadays don't get how lucky they are, not having to do UNIX development on a time-sharing system, although cloud systems kind of replicate the experience.
Only to feel totally handicapped when logging in into a busybox environment.
I'm glad I learned how to use vi, grep, sed..
My only change to an environment is the keyboard layout. I learned Colemak when I was young. Still enjoying it every day.
Agreed, but that doesn't stop you from using/learning alternatives. Just use your preferred option, based on what's available. I realise this could be too much to apply to something like a programming language (despite this, many of us know more than one) or a graphics application, but for something like a pager, it should be trivial to switch back and forth.
Awk and sed.
I like the idea of new tools though. But knowing the building blocks is useful. The “Unix power tools” book was useful to get me up to speed.. there are so many of these useful mini tools.
Miller is one I’ve made use of (it also was available for my distro)
Not everybody is a sysadmin manually logging into lots of independent, heterogeneous servers throughout the day.
Same goes for a bunch of other tools that have "modern" alternatives but the "classic" ones are already installed/available on most default distribution setups.
apt-get/pacman/dnf/brew install <everything that you need>
You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.
> or SSH anywhere
When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.
> even use a mix of these on my personal computer and the traditional ones elsewhere
I can't see the problem, really. I use some of those tools and they are convenient, but it doesn't matter that I can't work without that. For example, bat: it doesn't replace cat, it only outputs data with syntax highlight, makes my life easier but if I don't have it, ok.
The point is that sometimes you're SSHing to a lightweight headless server or something and you can't (or can't easily) install software.
But yes, in the eventual case that I don't have Nix I can very much use the classic tools. It is not a binary choice, you can have both.
Some of them are smart but sometimes I want dumb, for example, ripgrep respects gitignore, and often, I don't want that. Though in this case, there is an option to turn it off (-uuu). That's a common theme with these tools too, they are trying to be smart by default and you need option to make them dumb.
So no, these tools are not "objectively superior", they are generally more advanced, but it is not always what you need. They complement classic tools, but in no way replace them.
I genuinely don't know what is going on here.
> I genuinely don't know what is going on here.
I basically live in my music library. However, every single pop artist offers songs that I don't like, are not in my library, and mysteriously have many millions of albums sold.
I genuinely don't know what is going on here.
Joking aside, have you ever tried to use some of these tools ? I use to not understand why people where using vim until I really tried.
No.
> I use to not understand why people where using vim until I really tried.
There's your problem. I respectfully suggest installing Emacs.
I use the command line a lot too and this is one of my most common commands, and I don't know of an elegant way to do it with the builtin Unix tools.
(And I have basically the same question for finding files matching a regex or glob [ignoring the stuff I obviously don't want], e.g., `fd '.foo.*'`.)
Hits in hidden files is not really a pain point for me
find . -type f -name "*.foo" | grep -v '/\.' | xargs grep bar
(This one I could do from muscle memory.)If traversing those hidden files/directories were expensive, I'd tell `find` itself to exclude them. This also lets me switch `xargs` for `find`'s own `-exec` functionality:
find . -type f -not -path '*/\.*' -name "*.foo" -exec grep bar {} +
(I had to look that one up.) - bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.
- alias ls='ls -Fh' , problem solved. Now you have * for executables, / for directories and so on.
- ncdu it's fine, perfect for what it does
- iomenu it's much faster than fzf and it almost works the same
- jq it's fine, it's a good example on a new Unix tool
- micro it's far slower than even vim
- instead of nnn, sff https://github.com/sylphenix/sff with soap(1) (xdg-open replacement) from https://2f30.org create a mega fast environment. Add MuPDF and sxiv, and nnn and friends will look really slow compared to these.
Yes, you need to set config.h under both sff and soap, but they will run much, much faster than any Rust tool on legacy machines.It's useless as a cat replacement, I agree. The article really shouldn't call it that, although the program's GitHub page does self-describe it as "a cat clone". It's more of a syntax highlighter combined with a git diff viewer (I do have an issue with that; it should be two separate programs, not one).
I can't see bat as a "useless cat" or a replacement for cat except for reading source code in the terminal. It's more a like a less with syntax highlight or a read-only vim.
btop has been pretty good for watching a machine to get an overview of everything going on, the latest version has cleaned up how the lazy CPU process listing works.
zoxide is good for cding around the system to the same places. It remembers directories so you avoid typing full paths.
That’s because they’re not GNU coreutils, they’re BSD coreutils, which are spartan by design. (FWIW, this is one of my theories for why Linux/GNU dominated BSD: the default user experience of the former is just so much richer, even though the system architecture of the latter is arguably superior.)
I know I have hyperfine, fd, and eza on my Windows 11, and maybe some more I cannot remember right now.
They are super easy to install too, using winget.
`fd`: first I find that the argument semantic is way better than `find`, but that is more a bonus than a real killer feature. Now, it being much, much faster than `find` on most setup, I would consider a valuable feature. But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`
`rg` a.k.a `ripgrep` : Honestly it is just about the speed. It is so much faster than `grep` when searching through a directory, it opens up a lot of possibilities. Like, searching for `isLoading` on my frontend (~3444 files) is instant with rg (less than 0.10s) but takes a few minutes with grep.
But there is one other thing that I really like with `ripgrep` and that I think should be a feature of any "modern" CLI tool : It can format its output in JSON. Not that I am a big fan of JSON, but at least it is a well-defined exchange format. "Classic" CLI tool just output in a "human-readable" format which might just happen to be "machine-readable" if you mess with `awk` and `sed` enough. But it makes piping and scripting just that much more annoying and error & bug prone. Being able to output json, `jq` it and feed it to the next tool is so much better and feel like the missing chain of the terminal.
The big advantage of the CLI is that it is composable and scriptable by default. But it is missing a common exchange format to pass data, and this is what you have to wrangle with a lot of time when scripting. Having json, never mind all the gripes I have with this format, really join everything together.
Also, honorable mention for `zellij` which I find to be a much saner UX-wise alternative to `tmux`, and the `helix` text editor, which for me is neo-vim but with, again, a better UX (especially for beginner) and a lot more battery included feature while remaining faster (IMEX) than nvim with matching plugin for feature-parity.
EDIT: I would also add difftastic ( https://github.com/Wilfred/difftastic ) which is a syntax aware diff tool. I don't use it much, but it does makes some diff so so much easier to read.
[0] https://github.com/sharkdp/fd?tab=readme-ov-file#placeholder...
Then I tried them and it was such a night and day performance difference that they're now immediate installs on any new system I use.
That was inherited from find, it has "-exec". Even uses the same placeholder, {}, though I'm not sure about {.}
Got featured here on HN few weeks ago.
I have a very limited set of additional tools I tend to install on systems, and they are in my default ansible-config, so will end up on systems quickly, but I try to keep this list short and sweet.
95% of the systems I manage are debian or ubuntu, so they will use mostly the same baseline, and I then add stuff like ack, etckeeper, vim, pv, dstat.
As a greybeard linux admin, I agree with you though. This is why when someone tells me they are learning linux the first thing I tell them is to just type "info" into the terminal and read the whole thing, and that will put them ahead of 90% of admins. What I don't say is why: Because knowing what tooling is available as a built-in you can modularly script around that already has good docs is basically the linux philosophy in practice.
Of course, we remember the days where systems only had vi and not even nano was a default, but since these days we do idempotent ci/cd configs, adding a tui-editor of choice should be trivial.
I just enjoy seeing others incrementally improve on our collective tool chest. Even if the new tool isn’t of use to me, I appreciate the work that went into it. They’re wonderful tools in their own right. Often adding a few modern touches to make a great tool just a little bit better.
Thank you to those who have put in so much effort. You’re making the community objectively better.
_ZeD_•2h ago
exa modern replacement for ls/tree, not maintained
"not maintained" doesn't smell "modern" to me...
arccy•1h ago
eza: https://github.com/eza-community/eza
abenga•1h ago
Yeeeah, nope.
CaptainOfCoit•50m ago
selectnull•33m ago
Hendrikto•1h ago
throw_a_grenade•1h ago
JohnKemeny•1h ago