Does anyone actually enjoy this uhm style of writing?
So, to start: someone wants me to install Postman/similar and pay real money to share and make a request? Absolutely not. I can read the spec from Swagger, or whatever, too... and write down what was useful [for others]. We all have cURL or some version of Python.
Surely a few phrases of text worth making plans to save, and paying for [at least twice, you to research and them to store], are worth putting into source control. It's free, even gifts dividends. How? Automation that works faster than a human pushing a button. Or creates more buttons!
> The tools you need are simple. They're fast. They're reliable. They've been battle-tested by millions of people for years. Just fucking use them.
Do you ask people "Do you actually enjoy talking like that?" every time you hear a curse word?
Sigh. No moral panic involved and I don’t care if people swear. I asked about the style for a reason.
It’s a bit like if someone makes technical posts written in archaic English or in pirate speak. They’re free to do so of course but it’s still a weird choice given context
You define all your requests in a plaintext format and can inject variables etc... plus the name is kinda funny.
It doesn't need to render a fucking Chromium instance to make a web request. It doesn't depend on a service to run. It doesn't require an "Enterprise" subscription for basic features.
So I'd say it meets all of the criteria except being on your machine already.The 'egregious' things are charging to share what will fit very well in SCM (preventing real automation)... and breaking due to Online First/only. It makes sense to require the endpoint I'm talking to. Why would Postman need AWS/us-east-1 [0] for a completely unrelated API? Joyful rent-seeking.
cURL, your suggestion (hurl), or HTTPie all make far more sense. Store what they need in the place where you already store stuff. Profit, for free: never go down. Automate/take people out of the button-pushing pattern for a gold star.
https://github.com/pashky/restclient.el
I also like httpie but they seem to have gone commercial.
While I like curl, this is highly subjective, some people just prefer a GUI that can guide you and/or be visually explored.
This whole piece also reads like someone is quite angry at people preferring a different workflow than them. Some aspects, like shell history, are also not the magic bullet they propose here as it doesn't, e. G., cover the actual responses.
Curl's ability to do almost everything is a minor curse here too as it means that any documentation (man pages, options help) is very large.
Not to detract from you point; just to say that the author is probably not as angry as this could make them seem.
This guy would say "just use bash" and ignore the average user experience.
_plz() {
curl [rest of common args]
}
Then: _plz GET endpoint
One moment you have a properly quoted JSON string, the next moment you have a list of arguments, oops you need to escape the value for this program to interpret it right, but another program needs it to be double-escaped, is that \, \\, or \\\\? What subset of shell scripting are we doing? Fish? modern Linux bash, macOS-compatible bash? The lowest common denominator? My head is spinning already!
If I want to script something I'm writing Python these days. I've lost too much sleep over all the "interesting" WTF situations you get yourself into with shell scripting. I've never used Hurl but it's been on my radar and I think that's probably the sweet spot for tasks like this.
I guess parsing cmd line outputs would be annoying. Would be a worth while library to write
It’s a meme that originated with https://thebestmotherfucking.website/ or one like it.
people needing/using curl would have been a very distinct subset of users.
My response to this article is the same one I have to anyone out here screaming "it's so easy to just do it my way!": if it's so easy, then do it for me!
The ffmpeg fans are the loudest screechers I've found, in this area. They'll point to the trainwreck of a UX that is Handbrake as an example of GUI for terminal commands. And, look - command line utilities are great and Handbrake is a super good product that functions well and does more than I'd ever want it to. But neither of those things are the same thing as having good UX.
If it's so easy to compile a bunch of shell scripts and store them in a directory in a git repo, then package together a bunch of them that every dev would need, sprinkle in a few with placeholders that most devs would need (with some project-specific input), and then serve them to me in a composable GUI that, in real time, builds the command to be run in my shell. Let me watch the clicks edit the command, and then let me watch the "submit" execute the command. There's no surer way for me to learn exactly what commands I need than to see them all the time. And if I have to learn them (so that I can use them) BEFORE I've seen them - in context - a few times at least, then I'm going to have a much harder time remembering. UX, when done right, helps the user.
Put simply: if I can do everything I would be able to do with postman using curl, then I should also be able to wrap curl in a thin DearImgui window that is reactive to user input. And if it's as easy as the author says, their time would have probably been better spent just making the GUI wrapper app and presenting it as a way to get better with curl, rather than writing an edgelordy article about it.
Also they’re more stable than anything else. You can coast for decades on a script.
But text is very versatile. Adding another layer on top is losing that versatility. And while graphics is nice, symbolic manipulation is on a whole other level.
So fo a closed, and I guess small, you can have gui for intuitiveness. But if you want expressivity, you need symbols and formalism.
But there’s one thing that still beat Graphic in terms of intuitiveness. Tacticality. I’d bet that it’s way faster for a person to learn a physical car dashboard than a touchscreen one.
To me it seems like the complexity is just irreducible. There's so many formats, so many bits and pieces that can go in a video stream, they're not very visualizable, and they have surprising edge case interactions. Not to mention there's a lot of "normally the program figures this out for you, but there's an option to override it if broken" knobs and dials.
Now, all of that aside, I do like Handbrake and I do think it offers a ton of functionality with so little friction that it's one of my very favorite and most-used apps. No login, no project setup, no x, no y, no z. Just a thin wrapper around a badass command line utility, with tons of options for users to override, and sensible defaults. There's a lot to love about Handbrake!
But "my grandma can use it", or "a plumber can use it", or "a person who doesn't understand the technicals and just needs to do one stupid thing that the app can definitely do, can use it" are signs of good UX. You wouldn't say any of those things about Handbrake.
Although, it is correct for the article's mention of "Send POST requests"... just that typically people don't send POST requests out of the blue with no data.
-X POST is not wrong, it's just superfluous when using other flags like -d where the method can be inferred.
POST requests are often sent with no data (anything that is not idempotent should, unless there's another verb that could fit better).
The author delves a bit more into the issue.
> One of most obvious problems is that if you also tell curl to follow HTTP redirects (using -L or --location), the -X option will also be used on the redirected-to requests which may not at all be what the server asks for and the user expected. Dropping the -X will make curl adhere to what the server asks for. And if you want to alter what method to use in a redirect, curl already have dedicated options for that named --post301, --post302 and --post303!
Per the man page (`man 1 curl`),
> The method string you set with -X, --request will be used for all requests, which if you for example use -L, --location may cause unintended side-effects when curl does not change request method according to the HTTP 30x response codes - and similar.
`-d` and `--data` will appropriately change the headers of their requests. Funnily, `--post301` and `--post302` which have a similar effect as `-X POST` are RFC 7231 compliant, browsers just don't do that. [2][3] This is so ubiquitous that the error codes 307 and 308 were added to support the original behavior of repeating the request verbatim at the target address. Compare the following:
> nc -l -p 8080 -q 1 <<< $'HTTP/1.1 301 Moved Permanently\nLocation: http://localhost:8081\n\n' & nc -l -p 8081 -q 1 <<< $'HTTP/1.1 200 OK\nContent-Length: 0\n\n' & curl -L --data test localhost:8080; wait
> nc -l -p 8080 -q 1 <<< $'HTTP/1.1 301 Moved Permanently\nLocation: http://localhost:8081\n\n' & nc -l -p 8081 -q 1 <<< $'HTTP/1.1 200 OK\nContent-Length: 0\n\n' & curl -X POST -L --data test localhost:8080; wait
> nc -l -p 8080 -q 1 <<< $'HTTP/1.1 308 Permanent Redirect\nLocation: http://localhost:8081\n\n' & nc -l -p 8081 -q 1 <<< $'HTTP/1.1 200 OK\nContent-Length: 0\n\n' & curl -L --data test localhost:8080; wait
What happens here:1. In the 301 case with just `--data`, the request turns into a GET request when sent to the redirect.
2. In the 301 case with `-X POST`, the request stays a `POST` request, but doesn't send any data to the redirect.
3. Finally, in the case where the server returns a 308, we see the POST request is kept and the data is resent.
To further expand slightly on a different thing that might surprise some people, the data options will automatically set the content type by adding the header, `Content-Type: application/x-www-form-urlencoded`, as if sending form data from a browser. This behvaior can be overridden with a manual `-H`, `--header` argument (e.g., `-H 'Content-Type: application/json`).
Edit: cube00 pointed out that newer versions of curl than mine have `--json` which will do that automatically. [4]
[0]: https://daniel.haxx.se/blog/2015/09/11/unnecessary-use-of-cu...
[1]: https://www.rfc-editor.org/rfc/rfc7231
[2]: https://evertpot.com/http/301-moved-permanently
(Clarified regarding certificate)
For quick and easy http requests, httpie has been fantastic.
Does anyone have tips for how to make it more useful? Maybe I could grep better for options? For example in the link, the author lists out common curl commands like making a POST request or adding a header. If you tried to look through the manpage for this, this would take a long time.
There's another utility called tldr that does a better job of this by providing common usage examples that almost always instantly give me what I need, but its not nearly as comprehensive as man.
function cheat() { curl cht.sh/$1 }
Then in terminal you can use the following to see the examples: $ cheat curl
You can search a man page by pressing the '/' key, typing in what you want, and pressing 'enter'. 'n' jumps to the next instance of your search string 'N' jumps to the previous instance.
For example, I just was digging into BSD_auth and authenticate, and I don't know much about how auth works generally. I found it pretty tough to grok from the man page. I love the idea of learning everything from directly within the system and man pages, but I might just not be smart enough for that.
The issue with "survivor" software is that UX cannot be refined due to legacy support and that's what's great about curl itself is that libcurl and CLI front-end are separate tools allowing for alternative modern front-ends.
With curl I end up finding the command becomes hard to read, even taking advantage of backslashes. With Postman, it tidily hides the token out of the way on a separate tab and gets out of my way.
A while ago I was working on a DSL to solve this exact issue (env switching, http requests + chained requests e.g. to an auth server to retrieve a token) - but I haven't had the time recently, and I moved jobs to a GraphQL shop, so it feels a bit more pointless now :D
That was exactly what I needed this morning.
Way back when Postman was but a mere Chrome plugin, I spent a lot longer than I'd have liked fighting with a request that should have been logging GET requests but wasn't. Imagine my surprise when I found that it was following Chrome's caching rules and not actually making my requests despite me intentionally firing off those requests. If only I had just used cURL...
While curl is fine, most of the time I use the REST Client extension in VS Code. While VS Code is an Electron monstrosity, assuming you already have it, that extension is less than 3MB.
Even the full-feature GUI extensions like Thunder Client are scarcely bigger.
Hate VS Code, and never let your hands touch anything other than vim or emacs? Fine, there's a number of extensions that run in the browser that do the same thing.
$ jo details[author]=Dickens details[age]=dead books="$(jo -a 'Oliver Twist' 'Great Expectations')"
{"details":{"author":"Dickens","age":"dead"},"books":["Oliver Twist","Great Expectations"]}
I didn't think I've the successfully used curl in my life though. Every time there's confusion about parameters. It's always been was faster to just write a quick python script that uses requests.
Plus the author can be a bit special. One of the most overrated pieces of software on the planet
GUIs are (usually) slow and break the flow. They cater for the bottom of the pile of users, in detriment of the top.
> Q: But Postman has testing and automation!
> A: So does cURL in a shell script with || and && and actual programming languages. You want assertions? Pipe to grep or write a 3-line Python script. Done.
IMO if you're reaching for an "actual programming language" it's probably time to put curl down and switch to libcurl or whatever native equivalent is in that language.
- If you are writing a script that is more than 100 lines long, or that uses non-straightforward control flow logic, you should rewrite it in a more structured language now. Bear in mind that scripts grow. Rewrite your script early to avoid a more time-consuming rewrite at a later date.
- When assessing the complexity of your code (e.g. to decide whether to switch languages) consider whether the code is easily maintainable by people other than its author.
When we questioned the Google engineer assigned to support us, he snickered and said "you can trust it".
deafpolygon•8h ago
Xenoamorphous•8h ago
CaptainOfCoit•8h ago
Not sure how some developers could be so allergic to the terminal, don't you already spend a lot of time there?
troupo•8h ago
Xenoamorphous•8h ago
Who says I'm allergic to the terminal? I already stated that I use curl.
I could also ask why are some developers so allergic to any kind of UI. And they're very vocal about it. Just use whatever you want.
CaptainOfCoit•7h ago
Preferring "a couple of clicks" vs "run one command" seems to indicate so, otherwise I'm not sure why'd someone would prefer the former instead of the latter.
Xenoamorphous•7h ago
Actually I don't even create those collections, we have OpenAPI/Swagger docs for all of our APIs and I just import them with a couple of clicks (which I'm sure there's a way to do with curl).
For the odd requests, and sharing requests with others? I use curl, no problem. I actually think I know it pretty well and very rarely need to look up any docs for it.
CaptainOfCoit•6h ago
No, I don't (what a shitty strawman), I create abstractions then, like any other project. Surely you don't have hundreds of completely original and bespoke requests? Previously I've handled thousands of requests by having a .csv to load from.
Xenoamorphous•4h ago
Absolutely I do. It’s not like a few hundred endpoints is out of the ordinary in any mid sized company.
I can go an edit any of the requests with autoformatting, highlighting and whatnot.
As I say, you can keep your csv and abstractions, not trying to convince you that you should switch from whatever works for you.
skydhash•8h ago
The thing with simple tools is that bootstrapping is easy and versatile.
qsort•8h ago
CaptainOfCoit•8h ago
andoando•7h ago
Just save your requests in separate script and organize them.
And now you can run them from anywhere, including other scripts
cluckindan•8h ago
I use curl liberally and also tend to create scripts around it to perform common tasks, but I still get why someone would prefer a GUI.
pjc50•8h ago
People absolutely will pay for software rather than reading or thinking, if it makes doing the work easier. You may have heard of this thing called chatgpt.
(not being a web developer, I've only lightly used Postman, and it is definitely handy for things like authentication. Especially once you touch OAuth. But I uninstalled it once they went unnecessarily cloud)
eloisius•8h ago
All that said, I wouldn’t touch Postman. Last time I needed something to fit this bill I looked around to find the open source equivalent and found Bruno.
hi41•8h ago
Is there a way to send the json request that one sends in Postman but in curl while also using the jks file?
Similarly, we use SoapUI to send XML requests. Is there a way to send those XML requests using curl while also using the jks file?
Greatly appreciate your help.
cluckindan•8h ago
holletron•8h ago
blueflow•8h ago
This is why, people do not want to bother with the docs.
troupo•8h ago
Then it turned into a monstorsity.
antisthenes•8h ago
Plain text file?
skydhash•8h ago
Works everywhere.
It could be a script or a markdown with code blocks. I believe there’s wrapper with a more codified formats like .http.
andoando•7h ago
users/create.sh, users/delete.sh, etc
deafpolygon•7h ago
ebiester•7h ago
nucleardog•8h ago
The page even sort of acknowledges this... saying you manage your environments with environment variables. It doesn't mentioned how to extract data from the response, just jq for syntax highlighting. No explanation of combining these two into any sort of cohesive setup for managing data through flows of multiple requests. No mention anywhere on the page of working with an OpenAPI spec... many of the tools provide easy ways to import requests instead of manually reentering/rebuilding something that's already in a computer-readable format.
So the tl;dr here is "use cURL, and then rebuild the rest of the functionality in bash scripts you idiot".
I went down this path of my own accord when Insomnia was no longer an option. I very quickly found myself spending more time managing bash spaghetti than actually using tools to accomplish my goals.
That's why I use a full blown dedicated API client instead of a HTTP client. (Not Postman though. Never Postman.)