I am an avid terminal user who sees value in prompts executing like, and having the UX of native CLI programs, i.e., with --help, argument parsing, stdin/stdout, and composability via pipes.
So I came up with a tool (not vibe-coded, built over 4+ months) where you write a .prompt file with a template (Handlebars-style), enable it with promptctl enable, and it becomes a command you can run in your terminal. For example:
So I built a tool where you write a .prompt file with a template (Handlebars-style), enable it with `promptctl enable`, and it becomes a command you can run:
cat compose.yml | askdocker "add a load balancer"
ai-analyze-logs --container nginx
It supports multiple providers (Anthropic, OpenAI, Ollama, OpenRouter, Google), load balancing across them, response caching, and custom model "variants" with different system prompts.The feature I am most excited about is:
promptctl ssh user@host
This makes all your local prompt commands available on the remote machine, but execution happens locally. The remote server never needs API keys, internet access, or any installation. It works by forwarding the prompts over the SSH connection.I wrote this in Rust with 300+ commits in already. Would love feedback, especially on the SSH workflow.