Most "vibecoding" tools (LLM-assisted coding tools) are built into an IDE like VSCode. Earlier this year, a Reddit post inspired me to implement a vibecoding tool with an IRC-style terminal interface.
Highlights:
* Terminal-only interface (no IDE required)
* The agent independently navigates the file system, edits files, views git diffs, and runs unit tests until satisfied with its changes.
* Model-agnostic (It can use any LLM by putting an API key in a .env file)
The behavior of LLM coding agents can be flaky, but I was able to use the tool successfully. I even used it to extend itself, by asking the AI to design and implement new tools for it to use.
I haven’t worked on it for a few months (enjoying the summer in Mexico), but I’d love feedback, ideas, or collaborators.
nairbv•2h ago
Most "vibecoding" tools (LLM-assisted coding tools) are built into an IDE like VSCode. Earlier this year, a Reddit post inspired me to implement a vibecoding tool with an IRC-style terminal interface.
Highlights:
* Terminal-only interface (no IDE required) * The agent independently navigates the file system, edits files, views git diffs, and runs unit tests until satisfied with its changes. * Model-agnostic (It can use any LLM by putting an API key in a .env file)
The behavior of LLM coding agents can be flaky, but I was able to use the tool successfully. I even used it to extend itself, by asking the AI to design and implement new tools for it to use.
I haven’t worked on it for a few months (enjoying the summer in Mexico), but I’d love feedback, ideas, or collaborators.