I like VIM. I like LLMs. I like to work offline. I don't like to pay LLM providers for every line of code I write, and I like to have total control of the context. So, I wrote a plugin for vim that lets me use local LLMs (quantized deepseek coder is surprisingly useful and fast), have total control of the context, and integrate with my normal vimming flow. This one doesn't do tab completion or FIM (see
https://github.com/ggml-org/llama.vim if you want a good local solution for that). Instead It's centered around building context through conversation and instruction, having the model answer questions and rewrite whole blocks or files of code. Simple keybindings and auto-copying to the numbered registers makes it easy to integrate with macros and (at least my) normal editing flow.