Under the hood it runs llama.cpp (via koboldcpp) backends and allows easy integration with the popular modern frontends like Open WebUI, SillyTavern, ComfyUI, StableUI (built-in) and KoboldAI Lite (built-in).
Why did I create this? I wanted an all-in-one solution for simple text and image-gen local LLMs. I got fed up with needing to manage multiple tools for the various LLM backends and frontends. In addition, as a Linux Wayland user I needed something that would work and look great on my system.
throwaway81998•10h ago
Just installed LM Studio on a new machine today (2025 Asus ROG Flow Z13, 96GB VRAM, running Linux). Haven't had the time to test it out yet.
Is there a reason for me to choose Gerbil instead? Or something else entirely?
A4ET8a8uTh0_v2•10h ago
<< Is there a reason for me to choose Gerbil instead? Or something else entirely?
My initial reaction is positive, because it seems to integrate everything without sacrificing being able to customize it further if need be. That said, did not test it yet, but now I will.
lone-cloud•7h ago
Now as for your question, I started out with LM studio too, but the problem is that you'll need to juggle multiple apps if you want to do text gen or image gen or if you want to use a custom front-end. As an example, my favorite text gen front-end is "open webui" which gerbil can automatically set up for you (as long as you have Python's uv pre-installed). Gerbil will allow you to run text, image and video gen, as well as set up (and keep updated) any of the front-ends that I listed in my original post. I could be wrong but I'm not sure if LM studio can legally integrate GLP licensed software in the same way that Gerbil can because it's a closed source app.
throwaway81998•6h ago