Either LFM2.5-1.6B-4bit or Qwen3.5-2B-8bit or Qwen3.5-4B-4bit
Though, I don't see any references to Gemma at all in the open source code...
It requires a Firefox add-on to act as a bridge: https://addons.mozilla.org/en-US/firefox/addon/ai-s-that-hel...
There is honestly not much to test just yet, but feel free to check it out here, provide feedback on the idea: https://codeberg.org/Helpalot/ais-that-helpalot
The essence works, I was able to let it make a simple summary on CMS content. So next is making it do something useful, and making it clear how other plugins could use it.
Also: "Your AI agent can now create, edit, and manage content on WordPress.com" https://wordpress.com/blog/2026/03/20/ai-agent-manage-conten...
I'm talking about connecting Ollama to your wordpress.
Not via MCP or something that's complicated for a relatively normal user. But thanks for the link.
If the new Wordpress feature would allow for connecting to Ollama, then there is no need anymore for my plugin. But I don't see that in the current documentation.
So for now, I see my solution being superior for anyone who doesn't have a paid subscription, but has a decent laptop, that would like to use an LLM 'for free' (apart from power usage) with 100% privacy on their website.
For when wordpress doesn't have enough exploits and bugs as it is. Also why bother with wordpress in the first place if you're already having an LLM spit out content for you ?
You can check the code for exploits yourself. And other than that it's just your LLM talking to your own website.
> Also why bother with wordpress in the first place
Weird question, but sure, I use WordPress, because I have a website that I want to run with a simple CMS that can also run my custom Wordpress plugins.
https://github.com/Arthur-Ficial/apfel
Apple Ai on the command line
Then moved to pocket pal now for local llm.
Come onnnnnn. I would rather read a one line "Check out our offline llm" rather than a whole press release of slop.
This looks very neat. I'm not familiar with the nitty gritty of AI so I really don't understand how it can reply so quickly running on an iPhone 16. But I'm not even going to bother searching for details because I don't want to read slop.
moqster•57m ago
Going to give this a try...
_factor•45m ago
Does this seem sound?
ahofmann•38m ago
stonogo•31m ago
this seems self-contradictory
yolo_420•31m ago
zaphod12•9m ago
mschulze•36m ago
glitchc•11m ago
gwerbret•4m ago