Either LFM2.5-1.6B-4bit or Qwen3.5-2B-8bit or Qwen3.5-4B-4bit
very limited amount of use cases, perhaps. As a generalized chat assistant? I'm not sure you'd be able to get anything of value out from them, but happy to be proven otherwise. I have all of those locally already, without fine-tuning, what use case could I try right now where any of those are "very effective"?
Claude Code is a Desktop app as well.
Though, I don't see any references to Gemma at all in the open source code...
It requires a Firefox add-on to act as a bridge: https://addons.mozilla.org/en-US/firefox/addon/ai-s-that-hel...
There is honestly not much to test just yet, but feel free to check it out here, provide feedback on the idea: https://codeberg.org/Helpalot/ais-that-helpalot
The essence works, I was able to let it make a simple summary on CMS content. So next is making it do something useful, and making it clear how other plugins could use it.
Also: "Your AI agent can now create, edit, and manage content on WordPress.com" https://wordpress.com/blog/2026/03/20/ai-agent-manage-conten...
I'm talking about connecting Ollama to your wordpress.
Not via MCP or something that's complicated for a relatively normal user. But thanks for the link.
If the new Wordpress feature would allow for connecting to Ollama, then there is no need anymore for my plugin. But I don't see that in the current documentation.
So for now, I see my solution being superior for anyone who doesn't have a paid subscription, but has a decent laptop, that would like to use an LLM 'for free' (apart from power usage) with 100% privacy on their website.
For when wordpress doesn't have enough exploits and bugs as it is. Also why bother with wordpress in the first place if you're already having an LLM spit out content for you ?
You can check the code for exploits yourself. And other than that it's just your LLM talking to your own website.
> Also why bother with wordpress in the first place
Weird question, but sure, I use WordPress, because I have a website that I want to run with a simple CMS that can also run my custom Wordpress plugins.
https://github.com/Arthur-Ficial/apfel
Apple Ai on the command line
Then moved to pocket pal now for local llm.
Come onnnnnn. I would rather read a one line "Check out our offline llm" rather than a whole press release of slop.
This looks very neat. I'm not familiar with the nitty gritty of AI so I really don't understand how it can reply so quickly running on an iPhone 16. But I'm not even going to bother searching for details because I don't want to read slop.
Have a comparison chart to Ollama, LMStudio, LocalAI, Exo, Jan.AI, GPT4ALL, PocketPal, etc.
moqster•1h ago
Going to give this a try...
_factor•1h ago
Does this seem sound?
ahofmann•1h ago
stonogo•1h ago
this seems self-contradictory
PurpleRamen•33m ago
yolo_420•1h ago
zaphod12•45m ago
deltoidmaximus•25m ago
https://en.wikipedia.org/wiki/Comparison_of_OTP_applications
mschulze•1h ago
glitchc•48m ago
gwerbret•41m ago
mschulze•35m ago
yomismoaqui•9m ago
So you look down you see a tortoise. It's crawling towards you.