Are they seeing a worthwhile niche for the tinkerers (or businesses?) who want to run local LLMs with middling performance but still need full set of GPIOs in a small package? Maybe. But maybe this is just Raspberry jumping on the bandwagon.
I don't blame them for looking to expand into new segments, the business needs to survive. But these efforts just look a bit aimless to me. I "blame" them for not having another "Raspberry Pi moment".
But now if I want some low power linux PC replacement with display output, for the price of the latest RPi 5, I can buy on the used market a ~2018 laptop with a 15W quad core CPU, 8GB RAM, 256 NVME and 1080p IPS display, that's orders of magnitude more capable. And if I want a battery powered embedded ARM device for GPIO over WIFI, I can get an ESP32 clone, that's orders of magnitude cheaper.
Now RPi at sticker price is only good for commercial users since it's still cheaper than the dedicated industrial embedded boards, which I think is the new market the RPI company caters to. I haven't seen any embedded product company that doesn't incorporate RPis in its products now, or at least in their lab/dev/testing stage, so if you can sell your entire production stock to industrial users who will pay top dollar, why bother making less money selling to consumers, just thank them for all the fish. Jensen Huang would agree.
And the thin clients when they are for sale tend to have their SSDs ripped out by IT for data security, so then it's a hassle to go out and buy and extra SSD, compared to just buying a used laptop that already comes with display , keyboard, etc.
But it won't be as reliable, mostly motherboards won't last long.
The ticking timebomb lemons with reliability or design issues, will just die in the first 2-4 years like clockwork, but if they've already survived 6+ years without any faults, they'll most likely be reliable from then on as well.
I'm also currently building a small device with 5" touchscreen that can control a midi fx padle of mine. It's just so easy to find images, code and documentation on how to use the GPIO pins.
Might be niche, but that is just what the Pi excels at. It's a board for tinkers and it works.
I'm in the market to replace my aging Intel NUCs, but RPi is still cheaper.
Awful how? A SBC can take advantage of many software written from the dawn of x86.
Almost nothing useful runs in 8.
This is the problem with this gen of “external AI boards” floating around. 8, 16, even 24 is not really enough to run much useful, and even then (ie. offloading to disk) they're so impractically slow.
Forget running a serious foundation model, or any kind of realtime thing.
The blunt reality is fast high memory GPU systems you actually need to self host are really really expensive.
These devices are more optics and dreams (“itd be great if…”) than practical hacker toys.
I wouldn't dare suggest that. But they need to target a "minimum viable audience". Otherwise they'll just Rube-Goldberg themselves into irrelevance. This hat is a convoluted way to change the parameters of an existing compromise and turn it into a different but equally difficult compromise. Worse performance, better efficiency, adds cost.
> the idea of miniaturising
If you aren't ditching the laptop you aren't miniaturizing, just splitting into discrete specialized components.
Case closed. And that's extremely slow to begin with, the Pi 5 only gets what, a 32 bit bus? Laughable performance for a purpose built ASIC that costs more than the Pi itself.
> In my testing, Hailo's hailo-rpi5-examples were not yet updated for this new HAT, and even if I specified the Hailo 10H manually, model files would not load
Laughable levels of support too.
As another datapoint, I've recently managed to get the 8L working natively on Ubuntu 24 with ROS, but only after significant shenanigans involving recompiling the kernel module and building their library for python 3.12 that Hailo for some reason does not provide outside 3.11. They only support the Pi OS (like anyone would use that in prod) and even that is very spotty. Like, why would you not target the most popular robotics distro for an AI accelerator? Who else is gonna buy these things exactly?
I was able to run a speech to text on my old Pixel 4 but it’s a bit flaky (the background process loses the audio device occasionally). I just want to take some wake word and then send everything to remote LLM and then get back text that I do TTS on.
I was only using it for local Home Assistant tasks, didn't try anything further like retrieving sports scores, managing TODO lists, or anything like that.
TinyML is a book that goes through the process of building a wake word model for such constrained environments.
I fail to see the use-case on a Pi. For learning you can have access to much better hardware for cheaper. Perhaps you can use it as a slow and expensive embedding machine, but why?
... why though? CV in software is good enough for this application and we've already been doing it forever (see also: Everseen). Now we're just wasting silicon.
1. Can I run a local LLM that allows me to control Home Assistant with natural language? Some basic stuff like timers, to do/shopping lists etc would be nice etc.
2. Can I run object/person detection on local video streams?
I want some AI stuff, but I want it local.
Looks like the answer for this one is: Meh. It can do point 2, but it's not the best option.
2. Has been possible in realtime since the first camera was released and has most likely improved since. I did this years ago on the pi zero and it was surprisingly good.
dwedge•1h ago
8GB RAM for AI on a Pi sounds underwhelming even from the headline