Is there a future where we can expect people to just buy "AI" from BestBuy, like a TV set? It ll probably come with some model preloaded - cheaper if open-source, premium pricing for frontier lab models. The hardware is basically a bunch of GPUs enough for local inference.
Take it home and plug it into your home network and you can open a chat instance by going to the IP on any local device. You can give it access to internet if you want. Maybe it can also receive OTA updates.
Curious how others think about this - does local-first AI feel like a possibility? What are the economic and social challenges with this?