If there is a pocket-sized compact hardware that hosts large-size open-source LLMs that you can connect offline, wouldn't it be helpful?
The benefits:
- You can use large-size open-source LLMs without using up your PC or smartphone's compute
- You can protect your privacy
- You can use high performance LLM offline