This is not a cloud chatbot or a hosted-LLM wrapper. The focus is on building an assistant that lives with the user — understands personal context, reasons over local data, and can act autonomously without exporting private information off the device.
I’m interested in connecting with senior engineers who enjoy building AI systems in Rust, think in terms of orchestration and state, and care deeply about correctness, performance, and privacy.
San Francisco Bay Area preferred, as I believe early in-person collaboration matters.
Posting anonymously for privacy reasons — happy to share more in 1:1 conversations.
— Founder, SF Bay Area Contact: localai-founder@proton.me
repelsteeltje•1h ago
So something like triton server backends, but using rust rather that python or c++?
cajazzer•51m ago
I’m less interested in building a generic inference backend and more focused on how the system behaves over time — things like agent lifecycles, state, memory, and coordination. Inference is part of it, but not the whole thing.
Correctness, performance, and privacy matter because the system is long-lived and stateful, not just because it’s pushing tokens around.
repelsteeltje•38m ago