This project is about closing that gap. Three MCP servers give the agent direct access to real hardware interfaces: a debug probe (flash firmware, halt CPU, read registers and memory), a serial console (boot logs, CLI commands), and BLE.
Structured tools, not shell commands, so the agent can reason about hardware state the same way it reasons about code.
The latest demo: deploying a TFLite Micro keyword spotting model on an nRF52840 from a single terminal session. The agent flashed firmware, debugged a hard fault, switched to CMSIS-NN optimized kernels, and right-sized the tensor arena.
End result: 98ms end-to-end latency, 94.6% accuracy on real recordings from the Google Speech Commands dataset.
This is part of a broader series on giving AI agents direct access to hardware: https://es617.github.io/let-the-ai-out/