What bugged (lol) me is that I was spending too much time exporting data and manually correlating things. Earlier this year I tried feeding 5 years of data into a custom GPT, and the responses were actually (sort of) useful if it didn't choke.
So I built this as an MCP server, I had no idea how MCPs functioned so this was a good way to learn. It fetches fresh data on-demand rather than choking on a static dump, and the statistical analysis (correlation, outlier detection, trends) happens server-side before the LLM even sees it. When I ask "what predicts my best sleep?" it actually computes the answer instead of pattern-matching on vibes.
Works with Claude Desktop. Developed with Claude, see CLAUDE.MD TypeScript, ~600 tests, MIT licensed.