Receives OTLP, writes Parquet, exposes DuckDB SQL via CLI. No dashboards, no query builders, no alerting engine. An LLM agent shells out to ducktel query, gets JSON back, and does the reasoning itself. The thesis: everything observability platforms built above ingest+store was scaffolding for human cognition. When the consumer is an LLM, the scaffolding is just overhead.
Comments
guerython•2h ago
love the thesis. we built something similar that emits ingest events with request_id/tool_label/timestamp metadata, writes the stream to parquet, and lets the agent run a short summary query before choosing whether to surface an anomaly. keeping the rows small and tagged lets the agent reason about drift without a dashboard, and if someone wants to double-check you can replay the same query result into a human-focused view.
guerython•2h ago