I’ve seen many benchmarks on OLAP performance, but I wanted to better understand the practical impact for myself, especially for LLM applications. This is my first attempt at building a benchmarking tool to explore that.
It runs some simple analytical queries against ClickHouse, Postgres, and Postgres with indexes. To make the results more tangible than just a chart of timings, I added a "latency simulator" that visualizes how the query delay would actually feel in a chat UI.
With a 10M row dataset: ClickHouse queries are sub-second, while Postgres takes multiple seconds.
This is definitely a learning project for me, not a comprehensive benchmark. The data is synthetic and the setup is simple. The main goal was to create a visual demonstration of how backend latency translates to user-perceived latency. Feedback and suggestions are very welcome.
chutes•2h ago
this is a neat project. most intriguing to me was that OLAP performance for an OLTP style query was worse.
This really highlights the place for _both_ OLTP and OLAP DBs.
OLTP: when you need to select one
OLAP: when you need to select the world
oatsandsugar•2h ago
Also for metadata queries, OLTP was faster. But these were the difference between the blink of an eye and two blinks of an eye.
oatsandsugar•2h ago
It runs some simple analytical queries against ClickHouse, Postgres, and Postgres with indexes. To make the results more tangible than just a chart of timings, I added a "latency simulator" that visualizes how the query delay would actually feel in a chat UI.
With a 10M row dataset: ClickHouse queries are sub-second, while Postgres takes multiple seconds.
This is definitely a learning project for me, not a comprehensive benchmark. The data is synthetic and the setup is simple. The main goal was to create a visual demonstration of how backend latency translates to user-perceived latency. Feedback and suggestions are very welcome.