I've been using Claude Code and Codex pretty heavily for the past few months. My local session logs have grown to about 2GB each. The existing tool (ccusage) takes 17–20s to show a usage report, so I basically never check my usage.
Decided to rewrite it from scratch in Rust with parallel parsing and incremental caching. Here are the benchmarks against ccusage v18.0.8:
Claude (1,521 JSONL files, 2.2 GB): 0.08s vs 17.15s — 214x faster
Codex (91 JSONL files, 1.7 GB): 0.15s vs 20.76s — 138x faster
The feature I'm most proud of is `tu img` — it generates shareable image cards of your daily or weekly usage trend, great for flexing your token burn rate:
Also has a live TUI monitor (`tu live`), a GUI dashboard (`tu gui`), and standard CLI table output. Works with Claude Code, Codex, and Antigravity. Everything runs locally — no data leaves your machine.
Coming next: detailed per-session token reports combined with cost breakdown, and tools for analyzing and optimizing your token usage patterns. If that sounds useful, a star on GitHub helps me keep iterating: https://github.com/hanbu97/tokenusage
stratos123•1h ago
That's interesting. For 2GB of logs, 17s is somewhat slow but 0.08s seems straight up impossible; what makes ccusage slow and your approach fast?
hanbu97•1h ago
Good question – the 0.08s is warm cache. First run parses everything in parallel and caches by file mtime (~0.73s vs 17s, about 23x). After that it just loads cached counts, so you only pay for new/changed files (~0.08s vs 17s, 214x). ccusage re-parses from scratch every time with no caching, hence the gap.
hanbu97•1h ago
Decided to rewrite it from scratch in Rust with parallel parsing and incremental caching. Here are the benchmarks against ccusage v18.0.8:
The feature I'm most proud of is `tu img` — it generates shareable image cards of your daily or weekly usage trend, great for flexing your token burn rate: Also has a live TUI monitor (`tu live`), a GUI dashboard (`tu gui`), and standard CLI table output. Works with Claude Code, Codex, and Antigravity. Everything runs locally — no data leaves your machine.Install: `cargo install tokenusage --bin tu` | `pip install tokenusage` | `npm i -g tokenusage`
Coming next: detailed per-session token reports combined with cost breakdown, and tools for analyzing and optimizing your token usage patterns. If that sounds useful, a star on GitHub helps me keep iterating: https://github.com/hanbu97/tokenusage
stratos123•1h ago
hanbu97•1h ago