"RAM footprint: ~8MB on an empty session, ~12MB when working"
I like this, Claude Code is using multiple gigabytes, which is really annoying on lowend laptops
tecoholic•27m ago
Yes. Just this fact is going to make a lot of people try it out.
marknutter•24m ago
Isn't that because of the context window size?
SatvikBeri•18m ago
The context window has nothing to do with RAM usage and even if it did, a million tokens of context is maybe 5mb.
gidellav•10m ago
Hi, I'm the developer of zerostack!
No, the memory footprint is not beacuse of the context window size: on my benchmarks, with a 128k context loaded, and it jumped from 8MB (without any chat/context loaded) to 11MB.
The reasons why the memory footprint of zerostack are:
- Rust, and not JS/Python, so no interpreters/VMs on top
- Load-as-needed, so we only allocate things like LLM connectors when needed
- `smallvec` used for most of the array usage of the tool (up to N items are stored in stack)
- `compactstring` used for most of the string usage of the tool (up to N chars are stored in stack)
- `opt-level=z` to force LLVM to optimize for binary size and not for performance (even tho we still beat both in TTFT and in tool use time opencode)
The context window is not on your system. It's on the server with the model. There may be some local prompt caching, of some sort, but you're not locally hosting the context unless you're also locally hosting the model.
hparadiz•28m ago
this is what I've been waiting for
a low level language. please no more scripting language TUIs!
iknowstuff•18m ago
Isn’t codex in rust?
schaefer•14m ago
There has been no reason to wait...
Codex is written in rust.
--
So is deepseek-tui.
hparadiz•11m ago
Forgot to add an open source qualifier. I use codex lol
andxor•8m ago
Codex is also opensource.
sergiotapia•20m ago
Given agent harnesses affect so much of the performance of models, it would be great to see some kind of benchmark on how this tool performs compared to claude/codex/opencode/pi etc.
hiAndrewQuinn•15m ago
The codebase was small enough that I handed it over to DeepSeek v4 Flash in Pi to skim through for any risky business, and I didn't find anything concerning. Nice work.
gidellav•7m ago
Thanks! Funny enough, a good chunk of the coding was done by Deepseek v4 Flash, while I hand-wrote a couple of the TUI logic, as deepseek kept failing on certain cursor-moving logic, and I fully managed the memory optimization process (as you can read on another comment I left, it both a set of compiler optimizations and usage of certain Rust crates in order to leverage more efficient data structures).
kadoban•2m ago
> I handed it over to DeepSeek v4 Flash in Pi to skim through for any risky business
Doesn't prompt injection make that a rather flimsy investigation?
throwa356262•32m ago
I like this, Claude Code is using multiple gigabytes, which is really annoying on lowend laptops
tecoholic•27m ago
marknutter•24m ago
SatvikBeri•18m ago
gidellav•10m ago
The reasons why the memory footprint of zerostack are:
- Rust, and not JS/Python, so no interpreters/VMs on top
- Load-as-needed, so we only allocate things like LLM connectors when needed
- `smallvec` used for most of the array usage of the tool (up to N items are stored in stack)
- `compactstring` used for most of the string usage of the tool (up to N chars are stored in stack)
- `opt-level=z` to force LLVM to optimize for binary size and not for performance (even tho we still beat both in TTFT and in tool use time opencode)
- heavy usage of [LTO](https://en.wikipedia.org/wiki/Interprocedural_optimization#W...)
SwellJoe•6m ago