The codex backend is of good quality, the frontend is average, but most importantly it is too slow. I wonder if OpenAI will improve it.
Comments
esafak•3mo ago
Sonnet and Gemini are good and fast. Can't speak for Grok.
muzani•3mo ago
Grok Turbo is fast.
moomoo11•3mo ago
It seems to work with less issues than CC opus.
I don’t mind if it takes longer as long as the answer is correct more often.
You can always be doing more work while one chat is working..
naiv•3mo ago
the new 0.47 has a better performance now imho
i_have_an_idea•3mo ago
this seems like a crazy idea as the cli client has nothing to do with how many tokens per second the api streams
naiv•3mo ago
The api is never the bottleneck but how fast the cli provides context. So just by using ripgrep it will be faster than using grep. On top of this concurrent code search compared to sync etc
esafak•3mo ago
muzani•3mo ago