Hilarious take.
Usage is in the eye of the user, I see.
Waiting for Show HN: AbrasionNext, the framework evolution for frontend devs, with SaaS cloud deployment.
its already HERE https://loco.rs/
I writing a production level app right now with it
Trying to bring that with Slint: https://slint.rs
This is not happening. The new folks have moved to SPA/RSC and a RoR type framework doesn't make much sense in that context.
It's a CLI tool that makes API calls. I'd bet my bottom dollar that the performance difference between API-wrapping CLI tools in something like Ruby/Python vs Rust/C++ is negligible in perceived experience.
If they wanted people to not have a dependency on Node pre-installed, they could have shipped Single Executable Application's [0] or used a similar tool for producing binaries.
Or used Deno/Bun's native binary packaging.
[0] - https://nodejs.org/api/single-executable-applications.html
It's often parallel processing of I/O (network, filesystem) and computational tasks like testing and compiling code.
I did this for several projects, works great with much lower costs and compute/memory usage.
I suspect its not much because I never see any stats published by any of these companies.
> Aider writes most of its own code, usually about 70-80% of the new code in each release. These statistics are based on the git commit history of the aider repo.
interesting model. every prompt response acceptance gets a git commit without human modifications .
If codex was half as good as they say it is in the presentation video, surely they could’ve sent a request to the one in chatgpt from their phone while waiting for the bus, and it would’ve addressed your comments…
And now everyone is rewriting everything in Go/Rust left and right.
Especially interesting for software that are 99.9% of the time waiting for inference to come back to you. Sure, makes sense to rewrite something that is heavily relying on the CPU, or where you want an easier time to deal with concurrency, but I feel like that's not what makes Codex takes long time to finish a prompt.
With that said, I also rewrote my local LLM agent software to Rust, as it's easier to deal with concurrency compared to my initial Python prototype. That it compiles into a neat binary is an additional benefit, but could have as easily gone with Go instead of Rust.
In a different domain, I’ve seen a cli tool that requests an oauth token in Python be rewritten to rust and have a huge performance boost. The rust version had requested a token and presented it back to the app in a few milliseconds, but it took Python about five seconds just to load the modules the oauth vendor recommends.
That’s a huge performance boost, never mind how much simpler it is to distribute a compiled binary.
I agree it’s hell. But I’ve not found many comprehensive packaging solutions that aren’t gnarly in some way.
IMHO the Python Packaging community have done an excellent job of producing tools to make packaging easy for folks, especially if you’re using GitHub actions. Check out: https://github.com/pypa/cibuildwheel
Pypa have an extensive list of GitHub actions for various use cases.
I think most of us end up in the “pure hell” because we read the docs on how to build a package instead of using the tools the experts created to hide the chaos. A bit like building a deb by hand is a lot harder than using the tools which do it for you.
In my opinion, bundling the application Payload would be sufficient for interpreted languages like python and JavaScript
Even if a GC'ed language like Go is very fast at allocating/deallocating memory, Rust has no need to allocate/deallocate some amount of memory in the first place. The programmer gives the compiler the tools to optimize memory management, and machines are better at optimizing memory than humans. (Some kinds of optimizations anyway.)
Module import cost is enormous, and while you can do lots of cute tricks to defer it from startup time in a long-running app because Python is highly dynamic, for one-time CLI operations that don’t run a daemon or something there’s just nothing you can do.
I really enjoy Python as a language and an ecosystem, and feel it very much has its place…which is absolutely not anywhere that performance matters.
EDIT: and there’s a viable alternative. Python is the ML language.
"The existing code base makes certain assumptions -- specifically, it assumes that there is automatic garbage collection -- and that pretty much limited our choices."
this is next Trends
An announcement of Codex CLI being rewritten in C++ would be met with confusion and derision.
Why would you say this for Rust in particular?
The comment you linked is talking about unspecified application's runtime errors.
When I have a smallish application, with tests, written in one language, letting an LLM convert those files into another language is the single task I'm most comfortable handing over almost entirely. Especially when I review the tests and all tests in the new language are passing.
Note that most of these rewrites wouldn't be needed if the JIT language would be Java, C#, Common Lisp, Dart, Scheme, Raket.
And all of that list also have AOT compilers, and JIT cache tooling.
If this catches on, and more tools get the "chatgpt, translate this into rust, make it go brrr" treatment, hopefully someone puts in the time & money to take tauri that extra 10-20% left to make it an universal electron replacement. Tauri is great, but still has some pain points here and there.
I'd argue Rust and Go are even easier to work with than Python/JS/TS. The package management is better, and static linked native binaries eliminate so many deployment headaches.
Go used to try and let you do it, but has walked back those implementations after all the bugs they've caused, in my understanding.
Sometimes people really need to follow Yoda and Mr Miyagi advices, instead of jumping right into it.
Go and Rust prove you can get most of the benefit of C/C++ without paying that complexity cost.
Seems like confirmation bias.
Likewise, you can make a single file distribution of a TypeScript program just fine. (Bun has built in support even.) But people don't think of it as a "thing" in that ecosystem. It's just not the culture. TypeScript means npm or Electron. That's the equivalence embedded in the industry hive mind.
To be clear, I'm not decrying this equivalence. Simplification is good. We use language as a shorthand for a bundle of choices not necessarily tied to language itself. You can compile Python with Nuitka or even interpret C. But why would you spend time choosing a point on every dimension of technical variation independently when you could just pick a known good "preset" called a language?
The most important and most overlooked part of this language choice bundle is developer culture. Sure, in principle, language choice should be orthogonal to mindset, areas of interest, and kinds of aptitude. But let's be real. It isn't. All communities of humans being and Go developers evolve shared practices, rituals, shibboleths, and priesthoods. Developer communities are no exception.
When you choose, say, Rust, you're not just choosing a language. You're choosing that collection of beliefs and practices common among people who like to use that language. Rust people, for example, care very much about, say, performance and security. TypeScript people might care more about development speed. Your C people are going to care more about ABI stability than either.
Even controlling for talent level, you get different emphases. The Codex people are making a wire format for customizing the agent. C people would probably approach the problem by making a plugin system. TypeScript people would say to just talk to them and they'll ship what you need faster than you can write your extension.
Sometimes you even see multiple clusters of practitioners. Game development and HFT might use the same C++ syntax, but I'd argue they're a world apart and less similar to each other than, say, Java and C# developers are.
That's why language debates get so heated: they're not just expressing a preference. They're going to war for their tribe. Also, nothing pisses a language community off more than someone from a different community appropriating their sacred syntax and defiling it by using it wrong.
Codex isn't so much swapping out syntax as making a bet that Rust cultural practices outcompete TypeScript ones in this niche. I'm excited to see the outcome of this natural experiment.
Unfortunately that in an utopia that will never realise.
People will keep learning programming languages based on hearsay, whatever books they find somewhere, influencers and what not.
We are in the middle of a transition in programming paradigms.
Let the AI coding models flamewars start.
Claude Code tends to write meaningless tests just to get them to pass—like checking if 1 + 1 = 2—and somehow considers that a job well done.
> Optimized Performance — no runtime garbage collection, resulting in lower memory consumption
Introducing the list (efficiency resonates with me as a more specific form of performance):
> Our goal is to make the software pieces as efficient as possible and there were a few areas we wanted to improve:
the others ("zero dependencies") are not actually related to efficiency
Efficiency is the top level goal, and that equates directly to performance in most computing tasks: more efficiency means being able to let other work happen. There's times where single threaded outright speed is better, but usually in computing we try as hard as possible to get parallelism or concurrency in our approaches such that efficiency can directly translate to overall performance.
Overall performance as a bullet seems clear enough. Yes it's occluded by a mention of GC, but I don't think the team is stupid enough to think GC is the only performance factor win they might get here, even if they don't list other factors.
Even a pretty modest bit of generosity makes me think they're doing what was asked for here. Performance very explicitly is present, and to me, I think quite clearly a clear objective.
If it's not possible today, what are the challenges and where does a human need to step in and correct the model?
This is interesting, because the current Codex software supports third-party model providers. This makes sense for OpenAI Codex, because is is the underdog compared to Claude Code, but perhaps they have changed their stance on this.
[edit] Seems that this take is incorrect; the source is in the tree.
Reviewing the source for this tree, looks like it's been in public development for a fair amount of time, with many PRs.
I would bet it took more wall-clock time to type out that comment than it would have for any number of AI agents to snap the required equivalent of `if not re.match(...): continue` into place
// TODO: Verify server name: require `^[a-zA-Z0-9_-]+$`?
There may be several elements of server name verification to perform.That regex does not cover the complete range of possibilities for a server name.
The author apparently decided to punt on the correctness of this low-risk test -- pending additional thought, research, consultation, or simply higher prioritization.
This needs admin permissions, which means a ticket with IT and a good chance it'll be rejected since it's scary as it'll open up the door to many admin level installs of software that IT has no control over.
Installing node under WSL is a better approach anyway, but that'll make it harder for enterprise customers still.
https://nodejs.org/en/download
I never used nvm.
If someone doesn't get this, it is a skill issue.
Yes, if I spent more time learning these things, it would become simple but that seemed like a massive waste of time.
Architecting the original 100kloc program well requires skill, but that effort is heavily front loaded.
It's a way to close off codex. There's no point in making a closed source codex if it's in typescript. But there is if it's in rust.
This is just another move to make OpenAI less open.
lioeters•1d ago
laurent_du•1d ago
yahoozoo•1d ago
littlestymaar•1d ago
tux3•1d ago
The neat thing for me is just not needing to setup a Node environment. You can copy the native binary and it should run as-is.
Wowfunhappy•1d ago
satvikpendem•1d ago
qsort•1d ago
The big one is not having node as a dependency. Performance, extensibility, safety, yeah, don't really warrant a rewrite.
koakuma-chan•1d ago
wrsh07•1d ago
Also, ideally your lightweight client logic can run on a small device/server with bounded memory usage. If OpenAI spins up a server for each codex query, the size of that server matters (at scale/cost) so shaving off mb of overhead is worthwhile.
jacob019•1d ago
mrweasel•1d ago
There shouldn't be a reason why you couldn't and it would give you performance and zero dependency install.
wiseowise•1d ago
Astral folks are taking notes. (I wouldn't be surprised if they already have a super secret branch where they rewrite Python and make it 100x faster, but without AI bullshit like Mojo).
littlestymaar•1d ago
Edit: ah, I see, I read “LLM” instead of LLVM at first! It's only after I posted my question that realized my mistake.
I'm not sure it makes sense to compile JavaScript natively, due to the very dynamic nature of the language, you'd end up with a very slow implementation (the JIT compilers make assumptions to optimize the code and fall back to the slow baseline when the assumptions are broken, but you can't do that with AoT).
mrweasel•1d ago
That's a good point, maybe TypeScript would be a better candidate.
crabmusket•1d ago
For what it would take to compile TS to native code, check out AssemblyScript.