Dependancies introduce unnecessary LOC and features which are, more and more, just written by LLMs themselves. It is easier to just write the necessary functionality directly. Whether that is more maintainable or not is a bit YMMV at this stage, but I would wager it is improving.
Esp. with Go's quick compile time, I can see myself using it more and more even in my one-off scripts that would have used Python/Bash otherwise. Plus, I get a binary that I can port to other systems w/o problem.
Compiled is back?
I'm sure it will eventually be true, but this seems very unlikely right now. I wish it were true, because we're in a time where generic software developers are still paid well, so doing nothing all day, with this salary, would be very welcome!
So cross-platform vibe-coded malware is the future then?
Then it became a cat and mouse game with obfuscators and deobfucsators.
John Hammond has a *BRILLIANT* Video on this topic. 100% recommneded.
Honestly Speaking from John Hammond I feel like Nim as a language or V-lang is something which will probably get vibe coded malware from. Nim has been used for hacking so much that iirc windows actually blocked the nim compiler as malware itself!
Nim's biggest issue is that hackers don't know it but if LLM's fix it. Nim becomes a really lucrative language for hackers & John Hammond described that Nim's libraries for hacking are still very decent.
Frameworks might go the way of the dinosaur. If an LLM can manage a lot of complex code without human-serving abstractions, why even use something like React?
The Go standard library is a particularly good fit for building network services and web proxies, which fits this project perfectly.
Golang's libraries are phenomenal & the idea of porting over to multiple servers is pretty easy, its really portable.
I actually find Golang good for CLI projects, Web projects and just about everything.
Usually the only time I still use python uvx or vibe code using that is probably when I am either manipulating images or pdf's or building a really minimalist tkinkter UI in python/uv
Although I tried to convert the python to golang code which ended up using fyne for gui projects and surprisingly was super robust but I might still use python in some niche use cases.
Check out my other comment in here for finding a vibe coded project written in a single prompt when gemini 3 pro was launched in the web (I hope its not promotion because its open source/0 telemetry because I didn't ask for any of it to be added haha!)
Golang is love. Golang is life.
Same boat! In fact I used to (still do) dislike Go's syntax and error handling (the same 4 lines repeated every time you call a function), but given that LLMs can write the code and do the cross-model review for me, I literally don't even see the Go source code, which is nice because I'd hate it if I did (my dislike of Go's syntax + all the AI slop in the code would drive me nuts).
But at the end of the day, Go has good scaffolding, the best tooling (maybe on par with Rust's, definitely better than Python even with uv), and tons of training data for LLMs. It's also a rather simple language, unlike Swift (which I wish was simpler because it's a really nice language otherwise).
It turns out that verbosity isn't really a problem when LLMs are the one writing the code based on more high level markdown specs (describing logic, architecture, algorithms, concurrency, etc), and Go's extreme simplicity, small range of language constructs, and explicitness (especially in error handling and control flow) make it much easier to quickly and accurately review agent code.
It also means that Go's incredible (IMO) runtime, toolchain, and standard library are no longer marred by the boilerplate either, and I can begin to really appreciate their brilliance. It has me really reconsidering a lot of what I believed about language design.
Interestingly, since we are talking about Go specifically, I never found that I was spending too much typing... types. Obviously more than with a Python script, but never at a level where I would consider it a problem. And now with newer Python projects using type annotations, the difference got smaller.
Just FWIW, you don't actually have to put type annotations in your own code in order to use annotated libraries.
I mean people mention rust and everything and how AI can write proper rust code with linter and some other thing but man trust me that AI can write some pretty good golang code.
I mean though, I don't want everyone to write golang code with AI of all of a sudden because I have been doing it for over an year and its something that I vibe with and its my personal style. I would lose some points of uniqueness if everyone starts doing the same haha!
Man my love for golang runs deep. Its simple, cross platform (usually) and compiles super fast. I "vibe code" but feel faith that I can always manage the code back.
(self promotion? sorry about that: but created golang single main.go file project with a timer/pomodoro with websockets using gorilla (single dep) https://spocklet-pomodo.hf.space/)
So Shhh let's keep it a secret between us shall we! ;)
(Oh yeah! Recently created a WHMCS alternative written in golang to hook up to any podman/gvisor instance to build your own mini vps with my own tmate server, lots of glue code but it actually generated it in first try! It's surprisingly good, I will try to release it as open source & thinking of charging just once if people want everything set up or something custom
Though one minor nitpick is that the complexity almost rises many folds between a single file project and anything which requires database in golang from what I feel usually but golang's pretty simple and I just LOVE golang.)
Also AI's pretty good at niche languages too I tried to vibe code a fzf alternative from golang to v-lang and I found the results to be really promising too!
I wonder when they'll start offering virtual, persistent dev environments...
A lot of companies have been wanting to move in this direction. Instead of maintaining a fleet of machines, you just get a bunch of thin clients and pay Microsoft of whoever to host the actual workloads. They already do this 'kiosk' style stuff for a lot of front-line staff.
Honestly, not having my own local hardware for development sounds like a living hell, but seems like the way we are going.
Honestly I felt like it really bores me or (overwhelms?) me because now I feel like okay now I will do this, then that and then that & drastically expand the scope of the project but that comes with its own fatigue and the limits of free tokens or context with exe.dev so I end up publishing it on git provider, git ingest it paste it in web browser gemini ask it for updates (it has 1 million context) and then paste it with Opencode with an openrouter devstral key.
I used this workflow to drastically improve the UI of a project but like I would consider that aside from some tinkering, I felt like the "fun" of a project definitely got reduced.
It was always fun for me to use LLM's as I was in loop (Didn't use agents, copy paste workflow from web) but now agents kind of replicated that too & have gotten (I must admit) pretty good at it.
I don't know man, any thoughts on how to make such things fun again? When LLM's first came or even before using agents like this with just creating single scripts, It was fun to use them but creating whole projects with huge scope feels very fun sucking imo.
You can start a session there and chat with it to get a bunch of work done, then come back to that session a day later and the virtual filesystem is in the same state as when you left it.
I haven't figured out if this has a time limit on it - it's possible they're doing something clever with object storage such that the cost of persisting those environments is really low, see also Fly's Sprites.dev: https://fly.io/blog/design-and-implementation/
It appears to have 4GB of RAM and 56 (!?) CPU cores https://chatgpt.com/share/6977e1f8-0f94-8006-9973-e9fab6d244...
If people are getting this for free or even as an offering with chatgpt consideirng it becomes subsidized too. Lowend providers are a little in threat with their 7$/year deals if Chatgpt provides 56 cores for free. this doesn't seem right to provide so many cores for (free??)
Are you running this in your free account as you mention in blog post simon or in your paid account?
I used a free account to check if the feature was available there and it tried to get me to upgrade two prompts in (just enough for me to confirm the container worked and could install packages).
simonw•2h ago
I'm not sure when these new features landed because they're not listed anywhere in the official ChatGPT release notes, but I checked it with a free account and it's available there as well.