If you're interested, here's the keynote recording: https://www.youtube.com/live/U-fMsbY-kHY?t=3400s
container-use combines both forms of isolation: containers and git worktrees in a seamless system that agents can use to get work done.
On one end, I have been curious about getting multiple agents to work on the same branch, but realized I can just wait till they do that natively.
More so, all this feels like a dead end. I think OpenAI and github are right to push to remote development, so these don't matter. Eg, mark up a PR or branch in GitHub, and come back as necessary, and do it all from my phone. If I want an IDE, it can be remote ssh.
steeve•1d ago
dboreham•1d ago
sharifhsn•1d ago
beardedwizard•23h ago
nsonha•14h ago
It is not, name one software that has a LLM generating code on the fly to call APIs. Why do people have this delusion?
TeMPOraL•8h ago
If you tell a model it can use some syntax, e.g. `:: foo(arg1, arg2) ::`, to cause the runtime to call an API, and then, based on the context of the conversation, the model outputs `:: get_current_weather("Poland/Warsaw")`, that is "generating code on the fly to all APIs". How `:: get_current_weather("Poland/Warsaw")` gets turned into a bunch of cURL invocations against e.g. OpenWeather API, is an implementation detail of the runtime.
ErikBjare•7h ago
nsonha•4h ago
Surprisingly many people say this. I essentially ask them if they have seen a non-toy product that works like that, because everything is tool calling afak.