Image generation is still a joke.
And if you ever want to snap yourself out of the AI hype bubble, ask any model to generate a simple electronic schematic.
What amazes me is that people still hold this bias that somehow LLMs can't do complex. I mean languages are pretty complex. Understanding intent and responding accordingly in most languages, in most slangs / idioms / frfr nocaps, is quite complex in of itself. And we can all agree that LLMs today get intent. If you ask for a poem you get a poem, if you ask for a piece of code you get a piece of code. It might not be the best poem, or the best code, but intent -> output is pretty much solved.
Then it amuses me that somehow devs think their "language" is complex. Again, we do these things in meta languages to simplify things. Vocabularies are limited, the set of things we can call into are limited, and everything is somewhat standardised and heavily documented along the way. To think this is somehow more complicated or more complex than free language is funny to me.
Then there's the tooling. It has finally caught up with the best you could do ~1 year ago with libraries and lots of glue. "Agentic" now is much easier. Working in an existing codebase is easier. Agents can now search (semantic, AST, etc), can read files, write in specific files, diff, use tools, and so on. Every layer helps.
I'm happy to see people changing their minds. I've talked to a lot of skeptics over the past year. I've done the "here, let me show you" with a couple of friends. They all eventually changed their minds. I think a lot of people have visceral reactions to any hype. And they overcompensate with denial. I've heard it before, I'll hear it again...
incomingpain•3h ago
I built the new one with django. So prompting chatgpt to give me the general code and then I clean it up for my needs to was sooooo good. I got it completed in a few days which is insane.
Its absolutely insane if you dont use AI to code.