They’ll kill (figuratively) almost all of us. Maybe they already have, and people just haven’t realized it. LLMs are «zombifying» us.
Sorry to depress your Sunday. Thoughts?
They’ll kill (figuratively) almost all of us. Maybe they already have, and people just haven’t realized it. LLMs are «zombifying» us.
Sorry to depress your Sunday. Thoughts?
jb_briant•6m ago
I have several agents runing in parallel, they require inputs every 1 to 5 minutes.
Im switching context all the time, and I must hold a larger context in my brain RAM instead of being focused on a single topic.
I'm not writing code anymore for web development.
On the other side, when doing game dev, with all the spatial geometry necessary, LLM are useless 97% of thr time.