Like any tech business/product LLMs rely on constant presence. Posting low value "Look Ma What I Told the LLM to Do!" articles keeps it in the spotlight. Ignorance is bliss however.
It's like Poochy on the simpsons (might be too old a reference) - everyone hated him and Homer suggests that when Poochy's not around, everyone should be asking "where's Poochy?" I hope these useless features suffer the same fate (Poochy dies).
I can filter out the dumb "Llama 3.4 changes everything" posts, but the dumb ads, for ostensible productivity features (that everyone knows aren't) are destroying my productivity.
See also: Microsoft Clippy
AI is still in its early stages and is already disrupting jobs and work. Many enterprises that were laggards in technology adoption are now embracing AI use cases. Every week breaks an assumption that was made in the previous one. It's all tiring and demanding to keep up. Perhaps there is some hype surrounding agents and the extent to which they can deliver, but the big bet is that upcoming models will outperform their predecessors.
From a b2b perspective, agents are already making a difference in enterprises. So there's substance behind the hype.
Still I think instead of getting tired of it being in the public spotlight now, it makes sense to just upgrade your inner filters just like we unconsciously ignore other noise as it is much more than hype. As Karpathy said, it will eat through both traditional software and neural networks. Imagine when he introduced the term software 2.0 [1] and someone kept doing everything in C++ as they didn't want to "buy into the hype".
On a side note: I really like Karpathy’s evolving software concept, especially since these categories weren’t just invented now to explain genAI's place. Rather, genAI has now been assigned as software 3.0, when his distinction between software 2.0 and software 1.0 was done already in 2017
1. training costs are going to continue to escalate much faster then the models improve.
2. the current paradigm of trained ahead of time token completion turns out not to be a pathway to "AGI" or even anything much more then we have now. As this becomes clear investment dries up.
3. without investor support user's are forced to pay full price for LLM queries and a lot of things that are super useful when this shit is subsidized up the wazoo are not worth it at full price and when people are given the choice of stopping using LLMs or ponying up more cash people realize they can get by without them.
4. Everyone who was hyping LLMs moves on to the next shiny thing and all money for "AI", even for useful stuff like old fashioned machine learning, dries up and we end up somewhat backwards from where we started except GPUs are cheap again assuming the crash doesn't take down NVIDIA.
JustBreath•5h ago
And also in the way that it's hard to tell what we as the average people should do to prepare and about it.