Well, looks like they sorted em out!
Boy would they only know 10 years later you don't even need to write tests anymore. Must feel like Sci-fi timeline if you warped one of these blog authors into our future
Also, a look at how our expectations / goalposts are moving. In 2010, one of the first "presentations" given at Deepmind by Hassabis, had a few slides on AGI (from the movie/documentary "The Thinking Game"):
Quote from Shane Legg: "Our mission was to build an AGI - an artificial general intelligence, and so that means that we need a system which is general - it doesn't learn to do one specific thing. That's really key part of human intelligence, learn to do many many things".
Quote from Hassabis: "So, what is our mission? We summarise it as <Build the world's first general learning machine>. So we always stress the word general and learning here the key things."
And the key slide (that I think cements the difference between what AGI stood for then, vs. now):
AI - one task vs. AGI - many tasks
at human level intelligence.
----
I'm pretty sure that if we go by that definition, we're already there. I wish I'd have a magic time traveling machine, to see Legg and Hassabis in front of gemini2.5/o3/whatever top model today, trained on "next token prediction" and performing on so many different levels - gold at IMO, gold at IoI, playing chess, writing code, debugging code, "solving" NLP, etc. I'm curious if they'd think the same.
But having a slow ramp up, seeing small models get bigger, getting to play with gpt2, then gpt3, then chatgpt, I think it has changed our expectations and our views on what is truly AGI. And there's a bit of that famous quote "AI is everything that hasn't been done before"...
Kuraj•1h ago
hinkley•1h ago
But iif you perfected it then it would also be the thing that actually kills software development. Because if I told you your whole job is now writing tests, you’d find another job.
nemomarx•52m ago