I think the current mentality of "Make every process in life as easy and time-efficient as possible" is the problem.
AI is just a tool. What someone does with it is up to them. The current desire to not do anything, however, means people will abuse AI to make their lives more segregated from the work that enables them.
As technology progresses, people are less connected to the how and why of life. This leads to people not understanding how to do basic things. Nobody can do anything on their own and they have to pay money to someone for really basic stuff. People can hardly go grocery shopping anymore as it takes too much time. Peak capitalism?
Really just watch Idiocracy. AI isnt the problem; people's desire to do as little as possible is the problem.
In practice, they just spend all that saved time scrolling tiktok.
YOUR job doesn’t pay you to be curious.
Well, you could say mine doesn’t either, literally, but the only reason I am in this role, and the driving force behind my major accomplishments in the last 10 years, has been my curiosity. It led me to do things nobody in my area had the (ability|foolishness) to do, and then it led me to develop enough improvements that things work really well now.
This is part of what good teaching is about! The most brilliant engaged students will watch a lecture and think “wow nice I understand it now!” and as soon as they try to do the homework they realize there’s all kinds of subtleties they didn’t consider. That’s why pedagogical well crafted assignments are so important, they force students to really learn and guide them along the way.
But of course, all this is difficult and time consuming, while having a “conversation” with a language model is quick and easy. It will even write you flowery compliments about how smart you are every time you ask a follow up question!
Sort of the Feynman method but with an LLM rubber duck.
There's few things more annoying than a human that thinks it has the most accurate and up-to-date AI-level knowledge about everything.
10 bucks there will be a law to enforce exponential backoff so that you need to get good after a few questions before the LLMs delays things by an hour
And please excuse my language. I probably watch George Carlin videos a bit too much.
> For example, a 2018 analysis by researchers at Northwestern University and the University of Oregon found that average IQ scores in the U.S. began declining slightly after 1995, particularly in younger generations. This reversal mirrors findings in several European countries, including Norway, Denmark, and the UK.
My favorite. Love the guy. Too bad he is dead.
"We may soon look back on this era of TikTok, Love Island and Zack Polanski as an age of dignity and restraint."
What is 20 PRs per day worth.
Engineers will literally burn the world if it means looking good for their employers.
The same thing is happening/will happen with AI. If you don't go through the hard brain work of thinking things up for yourself, especially writing, your writing skills will deteriorate. We'll see that in a giant scale as more and more kids lean on ChatGPT to "check their homework".
AI is too late for the party. Mission already accomplished
As usual, it comes down to parenting. Bad parents will blame AI for their kids being stupid, just as they blame TikTok or whatever today.
If that sounded silly, it was exactly what they said would happen, when that came to pass (I grew up in the last generation where they weren't allowed. I know lots of folks younger than me, that I think are smarter than I am).
If we are looking just at IQ scores, then there's literally, billions of factors involved. It could be chemistry, nuclear radiation, malnutrition, stress, etc.
Most of that stuff is totally unpredictable, and we can only tell, after the fact.
Alea jacta est. There's nothing we can do about it. The candy ain't going back into the piñata. We'll just have to see what happens.
AI is a superfast internet search.
Imagine if you had that growing up. Instant access to any information with a professorial level of teaching and you could ask any question to clear up any confusion?
Our kids are going to be smarter than we could even imagine because getting access to any information they can imagine is instant taught by a perfect tutor.
But now they move around even more heavy stuff with machines.
I think something similar might happen to our brains. Maybe we won't be able to work ourselfs through every detail of a mathematical proof, of a software program, or a treatsie on philosophy. But we'll abe able to accomplish intellectual work that only really smart poeple could accomplish. I think this is what counts: outcome.
<< Fundamental skills like mental arithmetic, memorising text, or reading a map could soon be obsolete as cognitive offloading becomes a normal way of working.
Calculator, books, gps -- the three have been trotted out each time and some ( what passed for books in ancient days ) decried by otherwise smart people, who simply could not fanthom a different way of solving an issue. Worse, they offered no reason for:
1. Why do I need to calculate everything in my head? 2. Why do I need to memorize every passage? 3. Why do I need to remember every step?
So kids, who saw an improvement simply ignored the old men.. and good thing too. Otherwise, I might not even have been able to read beowulf ( literally ).
<< it’s also the desire among people in positions of authority and influence
Is it? Recent news suggested that execs of various tech corps limit their kids passive screen time ( so no doom scrolling, no social media ).
<< able to retain concentration so that we can learn and distinguish between what is real and what is AI slop
True, but in a sense that has always been true. If so, what is the real reason for this 'collection of words'?
<< The danger here is the separation of process from “product”. In the eyes of the utilitarian tech-evangelist, the essay is simply a product, a sequence of words to be generated as quickly as possible.
And here is the issue. Author is concerned that their words are no longer going to be special; note, not completely unlike certain monks upon learning about printing press. How quaint.
<< But the process of writing is itself constitutive of understanding.Writing is thinking. It is the act of retrieving knowledge, wrestling with syntax, and organising logic that forges understanding.
Have you read some of the articles out there ( including this one )? There is no wrestling there. There might ( I am being charitable ) be some thinking, but if there is logic OR understanding, it is not beyond what is required for serving the owner of the writer. That is all there is to it.
<< When AI produces the final text, the student is the ventriloquist’s dummy, mouthing words that originated elsewhere.
Well, I will be darned. This individual is just taking words out of my mouth, because I was about to say all those talking ( sorry, writing ) heads are just parroting one another with the origin of the sound ( sorry again, word ) clearly not coming from them..
<< They possess the answer but lack the understanding of how it was derived
So.. we ban encyclopedias?
<< We are also witnessing a kind of cognitive laziness which some of our institutions are actively encouraging.
I can give him that. It does take effort not to rely on it.
<< It requires the uncomfortable sensation of not knowing
But... but.. the author knows.. he just told us all what to think...
<< float on a sea of algorithmic slop they have neither the will nor the wit to navigate.
And this is different from now how exactly? Scale? Kids who want to read will read. Kids who want to learn, will learn.
***
Honestly, I am hard pressed not to say this article is slop. Not even proper AI slop like we would expect today ( edit: because at least that is entertaining ). This is lazy human slop. High and mighty, but based on 'old man yells at the cloud' vibes.
spwa4•2h ago
Oh, and the extreme brain drain the west imposed on everyone else, from South Africa to China, resulting in no available "brains", let's say, in those countries, and in the rich countries only brains available that aren't invested in making westerners smart, along with a disdain among existing populations of professions that require brains.
grugagag•2h ago
johnfn•2h ago
forgetfreeman•1h ago
NeutralCrane•1h ago
This is a huge assumption and not one I’m sure holds up. In my experience gaining competence is often more a matter of hands on experimentation and experience, and the thousands of pages of reference material are there to get you to the point where you can start getting hands on experience, and debug your experiments when they don’t work. If AI can meaningfully cut back on that by more efficiently getting people to the experimentation stage, it absolutely will be more effective. And so far in my limited experience, it seems extremely promising.
FrankyHollywood•1h ago
I think this is mostly about learning to think and develop grit.
As a kid when I wanted to play a game I had to learn dos commands, know how to troubleshoot a non functioning sounds blaster etc. Sometimes took me days to fix.
Doing this develops understanding of a domain, problem-solving skills and grit.
My kid just opens steam and everything works. Any question he has he asks AI. I am really curious what effect this will have on this generation. It is tempting to quickly say "they will be brain dead zombies" but that seems too simplistic.
In 20yrs we'll know!
spwa4•1h ago
(... and then presumably he applies what the AI tells him, occasionally asking why)
Frankly, this is a much better and targeted way to learn. If this is what happens, great!
I mean, I'd give him an intro how to pirate games, because
1) it's a technical challenge with a built-in reward
2) AIs (especially Gemini, but more and more ChatGPT too) refuse to help doing it
So a truly excellent pursuit for learning!
But I do feel it's very different from what happens with smartphones and that is desperately bad.
Fire-Dragon-DoL•54m ago
That sounds like a dedicated teacher though, not that bad?
Like asking questions and learning how and what questions to ask is an amazing skill