It's adjusted to not just give answers, but (perhaps frustratingly for the student), force them to iterate through something to get an answer.
Like anything it's likely also jail-breakable, but as we've learned with all software, the defaults matter.
IMHO, I think feeling frustration is the whole point -- it's how our brains rewire themselves, it's how we know we are learning, and it's how we build up true grit to solve harder problems.
As we want to "feel the burn" in the gym, we want to "feel the frustration" when learning.
Just like we see posts here about how AI (at the very least AI on its own) is ineffective at coding a product, these students eventually learn what the Wharton study had proven, that AI is not effective at getting them the grade they want.
I know I'm lazy. I try shortcuts like AI, copying Wikipedia before that, hoping just punching number into a ti-86 would solve my problems for me. They simply don't.
I am curious to dig into "Generative AI Can Harm Learning"[0], referenced in the article. I think the summary in the article skips over some of the subtleties in the abstract though.
0: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4895486
> Our results suggest that students attempt to use GPT-4 as a "crutch" during practice problem sessions, and when successful, perform worse on their own. Thus, to maintain long-term productivity, we must be cautious when deploying generative AI to ensure humans continue to learn critical skills.
I think the caution is the use of AI to circuit the real learning, even if AI is in a tutor mode, to avoid building up true grit.
Ultimately, in writing this article, my hope was that a student would read it and get angry, angry that over use of AI - using it as a crutch - is actually having a negative impact on their learning, and they would resolve to using it only for more efficiency and effectiveness, not a substitution for the true learning.
I was thinking of Richard Feynman’s approach to learning when writing this article. He was a genius, so I didn't want the analogy to be unrelatable. However, he really enjoyed understanding the first principles and that enjoyment gave him such a solid foundation. He put in the necessary hours to learn, and what a remarkable life he enjoyed because of it.
I also agree with the title and its implications.
But hype is hype, and humans like to ride the hype.
ffdixon1•8mo ago
Quick dopamine hits. Immediate satisfaction. Long-term learning deficits.
How to break this cycle? I wrote this article to try to answer this question.
dtagames•8mo ago
I had the epiphany that all of the "AI's problems" were problems with my code or my understanding. This is my article[0] on that.
[0] https://levelup.gitconnected.com/mission-impossible-managing...
hackyhacky•8mo ago
Here's my experience as a professional educator: AI tools are used not as shortcuts in the learning process, but for avoiding the learning process entirely. The analogy is therefore not to junk food, but to GLP-1, insofar as it's something that you do instead of food.
Students can easily use AI tools to write a programming project or an essay. It's basically impossible to detect. And they can pass classes and graduate without ever having had to attempt to learn any of the material. AI is already as capable as a university student.
The only solution is also hundreds of years old: in person, proctored exams. On paper. And moreover: a willingness to fail those students who don't keep up their end of the bargain.
j7ake•8mo ago
hackyhacky•8mo ago
I agree: they're great, if you have that luxury. But they don't scale.
petesergeant•8mo ago
hackyhacky•8mo ago
dymk•8mo ago
trilbyglens•8mo ago
JohnKemeny•8mo ago
hackyhacky•8mo ago
danpalmer•8mo ago
In education today there's a lot of focus on knowledge and testing, and therefore it's fairly natural for AI to be used to just answer questions instead of as a learning aid. If we had a focus more on understanding, I'd hope that use of AI would be more exploratory, with more back and forth to help students learn in a way that works for them. After all, if LLMs are basically just text calculators, every student having a concept explained to them in exactly the language they need would be pretty amazing.
ffdixon1•8mo ago
For learning, I think having an Oreo cookie (using AI) is OK once in a while, especially if your hitting a wall and can't get through, but it's a really, I think, a very steep slippery slope that leads to avoiding the learning process altogether.
I remember as a co-op student spending three days solving a particularly subtle bug in a C-based word processor. My grit was rewarded. On day three, I vividly remember staring at the code and the solution just popped into my head. That was one of the most formative experiences in my earlier years as a developer and feeling of elation never left me. I worry that AI will take these moments, especially early in ones career.
Our brains have not changed in hundreds of years, and I agree that the in person experience is actually the best. Humans learn best from humans. I'm trying to learn French, and Duo has been sad for a few weeks due to my absence, but its not having the same effect on me if it were a human French teacher was was sad with me.
Regarding failing students, I personally had to take summer school twice and still ended up failing grade 12 and repeating the entire school year. Why? I was too focused on computers and nothing else. In retrospect, taking summer school and repeating grade 12 actually helped me catch up at time when the stakes were low. If I hadn't, I would have definitely failed later in life when the costs were higher.