The purpose of higher education should be to learn things that will be useful to you (most likely in a career). However, the current purpose is to gain a piece of paper which will mean your job application doesn't get immediately thrown out.
People being willing to spend so much time and money on university only to deliberately avoid learning or thinking by using AI to cheat on everything suggests that the system itself is broken.
These students don't actually want to be in university but feel they have to in order to have a chance at success in the current job market. We are in a prisoner's dilemma where everyone is getting degrees just to be a more appealling applicant than the next person. You might have authored a very impressive opensource library but still not get the junior software dev job because HR never gave your CV to the hiring manager since you don't have a STEM degree and 50 other applicants did.
However, I don't really know how university's will evolve from this or what this new system will be. It seems hard to motivate a bunch of 18 year olds to actually want to learn stuff without dangling a piece of paper and exams at the end. Maybe that's just a symptom of all of the levels of education that come before university also dangling paper and exams. There were certainly parts of my degree I would have, at the time, liked to have skipped with AI but now (older and wiser) I'm very glad I couldn't.
Education is not just "buying" a certification to open doors. This part I'm happy to get rid off.
But those students aren't going to be using AI to skip all the learning. The article and just about everyone in higher education right now are saying that a large number of students are doing that. So, there must be a large number of students who are primarily motivated by piece of paper (and the job opportunity it provides).
That doesn't mean that they must be completely disinterested in their subject. They might have some lectures they really like and where they do the coursework properly. However, the epidemic of AI cheating speaks to the inefficiency created by the need for the piece of paper. If someone is essentially skipping 80% of the learning with AI then the job market requiring you to have a piece of paper is causing someone to waste 80% of their time and money. They would be better served by a short course teaching them only that 20% of skills they actually want.
The social side of things isn't something I was really addressing in this context. To me, that's a bonus of university. Given the cost, it doesn't seem worth going to university primarily for a social experience (unless you live somewhere where it's free). I also really hope that AI isn't affecting these social aspects.
That doesn't feel right. I thought that several groups were against the popularization of writing through the times. Wasn't Socrates against writing because it would degrade your memory? Wasn't the church against the printing press because it allowed people to read in silence?
Sorry for the off-topic.
Again I think this is a pretty narrow theory that Hershock gets some good mileage out of for what he's looking at but isn't a great fit for understanding this issue. The extremely naive "tools are technologies we have already accepted the changes from" has about as much explanatory power here. But also again I'm not a philosopher or a big Hershock proponent so maybe I've misread him.
Technology is neutral it’s always been neutral it will be neutral I quote Bertrand Russell on this almost every day:
“As long as war exists all new technology will be utilized for war”
You can abstract this away from “war” into anything that’s undesirable in society.
What people are dealing with now is the newest transformational technology that they can watch how utilizing it inside the current structural and economic regime of the world accelerates the already embedded destructive nature of structure and economic system we built.
I’m simply waiting for people to finally realize that, instead of blaming it on “AI” just like they’ve always blamed it on social media, TV, radio, electricity etc…
it’s like literally the oldest trope with respect to technology and humanity some people will always blame the technology when in fact it’s not…it’s the society that’s a problem
Society needs to look inward at how it victimizes itself through structural corrosion, not look for some outside person who is victimizing them
I agree with a lot of what you say here but not this. People choose what to make easy and what to make more difficult with technology all the time. This does not make something neutral. Obviously something as simple as a hammer is more neutral but this doesn't extend to software systems.
Right. People choose.
More specifically people with power direct what technologies get funded. How society chooses who is in power is the primary problem.
cuz Peter Hershock is a real guy and talks about AI all the time. I don't know if he ever said those exact words, but it's reasonable to think he did.
This is such a disingenuous take on the article, there's nothing naive or simplistic about it, it's literally full of critical thought linking to more critical thought of other academic observers to what's happening at the educational level. The context in your reply implies you read at most the first 10% of the article.
The article flagged numerous issues with LLM application in the educational setting including
1) critical thinking skills, brain connectivity and memory recall are falling as usage rises, students are turning into operators and are not getting the cognitive development they would thru self-learning 2) Employment pressures have turned universities into credentialing institutions vs learning institutions, LLMs have accelerated these pressures significantly 3) Cognitive development is being sacrificed with long term implications on students 4) School admins are pushing LLM programs without consultation, as experiments instead of in partnership with faculty. Private industry style disruption.
The article does not oppose LLM as learning assistant, it does oppose it as the central tool to cognitive development, which is the opposite of what it accomplishes. The author argues universities should be primarily for cognitive development.
> Successful students will grow and flourish with these developments, and institutions of higher learning ought to as well.
Might as well work at OpenAI marketing with bold statements like that.
The core premise is cognitive development of students is being impaired with long term implications for society without any care or thought by university admins and corporate operators.
It's disturbing when people comment on things they don't bother reading, literally aligning with the point the article is arguing, that critical thinking is decaying.
No one is misrepresenting your argument, it's well understood and being argued that it is false.
> students intellectual development is going to be impaired by AI because they can't be trusted to use it critically.
This debate is going nowhere so I'll end here. Your core premise is on trust and student autonomy, which is nonsense and not what the article tackles.
It argues LLM literally don't facilitate cognitive brain development and can actually impair it, irrelevant to how they are used so it's malpractice for university admins to adopt it as a learning tool in a setting where the primary goal should be cognitive development.
Student's are free to do as they please, it's their brain, money and life. Though I've never heard anyone argue they were their wisest in their teens and twenties as a student so the argument that students should be left unguided is also nonsense.
I’m not advocating for completely restricting access to AI for certain age groups. I’m pointing out that historically we have restricted prolonged interactions with certain stimuli that have shown to be damaging to cognitive development, and that we should make the same considerations here.
I think it’s hard to deny that younger generations have been negatively affected by the proliferation of social media engineered around fundamentally predatory algorithms. As have the older generations.
Seriously, you’re arguing with people who have severe mental illness. One loon downthread genuinely thinks this will transform these students into “genuises”
https://www.rxjourney.net/how-artificial-intelligence-ai-is-...
Of course, they're also cheating themselves out of an education, but few students have that Big Picture at their age.
Also women's and gender studies degrees were already a scam unless you have a trust fund.
It does seem like in-person pen-and-paper exams would hold the line pretty firmly with respect to competence. It's a simple solution and I haven't heard any good arguments against it.
> This isn’t innovation—it’s institutional auto-cannibalism. The new mission statement? Optimization.
Pen-and-paper exams only. No take-home essays or assignments. Assessments done in person under supervision. No devices in class. Heavily reduced remote learning or online coursework. Coursework redesigned so that any out-of-class work is explicitly AI-collaborative. Frequent low-stakes in-class writing to verify student voice and baseline ability. And when resources permit have oral exams and presentations as a means of assessment.
We did this for decades when tuition was a fraction of today's cost. Any argument that we can't return to basics is bollocks.
If you're trying to hawk education for $$$$$$, probably need to offer some actual human instruction, not Zoom and Discord sessions that anyone could run from their bedroom.
If they can't, then the rot and capture really is as bad as this makes out, and to update Will Hunting: the kids might as well save $150k and get their learning for $20/month on ChatGPT.
No, they really weren't. These were state school's in 1970's eastern Europe. No tuition, and neither parent was from a privileged background.
That didn't universally equate to privilege in a class or wealth sense for a number of countries.
eg: https://www.whitlam.org/whitlam-legacy-education
was the system I was educated under, when I took orals it was a result of being a scruffy kid that wore no shoes but passed general high school and math talent exams better than all but three others my age in the state.
( For interest, the three that ramked higher than myself that year in Tertiary admissions exams were all educated in expensive private schools in the capital city- I got by on School of the Air, a bunch of books and a few years at a smallish remote high school in far north W.Australia
* https://www.aades.edu.au/members/wa
1970's ham radio running off truck batteries - pre internet for that area, although we did experiment with text over phone line and packet radio.
* https://en.wikipedia.org/wiki/Prestel
)
In an effort to standardize European systems many courses are trying to get rid of them because foreign students are particularly weak in an oral defense.
Turns out we were right for once :D
Is something like that so hard to do ?
Catholic education and theology college?
* https://en.wikipedia.org/wiki/University_of_Paris#Origins
* https://en.wikipedia.org/wiki/University_of_Oxford#Founding
Bindoon: * https://kelsolawyers.com/au/paedophile_offenders/brother-kea...
Castledare: * https://www.bbc.com/news/uk-39131761
Clontarf: * https://www.findandconnect.gov.au/entity/clontarf/
St Augustine’s / Marcellin College : * https://kelsolawyers.com/au/paedophile_offenders/father-terr...
Not just Catholic schools, of course - but they certainly swept the boards and came first by a good margin: https://www.childabuseroyalcommission.gov.au/case-studies
They were very good at keeping it under wraps for decades though.
it’s not gonna be any different this time
reify•2mo ago
here is a UK .gov study of 21 schools, colleges, academies, universites and technical colleges who have adopted ai.
https://www.gov.uk/government/publications/ai-in-schools-and...
The majority of education providers have yet to adopt AI. Further research with those yet to adopt AI, and/or who are not considering using it, would help us understand better the barriers to more widespread use of AI across different types and phases of education.
For example, the assistant headteacher of one school said the top categories their teachers used in their AI tool were ‘help me write’, ‘slideshow’, ‘model a text’, ‘adapt a text’, ‘lesson plan’ and ‘resource generation’.
blibble•2mo ago
> As the AI champion told us: “If you put junk in, you’ll get junk out.”
and if you put gold in, you'll still get junk out