https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...
Where the actual news is:
> To this end, the trustees have delegated authority to the provost, working with deans of all academic colleges, to develop and to review and update continuously, discipline-specific criteria and proficiency standards for a new campuswide “artificial intelligence working competency” graduation requirement for all Purdue main campus students, starting with new beginners in fall 2026.
So the Purdue trustees have "delegated authority" to people at the University to make a new graduation requirement for 2026.
Who knows what will be in the final.
After more than a trillion dollars spent, LLMs can replace: (a) a new secretary with one week of experience (b) a junior programmer who just learned that they can install programs on a desktop computer, and (c) James Patterson.
That's the bright future that Purdue is preparing its students for.
Yes, AIs will be a huge thing...eventually...but LLMs are not AI, and they never will be.
Then future generations who like old school systems hacking will be able to pair program with Justine AI.
AI/ML isn't going to completely shift the world, but understanding how to do basic prompt engineering, validate against hallucinations, and know what the difference between ChatGPT and GPT-4o is valuable for people who do not have a software background.
Gaining any kind of knowledge is a net win.
However, there's no reason to think any trick would be relevant even in a year. As llms get better, why wouldn't we just have them auto rewrite prompts using appropriate prompt engineering tricks?
And I just know this is going to turn into a (pearl-clutching) AI Ethics course...
But I like to think that actually learning the history was important and it certainly was a diversion from math/chemistry/physics. I liked Shakespeare, so reading the plays was also worthwhile and discussing them in class was fun. Yeah, I was bored to tears in medieval history, so AI could have helped there.
If you're going to try to fake being able to write, better to try to dupe any other professor than a professor of English. (source: raised by English majors)
Why do you think it wouldn't do the same for other fields? The purpose of writing essays in school is never to have the finished product; it's to learn and analyze the topic of the essay and/or to go through the process of writing and editing it.
Part of this is very reasonable; AI is upending how students learn (or cheat), so adding a requirement to teach how to do it in a way that improves learning rather than just enhances cheating makes sense. The problem with the broad, top-down approach is it looks like what happens in Corporate America where there's a CEO edict that "we need a ____ strategy," and every department pivots projects to include that, whether or not it makes sense.
"all as informed by evolving workforce and employer needs"
“At the same time, it’s absolutely imperative that a requirement like this is well informed by continual input from industry partners and employers more broadly."
Purdue is engaging in the oldest profession in the world. And the students pay for this BS.
This is not remotely the kind of thing that a school should be making a requirement at this time. The technology is changing way too fast to even be sure that basic fundamental skills related to it will remain relevant for as many as 4-5 years.
For the same reason that elementary schools don't allow calculators in math exams.
You first need to understand how to do the thing yourself.
conartist6•2h ago
andy99•2h ago
I though Purdue was a good school, these kind of gimmicks are usually the province of low-tier universities trying to get attention.
turtleyacht•2h ago
Professors can tailor lectures to narrower topics or advanced, current, or more specialized subjects. There may be less need to have a series of beginning or introductory courses--it's assumed learners will avail themselves.
Pessimistically, AI literacy contributes to further erosion of critical thinking, lazy auto-grading, and inability to construct book-length arguments.
basch•6m ago
it's not unrealistic to be selecting for people with strong language skills and the ability to break tasks into discrete components and assemble them into a process. or the skill of being able to define what they do not know.
a lot of what makes a person good with an llm makes them also good at general problem solving.