Here is a wiki article with all common tell-tales of AI writing: https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing
There are no clear signs, at least for anyone who cares to hide them
It's how it was with the internet. I grew up in the 90s, and teacher didn't know how to deal with the fact we no longer had to go through multiple books in the library to get the information they needed. We barely needed to write it.
Now nobody expects students to not use the internet. Same here: teachers must accept that AI can and will write papers and answer questions / do homework. How you test student must be reinvented.
I'm sorry but, lmao. You cannot be serious.
> attribute AI
Oh no!
> still probably just results in kids handwriting AI generated slop
Not if they're doing it in person. And at least they then need to look at it.
At some point the will have to make profit, that will shape AI.
Either by higher prices or ads. Both will change the use of AI
What I don’t get is why wouldn’t they act like an editor and add their own voice to the writing. The heavy lifting was done now you just have to polish it by hand. Is that too hard to do?
The goals of academic assessment need to change. What are they assessing and why? Knowledge retention skills? Knowledge correlations or knowledge expression skills? None of these going to be useful or required from humans. Just like the school kids are now allowed to use calculators in the exam halls.
The academic industry need to redefine their purpose. Identify the human abilities that are needed for the future that is filled with AI and devices. Teach that and assess that.
Memorization has a place, and is a requirement for having a large enough knowledge base that you can start synthesizing from different sources and determining when one source is saying something that is contradicted by what should be common knowledge.
Unless your vision of the future is the humans in WALL-E sitting in chairs while watching screens without ever producing anything, you should care about education.
I've been wondering lately if one of the good things to come out of heavy LLM use will be a return to mostly in-person interactions once nothing that happens online is trustworthy anymore.
I would like to hire students who actually have skills and know their material. Or even better, if AI is actually the amazing learning tool many claim then it should enhance their learning and as a result help them succeed in tests without any AI assistance. If they can't, then clearly AI was a detriment to them and their learning and they lack the ability to think critically about their own abilities.
If everyone is supposed to use AI anyway, why should I ever prefer a candidate who is not able to do anything without AI assistance over someone who can? And if you hold the actual opinion that proper ai-independent knowledge is not required, then why should I hire a student at all instead of buying software solutions from AI companies?
ashleyn•1h ago
* grep to remove em dashes and emojis
* re-run through another llm with a prompt to remove excessive sycophantry and invalid url citations
dbg31415•1h ago
Ha. Every time an AI passionately agrees with me, after I’ve given it criticism, I’m always 10x more skeptical of the quality of the work.
glitchcrab•59m ago
20260126032624•53m ago
otikik•15m ago
the_fall•1h ago
The "humanizer" filters will typically just use an LLM prompted to rewrite the text in another voice (which can be as simple as "you're a person in <profession X> from <region Y> who prefers to write tersely"), or specifically flag the problematic word sequences and ask an LLM to rephrase.
They most certainly don't improve the "correctness" and don't verify references, though.
emmp•1h ago
What AI detectors have largely done is try to formalize that intuition. They do work pretty well on simple adversaries (so basically, the most lazy student), but a more sophisticated user will do first, second, third passes to change the voice.