Look how much power lies in the hands of people who lie between petroleum in the ground and its combustion. It's a whole waterfall and the majority of the "wealth" in society seems to consist of people who're spinning their wheels from siphoning from it. And now they're terrified it'll go away.
The AI "gold rush" really has this feeling. "How can I get my finger in the pie somewhere here?"
"All that is solid melts into air"
Given the performance of open weight models to date it looks as though that might prove fairly difficult in the medium to long term.
Only to have a machine ingest, compress, and reiterate your work indefinitely without attribution.
What consideration do you choose to afford to those feelings?
Only to have a machine ingest, compress, and reiterate your work indefinitely without attribution.
Further facilitating millions, or even billions, of other people to discover new ideas and create new things. It's not hard to see the benefit of that.I get that the purpose of IP laws are psychological, rather than moral. A culture where people feel as though they can personally benefit from their work is going to have higher technological and scientific output, which is certainly good, even if the means of producing that good are sort of artificial and selfish.
It's not hard to imagine, or maybe dream of, a world where the motivation for research and development is not just personal gain. But we have to work with the world we have, not the world we want, don't we...
Nobody will starve themselves, even if doing so will feed hundreds of others.
Neither. They are purely economic. You even acknowledge this when you call out personal benefit.
The stated intent is to facilitate creators realizing economic benefits from time spent creating. The reality is that large corporations end up rent seeking using our shared cultural artifacts. Both impacts are economic in nature.
The economic benefit is derived from a psychological effect: the expectation of personal gain.
The economy as a whole benefits from technological progress. The technological progress is fueled by each individual's expectation of personal gain. The personal gain is created via IP law.
There's a psychological component regarding trust. Either that your employer would never try to cheat you or alternatively that your employer is the sort that might try to cheat you but won't thanks to our society's various systems. But the showing up to work itself is a simple exchange of time and skill for money.
I don't think you can say that AI-written can be reliably detected. Turnitin is only ~90% effective: https://teaching.temple.edu/sites/teaching/files/media/docum...
I mean, the solution is just in-class-only essays, right? Or to stop with the weird obsession with testing and just focus on actually teaching.
They failing exams because they don't do the work is on them.
The linked article breaks it down. The measured false positive rate is essentially 0 in this small study.
I know that's what they wrote, but I heavily disagree. It got 28/30 (93%) correct, but out of the two it got "wrong":
- one was just straight up not rated because the file format was odd or something
- the other got rated as 11% AI-written, which imo is very low. I think teachers would consider this as "human-written", as when I was being evaluated with Turnitin that percentage of "plagiarism" detected would have simply been ignored.
The linked article analyzes their data into more detail. In particular, the measured false positive rate is essentially 0 in this small study.
The false positive rate is 0. The tool *never* says human writing is AI.
I’m surprised to see these comments in conjunction, 90% is pretty good, and much higher than i expected. I wonder what’s the breakdown of false positives/false negatives
Edit: from the linked paper
> Of the 90 samples in which AI was used, it correctly identified 77 of them as having >1% AI generated text, an 86% success rate. The fact that the tool is more accurate in identifying human-generated text than AI-generated text is by design. The company realized that users would be unwilling to use a tool that produced significant numbers of false positives, so they “tuned” the tool to give human writers the benefit of the doubt.
This all seems exceptionally reasonable. Of the samples with AI, they correctly identify 86%. Of the samples without AI, they correctly identify a higher proportion, because of the nature of their service. This implies that if they _wanted_ to make a more balanced AI detection tool, they could get that 86% somewhat higher.
Edit: thanks for doing so
What standard of proof is appropriate to expel someone from college? After they've taken on, say, $40,000 of debt to attend?
Assuming you had a class of 100 students, "90% effective" would mean expelling 10 students wrongly - personally I'd expect a higher standard of proof.
Anyone who gives 10 seconds of thought to how this could help realizes at 90% it’s a helpful first pass. Motivated students who really want to hide can probably squeak past more often than you’d like. And you know there will be false positives so you do something like: * review those more carefully, or send it to a TA if you have one to do so * keep track of patterns of positives from each student over time * explain to the student it got flagged, say it’s likely a false positive, and have them talk over the paper in person
I’m sure decent educators can figure out how to use a tool like that. The bad ones are going to cause stochastic headaches for their students regardless.
Tests can be wrong in two different ways, false positive, and false negative.
The 90% figure (which people keep rounding up from 86% for some reason, so I'll use that number from now on) is the sensitivity, or the abitity to not have false negatives. If there are 100 cheaters, the test will catch 86 of them, and 14 will get away with it.
The test's false positive rate, how often it says "AI" when there isn't any AI, is 0%, or equivalently, the test's "specificity" is 100%
> Turnitin correctly identified 28 of 30 samples in this category, or 93%. One sample was rated incorrectly as 11% AI-generated[8], and another sample was not able to be rated.
The worst that would have happened according to this test is that one student out of 30 would be suspected of AI generating a single sentence of their paper. None of the human authored essays were flagged as likely AI generated.
There are people whose style is closer to AI, that doesn't mean they used AI. And sometimes AI outputs text that look like a human would write.
There is also the mix: if I write two pages and I used two sentences by AI (because I was tired and I couldn't find the right sentence), I may be flagged for using AI. Even worse, if I ask AI for advice and then I rewrite it myself, what would be the output? I can make a reasoning that both (AI written and not AI written) would be wrong.
None of these tools are binary. They give a percentage score, a confidence score, or both.
If you include one ai sentence in a 100 sentence essay, your essay will be flagged as 1% AI and nobody will bat an eye.
Of course the sample size is fairly small, I would want a larger scale study to see if the false positive rate is actually 5%, or 1%, 0.1%, 0.000001%, etc.
Then if there's a high probability, I look through the references in the paper. Do they say what the student attributes to them?
Finally, if I still think it's AI-generated, I have the student in and ask questions about the paper. "You said this here in this paragraph -- what do you mean by that?"
AI detectors are a first-pass, but I think a human really needs to be in the loop to evaluate whether it's cheating, or just using something to clean up grammar and spelling.
In an educational context, the only purpose of the writing has traditionally been learning, and the purpose of turning it in has been to prove that the learning took place. Both of those are out the window now. Classroom discussion and oral presentations might be the only place you can still prove learning took place. Until everybody gets hidden AI-powered earpieces of course.
No it isn't. Stop.
The cynical part of me says that the people who share this link with that summary are the cheaters trying to avoid getting caught, on the basis of the fact that they are patently abusing the numbers presumably because they didn't pay attention in math class.
The tests are 90% SENSITIVE. That means that of 100 AI cheaters, 10 won't be caught.
The paper you linked says the tests are 100% SPECIFIC. That means they will *never* flag a human-written paper as mostly AI.
On some level the human output in academic setting is expected to be well formulaic in way AI generated text is.
Which often could lead to false positives.
- exact reuse of a long-ish word sequence(s) without credits -> not cool.
- complete/partial reinterpretation of an already existing story in different words -> it's fine
- Traced/almost identical image/drawing/painting (with the intent to fool someone) -> not cool
- Visual imitation in style/content but different approach or usage -> it's fine
I think people are too attached by the novelty of something, sure if I write a bunch of words and you repeat them as yours, that's not cool. But if something I make inspires someone and they take it, reframe it, rephrase it or whatever, go ahead.
People adore Star Wars, which is an absolute one to one of a hero's journey, it still has value. Most modern fantasy are basically fanfics of Middle Earth, still good that they exist.
Imagine someone just spamming sequences of notes at random for their whole life, does it mean they own anything else made here afterwards +70/80/90... Years?
If I restate something using completely my own words, I'm still supposed to cite the source where I got the idea.
If something is completely my own invention, and I didn't use any sources to create it, then that's original and I don't need to credit anyone else. But that's very rare.
Certainly there are styles and broad arcs that many creations follow that are not directly attributable to a specific source.
In the end hard lines are very hard.
Producing something entirely novel in an act of pure creativity is essentially a tall tale - like Newton and the Apple - possibly some truth to it, but definitely mythologized.
Yeah but A Hero's Journey is not a literal story, it's more of a framework written in a book called "The Hero With a Thousand Faces" for what makes a story interesting and how various original stories like myths, folklore etc (like the Bible) always followed the same pattern.
The author dissected that pattern, and then it has been followed by many writers/creators for what is considered to be a good model of a story. Screenwriting classes literally teach it, along with other stuff like The Three Act structure etc.
And if you really look into, almost all good stories follow that pattern to some extent, but it is the implementation that makes each story special.
It's like a bit like saying "People adore [x] webapp which is an absolute one to one of React, it still has value" but both are fundamentally different things.
Non transformative use -> Not cool.
Transformative -> it's fine
Original work attempting to deceive or confuse the origin as being by another. -> not cool
Original work emulating the style of another without attempting to imply involvement of the other -> it's fine.
Of Mans First Disobedience, and the Fruit Of that Forbidden Tree, whose mortal taste…
Ummm, excuse me. This is literally the garden of Eden. In fact this idiot plagiarizes the name too. He actually calls this Eden. wtf. Fake as fuck. And people call this copy-paste artist who cites literally zero of his sources a “poet”.
-- me
I don't think so.
geist67•1h ago
Every generation throughout time has had the right to recreate the legacy of human thought through the filter of their own times.
“Cultural appropriation” and other knock off terms are objectively a part of every creative and functional cycle.
Give credit where credit is due, yet once let into the world a thought becomes a part of such wilds.
asmor•1h ago
Not to mention that when it comes to art, I'd rather consume something that someone deemed important and interesting enough to dedicate skill and time to.
iammjm•1h ago
asmor•1h ago
mistrial9•59m ago
zozbot234•54m ago
pixl97•46m ago
The problem is a system of strong copyright laws isn't going to fix this system, and from everything we've seen is making it worse.
derektank•31m ago
Demands? Almost everything we do? I only spend 40-50 hours a week max doing labor that anybody would reasonably describe as being commercially exploited. No one’s broken down my door demanding I start making money on the visual novel I’m drafting in Ren’Py on the weekends, nor have I been castigated by my peers for throwing a party without charging an entrance fee.
dfxm12•30m ago
coldtea•1h ago
And everything wasn't "content", nor did they have massive numbers of influencers and public content creators, nor was there was a push even for laymen to churn heaps of text every day or to project an image to the whole world.
And until recently if you got caught plagiarizing you were shamed or even fired from journalism. Now it's just business as usual...
pessimizer•1h ago
You'd think it was more complicated than that if the people who were doing a caricature of you had enslaved and murdered your family, and lived in the house your family built while you lived on the street.
It doesn't matter, because culture works how it works (and is often used as a political tool), and somehow world culture ends up being people pretending to be Americans pretending to be the descendants of American slaves. But it's undeniably ugly.
adityamwagh•1h ago
MetaWhirledPeas•1h ago
It's an understandable position for these reasons:
- We like art and we ant to show our support and appreciation for art
- The most straightforward way to show support and appreciation for art is to give the artist money
- Much of the art we appreciate was only possible due to the promise of monetary gain on the part of the artist
But there are some old, unavoidable questions:
- At what point does the pursuit of monetary gain begin to diminish one's own artistic expression?
- At what point does the pursuit of monetary gain begin to diminish other peoples' artistic expression?
As you point out, there is no art without appropriation and re-creation.
And now there are some new, unavoidable facts:
- Appropriation is becoming easier
- Attribution is therefore becoming more difficult
- Compensation is therefore becoming more difficult
- Rewinding the clock is impossible
The only way out of this would be for humanity to collectively take a puritanical stance on art, where any form of appropriation is demonized. I think this would make art suck.
mistrial9•57m ago
my deck BBQ caught on fire, problem .. versus ... the 35,000 hectares next to my house is on fire with 20 meter tall flames
is "appropriation" now "easier" ? for whom, at what scale to deliver? at what scale to ingest ?
pixl97•42m ago
bryanrasmussen•22m ago
but it is quite notorious that people don't actually like doing that point, especially, I just have to point it out here, on HN. So...
At what point does the inability of monetary gain begin to diminish artistic expression?
mvdtnz•1h ago
Cultural appropriation was a term popularised in the heady days of woke excesses when white liberals were desperate to find reasons to be mad at one another for perceived impurity. It's a ludicrous concept from top to bottom.
Intellectual property laws, in my opinion, have a place in our society.
mock-possum•55m ago
pinnochio•53m ago
larodi•9m ago