Seems like a missed bit of PR for their community
This fast run to use LLMs in everything can be undone by one court decision - and the sensible thing is to isolate as much as you can.
Really interesting insight
right... there has been ample code and visual art around to copy for decades, and people have, and they get away with it, and nothing bad happens, and where are the "millions of coming copyright claims" now?
i don't think what you are talking about has anything to do with killing openai, there's no one court decision that has to do with any of this stuff.
Some genres of music make heavy use of 'samples' - tiny snippets of other recordings, often sub-5-seconds. Always a tiny fraction of the original piece, always chopped up, distorted and rearranged.
And yet sampling isn't fair use - the artists have to license every single sample individually. People who release successful records with unlicensed samples can get sued, and end up having to pay out for the samples that contributed to their successful record.
On the other hand, if an artist likes a drum break but instead of sampling it they pay another drummer to re-create it as closely as possible - that's 100% legal, no more copyright issue.
Hypothetically, one could imagine a world where the same logic applies to generative AI - that art generated by an AI trained on Studio Ghibli art is a derivative work the same way a song with unlicensed drum samples is.
I think it's extremely unlikely the US will go in that direction, simply because the likes of nvidia have so much money. But I can see why a cautious organisation might want to wait and see.
> in Its Content or Designs
Personally: I'm a developer, so my situation is different. But right now I use AI code completion and Claude Code. I think I'd be fine without Claude Code, since it hasn't "clicked" for me yet; I think it's motivating, particularly for new features and boilerplate, but often (even with the boilerplate) must rewrite a lot of what it generates. Code completion would be harder, but maybe if the work was interesting and non-boilerplate enough I'd manage.
I've heard Claude Code has improved a lot very recently, so I would feel left behind without it completely, except I can use it in my spare time on personal projects. But if it keeps improving and/or ends up "clicking", then I may feel like I'm spinning my wheels at work.
I don't do much with crypto/NFTs/AI, because I don't find any of it useful yet. But I get so much "with us or against us" heat for not being zealously against the the idea of them. It was NFTs, NFTs, NFTs at the table for months until it became AI, AI, AI. My preference is to talk about something else while playing board games.
One thing I've found when talking to non-technical board gamers about AI is that while they’re 100% against using AI to generate art or game design, when you ask them about using AI tools to build software or websites the response is almost always something like "Programmers are expensive, I can't afford that. If I can use AI to cut programmers out of the process I'm going to do it."
A minority are conflicted about this position.
When I talk to technical people at game nights we almost never talk about tech. The one time our programmers all played RoboRally the night kind of died because it felt too close to work for a Saturday night.
If GW was going to use AI they would probably start with sprue layouts. Maybe the AI could number the bits in sane way? I would be for that.
> Games Workshop elects not to experience multi-year headache. Will use AI when profitable.
Indeed, companies will always start using something if it makes financial sense for them.
> One thing I've found when talking to non-technical board gamers about AI is that while they 100% against using AI to generate art or game design, when you ask them about using AI tools to build software or websites the response is almost always something like "Programmers are expensive, I can't afford that. If I can use AI to cut programmers out of the process I'm going to do it."
This is because they don't view programming as a "creative" form of labor. I think this is an incorrect view, but this knowledge is at least useful in weighting their opinions.
The most interesting observation is that regardless of how "anti-AI" most people seem to be, it isn't that deep of an opinion. Their stated preference is they don't want any AI anywhere, but their revealed preference is they'll continue to spend money as long as the product is good. Most products produced with AI, however, are still crap.
I agree that this is often the case. I still see Games Workshop as an exception. They could have moved plastic production to a cheaper region (e.g. China), but they haven't done so. Financials are obviously important to them, but they're being very careful and thoughtful about their actions. This AI ban is just another showcase of that.
how can you go and generalize about these people, calling them idiots (that's what "deep of an opinion" means, even if you don't say that), and then breathlessly engage in the exact same rhetoric?
I actively don’t use AI because the results are unreliable or ugly. I’m just not against AI in principle. It’s funny that my position is considered contemptible by people who regularly use AI but are hard hardliners against it on moral grounds.
Remember when everything wasn’t a religious war? Actually, I don’t. It was always like this and it’s always going to be like this. Just one forever crusade after another.
Most people didn't choose to be part of your moon shot death cult. Only the people at the tippy top of the pyramid get golden parachutes off Musk's exploding rocket.
They never changed their position, corpos shouldn't get any money! That's always been the position. They are inherently unethical meat grinders.
You do not mention the perception of asymmetric legal and market power. Many people think that file sharing Disney movies is ok, but Google scraping the art of independent artists to create AI is not ok. That is not the same dynamic at all as not caring about copyright, and then suddenly caring about copyright.
Three things:
1. People simply don't respect programming as a creative, human endeavour. Replacing devs with AI is viewed in the same way as replacing assembly line workers with robots.
2. Somewhat informed people might know that for coding tasks, LLMs are broadly trained on code that was publicly shared to help people out on Reddit or SO or as part of open-source projects (the nuance of, say, the GPL will be lost). Whereas the art is was trained on is, broadly speaking, copywritten.
3. And, related to two: people feel great sympathy for artists, since artists generally struggle quite a bit to make a living. Whereas engineers have solid, high paying white collar jobs; thus, they're not considered entitled to any kind of sympathy or support.
Artworks have their relatively-popular creative-commons stuff, and some of those follow a similar "do whatever" vibe, but I far more frequently see "attribution required" which generally requires it at the point of use, i.e. immediately along-side the art-piece. And if it's something where someone saw your work once and made something different separately, the license generally does not apply. LLMs have no way to do that kind of attribution though, and hammer out stuff that looks eerily familiar but isn't pixel-precise to the original, so it feels like and probably is an unapproved use of their work.
The code equivalent of this is usually "if you have source releases, include it there" or a very few have the equivalent of "please shove a mention somewhere deep in a settings screen that nobody will tap on". Using that code for training is I think relatively justifiable. The licenses matter (and have clearly been broadly ignored, which should not be allowed) but if it wasn't prohibited, it's generally allowed, and if you didn't want that you would need to choose a restrictive license or not publish it.
Plus, like, artists generally are their style, in practical terms. So copying their style is effectively impersonation. Coders on the other hand often intentionally lean heavily on style erasing tools like auto-formatters and common design patterns and whatnot, so their code blends cleanly in more places rather than sounding like exclusively "them".
---
I'm generally okay with permissive open source licensed code being ingested and spat back out in a useful product. That's kinda the point of those licenses. If it requires attribution, it gets murky and probably leans towards "no" - it's clearly not a black-box re-implementation, the LLMs are observing the internals and sometimes regurgitate it verbatim and that is generally not approved when humans do it.
Do I think the biggest LLM companies are staying within the more-obviously-acceptable licenses? Hell no. So I avoid them.
Do I trust any LLM business to actually stick to those licenses? ... probably not right now. But one could exist. Hopefully it'd still have learned enough to be useful.
So what? The code is offered under specific licensing terms. Not adhering to those terms is just as wrong as training on a paid product.
It is about scarcity: art is a passion; there is a perpetual oversupply of talented game designers, visual graphic artists, sculptors, magna artists, music composers, guitarists, etc...you can hire one and you usually can hire talent for cheap because...there is a lot of talent.
Programmers are (or were?) expensive because, at least in recent times, talented ones are expensive because they are rare enough.
When the tech world realized their neato new invention inadvertently dropped a giant portion of the world's working artists into the toilet, they smashed that flusher before they could even say "oops." Endless justification, people saying artists were untalented and greedy and deserved to be financially ruined, with a heaping helping of "well, just 'pivot'."
And I did-- into manufacturing because I didn't see much of a future for tech industry careers. I'm lucky-- I came from a working class background so getting into a trade wasn't a total culture and environment shock. I think what this technology is going to do to job markets is a tragedy, but after all the shit I took as a working artist during this transition, I'm going to have to say "well, just pivot!" Better get in shape and toughen up-- your years of office work mean absolutely nothing when you've got to physically do something for a living. Most of the soft, maladroit, arrogant tech workers get absolutely spanked in that environment.
Because it's not? Programmers' ethos is having low attachment to code. People work on code together, often with complete strangers, see it modified, sliced, merged and whatever. If you rename a variable in software or refactor a module, it's still the same software.
Meanwhile for art authorship, authenticity and detail are of utter importance.
The overwhelmingly vast majority of the code you're talking about (basically, anything that doesn't explicitly disavow its copyright by being placed in the public domain, and there's some legal debate if that is even something that you can do proactively) is just as copyright protected as the art is.
Open Source does not mean copyright free.
"Free Software" certainly doesn't mean copyright free (the GPL only has any meaning at all because of copyright law).
Public Domain in the US is the only factor that truly matters on the Internet today, but people who care do both.
Release into the Public Domain and provide a 0-type license.
Very reminiscent of the "software factory" bullshit peddled by grifters 15 or 20 years ago.
And I think, frankly, a lot of agile practice as I've seen it in industry doesn't respect software development as a creative endeavour either.
But fundamentally I, like a lot of programmers/developers/engineers, got into software because I wanted to make things, and I suspect the way I use AI to help with software development reflects that (tight leash, creative decision-making sits with me, not the machine, etc.).
Shelling out to support artists is seen as virtuous, and AI is seen as the opposite of that - not merely replacing artists but stealing from them. There's also a general perception that every cost-saving measure is killing the quality of the product.
So you've got a PR double-whammy there.
They will definitely start using AI when their competitors do to the point that they gain a substantial competitive advantage. Then, at least in a free market, their only choices are to use AI or cease to exist. At that point, it is more survival bias (companies that used AI survived) rather than profit motive (companies used AI to make more money).
That is a false dichotomy. Eschewing AI may actually provide a competitive advantage in some markets. In other words, the third choice is to pivot and differentiate.
Their worlds are their monopolies. Worlds that now have multi-decades worth of lore investment (almost 50 years now I think).
Just because someone else can make cheaper little plastic models doesn't affect GW in the slightest. Or pump out AI slop stories.
The Horus Heresy book series is like 64 books now. And that's just a spin-off. It's set way before when 40k actually is set (10,000 years).
With so much lore they need complicated archiving and tracking to keep on top of it all (I happen to know their chief archivist).
You can't replace that. I only say all this just to try and explain how off the mark you are on understanding what the actual value of the company is.
I live in Nottingham where GW is based, another of my friends happens to have a company on an industrial estate where there are like 3 other tabletop gaming companies. All ex-gw staff.
You could probably fit all their buildings in the pub that GW has on its colossal factory site.
You used to know people who worked at Boots, which used to be the big Nottingham employer. Now days, I know more people who work at GW.
Plenty of people use proxies, too. There's places that do monthly packs of new STLs that could be an entire faction army, and there's long been places that sold "definitely not Space Marines and Sisters of Battle" minis too.
They don't have a threat of anyone overtaking them at current, but AI making alternatives in this vein even cheaper could eat away at portions of their bottom line.
I would describe them as anti-corporate IP/copyright cartel. They understand things like automobiles and personal computers require organized heavy lifting but laying claim to own our culture and entertainment, our emotional identity is a joke.
Just rich people controlling agency, indoctrinating kids with capitalist essentialism; by chance we were born before you and survived this long so neener neener! We own everything!
Such an unserious country.
NFTs/Crypto are just ways to do crimes/speculate/evade regulations. They aren't useful outside of "financial engineering." You were right to dismiss them.
LLMs are extremely useful for real world use cases. There are a lot of practical and ethical concerns with their use: energy usage, who owns them, who profits from them, slop generation, trust erosion... I mean, a lot. And there are indeed hucksters selling AI snake oil everywhere, which may be what tripped off your BS meter.
But fundamentally, LLMs are very useful, and comparing them to NFT/Crypto does a disservice to the utility of the tech.
Without disagreeing with your overall point in 99% of cases, we did actually have a good use for pinning things in the Bitcoin blockchain when I worked at Keybase. If you're trying to do peer-to-peer security, and you want to prove not only that the evil server hasn't forged anything (which you do with signatures) but also that it hasn't deleted anything legitimate, "throw a hash in the blockchain" really is the Right Way to solve that problem.
and it only requires the same electricity as a medium sized country to do it
continuously, forever
I had a conversation with an artist friend some time back. He uses Squarespace for his portfolio website. He was a few drinks in, and ranting about how even if it's primarily artists using these tools professionally at the moment, it'll still lead to a consolidation of jobs, how it's built on the skillset and learning of a broader community than those that will profit, etc. How the little guy was going to get screwed over and over by this sort of thing.
I started out doing webdesign work before I moved more to the operations and infrastructure management side of things, but I saw the writing on the wall with CMS systems, WYSIWYG editors, etc. At the time building anything decent still took someone doing design and dev work, but I knew that they would get better over time, and figured I should make the change.
So I asked him about this. I spoke about how yeah, the people behind Squarespace had the expertise - just like the artists using AI now - but every website built with it or similar is a job that formerly would have gone to the same sort of little guy he was talking about. How it's a consolidation of all the learnings and practices built out by that larger community, where the financial benefits are heavily consolidated. I told him it doesn't much matter to the end web designer whether or not the job got eliminated by non-AI automation and software or an LLM, the work is still gone and the career less and less viable.
I've had similar conversations with artists before. They invariably maintain that it's different, somehow. I don't relish jobs disappearing, but it's nothing new. Someday, maybe enough careers will vanish that we'll need to figure out some sort of system that doesn't involve nearly every adult human working.
And this is not complicated at all. It's the quality of output.
Users appreciate vibecoded apps but developers are universally unfazed about vibecoded pull requests. Lots of same devs use AI for "menial" tasks and business emails. And this is NOT a double standard: people are clearly ok when generative AI outputs may exist but aren't exposed to unsuspecting human eyes, and it's not ok if they are exposed to human eyes, because the data AIs generate haven't exceeded the low-side threshold of some sort. Maybe SAN values.
(also: IIUC, cults and ponzi scheme recruitment are endemic in tabletop game communities. so board game producers distancing from anything hot in those circles, even if it were slightly irrational to do so, also makes sense.)
Your job is to create IP. As per the US Copyright Office, AI output cannot be copyrighted, so it is not anyone's IP, not yours, not your employer's.
That's not "anti-AI", that's AI and copyright reality. Game Workshop runs their business on IP, suddenly creating work that anyone can copy, sell or reproduce because it isn't anyone's IP is antithetical to the purpose of a for-profit company.
We'll have to see how this plays out. Games Workshop is (supposedly) notoriously litigous, and they've gone after artists who get too close to their art style. AI models are trained on that, so this is going to be an interesting thing to monitor.
I wanted to adapt my bike balancer to an unusual sized wheel and simply measured the diameter of the rod and the outer diameter of the hole where the axle slides into, asked an LLM to produce an adapter, converted to an STL and hit print and got a fully functional 3D printed part. Felt like living in the future, maybe one step away from owning a Star Trek Replicator.
People are starting to notice and care about these things.
Maybe I’m just not cynical enough about the “average” non-HN population but I think there are quite a few people who care.
Lots of people from all walks of life play board games. There are a lot of people who refuse to buy games made with AI generated assets. They go as far as making forums and tracking these things so that other folks can avoid them.
AI generated anything is seen as cheap. It is cheap. It generates “similar” reproductions from its training set. It’s called, “slop,” for a reason: low effort, low quality.
There have been quality issues in some of GW’s recent product lines, but for the most part they still have fans because the bar is already high for what they make.
Cutting costs to make an extra bunch by making the product crappier would be a kick to the knee. Fans already pay a premium for their products.
Good on them for not going down that road.
Personally I would never pay for tabletop miniatures or lore books generated by AI. It's the same core problem as publishing regurgitated LLM crap in a book or using ChatGPT to write an email - I'm not going to spend my precious time reading something that the author didn't spend time to write.
I am perfectly capable of asking a model to generate a miniature, or a story, or a dumb comment for reddit. I have no desire to pay a premium for someone else to do it and get no value from the generated content - the only original part is the prompt so if you try to sell AI generated "content" you might as well just sell the prompt instead.
hwers•1h ago
crooked-v•1h ago
nomel•1h ago
hollowturtle•1h ago
astrange•49m ago
This anti-AI argument doesn't make sense, it's like saying it's impossible to reinvent multiplication based on reading a times table. You can create new things via generalization or in-context learning (references).
In practice many image generation models aren't that powerful, but Gemini's is.
If someone created one that output multi-layer images/PSDs, which is certainly doable, it could be much more usable.
yndoendo•12m ago
Using Visual Studio, all the AI code generation is applying Microsoft's syntax style and not my syntax style. The return code line might be true but the layout / art / syntax is completely off. This with a solution that has a little less than one million lines of code, at the moment, which AI can work off of.
Art is not constant. The artist has a flow and may have an idea but the art will change form with each stroke with even removing strokes that are not fitting. I see as AI generated content lacks emotion from the artist.
cthalupa•4m ago
furyofantares•33m ago
And it's not that limiting. You aren't stuck with anything you start with. You can keep painting.
layer8•44m ago
kryptiskt•31m ago
nomel•7m ago
AI, in modern tools, is not just "draw the scene so I can trace it".
[1] https://www.adobe.com/ai/overview/features.html
Hamuko•29m ago
https://youtu.be/E3Yo7PULlPs?t=668
nomel•22m ago
ToucanLoucan•6m ago
TitaRusell•1h ago
layer8•32m ago
The other problem is that AI-generated material does itself not enjoy copyright protection.