Would this be legal?
For example, if a SaaS corporation wanted to modify and sell a service using some AGPL project can they use AI to entirely rewrite that project effectively detaching it from its original creator and ownership?
Would this be legal?
For example, if a SaaS corporation wanted to modify and sell a service using some AGPL project can they use AI to entirely rewrite that project effectively detaching it from its original creator and ownership?
No.
The whole project (and some may argue that the LLM that trained on the AGPL code that is also running on the backend), should be open sourced as well.
Using LLMs to remove the licence and generating a derived project from the original AGPL code is not a 'clean room implementation' and is the equivalent of rewriting existing code from the original author.
https://fingfx.thomsonreuters.com/gfx/legaldocs/jnvwbgqlzpw/...
To me this makes the clean-room distinction very hard to assert, what am I missing?
Clean room requires the person writing the implementation do have no special knowledge of the original implementation.
https://en.wikipedia.org/wiki/Clean-room_design https://en.wikipedia.org/wiki/Chinese_wall
Nordstrom Consulting v. M&S Technologies, which is possibly the most relevant case, describes a process for developing under a clean room environment and from what I understand it seems to focus on isolation of engineering teams and resources (except when required for interoperability). I did not find mentions of assessing the cohort of engineers for prior access to the copyrighted material but if I have missed that please let me know.
I also wanted to say that I am not asking this because I am thinking to start an unethical license laundering business, I am only trying to understand the meaning of making LLMs legally equivalent to human workers.
If you ask it "give me a library same as X" by name and it does it then this will surely be based on the code of the library and may even contain the actual code of the library. Don't do this.
If you feed it the code of the library piece by piece and ask for something "equivalent" that's even worse. Explicitly derivative.
If you write your own documentation of how the library works but don't mention it by name and it's not a very special purpose library and the LLM writes to your new spec... Then probably you'll spend a lot to get a worse version. IANAL
GianFabien•2mo ago
Of course, the big AI companies blithely ignore moral and legal issues.