Should there be a clause for AI?
Richard Fontana
fontana at sharpeleven.org
Sun Jul 13 21:13:53 UTC 2025
> Disclosing modifications and releasing them under the same license applies only when *conveying* the software to others, not just when *using*. Perhaps this was understood, perhaps not, but this is a widespread misunderstanding, so please be careful about that. Anyone can use copyleft software and even modify it *without* any copyleft requirements being triggered. Modifications can be made and kept private. They do not need to be disclosed to anyone if the software is used only privately. And they only need to be disclosed to those who receive the software when sharing.
That assumes copyleft conforms to what I was calling "software freedom
as the FLOSS community understood it in 2012" or however I put it.
There have been a number of licenses, perhaps we'd consider them
pseudo or quasi-FLOSS, that attempt to break with this principle that
copyleft requirements are triggered only by "conveying", and indeed
this is also an argument that anti-AGPL people will use to say that
AGPL is not a free software license. (And it's also part of the reason
why in 1991 AGPL probably would have been seen as "non-free" if the
vocabulary had existed adequately then.)
The reason I mention this:
> The dilemma is about maintaining practical software freedom. There's no point in developing copyleft-next if it does nothing to actually support software freedom in practice.
>
> Let's imagine that we succeed at blocking legal AI training on copyleft-next code.
I don't see how you can block legal AI training on copyleft-next code
while adhering to 2012 software freedom, or (I'd contend) 2025
software freedom. Separately, the direction courts seem to be going in
the US would probably make licenses prohibiting AI training pointless.
(Apologies if I'm misunderstanding what you mean by "succeed at
blocking legal AI training".)
> Are we banking on the idea that AI-generated code will remain more
buggy or otherwise unreliable than human written copyleft code?
I think we can assume this (in general) for the shorter term, though I
don't know how short that is.
> We are concerned that a free society needs to not have a few companies or governments have exclusive AI control, and so we think copyright-licensing is a means to legally compel AI weights and so on to be released to the public? This scenario is not about excluding copyleft-next from training but getting AI's to be more free. But in practice, powerful companies that want exclusive control would likely exclude copyleft-next code if they felt it would compel them to be more free with their AIs than they want otherwise, right?
It sounds like you're envisioning that a "copyleft-next for AI" could
devise some sort of "clever hack" or jujitsu move or whatever to make
models themselves to be free, or more free. This is probably worth
discussing, but it's probably not going to take the form of a license
that says "if you want to train your model with this copyleft-next
code, you have to do these things to make your resulting model
copyleft".
As for the powerful companies training the kinds of models we're
talking about, barring some unexpected direction in the law favoring
copyright holders of training data (which I'd note that I personally
am not especially sympathetic to), I think they just aren't going to
care much as they don't care right now that they are training using
copies of proprietary (and GPL, etc.) works. Or more precisely, they
do care, but they believe they are in the right.
> Note that without any AI clauses, I *think* copyleft would still apply to the use of AI to do simple code modifications. So, imagine someone uses an AI to add a minor feature to a copyleft-next program, and they publish their update. This should be no different than if a human programmer had made the updates, right? And no extra clause is needed for this case.
It is potentially different for the reason I think I stated in this
thread, that in some jurisdictions the emerging principle is that
seemingly original works of generative AI models are not copyrightable
because they are not human authored. So you can imagine a situation
where the same minor feature is added to a copyleft-next program, but
if a human does it without using an AI tool, it's copyrightable and
therefore copyleft applies to the modifications, while if it was
entirely AI generated copyleft might not apply to those modifications.
I'm not sure this has that much practical significance though (at
least not until the tools we're talking about get much better). We
have minor features added to copyleft programs today that are under
permissive licenses or are essentially outside of the scope of
copyright entirely.
Richard
More information about the next
mailing list