Should there be a clause for AI?

Aaron Wolf wolftune at riseup.net
Sun Jul 13 21:30:07 UTC 2025


I agree with the bulk of these points. https://ai-2027.com/ is a good 
reference for the premise that AI could reach levels categorically 
different from today within what is a short-term in human time scale.

Anyway, the last point on contributions to copyleft-next projects: 
*some* legal entity would be publishing updated versions of a project… I 
mean, it's possible for an AI to anonymously put out some updated 
version of a program and nobody finds out where it came from, but this 
is comparable to a human doing so anonymously. As long as some human or 
corporation, some recognized legal entity is the *publisher* of some 
software, then that legal entity would be bound to follow the 
copyleft-next license in terms of *their* access to the copyrighted 
code. It shouldn't matter that the *updates* are not copyrightable 
precisely…

So, this brings up a key point for the actual license. We might be 
unable to specify that updates made by AI shall use copyleft-next as a 
license if those updates are not themselves copyrightable. However, I 
don't see why copyleft-next couldn't still specify that these 
uncopyrightable updates be released in source form as part of following 
the terms of copyleft-next in terms of including the *original* code in 
the updated program.

In other words, we should be able to say that it violates the 
copyleft-next license for a legal entity to publish an adaptation 
without making the source code available, even in cases where it is 
impossible to apply a copyright license to the code updates. The legal 
entity should still face the requirement to pass on all the source code 
that remains under copyleft-next and make available with it all the 
source code that is public domain but part of the update.

Aaron

On 7/13/25 2:13, Richard Fontana wrote:
>> Disclosing modifications and releasing them under the same license applies only when *conveying* the software to others, not just when *using*. Perhaps this was understood, perhaps not, but this is a widespread misunderstanding, so please be careful about that. Anyone can use copyleft software and even modify it *without* any copyleft requirements being triggered. Modifications can be made and kept private. They do not need to be disclosed to anyone if the software is used only privately. And they only need to be disclosed to those who receive the software when sharing.
> That assumes copyleft conforms to what I was calling "software freedom
> as the FLOSS community understood it in 2012" or however I put it.
> There have been a number of licenses, perhaps we'd consider them
> pseudo or quasi-FLOSS, that attempt to break with this principle that
> copyleft requirements are triggered only by "conveying", and indeed
> this is also an argument that anti-AGPL people will use to say that
> AGPL is not a free software license. (And it's also part of the reason
> why in 1991 AGPL probably would have been seen as "non-free" if the
> vocabulary had existed adequately then.)
>
> The reason I mention this:
>
>> The dilemma is about maintaining practical software freedom. There's no point in developing copyleft-next if it does nothing to actually support software freedom in practice.
>>
>> Let's imagine that we succeed at blocking legal AI training on copyleft-next code.
> I don't see how you can block legal AI training on copyleft-next code
> while adhering to 2012 software freedom, or (I'd contend) 2025
> software freedom. Separately, the direction courts seem to be going in
> the US would probably make licenses prohibiting AI training pointless.
>
> (Apologies if I'm misunderstanding what you mean by "succeed at
> blocking legal AI training".)
>
>   > Are we banking on the idea that AI-generated code will remain more
> buggy or otherwise unreliable than human written copyleft code?
>
> I think we can assume this (in general) for the shorter term, though I
> don't know how short that is.
>
>> We are concerned that a free society needs to not have a few companies or governments have exclusive AI control, and so we think copyright-licensing is a means to legally compel AI weights and so on to be released to the public? This scenario is not about excluding copyleft-next from training but getting AI's to be more free. But in practice, powerful companies that want exclusive control would likely exclude copyleft-next code if they felt it would compel them to be more free with their AIs than they want otherwise, right?
> It sounds like you're envisioning that a "copyleft-next for AI" could
> devise some sort of "clever hack" or jujitsu move or whatever to make
> models themselves to be free, or more free. This is probably worth
> discussing, but it's probably not going to take the form of a license
> that says "if you want to train your model with this copyleft-next
> code, you have to do these things to make your resulting model
> copyleft".
>
> As for the powerful companies training the kinds of models we're
> talking about, barring some unexpected direction in the law favoring
> copyright holders of training data (which I'd note that I personally
> am not especially sympathetic to), I think they just aren't going to
> care much as they don't care right now that they are training using
> copies of proprietary (and GPL, etc.) works. Or more precisely, they
> do care, but they believe they are in the right.
>
>> Note that without any AI clauses, I *think* copyleft would still apply to the use of AI to do simple code modifications. So, imagine someone uses an AI to add a minor feature to a copyleft-next program, and they publish their update. This should be no different than if a human programmer had made the updates, right? And no extra clause is needed for this case.
> It is potentially different for the reason I think I stated in this
> thread, that in some jurisdictions the emerging principle is that
> seemingly original works of generative AI models are not copyrightable
> because they are not human authored. So you can imagine a situation
> where the same minor feature is added to a copyleft-next program, but
> if a human does it without using an AI tool, it's copyrightable and
> therefore copyleft applies to the modifications, while if it was
> entirely AI generated copyleft might not apply to those modifications.
> I'm not sure this has that much practical significance though (at
> least not until the tools we're talking about get much better). We
> have minor features added to copyleft programs today that are under
> permissive licenses or are essentially outside of the scope of
> copyright entirely.
>
> Richard
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.copyleft.org/pipermail/next/attachments/20250713/a73e4553/attachment-0001.html>


More information about the next mailing list