Should there be a clause for AI?

Ben Cotton bcotton at funnelfiasco.com
Fri Jul 11 14:31:51 UTC 2025


On Fri, Jul 11, 2025 at 4:01 AM Vasileios Valatsos <me at aethrvmn.gr> wrote:
>
> I apologize in advance for opening what is, in my regard, a giant can of
> worms.

It's a conversation that we can't avoid. Credit to you for standing up. :-)

> I wonder if (a) this would make sense in copyleft-next, (b) if it even
> belongs in the conversation, (c) if there is a better way to tackle this.

Would this be field of use discrimination? I think it would, and would
therefore violate FSF Freedom 0 and OSD criteria 6. That would render
copyleft-next neither Open Source nor Free in the capital-letter sense
of the terms.

More theoretically, I'm a big fan of "don't make rules you can't/won't
enforce." If you release something under the TPL and I decide to use
it to train an AI model, how will you know? bkuhn will correct me if
I'm wrong, but that would seem to be one challenge in GPL enforcement
in the traditional software arena. At least with traditional software,
there's typically some external evidence that a project was used in
violation of its license. I'm not sure there's a good way to detect it
in an LLM unless I make my training data public (which I would not if
I were intending to violate the license).

I'm not suggesting we stick our heads in the sand, but large language
models are a complex problem from a variety of legal (I say as
not-a-lawyer), ethical, practical, etc standpoints. I don't see a
clear way forward to addressing them in a license like copyleft-next
at this point.

--
Ben Cotton (he/him)
TZ=America/Indiana/Indianapolis


More information about the next mailing list