this post was submitted on 10 Jan 2025
7 points (73.3% liked)

Open Source

31911 readers
402 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

If there with AGPLv3 AI, I would definitely try, would there be a reason not to use it? I wish there was a GPLv3 AI to install. I would not use a GPL2 AI.

As a real would code example of the difference between GPL2 and 3, not license but code, look at Linux kernel vs Linux-libre. For the limited hardware that a Linux-libre distribution supports, it's a much smoother and cleaner system to run.

An AGPL3 AI can't hide any nefarious or bias programming, so I see it truely working as a servant, while retaining zero information of users for data collecting. Look at open hardware like RISC V and how far that has come once companies saw they were free to engineer their own version of a RISC V CPU. It's made RISC V to be at the very beginning point to get closer to ARM, when it was once worthless. Having a AGPLv3 AI for anyone to design their own version with everybody pushlishing their server source code for AI, would there be a downside to using it?

top 2 comments
sorted by: hot top controversial new old
[โ€“] TootSweet@lemmy.world 8 points 7 hours ago* (last edited 7 hours ago)

The GPL family of licenses was designed to cover code specifically. AI engines are code and are covered in most jurisdictions by copyright. (Disclaimer: I know a lot less about international intellectual property law than about U.S. intellectual property law. But I'm pretty confident what I'll say here is at least true of the U.S..) But you don't really have a functional generative AI system without weights. And it's not clear that weights are covered by any particular branch of intellectual property in any particular jurisdiction. (And if they are, it's not clear that the legal entity who trained the engine owns those rights on those weights rather than the rights holders who hold rights to the materials being used as training data.) It's the weights that would make for any biases or purposefully nefarious output. Nothing that isn't covered by intellectually property can meaningfully be said to be "licensed", really. Under the AGPLv3 or any other license. To speak of something not covered by any intellectual (or non-intellectual, I suppose) property as "licensed" is just kindof nonsensical.

Like, since Einstein's General Relativity isn't covered by any intellectual property, it's not possible for General Relativity to be "licensed". Similarly, unless some law is passed making LLM weights covered by, say, copyright law, one can't speak of those weights being "licensed".

By the way, there are several high-profile cases of companies like Meta releasing LLMs that you can run locally and calling them "Open Source" when there's nothing "Open Source" about them. As in, they don't distribute the source code of LLaMa at all. That's exactly the opposite of "Open Source" and the weights aren't code and can't really be said to be "Open Source". More info here.

Now, all that said, I don't think there's actually any inherent benefit to LLMs, AGPLv3 or otherwise, so I don't have any interest even in AGPLv3 engines. But I'm all for more software being licensed AGPLv3. I just don't think AGPLv3 is a concept that applies to any portion of LLMs aside from the engine.

[โ€“] technohacker@programming.dev 5 points 7 hours ago* (last edited 7 hours ago)

I would actually bring a parallel to the device driver-firmware blob split that's common with hardware support in Linux. While the code needed to run inference with a model is straightforward and several open source versions exist already, the model itself is a bunch of tensors whose behaviour we don't have any visibility into. Bias is less a problem of the inference code and more an issue with the data it was trained on