this post was submitted on 09 Apr 2024
61 points (90.7% liked)

Futurology

1740 readers
123 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Not_mikey@slrpnk.net 6 points 6 months ago* (last edited 6 months ago)

Authorities should also “urgently” consider outlawing the publication of the “weights,” or inner workings, of powerful AI models, for example under open-source licenses, with violations possibly punishable by jail time, the report says

Fuck that, so only huge corporations can have access to it. You won't even be able to have start ups to challenge the behemoths because this would shut down any open scientific papers explaining how AI works to get started.

If you want to make a case this technology is an existential threat equivalent to nukes and any proliferation is dangerous then treat it like nukes and nationalize it and make it so only government can produce it. At least the government is nominally subject to the people instead of a bunch of companies who will happily destroy the world if it makes them an extra buck.

We're probably nowhere near that threat though so something like this would only serve to widen the gap between the current batch of huge AI companies and smaller scale developers and enthusiasts.