this post was submitted on 29 Aug 2024
126 points (96.3% liked)

Technology

59666 readers
2625 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] conciselyverbose@sh.itjust.works 59 points 2 months ago (3 children)

But in a separate Fortune editorial from earlier this month, Stanford computer science professor and AI expert Fei-Fei Liargued that the "well-meaning" legislation will "have significant unintended consequences, not just for California but for the entire country."

The bill's imposition of liability for the original developer of any modified model will "force developers to pull back and act defensively," Li argued. This will limit the open-source sharing of AI weights and models, which will have a significant impact on academic research, she wrote.

Holy shit this is a fucking terrible idea.

[–] Zorsith@lemmy.blahaj.zone 11 points 2 months ago (2 children)

I read that as "incentivizing keeping AI in labs and out of the hands of people who shouldn't be using it".

That said, you'd think they would learn by now from Piracy: once it's out there, it's out there. Can't put it back in the jar.

[–] conciselyverbose@sh.itjust.works 30 points 2 months ago (1 children)

They should be doing the exact opposite and making it incredibly difficult not to open source it. Major platforms open sourcing much of their systems is basically the only good part of the AI space.

[–] Monstrosity@lemm.ee 10 points 2 months ago (1 children)

Also, they used our general knowledge and culture to train the damn things. They should be open sourced for that reason alone. Llms should be seen and treated like libraries, as collections of our common intellect, accessible by everyone.

[–] theneverfox@pawb.social 4 points 2 months ago

Damn straight. I don't fear AI, I fear an even more uneven playing field

[–] LainTrain@lemmy.dbzer0.com 14 points 2 months ago (1 children)

Not open-sourcing it is a terrible idea, it just creates more black boxes and gives corporations a further upper hand.

[–] JustAnotherKay@lemmy.world -4 points 2 months ago (1 children)

Yeah what do I care if Jimmy down the street enjoys using his Ollama chatbot? I'm too busy worrying about Terminator panning out

[–] LainTrain@lemmy.dbzer0.com 1 points 2 months ago (1 children)

Exactly, so you agree that this bill is shit?

[–] JustAnotherKay@lemmy.world 2 points 2 months ago

Yes, but apparently that didn't come across according to the votes lol

[–] AbouBenAdhem@lemmy.world 2 points 2 months ago

I haven’t yet read Li’s editorial, but I’m generally more inclined to trust her take on these issues than Hinton and Bengio’s.

[–] ZILtoid1991@lemmy.world 2 points 2 months ago

Same energy as PirateSoftware's "If AAA companies can't kill games due to always online DRM then small indie devs have to support their games forever, thus bankrupting them" argument.