this post was submitted on 17 May 2024
496 points (94.8% liked)

Technology

59597 readers
2784 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] noodlejetski@lemm.ee 52 points 6 months ago (2 children)

Their real power is their ability to understand language and context.

...they do exactly none of that.

[–] breakingcups@lemmy.world 23 points 6 months ago (1 children)

No, but they approximate it. Which is fine for most use cases the person you're responding to described.

[–] FarceOfWill@infosec.pub 19 points 6 months ago (1 children)

They're really, really bad at context. The main failure case isn't making things up, it's having text or image in part of the result not work right with text or image in another part because they can't even manage context across their own replies.

See images with three hands, where bow strings mysteriously vanish etc.

[–] FierySpectre@lemmy.world -1 points 6 months ago

New models are like really good at context, the amount of input that can be given to them has exploded (fairly) recently... So you can give whole datasets or books as context and ask questions about them.

[–] Lmaydev@programming.dev 0 points 6 months ago

They do it much better than anything you can hard code currently.