this post was submitted on 05 Sep 2024
47 points (94.3% liked)

Technology

58692 readers
3970 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] conciselyverbose@sh.itjust.works 1 points 1 month ago (1 children)

No, they weren't. There was never at any point any theoretical possibility that an LLM would resemble understanding in any way.

[–] ContrarianTrail@lemm.ee 0 points 1 month ago* (last edited 1 month ago) (1 children)

That's why they simulate it. Just like I said.

Look, there's no point going any further with this. You just keep making baseless claims without any explanation or even attempt to try and convince me otherwise. When called out, you ignore it and move on. I'm not interested in discussions where people are just talking past each other while disregarding everything said in the previous messages. Take care now.

They don't simulate anything.

LLMs are objectively bullshit. You're the one who went way down the train trying to act like the exact correct word wasn't fair, and I responded to the only part of any of your posts that wasn't outright word salad nonsense.