this post was submitted on 22 Apr 2025
1525 points (98.9% liked)

Memes

49926 readers
1061 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 6 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] dxdydz@slrpnk.net 38 points 1 day ago (3 children)

LLMs are trained to do one thing: produce statistically likely sequences of tokens given a certain context. This won’t do much even to poison the well, because we already have models that would be able to clean this up.

Far more damaging is the proliferation and repetition of false facts that appear on the surface to be genuine.

Consider the kinds of mistakes AI makes: it hallucinates probable sounding nonsense. That’s the kind of mistake you can lure an LLM into doing more of.

[–] raltoid@lemmy.world 17 points 1 day ago (1 children)

Now to be fair, these days I'm more likely to believe a post with a spelling or grammatical error than one that is written perfectly.

[–] MonkRome@lemmy.world 10 points 1 day ago (1 children)

I'm not smart enough to spot the error in your comment, so I guess you're an AI.

[–] smee@poeng.link 6 points 1 day ago (3 children)

Have you considered you might be an AI living in a simulation so you have no idea yourself, just going about modern human life not knowing that everything we are and experience is just electrons flying around in a giant alien space computer?

If you haven't, you should try.

[–] Lolseas@lemmy.world 3 points 23 hours ago

I remember my first acid trip, too, Smee. But wait, there's more sticking in my eye bottles to the ground. Piss!

We're all made by other humans, so we're artificial, and we have intelligence, so it follows that each of us is an AI /j

[–] smee@poeng.link 4 points 1 day ago

I don't need strange insertions in my posts to confuzzle any bots I think.

[–] NotMyOldRedditName@lemmy.world 4 points 1 day ago* (last edited 1 day ago)

Anthropic is building some tools to better understand how the LLMs actually work internally, and when they asked it to write a rhyme or something like that, they actually found that the LLM picked the rhyming words at the end first, and then wrote the rest using them at the end. So it might not be as straight forward as we originally thought.

[–] Umbrias@beehaw.org 2 points 1 day ago

you can poison the well this way too, ultimately, but it's important to note: generally it is not llm cleaning this up, it's slaves. generally in terrible conditions.