this post was submitted on 07 Sep 2025
47 points (94.3% liked)

Futurology

3253 readers
129 users here now

founded 2 years ago
MODERATORS
 

cross-posted from: https://lemmy.sdf.org/post/41849856

If an LLM can't be trusted with a fast food order, I can't imagine what it is reliable enough for. I really was expecting this was the easy use case for the things.

It sounds like most orders still worked, so I guess we'll see if other chains come to the same conclusion.

you are viewing a single comment's thread
view the rest of the comments
[–] webghost0101@sopuli.xyz 1 points 1 day ago

I think its one of the systems thinking emergent from, not the only system though.

I do regularly feel like i have an llm-analoge component within my consciousnesses, as autihd i will sometimes say the exact opposite word from the meaning i intend.

I am also known to use certain sentences wrong because i apparently misunderstood their meaning, its autocopy things i heard others say in a similar appearing context so my brain believes i can say them to stall socially to have more time to think about what i really want to say.