this post was submitted on 25 Jul 2025
398 points (98.3% liked)

memes

17090 readers
2341 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/Ads/AI SlopNo advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
all 37 comments
sorted by: hot top controversial new old
[–] saltesc@lemmy.world 55 points 1 month ago* (last edited 1 month ago) (1 children)

At my work, it's become common for people to say "AI level" when giving a confidence score. Without saying anything else, everyone seems to perfectly understand the situation, even if hearing it for the first time.

Keep in mind, we have our own in-house models that are bloody fantastic, used for different sciences and research. We'd never talk ill of those, but it's not the first thing that comes to mind when people hear "AI" these days.

[–] AnarchistArtificer@lemmy.world 7 points 1 month ago (3 children)

"Keep in mind, we have our own in-house models that are bloody fantastic, used for different sciences and research."

I'm a scientist who has become super interested in this stuff in recent years, and I have adopted the habit of calling the legit stuff "machine learning", reserving "AI" for the hype machine bullshit

[–] YiddishMcSquidish@lemmy.today 5 points 1 month ago

This hits hard. I was in college when I first learned a machine solved the double pendulum problem. Problem is we have no idea how the equation works though. I remember thinking all the stuff machine learning could solve. Then they over hyped these LLMs that are good at ::checks my notes:: chatting with you.....

[–] saltesc@lemmy.world 1 points 1 month ago

It's certainly becoming easier, but I don't like it.

We have a cool AI that's a big problem solver, but its outputs are complex. We've attached a GPT onto it purely to act kind of like a translator or summariser to save time trying to understand what the AI's done and why. It's great. But we definitely don't see the GPT as offering any sort of intelligence, it's just a reference based algorithmic protocol bolted onto an actual AI. Protocols are, afterall, a set of rules or processes to follow. The GPT isn't offering any logic, reasoning, planning, etc. which are still the conditions of intelligence in computer science. But it certainly can give off the impression of intelligence as it's literally designed to impersonate it.

[–] markovs_gun@lemmy.world 1 points 1 month ago

Lmao I'm doing the exact thing. I'm a ChemE and I have been doing a lot of work on AI based process controls and I have coached members of my team to use "ML" and "Machine Learning" to refer to these systems because things like ChatGPT that most people see as toys are all that people think about with "AI." The other day someone asked me "So have you gotten ChatGPT running the plant yet?" And I laughed and said no and explained the difference between what we're doing and AI that you see in the news. I even have had to include slides in just about every presentation I've done on this to say "no, we are not just asking ChatGPT how to run the process" because that's the first thing that comes to mind and it scares them because ChatGPT is famously very prone to making shit up.

[–] ExcessShiv@lemmy.dbzer0.com 25 points 1 month ago (3 children)

It's not wrong though...There's one r and one rr in strawberry

[–] Laser@feddit.org 14 points 1 month ago (1 children)

Wrong! There's no r in strawberry, only an str and an rr.

[–] Haaveilija@lemmy.world 2 points 1 month ago
str(awberry)
[–] lugal@lemmy.dbzer0.com 12 points 1 month ago (2 children)

Found the Spanish speaker (they count rr as a separate letter)

[–] lemmyknow@lemmy.today 4 points 1 month ago (1 children)

Don't think those are separate letters. Just pronounced differently. I mean, rr is just 2 r's. Not a new letter. And this isn't an ß-type case either. Phonetically different, yes. Different letters? Creo que no. Could be wrong, though. Hispanohablantes de Lemmy, corrijanme

[–] exasperation@lemmy.dbzer0.com 2 points 1 month ago (2 children)

In Spanish, up until 1994, "ll" and "ch" were considered distinct letters from the component parts. But "rr" has never been considered distinct from "r," even though it is pronounced differently, in large part because no words start with "rr" and any word that starts with "r" is pronounced with the rolling R sound.

[–] lugal@lemmy.dbzer0.com 1 points 1 month ago* (last edited 1 month ago)

Thanks, I learned Spanish at school in the the late odds and I guess I confused it. My teacher was quite old so she wasn't up to date I guess

[–] lemmyknow@lemmy.today 1 points 1 month ago (1 children)

Aren't all R's rolling, though? Some longer and some shorter. I.e. rr and r. Guess I get it, though. Words starting with a single 'r' are pronounced like 'rr'. Interesting on the 'll' and 'ch' bits, too. Wasn't aware ¡Gracias, RAE!

[–] lugal@lemmy.dbzer0.com 1 points 1 month ago (1 children)

No, single r is tapped not rolled. It's similar but still a different sound

[–] lemmyknow@lemmy.today 2 points 1 month ago (1 children)

Oh, okay. Sounds similar. Sounds like rolling, but just once. Rolling if you turn off the looping option. I am not a linguist, though, so I don't know the intricacies of sounds

[–] lugal@lemmy.dbzer0.com 1 points 1 month ago

Yes, I think that sums up the difference quite well. If you want to dive into it: this is the single r and this is rr

[–] ExcessShiv@lemmy.dbzer0.com 4 points 1 month ago (1 children)

Nope, I can order beer in spanish (no more than 10 at a time) and that's about it.

[–] lugal@lemmy.dbzer0.com 2 points 1 month ago (1 children)

Is the limit because you only know number up to 10 or because after that your drunk or a little bit of both?

[–] ExcessShiv@lemmy.dbzer0.com 2 points 1 month ago (1 children)

I only know numbers up to 10

[–] lugal@lemmy.dbzer0.com 3 points 1 month ago (1 children)

11 is "once" and 12 is "doce". Now you can order a dozen

[–] Septimaeus@infosec.pub 4 points 1 month ago

Classic enabler

[–] Valmond@lemmy.world 4 points 1 month ago

It didn't say one and only one eh! One r, then one r again!

[–] sykaster@feddit.nl 6 points 1 month ago (4 children)

I asked this question to a variety of LLM models, never had it go wrong once. Is this very old?

[–] gigachad@sh.itjust.works 44 points 1 month ago* (last edited 1 month ago) (2 children)

They fixed it in the meantime:

if "strawberry" in token_list:  
    return {"r":  3}  
[–] towerful@programming.dev 8 points 1 month ago

Now you can ask for the number of occurrences of the letter c in the word occurrence.

[–] RidderSport@feddit.org 0 points 1 month ago (2 children)

You're shitting me right? They did not just use an entry grade java command to rectify and issue that a LLM should figure out by learning right?

[–] boonhet@sopuli.xyz 14 points 1 month ago (1 children)

Well firstly it's Python, secondly it's not a command and thirdly it's a joke - however, they have manually patched some outputs for sure. Probably by adding to the setup/initialization prompt

[–] RidderSport@feddit.org 1 points 1 month ago (1 children)

Java is the only code I have any (tiny) knowledge of, which is why the line reminded me of that.

[–] boonhet@sopuli.xyz 3 points 1 month ago* (last edited 1 month ago) (1 children)

Ah, but in Java, unless they've changed things lately, you have the curly brace syntax of most C-like languages

if ("strawberry" in token_list) {
    return something;
}

Python is one of the very few languages where you use colons and whitespace to denote blocks of code

[–] RidderSport@feddit.org 1 points 1 month ago

See, you're defined better, has been a decade for me ^^

Would it also shock you if water was wet, fire was hot, and fascists were projecting?

[–] BootLoop@sh.itjust.works 15 points 1 month ago* (last edited 1 month ago) (2 children)

Try "Jerry strawberry". ChatGPT couldn't give me the right number of r's a month ago. I think "strawberry" by itself was either manually fixed or trained in from feedback.

[–] sykaster@feddit.nl 5 points 1 month ago

You're right ChatGPT got it wrong, Claude got it right

[–] Zexks@lemmy.world 1 points 1 month ago

Works for me

5 — “jerry” has 2 r’s, “strawberry” has 3.

[–] ignotum@lemmy.world 10 points 1 month ago

Smaller models still struggle with it, and the large models did too like a year ago

It has to do with the fact that the model doesn't "read" individual letters, but groups of letters, so it's less straight forward to count letters

[–] psx_crab@lemmy.zip 1 points 1 month ago

Seeing how it start with an apology, it must've been told they're wrong about the amount. Basically being bullied to say this.