this post was submitted on 07 Mar 2024
346 points (93.7% liked)

Showerthoughts

29827 readers
743 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. A showerthought should offer a unique perspective on an ordinary part of life.

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. Avoid politics
    • 3.1) NEW RULE as of 5 Nov 2024, trying it out
    • 3.2) Political posts often end up being circle jerks (not offering unique perspective) or enflaming (too much work for mods).
    • 3.3) Try c/politicaldiscussion, volunteer as a mod here, or start your own community.
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct

founded 1 year ago
MODERATORS
 

it will loose its ability to differentiate between there and their and its and it’s.

top 50 comments
sorted by: hot top controversial new old
[–] spittingimage@lemmy.world 157 points 8 months ago (2 children)
[–] public_image_ltd@lemmy.world 133 points 8 months ago (4 children)

must of made a mistake their

[–] person@lemm.ee 64 points 8 months ago (2 children)
[–] public_image_ltd@lemmy.world 55 points 8 months ago (1 children)
[–] NegativeLookBehind@lemmy.world 47 points 8 months ago (3 children)
[–] RobotToaster@mander.xyz 29 points 8 months ago (2 children)
[–] foggy@lemmy.world 11 points 8 months ago (2 children)
[–] dream_weasel@sh.itjust.works 5 points 8 months ago

I also choose this guy's dead wife.

load more comments (1 replies)
load more comments (1 replies)
[–] Smc87@lemmy.sdf.org 9 points 8 months ago (1 children)

I need to of a word with you

[–] GregorGizeh@lemmy.zip 7 points 8 months ago (3 children)

This one must be the worst. "Could care less" being a close second

load more comments (3 replies)
[–] Rentlar@lemmy.ca 12 points 8 months ago (1 children)

OP hasn't payed enough attention in English class.

load more comments (1 replies)
load more comments (3 replies)
[–] zkfcfbzr@lemmy.world 5 points 8 months ago (1 children)
load more comments (1 replies)
[–] Ghostalmedia@lemmy.world 104 points 8 months ago (1 children)

Now when you submit text to chat GPT, it responds with “this.”

[–] Steve@startrek.website 47 points 8 months ago (1 children)
[–] FartsWithAnAccent@lemmy.world 32 points 8 months ago (1 children)
[–] inlandempire@jlai.lu 35 points 8 months ago (1 children)

As a language model, I laughed at this way harder than I should have

[–] summerof69@lemm.ee 8 points 8 months ago

NTA, that was funny.

[–] BoxerDevil@lemmy.world 42 points 8 months ago (1 children)

And it will get LOSE and LOOSE mixed up like you did

load more comments (1 replies)
[–] circuitfarmer@lemmy.world 31 points 8 months ago

I'm waiting for it to start using units of banana for all quantities of things

[–] raunz@mander.xyz 24 points 8 months ago (3 children)

ChatGPT trained used Reddit posts -> ChatGPT goes temporarily “insane”

Coincidence? I don't think so.

[–] public_image_ltd@lemmy.world 11 points 8 months ago (2 children)

This is exactly what I was thinking.

And maybe some more people did what i did. Not deleting my accounts but replacing all my posts with content created by a bullshit-generator. Made texts look normal, but everything was completely senseless.

Back in june-july, I used a screen tapping tool + boost to go through and change every comment i could edit with generic type fill, then waited something like 2 weeks in hopes that all of their servers would update to the new text, and then used the same app to delete each comment and post, and then the account itself. Its about all I could think to do.

load more comments (1 replies)
[–] FiskFisk33@startrek.website 6 points 8 months ago (1 children)

They have always trained on reddit data, like, gpt2 was, i'm unsure about gpt1

load more comments (1 replies)
load more comments (1 replies)
[–] Infynis@midwest.social 18 points 8 months ago (1 children)

ChatGPT also chooses that guy's dead wife

[–] Chainweasel@lemmy.world 7 points 8 months ago

The Narwhal Bacons at Midnight.

[–] londos@lemmy.world 18 points 8 months ago (3 children)

It also won't be able to differentiate between a jackdaw and a crow.

load more comments (3 replies)
[–] starman2112@sh.itjust.works 15 points 8 months ago

On the contrary, it'll becomes excessively perfectionist about it. Can't even say "could have" without someone coming in and saying "THANK YOU FOR NOT SAYING OF"

[–] Daxtron2@startrek.website 14 points 8 months ago

It already was, the only difference is that now reddit is getting paid for it.

[–] thantik@lemmy.world 13 points 8 months ago (1 children)

It was already trained on Reddit posts. It's just now they're paying for it.

load more comments (1 replies)
[–] bitchkat@lemmy.world 13 points 8 months ago

Its going to be a poop knife wielding guy with 2 broken arms out to get those jackdaws.

[–] outerspace@lemmy.zip 13 points 8 months ago (3 children)
load more comments (3 replies)
[–] Norgur@fedia.io 13 points 8 months ago (3 children)

From now on, when you say something like "I think I can give my hoodie to my girlfriend", it will answer"and my axe""

load more comments (3 replies)
[–] PurpleSheeple@lemmy.world 12 points 8 months ago (1 children)

And between were, we’re and where.

[–] db2@lemmy.world 8 points 8 months ago

Insure and ensure.

[–] driving_crooner@lemmy.eco.br 12 points 8 months ago (3 children)

ChatGPT was already trained on Reddit data. Check this video to see how one reddit username caused bugs on it: https://youtu.be/WO2X3oZEJOA?si=maWhUpJRf0ZSF_1T

load more comments (3 replies)
[–] YoorWeb@lemmy.world 10 points 8 months ago (2 children)

It will also reply "Yes." to questions "is it A or B?".

load more comments (2 replies)
[–] Witchfire@lemmy.world 9 points 8 months ago (1 children)

Don't forget the bullshit that is "would of"

load more comments (1 replies)
[–] shalafi@lemmy.world 8 points 8 months ago (1 children)

"What is a giraffe?"

ChatGPT: "geraffes are so dumb."

[–] AbouBenAdhem@lemmy.world 5 points 8 months ago (1 children)

“I have not been trained to answer questions about stupid long horses.”

load more comments (1 replies)
[–] JackLSauce@lemmy.world 8 points 8 months ago

"Can't even breath"

[–] kescusay@lemmy.world 8 points 8 months ago

Your right.

[–] SoyTDI@lemmy.world 7 points 8 months ago

And then and than.

[–] AnAustralianPhotographer@lemmy.world 6 points 8 months ago (1 children)

And when it learns something new, the response will be "Holy Hell".

[–] mannonym@lemmy.world 6 points 8 months ago (1 children)

Sure it might have some effect, but a big part of ChatGPT besides "raw" training data is RLHF, reinforcement learning from human feedback. Realistically, the bigger problem is training on AI-generated content that might have correct spelling, but hardly makes sense.

load more comments (1 replies)
[–] Feathercrown@lemmy.world 6 points 8 months ago

Is it a showerthought if it's actually just incorrect

[–] wargreymon2023@sopuli.xyz 5 points 8 months ago

The same for Gemini, Google brought its api

load more comments
view more: next ›