this post was submitted on 19 Sep 2025
390 points (99.0% liked)

Not The Onion

18103 readers
1786 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] tal@olio.cafe 31 points 2 days ago (3 children)

The very first artificial general intelligence humanity created was born with an extensive understanding of breast jiggle.

[–] Remember_the_tooth@lemmy.world 15 points 2 days ago (1 children)

Is no one else worried about creating a horny AI that modeled its concept of consent from porn?

[–] tal@olio.cafe 12 points 2 days ago* (last edited 2 days ago)

I think that the broader concept of instilling desired ethics into an AI is part of the friendly AI problem, which is very real and serious


and possibly not reasonably solvable. So while I don't really think that Cortana 2045 running around raping humans or something like that is very high on my likely risk list, I think that the broader problem that contains that particular issue probably is something that we'll need to deal with.

[–] Rai@lemmy.dbzer0.com 4 points 2 days ago (1 children)

LLMs still struggle with chastity cages, we’ll be okay

[–] tal@olio.cafe 2 points 2 days ago* (last edited 2 days ago)

In the broad sense that understanding of spatial relationships and objects is just kind of limited in general with LLMs, sure, nature of the system.

If you mean that models simply don't have a training corpus that incorporates adequate erotic literature, I suppose that it depends on what one is up to and the bar one has. No generative AI in 2025 is going to match a human author.

If you're running locally, where many people use a relatively-short context size on systems with limited VRAM, I'd suggest a long context length for generating erotic literature involving bondage implements like chastity cages, as otherwise once information about the "on/off" status of the implement passes out of the context window, the LLM won't have information about the state of the implement, which can lead to it generating text incompatible with that state. If you can't afford the VRAM to do that, you might look into altering the story such that a character using such an item does not change state over the lifetime of the story, if that works for you. Or, whenever the status of the item changes, at appropriate points in the story, manually update its status in the system prompt/character info/world info/lorebook/whatever your frontend calls its system to inject static text into the context at each prompt.

My own feeling is that relative to current systems, there's probably room for considerably more sophisticated frontend processing of objects, and storing state and injecting state about it efficiently into the system prompt. The text of a story is not an efficient representation of world state. Like, maybe use an LLM itself to summarize world state and then inject that summary into the context. Or, for specific games written to run atop an LLM, have some sort of Javascript module that runs in a sandbox, runs on each prompt and response to update its world state, and dynamically generates text to insert into the context.

I expect that game developers will sort a lot of this out and develop conventions, and my guess is that the LLM itself probably isn't the limiting factor on this today, but rather how well we generate context text for it.

[–] ICastFist@programming.dev 1 points 2 days ago (1 children)

This is important research, one cannot correctly infer the jiggle movement and bounce without an ample and wide sample size!

[–] tal@olio.cafe 1 points 2 days ago

Even if they were wearing a mask, new, more-capable biometric analysis could often identify humans.