this post was submitted on 08 Feb 2025
164 points (99.4% liked)

chapotraphouse

13691 readers
46 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 4 years ago
MODERATORS
 

rational enlightened beings that think the terminator from the movies is real i-cant

top 50 comments
sorted by: hot top controversial new old
[–] doublepepperoni@hexbear.net 66 points 3 weeks ago* (last edited 3 weeks ago) (4 children)

In Terminator, Skynet sends an Austrian robot back through time to shoot you in the face

Roko's Basilisk creates a virtual version of you at some point in the future after your death that it then tortures for eternity... and you're supposed to piss your pants over the fate of this Metaverse NPC for some reason

[–] ThermonuclearEgg@hexbear.net 47 points 3 weeks ago

Some people need to read more SCPs about cognitohazards memetic-kill-agent

[–] Esoteir@hexbear.net 36 points 3 weeks ago (1 children)

woooOOooo no but you see YOU are the metaverse NPC inside of the internet RIGHT NOW and the AI is simulating your entire life from the beginning to see if you want to bone an LLM and if you dont it puts you in the noneuclidean volcel waterboarding dungeon woooOOooo scared biden-horror jerma-fear

[–] keepcarrot@hexbear.net 6 points 3 weeks ago

And it has sent npc rationalists to torture me with their conversation. Unpleasant, to be sure

[–] NephewAlphaBravo@hexbear.net 32 points 3 weeks ago

damn that sucks for the virtual version of me but idgaf lmao sentient-ai stalin-gun-1stalin-gun-2

[–] KnilAdlez@hexbear.net 24 points 3 weeks ago

So wait, it's just I Have No Mouth, and I Must Scream? Did they bother to understand why AM was angry, as spelled out in the story?

[–] WhyEssEff@hexbear.net 56 points 3 weeks ago* (last edited 3 weeks ago)

as Brace put it and I had to rewrite out of my post because he said it after I thought it, this is people who believe they are the modern equivalent to Socrates falling for chain emails yud-rational

[–] FnordPrefect@hexbear.net 49 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

The only thing scary about this shit is that the Rationalists™ have convinced people that the Rational™ thing to do with a seemingly unstoppable megatrocity torture machine is to help and appease it stalin-stressed

[–] Esoteir@hexbear.net 40 points 3 weeks ago* (last edited 3 weeks ago)

they turned the end of history into a cringe warhammer 40k god because it's the only two things they know: liberalism and gaming

[–] Nakoichi@hexbear.net 44 points 3 weeks ago (1 children)

Roko's "I pulled this out of my ass"-alisk

[–] i_drink_bleach@hexbear.net 19 points 3 weeks ago (1 children)

It's literally just Pascal's Wager but with an "AI" paint job and some racing stripes. no-mouth-must-scream

[–] Hexboare@hexbear.net 6 points 3 weeks ago (1 children)

Pascal's wager is about a plane of existence that is entirely unknowable, at least.

[–] i_drink_bleach@hexbear.net 9 points 3 weeks ago

I mean, I'd say a hypothetical AI occurring at some unknown time in the future, which then remakes your exact brain architecture after you long since died just stuff you in a locker and pants you for all eternity is pretty unknowable as well. ;p

[–] Cysioland@lemmygrad.ml 39 points 3 weeks ago (3 children)

it's Pascal's Wager for rationalists

[–] Alaskaball@hexbear.net 21 points 3 weeks ago

Literally what I said when I first heard of it.

[–] Barabas@hexbear.net 17 points 3 weeks ago

Pascal's Wager crossed with The Game thought out by someone who heard the synopsis of I Have No Mouth But I Must Scream.

[–] Hexboare@hexbear.net 14 points 3 weeks ago
[–] AcidSmiley@hexbear.net 35 points 3 weeks ago (1 children)

Pascal's Wager, but if you do not develop skynet robo Jesus sends Arnie back in time to terminate you

I wonder if these dorks now have sleepless nights wondering if robo Jesus will actually develop out of DeepSeek and send them to robot superhell for being OpenAI heretics.

[–] PKMKII@hexbear.net 31 points 3 weeks ago

That’s the richest irony here, the rationalists are almost all atheists or agnostic's that scoff at arguments like Pascal’s wager but then have backed themselves into that exact logic.

[–] Lemmygradwontallowme@hexbear.net 29 points 3 weeks ago

Wow, subtweeting about Trueanon Episode 434? I listened to it and damn was it a banger in its 2 hours... yeah, the ridiculousness of the Rationality cult rlly deserves its mockery.

[–] Antiwork@hexbear.net 23 points 3 weeks ago

We need these people to have power in our institutions. Imagine the chaos

[–] dannoffs@hexbear.net 23 points 3 weeks ago

Imagine a powerful omniscient future AI that reincarnates your consciousness if you believe in rokos basilisk and gives you swirlies forever.

[–] TraschcanOfIdeology@hexbear.net 20 points 3 weeks ago

"Oh you're a Rationalist? Name 3 different Kant books"

[–] WeedReference420@hexbear.net 19 points 3 weeks ago

"Um guys is Jeff the Killer real?" for people who use the term "age of consent tyranny"

[–] corgiwithalaptop@hexbear.net 18 points 3 weeks ago (2 children)

I always said Rokos Basilisk would be ACTUALLY cool if it really was just a giant snake.

Giant snakes are awesome.

load more comments (2 replies)
[–] CthulhusIntern@hexbear.net 18 points 3 weeks ago (2 children)

Roko's Basilisk only is scary if you subscribe to their version of utilitarianism, which is purist, but also is a weird zero-sum version. Like, one of them wrote an essay that, if you could torture someone for 50 years, and that would make nobody ever have dust in their eyes again, you should torture the guy, because if you quantify the suffering of the guy, it's still less than the amount of suffering every subsequent person will feel from having dust in their eyes.

But also, even if you do subscribe to that, it doesn't make sense, because in this hypothetical, the Basilisk has already been created, so torturing everyone would serve no utilitarian purpose whatsoever.

[–] doublepepperoni@hexbear.net 16 points 3 weeks ago (1 children)

Like, one of them wrote an essay that, if you could torture someone for 50 years, and that would make nobody ever have dust in their eyes again, you should torture the guy, because if you quantify the suffering of the guy, it's still less than the amount of suffering every subsequent person will feel from having dust in their eyes.

my-hero "It's fine for billionaires like me to grind workers into dust today because my actions will one day lead mankind to a glorious future where we upload our brains into robots or something and you can have unlimited hair plugs"

[–] CthulhusIntern@hexbear.net 7 points 3 weeks ago

Basically, yeah. There's a reason that dictator types and wannabe dictator types love utilitarianism so much. They can use it to justify literally anything.

[–] GnastyGnuts@hexbear.net 6 points 3 weeks ago

It's the world's shittiest people bending themselves into pretzels to justify shit everybody else intuitively understands is fucked up.

[–] Juice@midwest.social 18 points 3 weeks ago

Yeah people who spend hours dooming on forums about rokos basilisk are just like afraid of the plot of a bad tv pilot

[–] capitanazo@hexbear.net 14 points 3 weeks ago

AI is being trained with the data of everyone on the internet, so we are all helping it to be created. Socko's floppy disk debunked!!!!

[–] FlakesBongler@hexbear.net 12 points 3 weeks ago

I'd like to see Roko's Basilisk deal with RoboCop

Him or Predator

[–] The_Jewish_Cuban@hexbear.net 12 points 3 weeks ago

It's ridiculous because rokos basilisk has a built-in element which requires you to share knowledge of its existence.

The more people know about it, the more likely it will be to come into existence. Just like most (all?) religions the idea of the basilisk is self propagating.

[–] christian@hexbear.net 10 points 3 weeks ago (2 children)

Someone please explain Roko's Basilisk to me. I have a loose understanding of what rationalism is, I'm assuming that should put me on about the same level as a rationalist's understanding of it.

[–] WhyEssEff@hexbear.net 16 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

hypothetically, in the future, humanity builds an AI that is so advanced and so powerful it is able to reform the consciousnesses of those who were aware of its possibility yet did not contribute to its existence and eternally torture those consciousnesses.


the "wait aren't you lot supposed to be thinking about this kind of stuff constantly" problems:

  • are we resurrecting or is my tiny clone being tortured
  • is consciousness a value or a reference? am i the payload or am i the pointer
  • how do we know that there isn't just an inherent barrier between biology and bits. what if consciousness transfer just cannot occur. what if we are bound to our flesh mechs.

the otherwise obvious problems:

  • what if it doesnt happen
  • why would it do that
  • is it different and rational if your religious deity is a gundam
  • death robot operates on chain email logic i-cant

okay, but for real, if we're actually thinking rationally, this is a possibility that cannot be reasonably accounted for due to its inability to be modeled in the framework of "reality as it is now and/or seems to allow for as we currently understand it." I could just as easily say "what if the earth splits in two and the northern hemisphere is jettisoned into the sun" but no one's expecting you to prepare for that. It's the same reason why being a prepper for the zombie apocalypse is understood to be irrational – it does not conform to reality, it's an obsession with a fictional scenario that remains unrealized.

[–] christian@hexbear.net 8 points 3 weeks ago (1 children)

To your last point, I have no idea what "basilisk" means in this context, but my basilisk will reward the people who have heard of roko's but don't contribute to its existence with joy and good fortune. In addition, my basilisk will reward the people who contribute to its own existence with a free freshly-baked plate of their favorite cookies in their time of greatest need.

We'll see which one wins out.

[–] WhyEssEff@hexbear.net 6 points 3 weeks ago

basilisk as in you look at it (become aware of it) and it kills you (tortures your tiny clone for eternity)

load more comments (1 replies)
[–] sweatersocialist@hexbear.net 10 points 3 weeks ago (2 children)

oh fuck reading this post informed ne of his existence and i’ve done nothing to help usher him in and now he’s gonna torture me for eternity of fuck

load more comments (2 replies)
load more comments
view more: next ›