this post was submitted on 08 May 2025
78 points (98.8% liked)

Futurology

2614 readers
57 users here now

founded 2 years ago
MODERATORS
top 13 comments
sorted by: hot top controversial new old
[–] ShellMonkey@lemmy.socdojo.com 24 points 3 weeks ago (1 children)

So a pretty solid representation of most corp execs then.

[–] sunzu2@thebrainbin.org 3 points 3 weeks ago

It appears to be human nature, "leadership" just gets away with it because that's how a power structure works.

little people get fucked, real people make money no matter what.

[–] eestileib@lemmy.blahaj.zone 12 points 3 weeks ago

Amoral bullshit machine? No wonder senior management types love these things.

[–] TankieTanuki@hexbear.net 10 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

It passed the test! It's ready to replace real executives.

[–] Jabril@hexbear.net 7 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

They really are just like humans!

[–] barrbaric@hexbear.net 7 points 3 weeks ago

Whoa, hey, let's not go too far here. I dispute classifying finance officers as "human".

[–] qx128@lemmy.world 10 points 3 weeks ago

Unequivocal proof that it’s trained on human behavior

[–] AceFuzzLord@lemm.ee 7 points 3 weeks ago

Congratulations! We can now start replacing finance officers with AI!

/s

[–] ech@lemm.ee 6 points 3 weeks ago (1 children)

Assigning a lot of morality and intention to word generating algorithms. "Oh no! The thing made to generate text to a prompt generated text to a prompt!"

[–] merc@sh.itjust.works 1 points 1 day ago

Especially because the data it has been trained on is probably not typical for a CFO in the real world. In the real world it's a lot of boring day-to-day stuff, that isn't worth writing up. The stuff worth writing up is exciting thrillers where a CFO steals money, or news reports where a CFO embezzles.

Imagine prompting it with "It was a dark and stormy night" and expecting it to complete that with "so David got out his umbrella, went home, ate dinner in front of the TV then went to bed." It's probably what most "David"s would actually do, but it's not what the training data is going to associate with dark and stormy nights.

[–] homesweethomeMrL@lemmy.world 3 points 3 weeks ago

I mean - yeah? What complete dipshit wouldn't expect it to do that?

[–] Valmond@lemmy.world 2 points 3 weeks ago

-"You're hired!"

[–] N0t_5ure@lemmy.world 1 points 3 weeks ago

I read that they trained it on a Trump dataset though.