this post was submitted on 07 May 2025
97 points (85.9% liked)
Technology
69867 readers
2839 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Those were not his words. They were someone else's words spoken by a very realistic puppet they made of him after he died.
That's weird at best, and does not belong in a court.
No doubt it's weird, but it was also a genuine attempt by a sister to speak for her beloved brother. I think it's beautiful and a perfect example of the importance of keeping an open mind, especially regarding things that make us uncomfortable.
So we agree on one point, weirdness.
It’s still got no business in a courtroom.
Why not? It wasn't used to influence the trial in any way; it was just part of the victim impact statements after the verdict was rendered.
Because a judge allowing anyone to represent their views in court as though those views belong to someone else is a textbook "bad idea." It is a misrepresentation of the truth.
So it would've been equally bad if instead of a video, she'd just read a statement she'd written in his voice? Something along the lines of:
Not at all, because it would have been her making claims about what she believes her brother would have said, and not a simulacrum of her brother speaking her words with his voice.
But that's what she did. She was upfront about the fact that it was an AI video reciting a script that she'd written.
You can say that all you want, but when your brain is presented with a video of a person, using that person's voice, you're going to take what's being said as being from that person in the video.
True, many people would have that problem, which is why the context in which the video was shown was acceptable; it was after the verdict had been given.
Such a thing should not impact sentencing, either. The judge allowed it, the judge was swayed by it, it impacted sentencing. This is wrong.