this post was submitted on 03 Apr 2024
138 points (96.0% liked)

World News

38831 readers
2501 users here now

A community for discussing events around the World

Rules:

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mwguy@infosec.pub 2 points 6 months ago (3 children)
[–] wahming@monyet.cc 3 points 6 months ago (1 children)
[–] mwguy@infosec.pub 1 points 6 months ago

I mean, it probably has a neural network component.

[–] intrepid@lemmy.ca 3 points 6 months ago (1 children)

Doesn't mean that it won't hallucinate. Or whatever you call an AI making up crap.

[–] mwguy@infosec.pub 0 points 6 months ago (1 children)

LLM's hallucinate all the time. The hallucination is the feature. Depending on how you design the neural network you can get an AI that doesn't hallucinate. LLM's have to do that, because they're mimicking human speech patterns and predicting one of my possible responses.

A model that tries to predict locations of people likely wouldn't work like that.

[–] laughterlaughter@lemmy.world 2 points 6 months ago (1 children)

Other AI systems can have hallucinations too.

[–] mwguy@infosec.pub 1 points 6 months ago

The primary feature of LLM's is the hallucination.