this post was submitted on 13 Jul 2025
647 points (97.0% liked)

Comic Strips

18282 readers
1346 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 2 years ago
MODERATORS
 

you are viewing a single comment's thread
view the rest of the comments
[–] DarkSirrush@lemmy.ca 70 points 1 week ago (2 children)

iirc the reason it isn't used still is because even with it being trained by highly skilled professionals, it had some pretty bad biases with race and gender, and was only as accurate as it was with white, male patients.

Plus the publicly released results were fairly cherry picked for their quality.

[–] yes_this_time@lemmy.world 34 points 1 week ago* (last edited 1 week ago)

Medical sciences in general have terrible gender and racial biases. My basic understanding is that it has got better in the past 10 years or so, but past scientific literature is littered with inaccuracies that we are still going along with. I'm thinking drugs specifically, but I suspect it generalizes.

[–] Ephera@lemmy.ml 19 points 1 week ago (1 children)

Yeah, there were also several stories where the AI just detected that all the pictures of the illness had e.g. a ruler in them, whereas the control pictures did not. It's easy to produce impressive results when your methodology sucks. And unfortunately, those results will get reported on before peer reviews are in and before others have attempted to reproduce the results.

[–] DarkSirrush@lemmy.ca 8 points 1 week ago

That reminds me, pretty sure at least one of these ai medical tests it was reading metadata that included the diagnosis on the input image.