Palantir's panel at a recent military conference where they joked and patted themselves on the back about the work their AI tools are doing in Gaza was like a scene with human ghouls in the darkest of horror movies.
Estimates vary as to how many of the 30,000-40,000 dead in Gaza are military combatants, but they seem to average about 20%. This seems like a terrible record of failure for an AI tool that touts its precision.
Why does the US government want to reward and endorse this tech? Why aren't people more alarmed? By any measure, surely Palantir's demonstrated track record is one of failure. The Israel-Hammas war is the first time the world has seen AI used in significant warfare. It's a grim indication for the future.