this post was submitted on 14 Feb 2024
474 points (98.6% liked)

Technology

59613 readers
2885 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Last year, two Waymo robotaxis in Phoenix "made contact" with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles' software. A "recall" in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn't pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn't elaborate on what it meant by saying that its robotaxis "made contact" with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren't carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to "persistent orientation mismatch" between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

you are viewing a single comment's thread
view the rest of the comments
[–] Chozo@kbin.social 27 points 9 months ago* (last edited 9 months ago) (1 children)

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it.

Having worked at Waymo for a year troubleshooting daily builds of the software, this sounds to me like they may be trying to test riskier, "human" behaviors. Normally, the cars won't accelerate at all if the lidar detects an object in front of it, no matter what it thinks the object is or what direction it's moving in. So the fact that this failsafe was overridden somehow makes me think they're trying to add more "What would a human driver do in this situation?" options to the car's decision-making process. I'm guessing somebody added something along the lines of "assume the object will have started moving by the time you're closer to that position" and forgot to set a backup safety mechanism for the event that the object doesn't start moving.

I'm pretty sure the dev team also has safety checklists that they go through before pushing out any build, to make sure that every failsafe is accounted for, so that's a pretty major fuckup to have slipped through the cracks (if my theory is even close to accurate). But luckily, a very easily-fixed fuckup. They're lucky this situation was just "comically stupid" instead of "harrowing tragedy".

[–] GiveMemes@jlai.lu 1 points 9 months ago (2 children)

Get your beta tests off my tax dollar funded roads pls. Feel free to beta test on a closed track.

[–] Chozo@kbin.social 22 points 9 months ago (1 children)

They've already been testing on private tracks for years. There comes a point where, eventually, something new is used for the first time on a public road. Regardless, even despite even idiotic crashes like this one, they're still safer than human drivers.

I say my tax dollar funded DMV should put forth a significantly more stringent driving test and auto-revoke the licenses of anybody who doesn't pass, before I'd want SDCs off the roads. Inattentive drivers are one of the most lethal things in the world, and we all just kinda shrug our shoulders and ignore that problem, but then we somehow take issue when a literal supercomputer on wheels with an audited safety history far exceeding any human driver has two hiccups over the course of hundreds of millions of driven miles. It's just a weird outlook, imo.

[–] fiercekitten@lemm.ee -1 points 9 months ago (1 children)

People have been hit and killed by autonomous vehicles on public streets due to bad practices and bad software. Those cases aren’t hiccups, those are deaths that shouldn’t have happened and shouldn’t have been able to happen. If a company can’t develop its product and make it safe without killing people first, then it shouldn’t get to make the product.

[–] Chozo@kbin.social 2 points 9 months ago (1 children)

People have been hit and killed by human drivers at much, much higher rates than SDCs. Those aren't hiccups, and those are deaths that shouldn't have happened, as well. The miles driven per collision ratio between humans and SDCs aren't even comparable. Human drivers are an order of magnitude more dangerous, and there's an order of magnitude more human drivers than SDCs in the cities where these fleets are deployed.

By your logic, you should agree that we should be revoking licenses and removing human drivers from the equation, because people are far more dangerous than SDCs are. If we can't drive safely without killing people, then we shouldn't be licensing people to drive, right?

[–] fiercekitten@lemm.ee 0 points 9 months ago (2 children)

I’m all for making the roads safer, but these companies should never have the right to test their products in a way that gets people killed, period. That didn’t happen in this article, but it has happened, and that’s not okay.

[–] Chozo@kbin.social 4 points 9 months ago* (last edited 9 months ago)

People shouldn't drive in a way that gets people killed. Where's the outrage for the problem that we've already had for over a century and done nothing to fix?

A solution is appearing, and you're rejecting it.

[–] ShepherdPie@midwest.social 1 points 9 months ago

Whose been killed by autonomous vehicles?

[–] DoomBot5@lemmy.world 7 points 9 months ago

Full releases have plenty of bugs.