this post was submitted on 14 Feb 2024
474 points (98.6% liked)

Technology

59613 readers
2885 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Last year, two Waymo robotaxis in Phoenix "made contact" with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles' software. A "recall" in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn't pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn't elaborate on what it meant by saying that its robotaxis "made contact" with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren't carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to "persistent orientation mismatch" between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

you are viewing a single comment's thread
view the rest of the comments
[–] bstix@feddit.dk 13 points 9 months ago (1 children)

Unexpected or not, it should do its best to stop or avoid the obstacle, not drive into it.

An autonomous vehicle shouldn't ever be able to actively drive forward into anything. It's basic collision detection that ought to brake the car here. If something is in the position the car wants to drive to, it simply shouldn't drive there. There's no reason to blame the obstacle for being towed incorrectly..

[–] NotMyOldRedditName@lemmy.world 7 points 9 months ago* (last edited 9 months ago) (1 children)

In this case it thought the vehicle had a different trajectory due to how it was improperly set up.

The car probably thought it wasn't going to hit it until it was too late and the trajectory calculation proved incorrect.

Every vehicle on the road is few moments away from crashing if we calculate that incorrectly. It doesn't matter if it knows its there.

[–] bstix@feddit.dk 2 points 9 months ago (1 children)

Same thing applies to a human driver. Most accidents happen because the driver makes a wrong assumption. The key to safe driving is not getting in situations where driving is based on assumptions.

Trajectory calculation is definitely an assumption and shouldn't be allowed to override whatever sensor is checking for obstructions ahead of the car.

[–] NotMyOldRedditName@lemmy.world 2 points 9 months ago (1 children)

The car can't move without trajectory calculations though.

If the car ahead of you pulls forward when the light goes green, your car can start moving forward as well keeping in mind the lead cars trajectory and speed.

If it was just don't hit an object in its path, the car wouldn't move forward until the lead was half way down the block.

The car knew the truck was there in this case, it wasn't a failure to detect. Due to a programming failure it thought it was safe to move because the truck wouldn't be there.

If you're following a vehicle with proper distance and it slams the brakes you should be able to stop in time as you've calculated their trajectory and a safe speed behind. But if that same vehicle slams on the brakes and goes into reverse, well... Goodluck.

It's all assumptions assuming the detection is accurate in the first place.

[–] bstix@feddit.dk 1 points 9 months ago (1 children)

If you're following a vehicle with proper distance and it slams the brakes you should be able to stop in time as you've calculated their trajectory and a safe speed behind.

You dont need to calculate their trajectory. It's enough to know your own.

If a heavy box falls off a truck and stops dead in front of you, you need to be able to stop. That box has no trajectory, so it's an error to include other vehicles trajectories in the safe distance calculation.

Traffic can move through an intersection closely by calculating a safe distance, which may be smaller than the legal definition, but still large enough to stop for anything suddenly appearing on the road. The only thing needed is that the distance is calculated based on your own speed and a visually confirmed position of other things. It can absolutely be done regardless of the speed or direction of other vehicles.

Anyway. A backwards facing truck is a weird thing to misinterpret. Trucks sometimes face backwards for whatever reasons.

It would be interesting to know how the self driving car would react to a ghost driver.

[–] NotMyOldRedditName@lemmy.world 1 points 9 months ago* (last edited 9 months ago) (1 children)

You dont need to calculate their trajectory. It’s enough to know your own.

This doesn't make sense. It's why I was saying the car won't move at a stop light when it goes green until the car is half way down the street.

If the car is 2.5 seconds ahead of me at 60mph on the highway, it's only 2.5 seconds ahead of me if the other car is doing 60 mph. If the car is doing 0mph then I'm going to crash into it.

It needs to know how fast and what direction the obstacle is going, and how to calculate the rate of acceleration/deceleration and extrapolate from there.

[–] bstix@feddit.dk 0 points 9 months ago (1 children)

2.5 seconds at 60 mph is more than enough to come to a full stop. If the car in front of you dropped an anvil (traveling at 0 mph) on the road, you could stop before crashing into the anvil. You do not need to drive into the other cars trajectory path.

[–] NotMyOldRedditName@lemmy.world 1 points 9 months ago* (last edited 9 months ago) (1 children)

You can't be driving behind that vehicles at 60mph with 2.5s WITHOUT knowing it's trajectory.

You keep trying to saying it doesn't need to know the trajectory of all objects around it, but that's not true.

[–] bstix@feddit.dk 1 points 9 months ago (1 children)

Yes you can. It is a stopping distance. 2.5 seconds at 60 mph is 220 feet. A car can brake from 60 to 0 in less than 220 feet. It will take longer than 2.5 seconds to do, but it won't hit the object which originally was 2.5 seconds ahead.

[–] NotMyOldRedditName@lemmy.world 2 points 9 months ago (1 children)

Maybe a straight behind isn't as good an example, although it is calculating the likelihood of it continuing to go straight.

An oncoming car, drifting out of the lane towards your lane.

It's not going to hit you until it's in your path, but the trajectory of it coming towards you is in your path.

If you don't consider where it's going and how fast it's going, you won't know if it's going to enter your lane before you pass it.

If you're only trying to avoid hitting objects and its not in your path until the last quarter second, you won't take appropriate actions because you don't know it's coming at you.

All these measurements are taken as time between you and them and it uses that info to calculate the trajectories.

[–] bstix@feddit.dk 1 points 9 months ago

Yes I know and it should. What I am saying is that the trajectory calculations should never be allowed to override the basic collision calculations, like it did in this case.

It does not matter if the towed truck appeared to have a different trajectory than it actually had, because it was very obviously in the range of collision.

Do you have a reverse sensor in your car that beeps when you're close to stuff?

It was the self driving car that drove into the tow truck. All it's sensors must've been beeping, and it still decided to keep driving.