Well, they should try to avoid any object in the road to be honest. Imagine a new toy comes out that a child is on. Sorry we killed that child l, we didn’t train it on that new toy.
I don’t know about the equipment of Waymo cars, but I would be surprised if they didn’t have LIDARs or some other form of distance based environment detection.
And that should be sufficient to implement basic obstacle detection. You don’t need to use machine learning if you can use sensors telling you that “something is too close”.
The car collided after hitting the brakes, seems there wasn’t any real damage.
It seems the system is designed to only lessen the impact when it detects the obstacle as non-human.
If it would have recognized the robot as human, it would have probably acted differently.
Better to hit the object and lessen the impact than to fully brake/avoid and risk worse.
I see nobody talking about this, but aren’t the Waymo(s) trained on data which most likely did not include these robots yet?
It seems like this is just something that’ll probably happen a lot whenever something new is introduced on the roads.
Expect reCaptcha to ask you to identify food delivery robots soon.
Wouldn’t be surprised tbh.
You can say that about nearly 100% of humans as well.
So I’m finding it hard to find a release date for humans, but I’m fairly sure they predated the invention of self-driving cars.
For example I seem to remember being alive in the 1990s
Inattentional blindness is a bitch.
Well, they should try to avoid any object in the road to be honest. Imagine a new toy comes out that a child is on. Sorry we killed that child l, we didn’t train it on that new toy.
Its just an unacceptable answer to be honest.
That is also very true.
It’s not a Tesla so I’m sure they are investigating the cause.
deleted by creator
I don’t know about the equipment of Waymo cars, but I would be surprised if they didn’t have LIDARs or some other form of distance based environment detection.
And that should be sufficient to implement basic obstacle detection. You don’t need to use machine learning if you can use sensors telling you that “something is too close”.
The article title is misleading as usual.
The car collided after hitting the brakes, seems there wasn’t any real damage. It seems the system is designed to only lessen the impact when it detects the obstacle as non-human. If it would have recognized the robot as human, it would have probably acted differently.
Better to hit the object and lessen the impact than to fully brake/avoid and risk worse.