In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.
The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.
What would definitely help with the discussion is if Mark Rober the scientist left a fucking crumb of scientific approach in his video. He didn’t really explain how he was testing it just slam car into things for views. This and a collaboration with a company that makes lidar made the video open to every possible criticism and it’s a shame.
Discovery channel level of dumbed down „science”.
Okay, but what would you like him to elaborate on, other than showing you that the Tesla is fooled by a road runner type mural, fog and dense rain?
How much more info other than just “car didn’t stop” (where other car did stop) do you need to be convinced this is a problem?
Did he enable the autopilot? When? What his inputs to the car were? Is if fsd? What car is that?
You can make every car hit a wall, that is the obvious part, but by claiming (truthfully, I have no doubt) that the car hit it on its own I would like to know what made it do it.
He said after the first of the 5 tests that every tesla test has autopilot on because some features are only enabled then
You didn’t watch the video did you? He address that after the first test and said all further test will be done with self driving on.
I have no doubt the car will crash.
But I do feel there is something strange about the car disengaging the auto pilot (cruise control) just before the crash. How can the car know it’s crashing while simultaneously not knowing it’s crashing?
I drive a model 3 myself, and there is so much bad shit about the auto pilot and rain sensors. But I have never experienced, or heard anyone else experiencing a false positive were the car disengage the auto pilot under any conditions the way shown in the video with o sound or visual cue. Considering how bad the sensors on the car is, its strange they’re state of the art every time an accident happens. There is dissonance between the claims.
Mark shouldn’t have made so many cuts in the upload. He locks the car on 39mph on the video, but crashes at 42mph. He should have kept it clean and honest.
I want to see more of these experiments in the future. But Marks video is pretty much a commercial for the Lidar manufacturer. And commercials shouldn’t be trusted.
This. If the video presented more facts and wasn’t paid for by competition it would be trustworthy. Otherwise it’s just clickbait (very effective judging by the fact we’re discussing it).
Actually, his methodology was very clearly explained. Did you watch the whole video? He might have gushed a bit less about LiDAR but otoh the laymen don’t know about it so it stands to reason he had to explain the basics in detail.
Found the Tesla owner!
😋
So Tesla owners have a monopoly on caring about the process of an experiment?
A logic conclusion by that is anyone not a Tesla owner is incapable of critical thought?
How is this a win?
What did you not like about his process?
I have no doubt the car will crash.
But I do feel there is something strange about the car disengaging the auto pilot (cruise control) just before the crash. How can the car know it’s crashing while simultaneously not knowing it’s crashing?
I drive a model 3 myself, and there is so much bad shit about the auto pilot and rain sensors. But I have never experienced, or heard anyone else experiencing a false positive were the car disengage the auto pilot under any conditions the way shown in the video with o sound or visual cue. Considering how bad the sensors on the car is, its strange they’re state of the art every time an accident happens. There is dissonance between the claims.
Mark shouldn’t have made so many cuts in the upload. He locks the car on 39mph on the video, but crashes at 42mph. He should have kept it clean and honest.
I want to see more of these experiments in the future. But Marks video is pretty much a commercial for the Lidar manufacturer. And commercials shouldn’t be trusted.
I fucking hate tesla and elon musk. Also I fucking hate people calling unverifiable shit science
You’re upset that made up people in your head called this video a research project or something? Because the closest thing I could find to what you’re complaining about is his YouTube channel’s description where it says “friend of science”.
He never claimed to be a scientist, doesn’t claim to be doing scientific research. In his own words, he’s just doing some tests on his own car. That’s it.
Well, it was published, up to you to do a peer review I guess!
Also, this isn’t needing science, it blatantly shows that things does infact not function as intended.
Just fyi, they used AEB in one car and cruise control in another. Far from even. I think it was a fail from the start considering they couldn’t get AEB to even fire on the Tesla driving without cruise control. Insane
Were is a robust description of the experiment? Or am I supposed to look frame by frame at the screen in the car to deduce the testing conditions?
All he had to do was tell us clearly what is enabled on each car and what his inputs are. That would solve all the tesla fanbois comments about him cheating. Maybe he didn’t for „engagement”.
deleted by creator
He made an elaborate test track specifically to make interesting observations.
He set up dozens of cameras to record interesting observations from multiple angles.
He collected footage of interesting phenomena he observed as they were happening in his elaborate test environment.
He then cut the footage up so much it’s impossible for us to say exactly what really happened.
If he went to all this trouble, and then made claims based on his experiment would it really hurt the video to explain the testing process a little bit more?