Tesla under investigation by California attorney general over Autopilot safety, marketing::The California attorney general is investigating Tesla over the electric car company’s driver assistance technology, CNBC has learned.
Tesla autopilot is not safe, it’s killing people: https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
Just look at this insane pileup and tell me that you want to drive this crap: https://www.businessinsider.com/tesla-stops-tunnel-pileup-accidents-driver-says-fsd-enabled-video-2023-1?op=1
About the pileup case, don’t take me wrong if the Tesla wouldn’t have stopped it wouldn’t have caused the accident, that I agree… and it’s terrible and should be disabled until these issues are solved.
But at the end most of the pileup was caused by people being people. Like it could have been a normal car or even a Tesla that stopped for any good reason and the pileup would have happened anyway… The first cars stopped fine but then some other cars that didn’t keep a safe distance/ were going too fast or were distracted started crashing.
You can see the pileup in the video. Tesla has a phantom break for no reason, and the cars driving behind slam into it.
They wouldn’t have slammed into it, if they’d kept their safe distance as @[email protected] wrote. I’m in no way defending Tesla‘s „Autopilot“, it should be banned until they pass a very difficult test proving true self driving capabilities and multiple layers auf fail safes (which they can’t right now). But examples where an autopilot Tesla did something stupid and other people making human errors are disingenuous: if somebody drops their cigarette and breaks unexpectedly and the cars behind don’t keep their distance and slam into it, the reason they have an accident is not the cigarette but their dangerous safety distance.
Drivers aren’t safe and they’re killing people. The FUD around this is hyperfocusing scrutiny. The real shame here is just leaving it up to Tesla. Governments should be throwing huge piles of money towards R&D, best practice standards, and breaking down barriers to testing and implementation. Thousands of lives are being lost because we are not moving fast enough with safe self driving tech. We have the ability to do this and it’s a shame it’s going so slowly at the cost of driver and pedestrian lives.
TÜV sagt nein.
The technology definitely still needs to be worked on, as it is, but I trust autopilot more than I trust the average commuter I have to deal with. And I deal with a lot.
Agreed. Anyone who claims FSD is worse driver than a human is just objectively wrong and doesn’t know what they’re talking about. Is it flawless? No. Is it better than the best human driver? Probably not. Is it miles ahead of average driver? Without a question.
Autopilot on freeways? Definitely better than the average driver. FSD on freeways? Same thing. I rely on those constantly, and also get frustrated when people complain about AP being unsafe.
FSD on streets? Definitely still worse than the average driver, at least in places that don’t have perfectly laid out street grids and properly painted lane lines which is what I deal with. I can’t make it through a drive on streets without disengaging multiple times.
The problem is these three different things get lumped together in conversations/articles all the time.
Yeah the article seems to use autopilot and FSD interchangeably but I was talking about FSD and I expect everyone else to mean it aswell who’s talking about autopilot because the autopilot feature in itself is nothing special.
I’m basing my opinion on the videos I’ve seen from AI DRIVR on youtube who really seems to put it in challenging spots but yeah I don’t have any first hand experience myself so I can’t argue against that