TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
- The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
- This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
- The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
Accurate.
Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.
Taking out the driver will make this already-unacceptably-lethal system even more lethal.
… Also accurate.
God, it really is a nut punch. The system detects the crash is imminent.
Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.
Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.
Hopefully they wised up by now and record these stats properly…?
NHTSA collects data if self-driving tech was active within 30 seconds of the impact.
The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that’s what they say on their stock earnings calls. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).
WITH a supervising human.
Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.
Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it’s current state.
Fascinating! I don’t know all this. Thanks
Any time :)
If they ever fixed it, I’m sure Musk fired whomever is keeping score now. He’s going to launch the robotaxi stuff soon and it’s going to kill a bunch of people.
Even when it is just milliseconds before the crash, the computer turns itself off.
Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.
deleted by creator
There’s at least two steps before those three:
-1. Society has been built around the needs of the auto industry, locking people into car dependency
That’s a good thing, because the alternative would be flipping the notion of property rights on its head. Making the owner not responsible for his property would be used to justify stripping him of his right to modify it.
You’re absolutely right about point -1 though.
You two don’t seem to strongly disagree. The driver is liable but should then sue the builder/seller for “self driving” fraud.
Maybe, if that two-step determination of liability is really what the parent commenter had in mind.
I’m not so sure he’d agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.
I would assume everyone here would agree with that 😘
I mean, maybe, but previously when I’ve said that it’s typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it’s somehow suddenly too dangerous to allow owners to control their property just because software is involved.
Lemmy is super pro FOSS.