More or less Tesla’s autopilot is not as safe as Tesla would have you believe.
Still almost exactly half the crash rate of human-only drivers. Therefore, we should ban human-only driving.
I don’t think this is a fair comparison since an Autopilot crash is a 2 stage failure: the Autopilot and then the driver both failed to avoid the crash. The statistics do not include the incidents where Autopilot would have crashed but the human took control and prevented it. If all instances of human intervention were included, I doubt Autopilot would be ahead.
If all instances of human intervention were included, I doubt Autopilot would be ahead.
Why would you interpret non-crashes due to human intervention as crashes? If you’re doing that for autopilot non-crashes you’ve gotta be consistent and also do that for non-autopilot non-crashes, which is basically…all of them.
If a human crashes and their action/vehicle is responsible for the crash, the crash should be attributed to the human (excepting mechanical failure, etc). I believe that if an advanced safety systems, such as automatic braking, that prevent a crash that otherwise would have occurred, the prevented crash should also be included in the human tally. Likewise, if Autopilot would have crashed if not for the intervention of the driver, the prevented crash should be attributable to Autopilot.
As has been often studied, the major problem for autonomous systems is that until they are better than humans WITHOUT human intervention, the result can be worse than both. People are much less likely to pay full attention and have the same reaction times if the autonomous system is in full control the majority of the time.
You’re missing the point – with a human driver there is accountability. If I, as a human, cause an accident, I have either criminal or civil liability. The question of “who is at fault” get murky. And then you have the fact that Tesla is not obligated to report the crashes. And then the failures of automated driving is very different than human errors.
I don’t think anyone is suggesting that we ban autonomous driving. But it needs better oversight and accountability.
In these cases the human is still accountable. Do you think that if a Tesla plowed into a kindergarten while using Autopilot the driver would avoid punishment? The driver is using a feature of the car. It tells you to stay alert and be prepared to take over on short notice. Those crashing are the idiots that sit in the backseat, go to sleep or play on their phones while the Autpilot is on. The only self driving right now where I would be in favour of punishing the company if something went wrong is those taxis that you purely are the passenger in.
Sit behind the wheel, you are responsible for what happens.
I’m all for more accountability, but it’s still better than human driving. Cutting human car deaths in half in exchange for murky accountability is clearly a worthwhile trade.
My main issue with Teslas autopilot is it’s branding and the way they advertise it.
Almost every non-tech person I talk to about things like that think it is 100% a hands off robot driver and that is a very, VERY dangerous idea.
It’s a very good system, and it is improving with every update, but it is far from the idea that many people have in their heads.
The videos you see of people sleeping on autopilot are worrying, do Teslas not have driver alert monitoring? if I look away from the road for 5 seconds in my Mazda it lets me know very loudly that it wants me to pay attention, if I were to fall asleep it would do it’s best to wake me up. when I use it’s very simple and limited self driving function I cant take my hands off the wheel for more than about 10 seconds before it alerts me.
They mention the crashrate being lower than a human, what is the actual crashrate?
Irrelevant; that’s what it is. Considering that a human is still ultimately responsible when they’re behind the wheel whether or not “autopilot” is running, it’s the human that should be attributed the lower crash rate.
Otherwise you risk incidents like this one where the human intervenes in a near-miss and actively stops the car from actually causing a severe accident being counted as “pro-autopilot” when it was the human that actually stopped the event from occurring.
No, the crash rate is definitely relevant, but I do get what you’re saying about safety being human caused.
While this is undeniably tragic in every instance, I can’t help but point out that the title had my sleep deprived brain thinking, “how the hell did the car crash that many times and keep on driving”
I’m so relieved I wasn’t the only fool, thank you lol
They’re running a beta test to the general public - only the thing they’re testing is a 2 ton ball of metal and explosive material regularly traveling at 45 mph (70 kmh). They even have the gall to charge for the ability to beta test it. I really hope this gets regulated at some point, otherwise this is just the beginning
Maybe a controversial opinion, but I’m glad they are charging for it. I wish there were a better way to vet who gets to be beta testers, but at least by charging money, they are ensuring only people who care about the technology get to use it.
Maybe I’m jaded, but it seems like drivers, in general, have gotten worse post-pandemic, and I wouldn’t trust 90% of them with autonomous driving features in the state it’s in.
I’m shocked it isn’t already regulated. I get it’s a developing technology but cars can be murderous.
Where I am, SAE Level 3 is banned as in you need authorization to test it out on public roads but SAE Level 2 is allowed. There are also SAE Level 5 vehicles in operation today, they’re just on private roads/property and nearly all of them are regulated, it’s just under workplace safety laws instead of driving laws.