Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tesla just announced 500 million miles driven by FSD [1]. Per the video, were it fully autonomous they could have a 95% CI on "safer than human" at only 275 million miles [2], but obviously having human supervision ought to remove many of the worst incidents from the dataset. Does anyone know if they publish disengagement data?

[1] https://digitalassets.tesla.com/tesla-contents/image/upload/...

[2] https://youtu.be/yaYER2M8dcs?t=477



This just shows how statistics can mislead. I own a Tesla with FSD and it's extremely unsafe for city driving. Just to quantify, I'd say at its absolute best, about 1 in 8 left turns result in a dangerous error that requires me to retake control of the car. There is no way it even comes close to approaching the safety of a human driver.


I only spent 3/4 of my post adding caveats, geez. Thanks for the first hand intuition, though.


The caveats are missing the point that FSD is very obviously less safe than a human driver, unless you constrain the data to long stretches of interstate road during the day, with nice weather, clearly marked road lines and minimal construction. At that point, my "intuition" tells me human drivers probably still safer, but under typical driving conditions they very obviously are (at least with Tesla FSD, I don't know about Waymo)


The reason why I spent 3/4 of my post on caveats was because I didn't want people to read my post as claiming that FSD was safe, and instead focus on my real point that the unthinkable numbers from the video aren't actually unthinkable anymore because Tesla has a massive fleet. You're right, though, I could have spent 5/6 of my post on caveats instead. I apologize for my indiscretion.


> my real point that the unthinkable numbers from the video aren't actually unthinkable anymore because Tesla has a massive flee

Yes, I'm addressing that point directly, specifically the fact that this "unthinkable number" is misleading regardless of the number's magnitude.


FSD's imperfections and supervision do not invalidate their fleet's size and its consequent ability to collect training data and statistics (eventually, deaths per mile statistics). The low fleet size assumption in the presentation is simply toast.

If I had claimed that the 500 million number indicated a certain level of deaths-per-mile safety, that would be invalid -- but I spent 3/4 of my post emphasizing that it did not, even though you keep pretending otherwise.


You could start by comparing highway driving, where I think Tesla actually is quite good.


Tesla's mileage numbers are meaningless because the human has to take over frequently. They claim credit for miles driven, but don't disclose disconnects and near misses.

California companies with real self driving have to count their disconnects and report all accidents, however minor, to DMV. You can read the disconnect reports online.


Do you trust claims and data from Tesla?


Do you think they lied about miles driven in the investor presentation?

Nah, that would be illegal. Their statement leaves plenty of room for dirty laundry though. I'm sure they won't disclose disengagement data unless forced, but they have plenty of legal battles that might force them to disclose. That's why I'm asking around. I'd love to rummage through. Or, better, to read an article from someone else who spent the time.


> Nah, that would be illegal.

Musk has violated many rules regarding investors.


Note that it would need to drive those 275 million miles without incident to be safer than a human.

Which for Tesla's FSD is obviously not the case.

https://www.motortrend.com/news/tesla-fsd-autopilot-crashes-...


Your video and my response were talking about fatal crashes. Humans don't go 100 million miles between crashes.

Has FSD had a fatality? Autopilot (the lane-follower) has had a few, but I don't think I've heard about one on FSD, and if their presentations on occupancy networks are to be believed there is a pretty big distinction between the two.


Isn't "FSD" the thing they're no longer allowed to call self driving because it keeps killing cyclists? Google suggests lots of Tesla+cyclist+dead but with Tesla claiming it's all fine and not their fault, which isn't immediately persuasive.


> Google suggests lots of Tesla+cyclist+dead but with Tesla claiming it's all fine and not their fault, which isn't immediately persuasive.

With human drivers -- are we blaming Tesla for those too?

You do you, but I'm here to learn about FSD. It looks like there was a public incident where FSD lunged at a cyclist. See, that's what I'm interested in, and that's why I asked if anyone knew about disengagement stats.


It appears that the clever trick is to have the automated system make choices that would be commercially unfortunate - such as killing the cyclist - but to hand control back to the human driver just before the event occurs. Thus Tesla are not at fault. I feel ok with blaming Tesla for that, yeah.


Is that real? I've heard it widely repeated but the NHTSA definitions very strongly suggest that this loophole doesn't actually exist:

> https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_Da...

The Reporting Entity’s report of the highest- level driving automation system engaged at any time during the period 30 seconds immediately prior to the commencement of the crash through the conclusion of the crash. Possible values: ADAS, ADS, “Unknown, see Narrative.”


"It appears" according to what?

Stuff people made up is a bad reason to blame a company.


From here[1]:

> The new data set stems from a federal order last summer requiring automakers to report crashes involving driver assistance to assess whether the technology presented safety risks. Tesla‘s vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

[1] https://www.washingtonpost.com/technology/2022/06/15/tesla-a...


You also need to cite them using that as a way to attempt to avoid fault.

Especially because the first sentence you quoted strongly suggests they do get counted.


Yeah, their very faux self driving package.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: