Overall, yikes.
Oh good, Tesla vehicles apparently struggle with the task of "Hey, there's a car there" in degraded conditions.
Probably don't need to worry about that while driving though.
>Tesla also described internal data and labeling limitations that prevented a uniform identification and analysis of crash events with the subject system engaged. ODI believes this limitation could have led to under-reporting of subject crashes over portions of the defined time-period.
I thought Tesla was a "Software" company!
This report is insanely vague though. It's very preliminary, opened yesterday.
Yeah I think posting this here is premature without any details.
Maybe I'm misremembering things, but I feel like 4-5 years ago we didn't have these clickbait headlines that fed political discourse. It feels like reddit culture has permeated this place for a while.
Anytime one of Elon Musk's company has a misstep, the headlines violently shoots to the top of the front page.
This is an expected and understood result given the hardware and software involved.
You will not get past these issues without a RADICAL improvement in camera technology paired with specialized, dedicated processing hardware matched against several (and I mean several several) "common" environment profiles.
FSD is a scam. It's not safe. It is not technically sound.
The fact that there aren't many more accidents with the system is a by product of consistent and well thought out road standards, car standards, other safety systems present on cars, and driver education.
Eh, while I agree with you on the permeation of reddit culture on this board, this post is in no way clickbait or political in nature.
In fact, the title of this post is literally copy and pasted from the problem description.
It is a shameful engineering design to leave out LIDAR and it has cost human lives.
Let's hope Musk does not leave out something important for the moon landing. His proposal for it is absolutely ridiculous, it looks like a children's book fantasy and many smaller top-heavy craft have already toppled on the moon!
One giant leap?
The gravity is weaker so just jump down /s
If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human? Especially because a human has a window to override FSD, but FSD doesn't really get a chance to override a human, except in limited scenarios like automatic emergency braking. And it gets more people using it by providing FSD at a lower cost?
I'm not sure that's the case anymore. Each Tesla model has gotten more spartan over the years. And the interiors have never been all that "premium" when compared with other manufacturers. They should still offer the most comprehensive safety features, but whether or not thats because of "luxury" or not, I'm not sure.
Animats•49m ago
Does it not detect them at all, or fail to deal with detected sensor degradation adequately? Does "Full Self Driving (assisted)" slow down under conditions of poor visibility?
Does Tesla even look for the road surface? One big advantage of those up-top LIDAR units is that you have a good scan of the pavement ahead. If you're not sensing flat pavement ahead, don't go there. That's basic. Vision-only systems, going back to Mobileye, have been overly dependent on looking for known kinds of obstacles. Original Mobileye could only detect car rear ends.
altairprime•33m ago
Under conditions of poor camera visibility?
Humans drivers can see under conditions that cameras cannot, and people will otherwise misinterpret “visibility” as referring unpredictably (and with personal biases) towards either human sight and/or camera processing and/or lidar processing.
dstroot•32m ago
michaelmrose•22m ago
hn_acc1•9m ago
9dev•19m ago
recursivecaveat•19m ago
XorNot•19m ago
They're just not that good - nowhere near human vision performance. And a human in a car has a surprisingly good view of the road and a very fast pan tilt system to look around.
Tesla's do not actually have 360 degree full binocular vision coverage - nor the ability for a camera to lean left or right to improve an ambiguous sensor picture.
So while I fully believe that vision only self driving could work, within the constraints of automobile platforms and particularly the Tesla and it's current camera deployments, it is not remotely similar enough to human visual fidelity for that to solve a valid argument.
buildbot•15m ago
Humans are hard to compete with! I'd want LIDAR & RADAR just to give me an edge.
giantrobot•16m ago
And this is an amazingly stupid statement. Humans drive with most of their senses, not just vision. In fact our proprioception plays an important role in driving.
Even Tesla's use of cameras is poor because they're monocular and fixed in place. Most humans have binocular vision and those visual sensors have multiple degrees of freedom and the ability to adjust focus.
Even if you wanted to only use vision for navigation it's irresponsible to not use binocular configurations that get more reliable depth sensing.
UltraSane•13m ago
At this point I truly don't understand why anyone cares what that liar says.
maxerickson•8m ago
It might be dishonest (if he doesn't believe it is possible), but I don't think he's saying that the current systems have reached the mark.
renlo•4m ago
I also own a Tesla, and there is no indication shown to the user that FSD's vision is degraded. They need to add this in.
For example, numerous times I have been driving my Tesla with FSD activated with ostensibly a clean and clear windshield when suddenly the car will do the "clean the windshield in front of the camera routine" without any indication that the car's camera is degraded. If people haven't seen this "clean the windshield routine", the wiper fluid is dispensed and the wiper will vigorously wipe in front of the camera only -- the rest of the windshield only gets a cursory wipe.
This indicates to me that the camera has poor visibility and I am not informed or aware of this as a driver, which is concerning. I am often curious if there is a thin occluding film on the windshield in the camera box in front of the camera, or something that has degraded FSD's vision, but they do not give you the ability to view the camera feed, nor do they notify you that the vision is degraded. I think a "thin occluding film" may be in the camera box because my normal windshield outside of the camera box started to show a thin chemical film after a couple of months, which apparently (according to a Google search) happens when a new car off-gasses, adding a thin film of chemical byproduct to the windshield. This is my first new car so I've no idea if this is normal or not.
Sohcahtoa82•30m ago
But it's that 0.1% of situations where the results will be catastrophic. Sure, you can detect vehicles, traffic cones, bikes (both bicycles and motorcycles), people, mopeds, traffic lights, lane markings, everything you'd expect on a road.
But what about the mattress that fell out of someone's truck? If the car doesn't know what a mattress is and what it looks like, it can't really adequately determine its size based on the monocular vision that Tesla has. Sure, maybe it could use motion vectors between video frames to make a guess, but I'm not convinced that's going to work well, especially relative to LIDAR.
Steering back to the subject at hand...
> "In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred."
I don't think I've ever had my Tesla disable Autopilot based on road conditions, though maybe it's because when conditions are bad, I've just taken manual control preemptively. I've let it go through construction areas where cones are guiding traffic outside the painted lines, and surprisingly, it's handled it fine, though I've only done this at low speeds (~20 mph).
Camera visibility is another story. In heavy rain at night, I've had it not allow me to enable AP, though I've never had it disable AP and tell me to take control. However, it HAS limited the cruise speed based on visibility.
All this to say...
...anybody buying Tesla's FSD is being swindled, as far as I'm concerned. "FSD (Supervised)" is a scam. If you have to supervise it, it's not self-driving. It's just a party trick that you have to watch to make sure nothing goes wrong.
cornell532•6m ago
FSD is a better driver than me 99% of the time.
hn_acc1•4m ago
99.9% of driving of sea-level, non-rainy, near-the-coast California/Austin weather, maybe. I would guess it's a no-go in the inland foggy conditions in CA, for example, or freezing rain in TX.
In terms of ALL conditions? 60-70% maybe.