When (not if) visuals are obscured, there is no backup. The vehicle is driving blind.
This is a fundamental engineering failure --- a lack of redundancy in a critical safety system. It can't/won't be fully overcome without additional feedback in my opinion.
Obviously the most responsible choice is to not drive while sleep deprived, or to pull over to a safe location and rest if one is too drowsy to operate a vehicle.
That said, if one is refusing to do that, for whatever reason, I have a hard time believing the upcoming FSD 14 stack is going to be more dangerous than driving drunk, or than driving while so drowsy as to be equally unsafe to driving drunk.
The older pre-NN stacks definitely weren't "there", but it requires either a meaningful ignorance or motivated partisanship to argue that FSD 13 isn't at least approaching the threshold of being a better driver than the worst drivers on the road, if not definitively better.
Speaking of ignorance and motivated partisanship, how do you explain the fact that Tesla has filed a motion in federal court to keep automated crash data away from the public?
If FSD is as good as you say, I would expect them to jump at the chance to show everyone the proof.
https://www.reuters.com/legal/government/musks-tesla-seeks-g...
Veserv•4mo ago