I understand the reasoning behind it, but watching the video () of the test shows that the car did not warn the driver, and even if it did it was speeding too much leaving almost no time for a driver to respond
Disclaimer- I have never used FSD before
() https://dawnproject.com/the-dawn-project-and-tesla-takedowns...
The warnings occur when you look away or don't touch the steering wheel for a while. Not saying that Tesla is without error (it isn't), but just clarify what the warnings are for.
So they are useless. My car warns me even if i don't look.
This test is about whether Tesla's self driving technology is safe for the people outside the vehicle.
Crash testing is much more than that. Check out NCAP for example. They specifically include safety ratings for other, vulnerable road users (i.e. pedestrians). And the Model 3 currently sits at the top spot of their 2025 rankings in that category.
This test shows the self driving software disobeys traffic laws (the school bus stop sign requires everyone else to stop), resulting in a pedestrian collision that would not have happened had traffic laws been followed.
Comparing "Tesla Takedown" with ExxonMobile is way too quick, you should have said Greenpeace. I'd say that TT has to do this, Is part of the point.
https://www.theverge.com/news/646797/nhtsa-staffers-office-v...
When regular in theory bipartisan mechanisms fail, protest is all you have left.
Elon Musk, who also owns Tesla.
Why not take an objective, fact-based regulatory approach from the start?
Independent tests are what's needed, and preferably done by agencies who won't get gutted because they find something that's inconvenient.
What other hidden dangers slip by without public knowledge.
As for the fanbois and f*cbois, they have always existed and will always exist. They are the pawns. Smart mature people learn to lead, direct, manipulate, deflect and ignore them.
So if you want to get into the details and figure out why a video from "Tesla Takedown" should or should not be believed, in all ears, but I'm some random on the Internet. I don't work at NHTSA or anywhere that could affect change based on the outcome of this video. It's not going to affect my buying decisions one way or another, but it'll only divide people who have decided this matters to them and can't get along with others.
https://en.wikipedia.org/wiki/Unsafe_at_Any_Speed%3A_The_Des...
I am wondering if there is a safety certification body for self driving technology. If not, one is needed because consumers can't be expected to be aware of all the limitations of the latest update they have installed.
There must be basic safety standards these systems need to meet, a disclaimer can't be the solution here.
- the failure here is that the car didn't stop for the bus on the other side of the road with the extended stop sign. (Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)
- the FSD version for the robotaxi service is private and wasn't the one used for this test. The testers here only have access to the older public version, which is supposed to be used with human supervision
- the dawn project is a long-time Tesla FSD opponent that acts in bad faith - they are probably relying on false equivalence of FSD beta vs robotaxi FSD
Nevertheless this is a very important test result for FSD supervised! But I don't like that the dawn project are framing this as evidence for why FSD robotaxi (a different version) should not be allowed without noting that they have tested a different version.
That's why you slow down when you pass a bus (or a huge american SUV).
Every driver/car that obeys the law has absolutely no problem avoiding this collision with the child, which is why the bus has flashing lights and a stop sign.
Worst case, you needlessly slow down around a roadside parked schoolbus.
Best case, you save a kid's life.
https://www.mass.gov/info-details/mass-general-laws-c90-ss-1...
This is a major failure, failing to observe a stop sign and a parked school bus are critical mistakes. If you can't manage those you're not ready to be on the road without a safety driver yet. There was nothing particularly difficult about this situation, these are the basics you must handle reliably before we even get to alle the tricker situations those cars will encounter in the real world at scale.
I guess the thing I'm trying to reconcile is that even very safe drivers make critical mistakes extremely rarely, so the threshold at which FSD is safer than even the top 10% of human drivers likely includes some nonzero level of critical mistakes. Right now Tesla has several people mining FSD for any place it makes critical mistakes and these are well publicised so I think we get an inflated sense of their commonality. This is speculation, but if true it leaves some possibility of it being significantly safer than the median driver while still allowing for videos like this to proliferate.
I do wish Tesla released all stats for interventions/near misses/crashes so we could have a better and non-speculative discussion about this!
Do you really want to trust a heartless, profit-motivated corporation with 'better than human is good enough'?
What happens when Tesla decides they don't want to invest in additional mistake mitigation, because it's incompatible with their next product release?
No, but the regulator helps here - they do their own independent evaluation
> What happens when Tesla decides...
the regulator should pressure them for improvements and suspend licenses for self driving services that don't improve
https://techcrunch.com/2025/02/21/elon-musks-doge-comes-for-...
But, I've read through your chain of rplies to OP and maybe I can help with my POV.
OP is replying in good faith showing "this sampling incident is out of scope of production testing/cars for several reasons, all greatly skewing the testing from this known bad actor source."
And you reply with "Zero systemic reproducible mistakes is the only acceptable critera."
Well then, you should know, that is the current situation. In tesla testing, they achieve this. The "test" in this article, which the OP is pointing out, is not a standardized test via Tesla on current platforms. SO be careful with your ultimatums, or you might give the corporation a green light to say "look! we tested it!".
I am not a tesla fan. However, I also am aware that yesterday, thousands of people across the world where mowed down by human operators.
If I put out a test video showing that a human runs over another human with minimum circumstances met, IE; rain, distraction, tires, density, etc., would you call for a halt on all human driving? Of course not, you'd investigate the root cause, which is most of the time, distracted or impaired driving.
Not really - it's a case of slowing down and anticipating a potential hazard. It's a fairly common situation with any kind of bus, or similarly if you're overtaking a stationary high-sided vehicle as pedestrians may be looking to cross the road (in non-jay-walking jurisdictions).
One of the prime directives of driving for humans and FAD systems must be "never drive faster than brakes can stop in visible areas". This must account for such scenarios as obstacles stopped or possible coming the wrong way around a mountain turn.
Your argument that a newer version is better simply because it's newer does not convince me. The new version could still have that same issue.
I actually didn't say that and am not arguing it formally - I said what I said because I think that the version difference is something that should be acknowledged when doing a test like this.
I do privately assume the new version will be better in some ways, but have no idea if this problem would be solved in it - so I agree with your last sentence.
Did they anywhere refer to this as Robotaxi software over calling it FSD, the term Tesla has chosen for this?
My understanding is that Tesla is the only manufacturer trying to make self-driving work with just visual-spectrum cameras --- all other vendors use radar/lidar _and_ visual-spectrum cameras.
As for being unrepresentative of the next release of FSD, we've had what eight years ten years of it's going to work on the next release.
In that case it could’ve learnt almost nobody ever fully stops at a stop sign, or in this case a bus stopped with a stop sign.
People are way more accepting/understanding of drunk driving than passing school bus stop signals.
You're disagreeing with something I didn't say. There's a difference between afterthought and the primary initiator of the stop in a situation like this.
>Of course the car should recognize the stop sign, but there are cases in which stop signs are obstructed or missing, and in those cases pedestrians should still not be hit by a car.
The surprise pedestrian test is one that any vehicle can be made to fail by sheer physics. Avoiding errant pedestrians like in the video will likely only come as a byproduct of better situational behavior by self driving vehicles. The overwhelming majority of drivers know to ignore the speed limit if the situation is rife with pedestrians or otherwise sus and are generally fine with missing/obstructed stop signs. I don't know what route self driving software will take to approximate such behavior but it likely will need to.
There's different degrees of failure as well, did the Tesla try to brake beforehand or apply brakes after hitting the doll?
jofzar•3h ago
https://www.reddit.com/r/teslamotors/comments/1l84dkq/first_...