Does Tesla offer massively lower insurance premiums for drivers that do the majority of their driving with FSD?
State Farm quoted me a rate of $240/mo for switching insurance to the Tesla ( up from $130/mo ). This is in California, Bay Area.
On a whim, I fired up the Tesla app and requested a quote for insurance through them.
They quoted me $134/mo.
I know, anecdata, size = 1. But I was surprised how low it was. I sent the coverage information to State Farm, to see if they would match it, but they just shrugged and said no, we can't match that.
[1] https://www.reddit.com/r/Insurance/comments/1kji0bm/just_fil...
As of the last year or so, I don't even have to touch the steering wheel anymore!
Is that legal where you live?
I find it very similar to operating an airplane with a reliable autopilot. The GFC-700 is super good at what it does. But it is still on me to monitor what it is doing, while at the same time significantly reducing my workload.
Like, it's OK to shout and scream about LIDAR and supervision and disengagement and all. But... it still drives itself! Really well.
But when you incorporate that tech into a fleet doing 100k residential miles a week with no supervisor, your mowing down 12 kids a month.
There are no real numbers, because there are no real self-driving Teslas.
Nonetheless, our cars drive us around anyway. Neither they, nor us, actually care about hypothetical steamrollered kids.
Most people using FSD don't come close to the mileage needed to get an idea of the safety level needed. If a Tesla robotaxi kills a kid, Tesla is done, and there won't be a coming back.
So Tesla actually needs millions of miles without critical intervention before they can confidently let these things en masse out on the streets.
A whole tesla fanboy meetup collectively will not have enough FSD miles to see something like that, but a robotaxi fleet will encounter it within a year.
What's the usefulness of this if you still have to pay attention at all times?
Nobody can be told what FSD is. You have to see for yourself.
> If FSD (Supervised) was active at any point within five seconds leading up to a collision event, Tesla considers the collision to have occurred with FSD (Supervised) engaged for purposes of calculating collision rates for the Vehicle Safety Report.
How is this not way more controversial than having to pay extra to activate features like heated steering wheels in other brands?
Ford and Chevy otoh are doing exactly the self sabotaging greedy bullshit you expect. Chevy already told previous gen super cruise owners that they are no longer getting updates or more mapped areas, and ford just segregated their line into before and after 2025. 2025 cars getting the latest bluecruise version, earlier cars don't have the hardware (but you still need to pay annualy for that deprecated older version!)
Ford also has never exteneded the mapped area in 4 years, and releases minor updates maybe once a year, where you will wait another 8 months to get the OTA.
All while having the nerve to charge $500/yr.
I think you're being obtuse, but to be clear, many car manufacturers offer trims don't include features that would qualify making the car 'safer' - blind spot detection, back up cameras (I think these are legally required now but were a premium feature for over a decade), parking assistance, crash detection, etc.
I have a Tesla and use FSD every day, and while it is a safety feature, it is _the_ pinnacle 'luxury' feature that a car can have today and they honestly do not charge enough for it.
Most of the things you mentioned aren't software locked behind a paywall, hopefully, you don't swipe your credit card and get those features added via OTA in minutes. If your car doesn't have back seat airbags it's hopefully not because you haven't paid for the back seat airbag in-app purchase.
I didn't realize how much they market it as a safety feature.
I get that FSD (maybe) has/requires better hardware than my car. But what I hate about autopilot is all around basic driving:
* Lane centering. It's extremely aggressive about lane centering, if you're in the right lane and an onramp joins from the right, the car aggressively drives to the right as soon as it perceives that the lane is wider.
* Throttle/brake behavior. It waits too long to brake (despite having radar in my car, which can supposedly "see" more than one car ahead), and when it does apply the brakes it doesn't do so smoothly. It tips in somewhat aggressively, and you can feel the discrete steps in brake force application change. Ditto for acceleration when the traffic in front of me moves.
There's no reason to think that any of this has anything to do with compute power, it all seems to be programming decisions that have been made for whatever reasons, so I can't see why FSD would be different.
And yet, if FSD drives like this, I don't get how anyone can think it's good? On the other hand, I've also heard people say they think autopilot is good, which it's clearly not, so it makes me judge their driving skill rather than the different models. But perhaps there's some matrix of hardware revisions and software/decision models out there that I'm unaware of, that explains differences in driving behavior, if they exist?
So I suppose they had a parallel development process for FSD while building autopilot features?
But since there is a hardware support overlap, it seems like at some point you'd migrate autopilot cars to the FSD software stack, with limitations added in via feature flags.
> But since there is a hardware support overlap, it seems like at some point you'd migrate autopilot cars to the FSD software stack, with limitations added in via feature flags.
This is exactly what a lot of people speculate will happen soon.
It's possible v14 is better. v12 was certainly better than v11 in all those regards. But there are still issues with the car making dumb lane choices in v12.
Isn't there a trial you can try?
There's been rumors of a v14 lite coming out because tesla REALLY doesn't want to deal with the fact that they promised the 2018s could be fully autonomous.
Just yesterday I got an ad in the app "refer a friend and try FSD (supervised), and get $15!"
So I opened up the app just now and I suppose I got my answer that proves my initial premise was incorrect -- I need a hardware upgrade for FSD. Womp womp.
(That said, it still seems like the lane centering and brake/throttle behavior should have been easily fixed without a FSD hardware upgrade).
Tesla really trying to engineer good will, and unsurprisingly only using their own data, which paves over things like "driver intervention prevented FSD from crashing in this instance" or "FSD disengaged 2s before crash, therefore driver error".
Rather than play statistics games with self-reported dressed up supervised driving data to try and trick people into rolling the mortality dice with a robotaxi, just let the cars drive around empty. But he can't do that, because these cars are not FSD.
Even worse, the government could easily mandate LIDAR for autonomous cars, and that would basically kill Tesla overnight.
I intervene on FSD all the time, but only due to differences of opinion, like I want to take a different route, or I think it's being too cautious, or I want to change lanes earlier / later than it decided to. In a year of use I can't remember a time when I intervened to prevent an accident.
So if you just looked at number of interventions in my data, it doesn't really tell you anything except that I'm too much of a control freak to just let it do its thing...
The dude has got six weeks to get those supervisors out of the car, and he isn't even giving the most critical data around them. Not a good sign.
Elon will probably find a near closed loop around an empty development in Austin, put two robotaxis there without a driver, and let people do a loop. Tesla contractors will walk the area an stop people from crossing when a robotaxi is near. Then he will proclaim there are driverless taxis in Austin and that they will expand to the rest of the US by mid 2026, and then talk about how actually optimus robots in 2027 are the real thing to focus on.
* Compared to the estimated U.S. average.
They have a huge store of data on accidents in teslas per mile driven. Why don't they compare their actual data on accidents? Well, they would, but it probably is worse with FSD.
If they want to put themselves in a peer group that only has driver assistance systems, then the comparison should be to other similarly-priced and new vehicles, no U.S. average 12-year-old Corollas.
Anyway what is most amusing about this promotional material is that from the very first frame it inadvertently highlights how much worse it is than Waymo. The "Avoiding T-bones" scenario only seems like the car came out of nowhere because Tesla's camera system is so limited. Waymo would have seen it coming a mile away.
Do I need to have my hands on the steering wheel or is the car actually self driving? Can my Tesla drop me off where I need to be and go park itself? And then can I summon it when I'm done? Can I turn my Tesla into a taxi that picks up other people and earns me money? Is it even possible for a Tesla to drive itself unsupervised?
All of these have been promised over and over again for the past decade, and at this point it is impossible for anyone to set the right expectations for themselves.
> Do I need to have my hands on the steering wheel or is the car actually self driving?
You don't need to have your hands on the steering wheel, but you do need to be paying attention to the road.
> Can my Tesla drop me off where I need to be and go park itself?
Not yet.
> And then can I summon it when I'm done?
Yes.
> Can I turn my Tesla into a taxi that picks up other people and earns me money?
Not yet.
> Is it even possible for a Tesla to drive itself unsupervised?
They can (e.g., https://youtu.be/BO1XXRwp3mc), but it's not enabled for consumers yet.
The cybertruck was a huge boondoggle and only doctors and true beleivers bought them but over time I don't see them in the parking lot when I see the doctors in the building hmmm.
Also Elon "skipping around like a dipshit" (Tim Waltz) and now claiming he's worth a trillion dollars.
By his own admission his identity is entertwined with Tesla's.
He poisened the brand for me and many others.
1: https://www.cbsnews.com/news/tesla-autopilot-staged-engineer...
2: https://www.cnn.com/2025/08/21/business/tesla-nhtsa-self-dri...
silexia•1h ago
amluto•1h ago
jfoster•1h ago
DarmokJalad1701•1h ago
In the page:
"If FSD (Supervised) was active at any point within five seconds leading up to a collision event, Tesla considers the collision to have occurred with FSD (Supervised) engaged for purposes of calculating collision rates for the Vehicle Safety Report."
They are pretty open about how the stats are reported.
tomrod•1h ago
You really can't trust almost anything Musk says, he's proven this time and again, and Tesla will reflect that in its culture.
DarmokJalad1701•1h ago
The driver can at any time. And they do if it seems like it is going to do something stupid - which is getting rarer and rarer as time goes on. As a Level 2 system, the driver is always supposed to supervise the operation and stay alert.
Musk has proven time and again that things his critics say are impossible/unrealistic ends up being achieved late. See anything from reusing rocket stages to the goals from Tesla's 2018 Compensation Plan[1] ("If Mr. Musk were somehow to increase the value of Tesla to $650 billion — a figure many experts would contend is laughably impossible and would make Tesla one of the five largest companies in the United States ...")
Or the Arianespace guy saying SpaceX is "selling a dream". To quote: "I think a $5 million launch or a $15 million launch is a bit of a dream. Personally, I think reusability is a dream."
[1] https://www.nytimes.com/2018/01/23/business/dealbook/tesla-e...
[2] https://arstechnica.com/space/2024/06/some-european-launch-o...
sixQuarks•1h ago
Nevertheless, the latest version of FSD I can certainly believe is seven times less likely to get into an accident than the average driver. I experienced it daily.
stldev•1h ago