???
It is interesting to compare the commentary here vs. on Tesla fan pages, where they say the biggest problem with FSD is that there are so few interventions now that they can't improve the data set much from real world data. It's just too good. The only thing stopping a nationwide Robotaxi rollout is those fuddy duddy regulators who refuse to sign off on the clearly superior technology because they're in Google's pocket. It's not hard to see how Tesla is a meme stock at this point.
The theoretical future of Tesla owners being able to put their cars into robotaxi mode when they're not using them does not exist yet, and may never.
If it's a winning strategy and Tesla can make money from it, Tesla will keep all the profits for themselves.
Leasing cars to Uber drivers has taken off in the last decade. No reason for Tesla to sit on a quickly depreciating asset like a car if they can double dip (initial purchase and Taxi profit share) without having to pay for electricity, depreciation, or maintenance.
My experience with FSD is that while it feels “magic” at times, it’s like a teenage driver that you have to baby sit constantly. It’s genuinely impressive how well it works given the really limited hardware, but if you use it routinely you know it will make at least one weird/dangerous choice on every trip.
Generally, I really don’t trust it in most situations except properly delineated highways, but even then it can be a crapshoot. If you’ve experienced FSD then get in a Waymo, they are night and day different—a lot more predictable, and able to navigate uncertainty compared with what Tesla has built. It’s likely down to a combination of both software and their insistence that radar doesn’t matter, but it clearly does.
I would never get in a Tesla that purports to drive itself, there’s no way it’s safe or worth the risk. I won’t even use it with my family in the car.
I know a handful of others who own Teslas and feel the same, despite what the fans spout online. I generally like my Model Y, but I definitely do not trust FSD—I find it hard to believe that it’s even being taken seriously in the media. Not a great endorsement if even your own customers don’t trust it after use it.
Yeah, it would be hilarious if it wasn't so horrifying, I remember watching a level crossing be represented as a weird traffic light that would go from red to off to red erratically, with a similarly erratic convoy of trucks representing the train.
Mind you I remember people claiming FSD was "nearly done" because they'd "tackled all the hard problems, and were now in clean up", and how as a result that meant they could let their FSD take itself through a roundabout, not just straightlining it through. Never underestimate the power of denial.
Even if it's hypothetically 99% as good as Waymo at the moment, 99% is not good "enough" when it comes to something as critical as driving.
To complete the analogy, Tesla is invested in vision-only technologies, while its competitors are making gains with Lidar and other tech that Tesla refuses to acknowledge. It's very reminscent of Roomba in the mid 2010s.
The Matic is a cool little robot though.
Perhaps more like plankton.
> The [...] company warned in its earnings results [on 12 March 2025] that there’s doubt about whether it can continue as a going concern.
Meanwhile Roomba seems to have done...pretty much nothing? Reminds me of the death of Skype when everyone transitioned to literally everything else while they floundered around.
As a counter anecdote, I do use FSD with my family in the car, I also have used it on snowy roads, logging roads, and it does quite well. Not unsupervised well, but better than I expected given that I'm running FSD on a nearly 6-year old car. The number of trips around town that have been totaly interventionless has definitely been going up lately, and usually interventions have been because I wanted to be more aggressive, not because the car was making a major error or even being rough.
I have no motivation to be positive; I own no Tesla stock or position and just like it because its the best car for me currently. I cannot emphasize enough just how different my lived experience has been from how you describe it.
It definitely has come a good way since I first got my car, but it's still _unpredictable_ and even seems to progress, then randomly regress, between releases. The big one is just navigating unpredictable environments, which is where Waymo is clearly far, far ahead.
In the real world, I think their approach has clearly hit a ceiling and I definitely feel a lot safer sitting in a Waymo than a Tesla, I'm not sure the gap is going to narrow unless something drastic changes.
The left turn fuckup is really bad though, as is the instance of the robotaxi dropping someone off in the middle of an intersection after they hit the "drop off earlier" button: https://sh.reddit.com/r/SelfDrivingCars/comments/1liku3o/tes...
And guess who's at fault when you're caught?
Edit: No, crossing double-lines to avoid obstructions isn't breaking the law. People in the Comma Discord bring up these weird edge cases that are outlined in law, too. No, you're never forced to break the law when driving.
You can downvote me all you want. You're wrong. The police would find you wrong, a traffic law court would find you wrong. People are animals and don't want to obey the law. Just obey the law. Drive the speed limit, keep right unless passing or turning left.
Legally.
The law where I am is that human traffic directors (police, sheriffs, construction crews, etc) take precedent over what's in the book. If they didn't, nobody could ever get around a fire truck at a fire.
I just ignored him and went around when it seemed safe.
And that's where you get a ticket. Plenty of cops around here sitting at the exit of construction zones waiting for people just like you.
I get that nobody's perfect and you might do it by mistake (going 35 on a unmarked road that was 30 max or something), but the way you write your comment makes it sound like it's something one does in a deliberate way?
Drivers like that shouldn't be on the road.
Um ... kinda...
If it's intentionally programmed to violate speed limits, then sure.
If it's intending to follow speed limits but is failing, then that's terrifying, because where is its wrong information coming from, and how catastrophic could the errors be?
Slowing down surrounding traffic to the speed limit would be an amazing safety multiplier for AVs.
I have driven many hours at posted speed limits on highway in the rightmost lane in the US, and I yet need to hear one honk. And I have seen cars driving even slower -- absolutely no problem. Anyone that wants to move faster just changes lanes and moves on.
No one seems to notice the speed limit is 50 and not 65 so the majority of cars are going 65 to 75mph. I thought about carrying a sign I could hold up "Speed limit 50mph!" for when I got screamed at for trying to go the speed limit
I'm on the "enforce it or repeal it" side. I personally wish they'd put up cameras there and ticket everyone until they slowed down. Or change the signs to 65 if they're not going to enforce it.
The north bound side is also treacherous in that area. Again, the speed limit is 50mph but the 280 merges on the right and people come off it at 65mph+. That makes it harrowing to get in the right lane for the Cesar Chavez exit from the 101. If you're going the 50mph speed limit you'll merge into a line going 65-70. If you go 65-70 you have far less time to get in the right line in time to make the exit.
I just don't think there's going to be any issue in most geographies if they go on highways/freeways and follow the speed limit (I guarantee that's true in the US; I'd go out on a limb and guess it's true almost everywhere).
edit: grammar
Garbage truck leaving a slowly but as fast as he can -> no problem
Tesla M3 leaving a light slowly -> dude deserves ever honk he gets.
BMW M3 weaving through traffic cuts people off but is gone as fast as he came -> nobody cares
Prius doing 55 in the right lane -> no problem
Prius doing 55 in the left lane because entitlement -> lots of middle fingers
I have complete confidence waymo would have "done the right thing". The real problem is that there still would have been a million rear endings. Eventually the insurance companies would have sued them because no amount of idiots online screeching about the "rules" actually makes it ok to habitually create the preconditions for an accident. It would have been a big expensive legal fight. The state doesn't want high speed limits to reflect how fast people actually drive. Waymo doesn't wanna be involved in that. The insurance companies don't wanna be involved in that. It's just a no brainer not to given the constraints.
Though I can't help but wonder if insurance rates would've eventually gone down for everyone if the issue got forced and it resulted in some sort of "after the 3rd rear ending they're on you by default" rule or something.
Which also highlights one of the inherent flaws with laws and rules like speed limits. They don't ever actually mean, drive at this velocity. They always mean, drive at a safe speed and here is the number we all collectively agree on, unless we don't.
If we mean, that this is dangerous: Yes, of course. It's an obvious, somewhat dangerous error (and I say "somewhat", only because I assume currently everyone participating is cognizant of the fact, that this is in some kind of testing stage)
But I think the more interesting question is: How quickly can this issue (and others like it) be fixed? If the answer at this point is "we will have to retrain the entire model and just hope for the best" that sounds, like, really bad.
[1] https://www.reuters.com/technology/luminar-says-tesla-is-big... [2] https://www.teslarati.com/tesla-no-longer-needs-lidar-ground...
I'm not saying Tesla FSD is any good but the idea that a robot could never drive without just cameras seems to be false given 1.6 billion human drivers that drive with only eyes.
People have much higher safety standards for self-driving cars than they do for human drivers. Just look at how one fatality led to the total abandonment of both the Cruise and the Uber self-driving program.
Still, even with this high performance human sensor suite, people commonly get into accidents in bad weather.
The Waymo approach of using other sensing modalities in order to compensate for the ways in which the cameras/processing aren't as good as a human makes a lot of sense, and in addition, it gives them the ability to exceed human performance using lidar and radar when cameras are having a hard time.
Once we have mass produced lidars and radars, the cost will come down, and not many people are going to care about an extra $1000-$2000 worth of sensors on a car if it significantly improves the safety.
Waymo has a multi-year head start; that’s going to be hard to overcome without a superior experience from Tesla.
Obviously the "if Tesla can actually do this" is doing a lot of work here! I would not surprise me if they had a crash sooner than later and that scuttles the whole thing. But it's also possible that doesn't happen and this works. Or at least I'm not smart enough to know for sure that it won't.
A big cost driver is invisible, at least once Tesla gets their safety monitor out of the vehicle, which is the ratio of safety monitors to vehicles. You need three shifts of safety monitors seven days a week. It adds up. Google wouldn't be expanding if they had to hire a building full of safety monitors.
In the short to medium term, safety monitoring will be the limiting factor.
That's explainable by, perhaps, the data training set, but an "Actually Indians" approach seems more likely (and more inline with the bullshit Musk has pulled in the past)
https://www.usnews.com/news/top-news/articles/2025-06-20/exp...
It's a 200 pe stock, sales are falling, so it won't have earnings to speak of next quarter. High pe stocks need growth to justify their multiples. Tesla is not growing.
Also if this robotaxi service isn't pulled off the road soon then it will be limited to a very select set of locations. If someone has to sit in these cars to monitor them all the time then Tesla may be losing money on every journey.
This premature move in releasing the robotaxi is certainly stock pumping.
Long term I think they are fine - I think they will have solved FSD and no one will remember Elon's politics.
Look at Waymo or Mercedes if you want to see what working FSD looks like. Tesla isn't even in the same ballpark yet.
I can tell you though, Tesla is the only company doing two things:
Making full self driving car that doesn't need an expensive and expansive sensor suite. Just off the shelf cameras and a GPU.
Making full self driving that can drive anywhere, no pre-mapping needed.
Waymo probably could do without pre-mapping, Mercedes probably not. Right now though Tesla is the only player with a car that you can buy and will "self drive" anywhere in the country. The other car companies cars only work on select pre-mapped roads, and IIRC Mercedes only works on highways and over 40mph.
The bet against Tesla is that they will not be able to pull this off. Not that competitors will beat them to the punch. If Tesla did get FSD with cameras on par with Waymo, Waymo would likely be unable to compete.
There's no indication that Tesla has solved the fundamental challenge of self-driving - making the system incredibly reliable and safe and able to handle myriads of rare situations. Tesla is at a point where multiple self-driving companies were years ago. The general experience from this field is that scaling is extremely hard because improving reliability and safety by orders of magnitude is extremely hard.
I think people are too hung up on the sensor suite. Once you solve the fundamental problem, a sensor suite is kind of an implementation detail. Waymo's system is robust to the removal of any sensor type / map. There's a good chance that Waymo in a camera-only mode would perform better than the Tesla robotaxi.
Mapping is not an issue, it's less expensive than people expect and you only need to do it once, so the cost per mile is approximately zero. And in Waymo's case, there's a lot of room for streamlining the process. The more you map, the easier and faster it gets. BTW, MobilEye has a very lightweight and crowdsourced mapping process, they're also camera-centric and have a cheap hardware package.
Waymo's long-term goal is a a system that is cheap and works everywhere. They've spent 15 years with a focus on solving the hard problem. Now they seem to be close to the finish line and they're shifting their focus and money into the easier stuff, optimizing hardware and expansion.
I can get into my car, plug in a destination and not have to touch the car. Nothing but Tesla does that right now unless every video on the subject is lying regardless of whatever the defined SAE levels are.
TSLA is a meme stock at this point to which ordinary expectations do not apply. The price is unmoored from what the company does. _By the book_ I agree, it should be shorted, but I wouldn't.
Eventually they do. At this point the bull thesis is still plausible to a lot of investors but this won't last forever.
And I got roasted. Invest with caution.
It is. Inevitably, when the promised rollout ramp up doesn't happen, there will be another pump. This is not his first rodeo. The only thing that can stop another pump is competitors scaling first, not repeated Robotaxi failures (which I fully expect).
Tesla Robotaxi launch is a dangerous game of smoke and mirrors
How many billion miles more are needed for the computer to not drive over a double yellow line?
Is there any evidence that three trillion miles would work?
yall_got_any_more_of_that_driving_data_dave_chappel_meme.gif
If it turns out that lidar is necessary, there goes the value of the installed base of Tesla, vehicles. It would be like starting from scratch. Clinging to vision only AV is some variant of the sunk cost fallacy.
If you're thinking, surely they must've thought of all that and have a plan, I'd point to the design of starship and ask, is that ever going to be human rated?
If THIS is the state of Tesla's "full self driving" abilities, I would NEVER ride in one of these. Nor would I left my family ride in one. AND I would carefully steer away from any lane that has one of these vehicles running.
(disclosure: long time AV safety advocate; have tested AV capabilities of dozens and dozens of SAE L2-L4 vehicles in the last ~7 years)
I have my motorcycle license and drove an sv650 when I lived in CA. Normally on a motorcycle the left lane on the highway is the safest because people are merging in and out of the right lane when they enter and exit the highway, so the left lane is usually steady and no turning.
So you drive with traffic in the left lane as much as possible. The problem in CA is that you're driving at times 95mph just to stay with traffic. I had a turn near Ventura on the 101 where I had to lean my bike so far over that it was like I was on a racecourse.
Since moving to Europe it's much more sane. There are speed cameras everywhere so people don't speed. When you can predict what every on else is doing on the road it's so much safer.
beepbooptheory•3h ago
ranger_danger•3h ago
physhster•3h ago
ranger_danger•3h ago
hackernudes•3h ago
sjsdaiuasgdia•3h ago
https://jarv.is/notes/cloudflare-dns-archive-is-blocked
ranger_danger•2h ago
I_dream_of_Geni•3h ago
majorchord•2h ago
gruez•3h ago
ranger_danger•2h ago
bdamm•2h ago