The cars themselves are nothing special at best, and given the political shenanigans the CEO is pulling, it really will turn a lot of people off trusting anything associated with him.
Cyber truck build quality was also a red flag, suggesting that pthe company lacks rigor.
Last experience of FSD was in an Uber 1 year ago where the guy proudly wanted to show it to us. Worked for about 60 seconds until the third turn in stop & go traffic where it near launched into and rear ended stopped traffic ahead. Lots of beeping and human stomping on brakes.
This was, again, during a "they finally solved it" claim cycle.
You can tell how much people believe in the product by how quickly these threads get flagged to death.
Simply not being any of those things makes you a 10-100x driver by comparison to the average. So buying FSD or getting in a RoboTaxi that is claimed to be "just as good as the average human" is actually quite bad.
Oh and of course this is all flagged now because Elon is god here.
Good points. I wonder if the most accurate comparison is rideshare drivers? I don't know how their safety fits overall, but they are far too dangerous drivers for my taste.
Think about it- if they were so bad, it would be an expensive & time consuming problem to constantly be dealing with insurance and/or fixing out of pocket their wrecked cars!
EDIT: lol looks like it got flagged again
Self driving is a real technical problem in the HN domain that is worthy of discussion. Seeing it censored here the same it as on X is disheartening in terms of the entire VC industrial complex omertà.
Just yesterday in SF I nearly collided on my bike with a human SUV driver (likely an Uber), who pulled out impatiently into the bike lane to pass a Waymo which was stopped at a light waiting for a pedestrian to finish crossing (against the signal).
One tell - the amount of FSD guys that will quietly admit, if you ask, that their wife won't let them use it when she is in the car :-). Certainly my wife felt that way about AP/EAP in all its variations.
Women have a higher bar for technology, in that they expect it to actually work, not just be a neat idea.
If I had been a safety driver, I would have intervened ~5 times: - ~3 (maybe more) when it stupidly turned on its turn signal while stuck in traffic in a middle lane at a light. Changing lanes wouldn't have even been desirable. - Another time when it parked itself for pickup behind a car that was ready to leave. It blocked him for really no reason - Another time, it was seriously confused by a car backing out of a space in a parking spot near its intended dropoff point and just behaved... strangely, with weird unnecessary lurches.
None of these were safety related. TBH, measuring interventions is really hard for this reason. I remember hearing anecdotal reports that Waymo had a decent number of non-safety interventions back when they had safety drivers.
I wouldn't necessarily suggest being an early adopter of robotaxi with no safety driver either, but I don't think these numbers can be extrapolated to show that with any meaningful confidence. This is especially true given that the initial drives will likely be in geofenced areas.
When lack of integrity is rewarded, society decays. If you want to be part of a culture people would choose to adopt, then lack of integrity must be punished severely. If abuse of trust is rewarded, then society at large will see people start to experience distrust as the default rather than trust.
Honestly I've never regretted summoning a Waymo for a ride. I was typically a public transit user, and bus/train operators are safe, reliable, and impassive people. I can recount with horror, all the bizarre, unbelievable, fucked-up situations that transpired when a human being was summoned with their own damn car (taxi/Lyft/Uber/Veyo). None of that shit goes down when I slide into a Waymo.
Sure, there are technical issues and I have little quibbles with the quality of service provided, but at the end of the day, it gets me from Point A to Point B and it doesn't interpose a creepy 3rd party running the show (unless you consider Alphabet to be the sine qua non of creepy 3rd parties.)
Alphabet has earned my confidence, from their mapping activities and Big Data capability, to all the other infrastructure and logistics necessary to run a project of this scale. Their deployment of Waymo has been simply an evolutionary step in the robot apocalypse, I mean 21st Century urban convenience landscape.
I wouldn't accord any of the same confidence to Tesla or Uber due to the vastly different structure and scope of their business models. I just hope that Waymo is here to stay, because it really is working out well between us.
throwaway69123•8h ago
JumpCrisscross•7h ago
Cities have the bureaucracy to regulate. They also, currently, mostly hate Elon.
Tesla would be better placed trialling in suburbias, where accidents can be more-readily blamed on factors out of the company’s control.
steveBK123•3h ago
1) Take on any (even limited amount / limited use case / limited region) liability when FSD, the way Mercedes already has.
2) Actually launch Robotaxi for really real, at any sort of scale, the way Waymo has
Right now its the same situation its been for years & years - a lot of talk, and FSD cannot fail.. only be failed, by the driver.
That is - if it crashes, the driver failed to intervene. But if the driver intervenes & complains about frequency of interventions, the response is that the driver probably is too conservative and intervenes too often.. that the car wouldn't have crashed anyway. Circular logic.
dragonwriter•3h ago
steveBK123•2h ago
steveBK123•3h ago