“Thought of the day, and I wish there were a way to get this to legislators:
Come the next Big One earthquake, all of San Francisco’s emergency services will be blocked by Waymos.”
I’m AMAZED they’re not designed to handle this better. This does indeed seem like a massive problem. “Oops we give up” right when things get the worst? How is this OK?
I’ve been very impressed by Waymo’s more cautious approach. Perhaps they haven’t fully thought through the ramifications of it though.
Were any emergency vehicles actually blocked?
We have an actual failure here–step one is identifying actual failures so we can distinguish what really happened from what hypothetically could.
They need to drive or pull over. Never just stop there in the road and wait.
They're not. But it's also not a disaster. Pretending it is on Twitter is pandering, not policymaking.
> They need to drive or pull over. Never just stop there in the road and wait
Agreed. Waymo has a lesson to learn from. Sacramento, and the NHTSA, similarly, need to draw up emergency minimums for self-driving cars.
There are productive responses to this episode. None of them involve flipping out on X.
Because it’s a power outage. If we instead learned about this during a real disaster people could have died because these things were let on the road without planning what they should do in abnormal circumstances.
We’re lucky it’s not a disaster.
This is universally true. The question is how bad could it have been, and in which cases would it have been the worst?
> We’re lucky it’s not a disaster
This is always true. Again, the question is how lucky?
We have an opportunity to count blocked emergency vehicles and calculate a hypothetical body count. This lets us characterize the damage. But it also permits constraining hysteria.
Sure, but it would be notable if one had to. If none had to, we have a problem to solve, not a catastrophe.
Makes me think there are likely other obvious use cases they haven’t thought about proactively either.
We have zero evidence a power outage wasn't foreseen. This looks like a more complex multi-system failure.
Once you’re on public roads, you need to ALWAYS fail-safe. And that means not blocking the road/intersections when something unexpected happens.
If you can physically get out of the way, you need to. Period.
Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.")
If Waymo literally didn't foresee a blackout, that's a systemic problem. If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case.
I agree with this bit
> If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case.
This is what I have a problem with. That’s not an edge case. There will always be a weird thing no one programmed for.
Remember a few years ago when a semi truck overturned somewhere and poured slimy eels all over the highway? No one‘s ever gonna program for that.
It doesn’t matter. There has to be an absolute minimum fail safe that can always work if the car is capable of moving safely. The fact that a human driver couldn’t be reached to press a button to say to execute that is not acceptable. Not having the human available is a totally foreseeable problem. It’s Google. They know networks fail.
Waymo's problem is obvious in hindsight, and quite embarrassing for them, but it can be solved with software improvements. Tesla's FSD already treats dark traffic lights as stop signs, so I would bet on Waymo fixing this as soon as they can.
But transportation that depends on infrastructure along the whole route (such as trains and busses powered by overhead lines) are always going to fail in these situations. I think that's acceptable considering how rare these events are.
Perhaps traffic lights being out is what caused the cars to stop operating autonomously and try to phone home for help, or perhaps losing the connection home is itself enough to trigger a fail safe shutdown mode ?
It reminds a bit of the recent TeslaBot video, another of their teleoperated stunts, where we see the bot appearing to remove a headset with both hands that it wasn't wearing (but that it's remote operator was), then fall over backwards "dead" as the remote operator evidentially clocked off his shift or went for a bathroom break.
Things go wrong -> get human help
Human not available -> just block the road???
How is there not a very basic “pull over and wait” final fallback.
I can get stating out if the car thinks it hit someone or ran over something. But in a situation like this where the problem is fully external it should fall back to “park myself” mode.
Barring everything else, the proper failsafe for any vehicle should be to stop moving and tell the humans inside to evacuate. This is true for autonomous vehicles as well as manned ones–if you can't figure out how to pull over during a disaster, ditching is absolutely a valid move.
But if the cars are actually reaching that point they probably shouldn't be on the road in the first place.
The not-quite-final fallback should be to pull over.
Sure, I go out and drive around on the roads for no reason all the time. I'll avoid doing that during the crisis.
Some folks enjoy driving.
tiahura•1h ago
brianwawok•1h ago
Don’t think I have had a totally inactive light. I have had the power is out but emergency battery turned to blinking red light, and it correctly treats as a stop sign.
EA-3167•1h ago
andsoitis•1h ago
https://www.tesla.com/ownersmanual/modely/en_eu/GUID-A701F7D...
jerlam•1h ago
It might work, if other Tesla drivers regularly drive in that area. Also, it might not work, and you should assume that it won't work.
GMoromisato•1h ago
Looks like it treats it as a 4-way stop. Is this because Tesla has more training data?
gertlex•1h ago
tanvach•1h ago
Though the safety driver disengaged twice to let emergency vehicles pass safely.
JumpCrisscross•1h ago
Its human takes over. FSD is still Level 3.
(Robotaxi, Tesla's Level 4 product, is still in beta. Based on reports, its humans had to intervene.)
AlotOfReading•40m ago