Each story is probably a sad one, but hmm, an Instagram post about one of these being published on Hacker News because it involved a Waymo? Wow!
At least the number of articles suggesting that trolley problems are highly relevant to self-driving car implementations have gone down.
But if this is the worst that can be said about Waymo then that gives me a lot of confidence in their general driving abilities.
So it’s safe to ago ahead and short it.
Remember it trades under GOOG
Perhaps assign a safety driver that puts its own driving license and criminal liability on the line, so the company cannot evade responsibility.
Pushing companies to investigate and take responsibility, and report these accidents is going to overall to improve reliability of the system.
The reality is that if you do not put strong punishments, these companies wont have the incentive to fix it, or they will push these priorities way lower on the to-do list.
I think the point is you don't know for certain what you hit if you hit and run. The car should have enough collision detection to know when it's hit something.
That said, this story is sending up red flags with the "allegedly" in the title and lack of evidence beyond hearsay.
This is really only true for Waymo, who appear to be the only folks operating at scale who did the work properly. Robotaxi, Cruise and all the others are in a separate bucket and should be statistically separated.
Very bleak and very tech-bro-coded, no wonder that regular people have started seeing us like pariah, we deserve it.
Waymo? How is this ambiguous. Waymo makes the car, writes the software and operates the vehicle.
https://waymo.com/blog/2025/05/waymo-making-streets-safer-fo...
Though, Waymo should absolutely be responsible for this and be treated as if it were a human who hit the cat.
Also note that there is an enormous issue of trust and dignity.
By "trust" I mean: We have seen how data and statistics are created. They are useful on average, but trusting them on very important, controversial topics, when they come from the private entity that stands to benefit from them, is an unrealistic ask for many normal humans.
By "dignity" I mean: Normal humans will not stand the indignity of their beloved community members, family, or pets being murdered by a robot designed by a bunch of techies chasing profit in silicon valley or wherever. Note that nowhere in that sentence did I say that the techies were negligent - they may have created the most responsible, reliable system possible under current technology. Too bad normal humans have no way of knowing if that's the case. Especially humans who are at all familiar with how all other software works and feels. It's a similar kind of hateful indignity and disgust to when the culpable party is a drunk driver, though qualitatively different. The nature of the cause of death matters a lot to people. If the robot is statistically safer, but when it kills my family it's because of a bug, people generally won't stand for that. But of course we don't know why exactly, as observers of an individual accident - maybe the situation was truly unavoidable and a human wouldn't have improved the outcome. The statistics don't matter to us in the moment when the death actually happens. Statistics don't tell us whether specifically our dead loved one would have died at the hands of a human driver - only that the chances are better on average.
Human nature is the hardest thing for engineers to relate to and account for.
Wonder why the title states allegedly but not the article?
Self-driving cars are constantly subject to mini-trolley problems. By training on human data, the robots learn values that are aligned with what humans value. -- Ashok Elluswamy (VP AI/Autopilot at Tesla)
If they were using my data I'd be partly responsible, due to failing to swerve around the last few suicidal prairie dogs I rolled over. I hate when that happens but I don't attempt high speed evasions. But I would if it were something larger, human or not, out of self defense. And it's never happened but I hope I'd stomp and swerve for a toddler. I'm happy with an autopilot learning that rule set, even though I've lost too many cats under tires.You probably get more honest answers by presenting a trolley problem and then requiring a response within a second. It's a great implicit bias probe.
Otherwise it's a slippery slope of "well but it's generally good"
sema4hacker•3mo ago
eminence32•3mo ago
nomel•3mo ago
potato3732842•3mo ago
So ever 0 isn't "safe", lol
rogerrogerr•3mo ago
The interesting point will be when insurance companies reduce your rate if your car doesn’t have a steering wheel (or, equivalently, charge a “driving manually” fee). It might be obscured if car companies take on the risk themselves, but at some point people will start to notice that driving manually costs more.
IAmBroom•3mo ago
Not "better than the best", but "safer than the average driver" - and if you aren't the only one on the road, your safety is a mix of your skill and everyone else's.
asveikau•3mo ago
Still, this cat was on a busy stretch of 16th Street for nearly a decade and was unharmed by human drivers. I think Waymo failed pretty badly here. Some of the dismissive comments I've seen on this topic seem to me like they're making excuses.
Fire-Dragon-DoL•3mo ago
I was less than 18, using one of those little cars that reaches at most 50 Km/h. I slammed the break and manage to stop maybe 2 cm from the kitty, which managed to continue out of the street alive.
The scooter behind me came close to me and complained that I almost killed them by slamming the breaks. To this day, I still don't know if that was the right call. That guy could have been a dad and I could have killed a father. Still I couldn't think of killing a cat either.
jopicornell•3mo ago
Fire-Dragon-DoL•3mo ago
mensetmanusman•3mo ago
Fire-Dragon-DoL•3mo ago
The reason why the guy behind me survived is because he did veer when I slammed the breaks
michael1999•3mo ago
Redster•3mo ago
It sounds more like instead of learning a lesson about following too closely, he decided to turn the mix of anger or fear onto you. Hopefully, with time removed from the situation, he will realize that he should not follow as closely. And hopefully you won't be too affected by guilt-manipulation. (Obviously, it's a good idea when something behind you cannot slow quickly, to try not to brake too quickly, but in theory, a scooter should be able to stop very quickly.) With hard calls, you only can do your best with the information you had at the time.