> Very few of Waymo’s most serious crashes were Waymo’s fault
>> I looked at 45 major Waymo crashes—most were human error.
Not sure whether you intentionally submitted a misleading title, or was a genuine mistake. Either way, the submitted title grossly changes the meaning.
EDIT: Read the article to see whether the headline was clickbaity. Spoiler - it was.
> At 1:14 AM on May 31st, a Waymo was driving on South Lamar Boulevard in Austin, Texas, when the front left wheel detached. The bottom of the car scraped against the pavement as the car skidded to a stop, and the passenger suffered a minor injury, according to Waymo.
Among the 45 most serious crashes Waymo experienced in recent months, this was arguably the crash that was most clearly Waymo’s fault. And it was a mechanical failure, not an error by Waymo’s self-driving software.
--
That's the most serious of the crash. The article is written in a way to sound like the autonomous driving had issues, when it was mechanical failure. Not saying that's bad, but isnt what the title of the article was trying to say. It could have as well be $any_car
That's not ambiguous at all. If you are unable to stop in time if the car in front of you slams on the brakes, you were following too closely, and that is your fault.
My guess is that the ambiguity is about the trade-off: "even if no one should hit you for slamming on the brakes, is it a good risk to take over a cat?"
jqpabc123•4mo ago
https://electrek.co/2025/08/04/tesla-withheld-data-lied-misd...