Surely the driver had to know his car did not take over these functions and if the software was not designed to do this, then it couldn't have malfunctioned.
“Before the crash, McGee had engaged the system that Tesla calls Autopilot, which can steer, brake and accelerate the car on its own.
…
Lawyers for the plaintiffs accuse the company of overpromising what its technology can do in order to sell cars.
…
In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.
‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.)
https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...
If he knew it wouldn't, then Tesla obviously is innocent and all blame lies with the driver.
We wouldn’t be talking about it.
If the Tesla just blew a stop sign there wouldn’t be a suit.
My Subaru has collision avoidance. It markets this, albeit somewhat carefully. If the collision detection-and-avoidance system straight up didn’t work, and I got injured in an accident because of it, I’d hold both the driver and Subaru responsible. Because they sold me a thing that they claimed would do something it didn’t.
Tesla’s collision-avoidance system was marketed as an autopilot that didn’t need human supervision except for legal reasons. It didn’t work. And it failed in a way cars with collision-avoidance radars probable wouldn’t have.
Outrageous; compare to cruise control on other cars.
In line with the outrageous regulation kill-switch-on-all-new-cars-after-2026, inside the infrastructure and jobs bill passed in 2021.
I see there's a newer bill drafted called No Kill Switches in Cars Act, hope it gets moving.
Other ADAS systems have much better driver monitoring. My 2016 Honda did. My 2021 Ford does.
The government has forced Tesla to strengthen their monitoring at least twice.
Tesla called it Autopilot. Tesla (seemingly) didn’t care about reckless behavior. Tesla repeatedly had misleading claims about being far safer than humans driving. Tesla didn’t even keep the data that would be needed to prove that for the first 2 years.
abbotcabbot•6mo ago
MBCook•6mo ago
Either way, it broke the law and killed someone. Does it matter if it was an act in 1 part or 5?
abbotcabbot•6mo ago
If it had 5 errors 4 of which were not deadly due to random circumstances like no vehicle with right of way passing, then that is worse than if it committed one error that was deadly. I might fix the second implementation and scrap the first.
1970-01-01•6mo ago