*: I can't work out from the article whether this file was erased, or just unlinked from the filesystem: they quote someone as saying the latter, but it looks like it was actually the former.
If I were implementing such a system (and I have), I could see myself deleting the temporary without much thought. I would still have built a way to recreate the contents of the tarball after the fact (it's been a requirement from legal every time I've scoped such a system). Tesla not only failed to do that, but avoided disclosing that any such file was transferred in the first place so that the plaintiffs wouldn't know to request it.
Given storage is a finite resource, removing the tar after it was confirmed in the bucket is pure waste.
Even in that case though, you would still have a way to produce the data because it would have been specced in the requirements when you were thinking about the broader organizational context.
I've made no contention, but if I had, it would be that whoever signed off on this design had better not have a PE license that they would like to keep, and we as an industry would be wise not to keep counting on our grandfather-clause "fun harmless dorks" cultural exemption now that we manufacture machines which obviously kill people. If that by you is conspiracy theory, you're welcome.
ETA: Restate your conspiracy theory in the hypothetical case that they had used `tar | curl` instead of the intermediate archive file. Does it still seem problematic?
I'm not going to argue with someone who throws gratuitous insults. Rejoin me outside the gutter and we'll continue, if you like. But the answer to your question is trivially yes, as might by now have been obvious had you sought conversation rather than - well, whatever I suppose the slander by you was meant to be. One hopes we'll see no more of it.
Send the corporation to jail. That means it cannot conduct business for the same amount of time that we would put a person in jail.
Make negligence unprofitable and the profit-optimizers will take care of the rest. Then 80 years later when people get too used to "trustworthy" corporationns we can deregulate everything and repeat the cycle
The MCI Worldcom fraud, which broke shortly after Enron, might also have doomed AA (they were the auditor for both major frauds of 2002). MCI Worldcom filed for bankruptcy before it could be hit with criminal charges, and the SEC ended up operating MCI-W in the bankruptcy, because the fines were so large and are senior to all other debts, so they outmuscled all of the other creditors in the bankruptcy filings. Which was why they weren't hit with criminal charges- they already belonged to the Government. There hasn't been much stomach for criminal charges against a corporation ever since.
The fact that the Supreme Court has spent the past few decades making white collar crimes much harder to prosecute (including with Arthur Andersen, where they unanimously reversed the conviction in 2005) is another major factor. The Supreme Court has always been terrible, and gets far more respect than it deserves.
Tesla must pay portion of $329M damages after fatal Autopilot crash, jury says
Buying SPY, my mistake. Being incentivized to put money in my 401k... That is a bit harder to solve.
It just looks stupid to me in a way that makes me more likely to discount your post.
For an article that is supposed to at least smell like journalism, it looks so trashy.
Journalism is a thing of its own; blogs aren't it.
[0] https://www.thedrive.com/news/24025/electreks-editor-in-chie... ("Electrek’s Editor-in-Chief, Publisher Both Scoring $250,000 Tesla Roadsters for Free by Gaming Referral Program": "What happens to objective coverage when a free six-figure car is at play? Nothing good." (2018))
Negative, cheesy, clickbait, rage inducing etc headlines do seem to get more clicks. There is a reason why politicians spend more time trash talking opponents than talking positively about themselves. Same goes with attack ads.
I have not doubt a majority of people will say they despise these pictures like YouTube thumbnails, yet the cold numbers tell the opposite.
- Honoré de Balzac
"Tesla awards boss Elon Musk $29bn in shares" - https://www.bbc.com/news/articles/cz71vn1v3n4oSo this is also a failure of the investigator.
Autopilot is cruise control. When you understand this, claiming that Tesla is partially at fault here does not match the existing expectations of other driver assistance tech. Just because Tesla has the capability of disabling it doesn't mean they have to.
This all comes down to an interpretation of marketing speak. If you believe "autopilot" is misleading you'd agree with the jury here, if you don't you wouldn't. I'm no lawyer, and don't know the full scope of requirements for autopilot like features, but it seems that Tesla is subject to unfair treatment here given the amount of warnings you have to completely ignore and take no responsibility for. I've never seen such clear warnings on any other car with similar capabilities. I can't help but think there's maybe some politically driven bias here and I say that as a liberal.
Happy to be convinced otherwise. I do drive a Tesla, so there's that.
Lol is this for real? No amount of warnings can waive away their gross negligence. Also, the warnings are clearly completely meaningless because they result in nothing changing if they are ignored.
> Autopilot is cruise control
You're pointing to "warnings" while simultaneously saying this? Seems a bit lacking in self awareness to think that a warning should muster the day, but calling cruise control "autopilot" is somehow irrelevant?
> I can't help but think there's maybe some politically driven bias here
Look only to yourself, Tesla driver.
That’s not true
> Do I still need to pay attention while using Autopilot?
> … Before enabling Autopilot, you must agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle.” Once engaged, Autopilot will also deliver an escalating series of visual and audio warnings, reminding you to place your hands on the wheel if insufficient torque is applied. If you repeatedly ignore these warnings, you will be locked out from using Autopilot during that trip.
> If you repeatedly ignore the inattentive driver warnings, Autosteer will be disengaged for that trip. If you receive several ‘Forced Autopilot Disengagements’ (three times for vehicles without a cabin camera and five times for vehicles with a cabin camera), Autosteer and all features that use Autosteer will be temporarily removed for approximately one week.
>> If you repeatedly ignore the inattentive driver warnings, Autosteer will be disengaged for that trip. If you receive several ‘Forced Autopilot Disengagements’ (three times for vehicles without a cabin camera and five times for vehicles with a cabin camera), Autosteer and all features that use Autosteer will be temporarily removed for approximately one week.
There are videos of people on autopilot without their hands on the wheel...
What part of how autopilot is marketed do you find to be gross negligence?
I would ask, what is the existing definition of autopilot as defined by the FAA? Who is responsible when autopilot fails? That's the prior art here.
Additionally if NTSB failed to clearly define such definitions and allowments for marketing, is that the fault of Tesla or the governing body?
I'm pretty neurotic about vehicle safety and I still don't think this clearly points to Tesla as being in the wrong with how they market these features. At best it's subjective.
"Auto Pilot: a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot."
"Cruise Control: an electronic device in a motor vehicle that can be switched on to maintain a selected constant speed without the use of the accelerator."
That is not how it’s marketed at all.
Would Boeing or John Deere be responsible for marketing language or just the instruction manual. We know the latter is true. It's there any evidence of the former? Intuitively I would say it's unlikely we'd blame Boeing if a pilot was mislead by marketing materials. Maybe that has happened but I haven't found anything of that sort (please share if aware).
Actually, the former is true. Courts and juries have repeatedly held that companies can be held responsible for marketing language. They are also responsible for the contents of their instruction manual. If there are inconsistencies with the marketing language it will be held against the company because users aren't expected to be able to reconcile the inconsistencies; that's the company's job. Thus, it's irrelevant that the small print in the instruction manual says something completely different from what all the marketing (and the CEO himself) says.
The "autopilot is limited" argument would have worked 20 years ago. It doesn't today. Modern autopilots are capable of maintaining speed, heading, takeoff, and landing so they're not just pilot assistance. They're literally fully capable of handling the flight from start to finish. Thus, the constant refrain that "autopilot in cars is just like autopilot in planes" actually supports the case against Tesla.
I've owned two Tesla's ( now a Rivian/Porsche EV owner). Hands down Tesla has the best cruise control technology in the market. There-in lies the problem. Musk constantly markets this as self driving. It is NOT. Not yet at least. His mouth is way way way ahead of his tech.
Heck, stopping for a red light is a "feature", where the car is perfectly capable of recognizing and doing so. This alone should warrant an investigation and one that i completely, as a highly technical user, fell for when i first got my model 7 delivered... Ran thru a red light trying out auto pilot for the first time.
I'm honestly surprised there are not more of these lawsuits. I think there's a misinterpretation of the law by those defending Tesla. The system has a lot of legalese safe-guards and warnings. But the MARKETING is off. WAY OFF. and yes, users listen to marketing first.
and that ABSOLUTELY counts in a court of law. You folks would also complain around obtuse EULA, and while this isn't completely apples to apples here, Tesla absolutely engages in dangerous marketing speak around "auto pilot". Eliciting a level of trust for drives that isn't there, and they should not be encouraging.
So sorry, this isn't a political thing ( and yes, disclaimer, also a liberal).
Signed... former Tesla owner waiting for "right around the corner" self driving since 2019...
Are there clear guidelines set for labeling and marketing of these features? If not, I'm not sure how you can argue such. If it was so clearly wrong it should have been outlined by regulation, no?
The article says no warnings were issued before the crash.
So which warning did the driver miss?
This is the responsibility of a licensed driver. I don't know how a Mercedes works, but if I crash one because I misused a feature clearly outlined in their user manual, Mercedes is not at fault for my negligence.
the Center for Science in the Public Interest filed a class-action lawsuit
The suit alleges that the marketing of the drink as a "healthful alternative" to soda is deceptive and in violation of Food and Drug Administration guidelines.
Coca-Cola dismissed the allegations as "ridiculous," on the grounds that "no consumer could reasonably be misled into thinking Vitaminwater was a healthy beverage"
One, you don't need a license to buy a non alcoholic beverage. Two, while the FDA has clear guidelines around marketing and labeling, I'm not aware of any regulatory body having clear guidelines around driver assistance marketing. If they did it wouldn't be controversial.
And it didn't warn users about this lack of capabilities until it was forced to do so. Those warnings you're talking about were added after this accident occurred as part of a mandated recall during the Biden administration.
In other words, if you bought the car because you kept hearing the company say "this thing drives itself", you're probably going to believe that over the same company putting a "keep your eyes on the road" popup on the screen.
Of course other companies have warnings that people ignore, but they don't have extremely successful marketing campaigns that encourage people to ignore those warnings. That's the difference here.
My autosteer will gladly drive through red lights, stop signs, etc.
And the fact that we have telemetry at all is pretty amazing. Most car crashes there's zero telemetry. Tesla is the exception, even though they did the wrong thing here.
That is - gamble GOP alignment leading to to regulatory capture such that the bar is lowered enough that they can declare the cars safe.
Even California's system is lax enough that you can drive a Tesla semi through it.
Are you also arguing that Tesla didn’t withhold data, lie, and misdirect the police in this case, as the article claims? Seems to me that Tesla tried to look as guilty as possible here.
I agree with you that doesn’t matter when it comes to covering up/lying about evidence.
They could have been 0.5% at fault. Doesn’t mean that was ok.
For context though, note that this crash occurred because the driver was speeding, using 2019 autopilot (not FSD) on a city street (where it wasn't designed to be used), bending down to pick up a phone he dropped on the floor, and had his foot on the gas overriding the automatic braking: https://electrek.co/2025/08/01/tesla-tsla-is-found-liable-in... The crash itself was certainly not Tesla's fault, so I'm not sure why they were stonewalling. I think there's a good chance this was just plain old incompetence, not malice.
I do like the idea of incentivizing companies to take all reasonable steps to protect people from shooting themselves in the foot, but what counts as "reasonable" is also pretty subjective, and liability for having a different opinion about what's "reasonable" seems to me to be a little capricious.
For example, the system did have a mechanism for reacting to potential collisions. The vehicle operator overrode it by pushing the gas pedal. But the jury still thinks Tesla is still to blame because they didn't also program an obnoxious alarm to go off in that situation? I suppose that might have been helpful in this particular situation. But exactly how far should they legally have to go in order to not be liable for someone else's stupidity?
The meme of Hanlon's Razor needs to die. Incompetence from a position of power is malice, period.
A bit more nuanced version is that incompetence from a position of power is a choice.
I guess you could go even more nuanced and say sometimes incompetence from a position of power is a choice, and I would agree with that, but now the statement seems so watered down as to be almost meaningless.
> Update: Tesla’s lawyers sent us the following comment about the verdict:
> Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator – which overrode Autopilot – as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver – from day one – admitted and accepted responsibility.
---
Personally, I don't understand how people can possibly be happy with such verdicts.
Recently in 2025, DJI got rid of their geofences as well, because it's the operator's responsibility to control their equipment. IIRC, DJI did have support of the FAA in their actions of removing the geofencing limitations.
These sorts of verdicts that blame the manufacturer for operator errors, are exactly why we can't have nice things.
It's why we get WiFi and 5G radios, and boot loaders, that are binary-locked, with no source code availability, and which cannot be used with BSD or Linux easily, and why it's not possible to override anything anywhere anymore.
Even as a pedestrian, I'm glad that Tesla is fighting the good fight here. Because next thing I know, these courts will cause the phone manufacturers to disable your phone if you're walking next to a highway.
Letting people use autopilot in unsafe conditions is contributory negligence. Given their marketing, that's more than worth 33% of the fault.
That they hid this data tells me everything I need to know about their approach to safety. Although nothing really new considering how publicly deceitful Musk is about his fancy cruise-control.
Cruise had to shut down after less than this but, because Elon has political power over regulation now, a Tesla could drive right through a farmers market and they wouldn't have to pause operations even for an afternoon.
Does he still? I wouldn't be so sure.
duxup•2h ago
bcrosby95•1h ago
elil17•1h ago
buyucu•18m ago
duxup•2m ago