The much more likely hypothesis in my view is that he was helping Trump because of personal conviction (only in small parts motivated by naked self-interest).
You should expect rational billionaires to tend politically right out of pure self-interest and distorted perspective alone; because the universal thing that such parties reliably do when in power is cutting tax burden on the top end.
So this isn't so much of an assumption, as taking him at his word.
The problem is not that the republican party used to be a conservative right party.
What I’m saying is this is not a sports competition where Musk is automatically an opponent of the Democratic party because he supported Trump. He supported Trump in order to improve his chances with the legal system because he knew Trump would be willing to be so corrupt.
Another world might be imagined in which the Democratic party was taken over in 2016 but that is not the world we live in.
This is how conservatives keep people going 'both sides!' even though they manufacture whatever is required to be that way.
> U.S. District Judge Beth Bloom, who presided over the case, said in an order that she did not find “sufficient evidence” that Tesla’s failure to initially produce the data was intentional.
He used to be quite charismatic, I believed him up until about 2017 or so. Then I figured he was just a bit greedy and maybe money got to his head but still a respectable innovator. However during 2020 or 2021 (I don’t exactly remember) he started to get quite unpleasant and making obviously short-term decisions, such as relying only on cameras for self driving because of chip shortages but dressing it up as an engineering decision.
What is your actual point? What would he stand in front of a judge for, right now, if Harris had won?
Sending a bunch of scriptkiddies around and having them cut government funding and gut agencies is not really how you make evidence "vanish", how would that even work?
And, lastly, jumping in front of an audience at every opportunity and running your mouth is the absolute last thing anyone would ever do if the goal was to avoid prosection. But it is perfectly in line with a person that has a very big ego and wants to achieve political goals.
Humans are simply incapable of paying attention to a task for long periods if it doesn't involve some kind of interactive feedback. You can't ask someone to watch paint dry while simultaneously expect them to have < 0.5sec reaction time to a sudden impulse three hours into the drying process.
Oh! And also, moving within the lane is sometimes important for getting a better look at what's up ahead or behind you or expressing car "body language" that allows others to know you're probably going to change lanes soon.
1. AEB brakes violently to a full stop. We experience shock and dismay. What happened? Oh, a kid on a bike I didn't see. I nearly fucked up bad, good job AEB
2. AEB smoothly slows the vehicle to prevent striking the bicycle, we gradually become aware of the bike and believe we had always known it was there and our decision eliminated risk, why even bother with stupid computer systems?
Humans are really bad at accepting that they fucked up, if you give them an opportunity to re-frame their experience as "I'm great, nothing could have gone wrong" that's what they prefer, so, to deliver the effective safety improvements you need to be firm about what happened and why it worked out OK.
The problem here isn't that people think they don't need to pay attention because their car can drive itself and then crash. The problem is that people who know full well that they need to focus on driving just don't because fundamentally the human brain isn't any good at paying attention 100% of the time in a situation where you can get away with not paying attention 99.9% of the time, and naming just can't solve this.
Like also everytime let the downvotes rain. If you downvote, it would be nice, if you could tell me where I am wrong. It might change my view on things.
All this demonstrates is the term “full self driving” is meaningless.
Tesla has a SAE Level 3 [1] product they’re falsely marketing as Level 5; when this case occurred, they were misrepresenting a Level 2 system as Level 4 or 5.
If you want to see true self driving, take a Waymo. Tesla can’t do that. They’ve been lying that they can. That’s gotten people hurt and killed; Tesla should be liable for tens if not hundreds of billions for that liability.
It’s meaningless because Tesla redefines it at will. The misrepresentation causes the meaninglessness.
Also "All this demonstrates is the term “full self driving” is meaningless." prooves my point that it is not missleading.
The levels are set at the lowest common denominator. A 1960s hot rod can navigate a straight road with no user input. That doesn’t mean you can trust it to do so.
> Where did Tesla say FSD is SAE Level 5 approved?
They didn’t say that. They said it could do what a Level 5 self-driving car can do.
“In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.
‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.) [1]”
> Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading
This is tautology. You’re defining FSD to mean whatever Tesla FSD can do.
[1] https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...
FSD cannot “do everything a Level 5 system can.” It can’t even match Waymo’s Level 4 capabilities, because it periodically requires human intervention.
But granting your premise, you’d say it’s a Level 2 or 3 system with some advanced capabilities. (Mercedes has a lane-keeping and -switching product. They’re not constantly losing court cases.)
Self driving can totally means the human own-self driving.
Having SAE level is clearer.
Blocking a technology is Luddism. Blocking a company is politics.
> Moments later, court records show, the data was just as automatically “unlinked” from the 2019 Tesla Model S at the scene, meaning the local copy was marked for deletion, a standard practice for Teslas in such incidents, according to court testimony.
Wow...just wow.
edit: My point is that it was not one lone actor, who would have made that change.
It’s just worded to make this sound sketchy. I bet ten bucks “unlinked” just refers to the standard POSIX call for deleting a file.
Also this is not like some process crash dump where the computer keeps running after one process crashed.
This would be like an plane black box uploading its data to the manufacturer, then deleting itself after a plane crash.
Deleting the data on the server is totally sketchy, but that’s not what the quoted section is about.
How Tesla could say that detecting a collision and not locking all/any of the data is normal is just insane.
The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.
Crash data in particular should be considered sacred, especially given the severity in this case. Ideally it should be kept both on the local black box and on the servers. But anything that leads to it being treated as instantly disposable everywhere, or even just claiming it was deleted, can only be malice.
Exactly. The issue is deleting the data on the servers, not a completely mundane upload-then-delete procedure for phoning home. This should have been one sentence, but instead they make it read like a heist.
My money is on nobody built a tool to look up the data, so they have it, they just can't easily find it.
If you built the system to keep such data then you know you should have it. Claiming it was deleted would be an outright lie.
Tesla built such an expensive system to collect data, routinely used it to inform engineers and developers for future product development and AI training, but wasn't able to find data on it when it happened to be incriminating?
I keep seeing this attitude of innocently insisting there must have been no wrongdoing, just understandable mistakes, playing down the situation. From a company with a proven track record of subterfuge and maliciousness, with a leader like Musk.
This data is yours. You were going the speed limit when the accident happened and everyone else claims you were speeding. It would take forever to clear your name or worse you could be convicted if the data was lost.
This is more of "you will own nothing" crap. And mainly so Tesla can cover its ass.
Maybe this is not appropriate for a car, but that doesn’t excuse the ridiculous breathless tone in the quoted text. It’s the worst purple prose making a boring system sound exciting and nefarious. They could have made your point without trying to make the unlink() call sound suspicious.
The rogue engineer defense worked so well for VW and Dieselgate.
The issue of missing crash data was raised repeatedly. Deleting or even just claiming it was deleted can only be a mistake the first time.
So either the problem is Tesla engineers are fucking stupid (doubtful) or this is a poor business/product design.
My money is on the latter.
I worked a year in airbag control, and they recorded a bit of information if the bags were deployed - seatbelt buckle status was one thing. But I was under the impression there was no legal requirement for that. I'm sure it helps in court when someone tries to sue because they were injured by an airbag. The argument becomes not just "the bags comply with the law" but also "you weren't wearing your seatbelt". Regardless, I'm sure Tesla has a lot more data than that and there is likely no legal requirement to keep it - especially if it's been transferred reliably to the server.
The performance ship sailed, like, 15 years ago. We're already storing about 10000000 more data than we need. And that's not even an exaggeration.
> Tesla later said in court that it had the data on its own servers all along
Perhaps if there is some sort of crash.
Which of these is evidence of a conspiracy:
tar cf - | curl
TMPFILE=$(mktemp) ; tar cf $TMPFILE ; curl -d $TMPFILE ; rm $TMPFILE
The requirements should have been clear that crash data isn't just "implement telemetry upload", a "collision snapshot" is quite clearly something that could be used as evidence in a potentially serious incident.
Unless your entire engineering process was geared towards collecting as much data that can help you, and as little data as can be used against you, you'd handle this like the crown jewels.
Also, to nit-pick, the article says the automated response "marked" for deletion, which means it's not automatically deleted as your reductive example which doesn't verify it was successfully uploaded (at least && the last rm).
Any competent engineer who puts more than 3 seconds of thought into the design of that system would conclude that crash data is critical evidence and as many steps as possible should be taken to ensure it's retained with additional fail safes.
I refuse to believe Tesla's engineers aren't at least competent, so this must have been done intentionally.
Assuming its not intentionally malicious this is a really dumb bug that I could have also written. You zip up a bunch of data, and then you realize that if you don't delete things you've uploaded you will fill up all available storage, so what do you do? You auto delete anything that successfully makes it to the back-end server, you mark the bug fixed, not realizing that you overlooked crash data as something you might want to keep.
I could 100% see this being what is happening.
You're implying it's special for crashes, but we don't know that.
Saying "hey, the upload_and_delete function is used in loads of places!" doesn't free you of the responsibility that you used that function in the crash handler.
After it confirmed upload to the server? What if it was a minor collision? The car may be back on the road the same day, or get repaired and on the road next week. How long should it retain data (that is not legally required to be logged) that has already been archived, and how big does the buffer need to be?
If the car requires that a certain amount of storage is always available to write crash data to, then it doesn't matter what's in that particular area of storage. That reserved storage is always going to be unavailable for other general use.
> someone at Tesla probably took “affirmative action to delete” the copy of the data on the company’s central database, too
Even if Tesla hadn't squandered it's EV lead and was instead positioned to be a robotics and AI superpower, is this really the corporate behavior you would want? This is some fucking Aperture Science level corporate malfeasance.
Maybe this thread will be different
If that is actually designed like this, the only reason I could see for it would be so that Tesla has sole access to the data and can decide whether to use it or not. Which really should not work in court, but it seems it has so far.
And of course I'd expect an audit trail for the deletion of crash data on Tesla servers. But who knows whether there actually isn't one, or nobody looked into it at all.
For the first kind of data, deleting the data from the car the moment there’s confirmation that it now is stored at Tesla can make perfect sense as a mechanism to prevent the car to run out of storage space.
Of course, if the car crashed, deleting the data isn’t the optimal, but that it gets deleted may not be malice.
But in the end we wouldn't be discussing this at all if Tesla had simply handed over the data from their servers. If they can't find it, it isn't actually there or they deliberately removed it this affects how I view this process.
Two copies are better than one. If you immediately erase the data, you better be sure the transmitted data is safe and secure. And obviously it wasn't.
Anytime data is recorded legal is immediately asking about retention so they don't end up empty handed in front of a judge.
Every byte that car records and how it is managed will be documented in excruciating detail by legal.
This is not unlike what happens for flight data recorders after a crash. The raw data is not made public right away, if ever.
But the data was mostly unprotected on the devices, or it couldn't have been restored. And Tesla isn't exactly known for respecting the privacy of their customers, they have announced details about accidents publicly before.
And there is the potential conflict of interest, Tesla does have strong incentives to "lose" data that implicates Autopilot or FSD.
All vehicle manufacturers have sole access to data. There isn't a standard for logging data, nor a standard for retrieving it. Some components log data and it only the supplier has the means to read and interpret it.
They tried to recruit me for the UI. If I lived closer, I would have jumped on it. Not only was I bit of a Tesla fanboy at the time, I used to work across the street from their office and really liked that area. (Deer Creek Road in Palo Alto.)
...and potentially death?
Lies about capabilities, timelines, even things as frivolous as being rank one in a video game. He bought Twitter to scale his deception.
And the distinction is what?
I'm not serious of course. There are huge swaths of the public whose eyes would glaze over if you tried to explain it, and that's my point.
Props to greenthehacker. may you sip Starbuck's venti-size hot chocolates for many years to come.
So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?
Tesla recanted its employee’s testimony “after discovering evidence inconsistent with his stated recollection of events,” it said.
That’s a fancy way to say that he lied
ath3nd•2h ago
Do we expect them to admit they were outright lying and wrong considering their leader is a pill popping Nazi salute making workaholic known to abuse his workers?
rwmj•1h ago
emsign•1h ago
gruez•30m ago
psychoslave•1h ago
lawn•1h ago
But today you just have a private dinner with the president and he'll wave it away.