frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Why Local-First Apps Haven't Become Popular?

https://marcobambini.substack.com/p/why-local-first-apps-havent-become
50•marcobambini•48m ago•41 comments

Easy Forth

https://skilldrick.github.io/easyforth/
59•pkilgore•2h ago•16 comments

CompileBench: Can AI Compile 22-year-old Code?

https://quesma.com/blog/introducing-compilebench/
17•jakozaur•1h ago•4 comments

Kmart's use of facial recognition to tackle refund fraud unlawful

https://www.oaic.gov.au/news/media-centre/18-kmarts-use-of-facial-recognition-to-tackle-refund-fr...
110•Improvement•3h ago•82 comments

SGI demos from long ago in the browser via WASM

https://github.com/sgi-demos
133•yankcrime•6h ago•28 comments

How I, a beginner developer, read the tutorial you, a developer, wrote for me

https://anniemueller.com/posts/how-i-a-non-developer-read-the-tutorial-you-a-developer-wrote-for-...
574•wonger_•12h ago•273 comments

A Beautiful Maths Game

https://sinerider.com/
25•waonderer•2d ago•7 comments

Beyond the Front Page: A Personal Guide to Hacker News

https://hsu.cy/2025/09/how-to-read-hn/
33•firexcy•4h ago•4 comments

You did this with an AI and you do not understand what you're doing here

https://hackerone.com/reports/3340109
604•redbell•6h ago•284 comments

Biconnected components

https://emi-h.com/articles/bcc.html
25•emih•14h ago•5 comments

M4.6 Earthquake – 2 km ESE of Berkeley, CA

https://earthquake.usgs.gov/earthquakes/eventpage/ew1758534970/executive
104•brian-armstrong•4h ago•49 comments

What happens when coding agents stop feeling like dialup?

https://martinalderson.com/posts/what-happens-when-coding-agents-stop-feeling-like-dialup/
36•martinald•1d ago•23 comments

Privacy and Security Risks in the eSIM Ecosystem [pdf]

https://www.usenix.org/system/files/usenixsecurity25-motallebighomi.pdf
203•walterbell•9h ago•107 comments

DeepSeek-v3.1-Terminus

https://api-docs.deepseek.com/news/news250922
34•meetpateltech•1h ago•11 comments

Sj.h: A tiny little JSON parsing library in ~150 lines of C99

https://github.com/rxi/sj.h
435•simonpure•21h ago•214 comments

Show HN: Software Freelancers Contract Template

https://sopimusgeneraattori.ohjelmistofriikit.fi/?lang=en
85•baobabKoodaa•6h ago•26 comments

Optimized Materials in a Flash

https://newscenter.lbl.gov/2025/09/18/optimized-materials-in-a-flash/
9•gnabgib•3d ago•0 comments

We Politely Insist: Your LLM Must Learn the Persian Art of Taarof

https://arxiv.org/abs/2509.01035
103•chosenbeard•13h ago•45 comments

Metamaterials, AI, and the Road to Invisibility Cloaks

https://open.substack.com/pub/thepotentialsurface/p/metamaterials-ai-and-the-road-to
26•Annabella_W•5h ago•8 comments

Why is Venus hell and Earth an Eden?

https://www.quantamagazine.org/why-is-venus-hell-and-earth-an-eden-20250915/
159•pseudolus•15h ago•247 comments

A Generalized Algebraic Theory of Directed Equality

https://jacobneu.phd/
49•matt_d•3d ago•15 comments

The death rays that guard life

https://worksinprogress.co/issue/the-death-rays-that-guard-life/
24•ortegaygasset•4d ago•12 comments

Download responsibly

https://blog.geofabrik.de/index.php/2025/09/10/download-responsibly/
262•marklit•8h ago•182 comments

Simulating a Machine from the 80s

https://rmazur.io/blog/fahivets.html
58•roman-mazur•3d ago•8 comments

How can I influence others without manipulating them?

https://andiroberts.com/leadership-questions/how-to-influence-others-without-manipulating
165•kiyanwang•15h ago•163 comments

I uncovered an ACPI bug in my Dell Inspiron 5567. It was plaguing me for 8 years

https://triangulatedexistence.mataroa.blog/blog/i-uncovered-an-acpi-bug-in-my-dell-inspiron-5667-...
131•thunderbong•4d ago•16 comments

40k-Year-Old Symbols in Caves Worldwide May Be the Earliest Written Language

https://www.openculture.com/2025/09/40000-year-old-symbols-found-in-caves-worldwide-may-be-the-ea...
171•mdp2021•4d ago•99 comments

What if AMD FX had "real" cores? [video]

https://www.youtube.com/watch?v=Lb4FDtAwnqU
12•zdw•3d ago•4 comments

Lightweight, highly accurate line and paragraph detection

https://arxiv.org/abs/2203.09638
131•colonCapitalDee•16h ago•23 comments

Be careful with Go struct embedding

https://mattjhall.co.uk/posts/be-careful-with-go-struct-embedding.html
116•mattjhall•14h ago•83 comments
Open in hackernews

Tesla coast-to-coast FSD crashes after 60 miles

https://electrek.co/2025/09/21/tesla-influencers-tried-elon-musk-coast-to-coast-self-driving-crashed-before-60-miles/
215•HarHarVeryFunny•2h ago

Comments

gdulli•1h ago
I thought the title meant the software crashed, but it was the car that crashed: "They didn’t make it out of California without crashing into easily avoidable road debris that badly damaged the Tesla Model Y ... In the video, you can see that the driver doesn’t have his hands on the steering wheel. The passenger spots the debris way ahead of time. There was plenty of time to react, but the driver didn’t get his hands on the steering wheel until the last second."
lazide•1h ago
The issue with ‘driver in the loop only for emergencies’ is that it’s harder for someone to be prepared to react only occasionally (but still promptly and correctly) when outlier behavior happens, not easier.

For automated tools like CNC machines, there is a reason it’s just ‘emergency stop’ for instance, not ‘tap the correct thing we should do instead’.

But doing an e-stop on a road at speed is a very dangerous action as well, which is why that isn’t an option here either.

UncleMeat•1h ago
And you don't even know until it is too late even if you are paying attention. Suppose you see the obstruction well in advance. You expect the car to change lanes. You get past the point where you would have changed lanes but there's still time. You aren't yet aware that the car is missing the obstruction. A moment later you are much closer and make the judgement that the car is making a mistake. You grab the wheel but now it's too late.

The alternative is taking control of the car all the time. Something nobody is going to practically do.

lazide•1h ago
Yeah, in many ways the only way a human can know for sure that it’s not seeing something is past the point of no return/the absence of action, which is always hard.
detaro•1h ago
And why many "real" self-driving systems that share the road with other users are often quite speed-limited, to allow them to come to a full stop relatively safely if needed, and give a human time to take over properly. (E.g. the Mercedes system will stick way below the usual speed limits, which means it's main use currently is handling heavy traffic with stop-and-go, or many of the autonomous "busses" for dense city spaces/campuses/...)

Passively monitoring a situation continuously and reacting quickly when needed is something humans are not good at. Even with airline pilots, where we do invest a lot in training and monitoring, it's a well-understood issue that having to take over in a surprising situation often leads to confusion and time needed to re-orient before they can take effective action. Which for planes is often fine, because you have the time buffer needed for that, but not always.

anonymars•1h ago
There's sort of an analogous conundrum with Airbus-style automation: the system has various levels of automation and protection (e.g. preventing you from stalling the plane)

And then when something goes wrong, you are unceremoniously handed a plane with potentially some or all of those protections no longer active.

As an analogy, imagine your FSD car was trying to slow down for something, but along the way there is some issue with a sensor. So it gives up and hands control back to you while it's in the middle of braking, yet now your ABS is no longer active.

So now the situation is much more sudden than it would have been (if you had been driving the car you would have been aware of it and slowing down for it youself earlier in the game), you likely weren't paying as much attention in the first place because of the automation, and some of the normal protection isn't working.

So it's almost three levels of adding insult to injury. Potentially related discussion: https://news.ycombinator.com/item?id=43970363

kayodelycaon•1h ago
It’s important to point out that airline pilots are trained to handle sudden emergencies. This has been incredibly successful at scale. But it came great expense of both money and lost lives. And it still isn’t perfect.

The level of training required to oversee full automation is non-trivial if you have to do more than press a stop button.

Coffeewine•1h ago
I’ll defend airbus a little - there are flight laws that more or less provide at any given moment as much automation as is possible given the state of the sensors and computers. So it doesn’t just go ‘oops, a sensor failed, now you have direct control of the plane.’

It does have the same problem - if 99.999% of your flight time is spent in normal law you are not especially ready to operate in one of the alternate laws or god forbid direct law, which is similar to the case of a driver who perhaps accustomed to the system forget how to drive.

But I think we have a ways before we get there. If the car could detect issues earlier and more gradually notify the driver that they need to take control, most every driver at present retains the knowledge of how to directly operate a car with non-navigational automation (abs as you mentioned, power stearing, etc)

anonymars•44m ago
Yeah, it's a tricky problem to solve, but other design decisions exacerbate it too, like the lack of visual or tactile feedback in the controls.

I was thinking of, for example XL Airways Germany 888T. I was trying to find it and came across this thread making a similar comparison so I'll link that: https://www.reddit.com/r/AdmiralCloudberg/comments/18ks9nl/p...

I agree that we can make the situation less abrupt for cars in some cases (though people will probably get annoyed by the car bugging them for everything going on)

catapart•1h ago
And if the car was fully self driving and capable of seeing debris that was clearly seen by two humans (whom tesla claims have inferior eyes), it could have taken the opportunity to change lanes and avoid the obstruction.

They want to pretend you'll only need to actually intervene in 'edge case' situations, but here we have an example of perfect conditions requiring intervention. Regardless of the buzzwords they can attach to whatever methods they are using, it doesn't feel like it works.

petesergeant•1h ago
> Tesla’s EV business is in decline and the stock price depends entirely on the self-driving and robot promises

I don't think that's right; I think the stock price entirely depends on people seeing it a vehicle to invest in Musk. If Musk died tomorrow, but nothing else changed at Tesla, the stock price would crater.

Mistletoe•1h ago
I actually think the stock price would go up. His detour to fascism and megalomania has chased off tons of liberal environmentalists like myself that are the target audience for electric cars. I cancelled what would have been our replacement Tesla when he was implying on Twitter there can be good reasons for sneaking into the speaker of the house’s house and hitting her husband in the head with a hammer.
iwontberude•1h ago
Nah Morgan Stanley and Wedbush would know the gig is up, the fraud would no longer be maintainable without Elon.
lazide•1h ago
Yup. The effect of the reality distortion field has weakened, but remains plenty strong enough for now.
lazide•1h ago
The issue is that if you look at Tesla as a normal car company (without an iconoclast CEO/personality), then you need to do normal P/E math, which is going to be sub-100 for sure.

Right now it’s easily double to triple that, even with Musks behavior.

stephen_g•1h ago
Normal P/E math for a car company would put it in more like the 10-20 region - we’re not talking it being double to triple a sensible valuation but seriously something like 15 times…
Fade_Dance•1h ago
It may do this or that on the announcement, but if growth stops (arguably already has) and there is no hype for years, it's likely going to grind down to a normal valuation with time.

The passive investing / market cap weighted ETF complex tends to help big valuations stay big, but a company like Tesla still needs that sharp shot in the arm followed by frenzied buying occasionally in order to stay aloft (be it by traders, retail participants, shorts covering, etc).

I suppose they could replace Musk with another hype salesman, but the "hate" that Tesla gets is a big part of these upside shock cycles for the stock, because the ticker is a siren call for short sellers, who are ultimately guaranteed future buyers.

walleeee•1h ago
A thorough analysis of the materials/energy reality we inhabit could lead one to make a similar decision for environmental reasons alone
jcranmer•29m ago
I think firing Musk would do wonders for Tesla as going concern but would be a disaster for the stock price.

I suspect a significant proportion of Tesla's stock price comes from people who are using it as a proxy for his other companies that the public can't invest in, primarily xAI (as all AI companies are in a horrific bubble right now) and SpaceX.

Zigurd•1h ago
If it were only that TSLA is a way for retail investors to buy shares of ELON, Tesla wouldn't need to do a Robo taxi rollout in Austin to bolster claims that FSD is for real.

Elon can't levitate TSLA and other valuations by himself. There has to be at least the appearance of substance. That appearance is wearing thin. While I'm going to observe the caution that the market can stay irrational longer than I can stay solvent, once reality assert itself, Elon will be powerless to recreate the illusion.

testing22321•1h ago
The world’s best selling vehicle means nothing? Extremely safe vehicles mean nothing?

I get that it’s way over hyped, but they have real results that can’t be denied

fabian2k•48m ago
That would justify a valuation in the same range as other car companies. Not the valuation Tesla has right now.
HPsquared•1h ago
It's so clear, dry, perfect lighting, no traffic or anything. That's shocking.
coreyh14444•1h ago
The debris is pretty close to the color of the road. Seems like a good case for radar/lidar ¯\_(ツ)_/¯
lazide•1h ago
Who knew spending more on extra sensors would help avoid issues like this? So weird.
raisedbyninjas•57m ago
Sensor fusion is too hard is the new cope.
petermcneeley•1h ago
And yet the humans detected it without lidar?
bananapub•1h ago
they're making fun of Tesla, which stopped putting lidar in their cars during the pandemic when it got expensive and instead of saying "we can't afford it", claimed it's actually better to not have lidar and just rely on cameras
AlotOfReading•53m ago
Tesla has never had LIDAR on production cars, only mapping/ground truth and test vehicles. It was radar that disappeared during the pandemic.
ekianjo•1h ago
do we have an example of Lidar equipped car that can avoid that?
jefftk•1h ago
Tesla is essentially the only one that doesn't use lidar. I'd be very surprised if a Waymo had a problem with this debris.
Bengalilol•59m ago
Of course. There are plenty of LiDAR demos out there. For a starter : https://www.youtube.com/watch?v=gylfQ4yb5sI BTW: https://www.reddit.com/r/TeslaFSD/comments/1kwrc7p/23_my_wit...
ACCount37•1h ago
Yeah! Just add more sensors! We're only 992 more sensors away from full self-driving! It totally works that way!

The debris? The very visible piece of debris? The piece of debris that a third party camera inside the car did in fact see? Adding 2 radars and 5 LIDARs would totally solve that!

For fuck's sake, I am tired of this worn out argument. The bottleneck of self-driving isn't sensors. It was never sensors. The bottleneck of self-drivng always was, and still is: AI.

Every time a self-driving car crashes due to a self-driving fault, you pull the blackbox, and what do you see? The sensors received all the data they needed to make the right call. The system had enough time to make the right call. The system did not make the right call. The issue is always AI.

AnimalMuppet•31m ago
The problem can be sensors, even for humans. When a human's vision gets bad enough, they lose their license.
Scarblac•1h ago
Maybe it decided it was a harmless plastic bag or something.
lm28469•1h ago
Two more years guys, just give him two more years, a few more billion, a bit more political power and I promise he'll give you your fancy self driving toy. (Repeat from 2012 ad infinitum)
cedws•1h ago
Solar roofs, Dojo, Hyperloop, robotaxi, roadster, semi, mission around the Moon, man on Mars by 2020, it’s all coming guys he promised!
hliyan•1h ago
Boring Company. Whatever happened to those tunnels?
danaris•1h ago
They did their job: they discouraged the construction of high-speed rail.

Musk has, IIRC, actually admitted that this was their purpose.

panick21_•46m ago
No it didn't. The Hyperloop had no impact what so ever on California High Speed Rail.
behnamoh•1h ago
they were boring, construction continued
arm32•1h ago
Easy problem to solve. We just need to train an AI image classifier with highly annotated images of every single possible combination of road debris in every single country in the entire world. Shouldn't take longer than a month, right team? /s
Zigurd•1h ago
Hey boss, should we set aside the grade crossing barrier recognizer for that?
mhh__•1h ago
If it's not your money does it matter? I don't think it's fair to say he doesn't deliver anything e.g. he did cars, rockets, and now AI (the rate X are building out training capacity is genuinely astonishing) at the same time.
_aavaa_•1h ago
Yes it does matter. Money thrown away on his lies is money that isn’t invested on real projects.
bilbo0s•56m ago
I think the point is that if it's his money he's pissing away, then any other projects the money would have been spent on would have been equally dubious in any case. He's not going to, all of a sudden, become wise simply because he doesn't spend money on what he's spending money on.

Did we give him wayyy too much free money via subsidies? Yes. But that was our mistake. And if we hadn't given it to him, we would have given it to some other scam artists somewhere or other. So even in the case of the counterfactual, we could expect a similar outcome. Just different scumbags.

_aavaa_•15m ago
> And if we hadn't given it to him, we would have given it to some other scam artists somewhere or other.

No we wouldn’t have. Not every dollar we give goes to scam artists. And there are a whole lot of industries and companies far less deceitful.

panick21_•45m ago
Overall SpaceX is incredibly successful and Tesla is still reasonable successful. So money overall is hardly thrown away.
lm28469•40m ago
Success being defined here as "something that makes numbers go up in some capacity"
nolok•29m ago
From a stockholders point of view tesla is a massive success.

From a car design and development point of view, it's a massive waste of lost opportunities.

From a self driving interested person, it's a joke.

Really depends on how you view things, in a purely money in the stock market aspect tesla is doing great.

panick21_•23m ago
Well in the same measure of success that we use for every company.
lm28469•41m ago
idk man, I'm not in the matrix deep enough to give a shit about "building out training capacity" of LLMs, I think there are way more important topics like not destroying the fabric of our society and political systems, but idk, I guess I'm just a far left antifa maniac terrorist or something
thefourthchime•1h ago
I use Tesla FSD 99% of the time. I recently drove from Austin to Florida and back.

The only time I had to take over was for road debris on the highway. Off the highway it’s very good about avoiding it. My guess is Tesla has not been focusing on this issue as it's not needed for robotaxi for phase one.

gregoriol•1h ago
The fact that it works well in good conditions doesn't make it good. It's the unexpected situations that determine if someone lives or dies.
thefourthchime•1h ago
It doesn't have to do with the conditions. It has to do with whether or not it's on the highway or not. The car uses a different Stack for highway and off highway, the highway has been a step child for quite a long time now
AlotOfReading•59m ago
That used to be true. Autopilot was for highways, FSD was for other roads. FSD can be enabled on both since v12 though and this video is specifically an attempt to use FSD on highways to go cross country.
0xcafecafe•1h ago
Any software that works 90% or even 99% of the time is not safe software especially when it has to be used to operate machines with deadly power.
ekianjo•1h ago
sounds like you are arguing that humans should not be driving
oblio•1h ago
> The ùberphilosopher Bertrand Russell presents a particularly toxic variant of my surprise jolt in his illustration of what people in his line of business call the Problem of Induction or Problem of Inductive Knowledge (capitalized for its seriousness)—certainly the mother of all problems in life. How can we logically go from specific instances to reach general conclusions? How do we know what we know? How do we know that what we have observed from given objects and events suffices to enable us to figure out their other properties? There are traps built into any kind of knowledge gained from observation.

> Consider a turkey that is fed every day. Every single feeding will firm up the bird's belief that it is the general rule of life to be fed every day by friendly members of the human race "looking out for its best interests," as a politician would say. On the afternoon of the Wednesday before Thanksgiving, something unexpected will happen to the turkey. It will incur a revision of belief.

The Black Swan: The Impact of the Highly Improbable, Nassim Nicholas Taleb, page 40

brk•42m ago
Your experience supports the other data, it’s not full self driving.
jve•24m ago
A 99% reliability is no reliability. So per 100km I should expect 1 issue? Like multiple per week? FSD has to be much better than that to be trustworthy.
curiousObject•1h ago
It’s an accident that could also happen to an inattentive human driver. (Ask me how I know :( )

But, if you watch the car’s display panel, it looks as if the car didn’t see anything and just went full speed ahead. That’s not great.

It should have slowed down and alerted the driver that something was there. I didn’t watch the complete video so maybe there’s more.

ForHackernews•1h ago
If you watch the video, the human non-drivers in the car noticed the road debris and could have reacted, but seemed not to because they were waiting for the computer to respond first.
saberdancer•1h ago
They were testing FSD so I understand why they would do it. It also looked like roadkill and didn't expect it to be a hard obstacle.
mynameisvlad•1h ago
> They were testing FSD

So if that was a human and they ran them over it'd be okay because they were testing FSD?

They're putting themselves (fine) and everyone around them (far less fine) in danger with this stunt.

curiousObject•17m ago
You are correct that they were testing the system, so they probably waited too long to react. But Tesla’s car didn’t do well in this incident

A competent human driver would instinctively slow down, look at the potential obstruction, and think about changing lanes or an emergency stop.

Most probably, the visible object is just some water spilled on the road or that kind of thing. But if it isn’t then it’s very dangerous

This car appeared to be blind to any risk. That’s not acceptable

HankStallone•5m ago
Heh, just yesterday I drove past some roadkill and wondered what a self-driving car would make of that. Then I started wondering how it would handle deer crossing the road 50 years ahead, or running alongside the vehicle as they sometimes do when they get spooked. Or how it would handle the way people drive down the center of gravel roads until they meet someone, and then move over.

At 56, I don't expect to see it on country roads in my lifetime, but maybe someday they'll get there.

jarym•1h ago
"is a Level 2 driver assistance system that requires constant supervision by a human driver" - the reason for human supervision might have something to do with uncommon situations (debris in road being such a situation).

Elon's estimates have always been off but it is irresponsible to see an obstacle up ahead and assume the computer would do something about it while the driver and passenger debate on what the said obstacle is. I am not sure if they were trying to win a Darwin Award and I say that as no particularly fan of Musk!

datadrivenangel•1h ago
Level 2 is very dangerous because it's so good that humans are only needed in emergencies...
blueboo•1h ago
The safer the system, the more catastrophic the failures
exe34•1h ago
I assume you spotted the problem? Or are you saying you missed it?
Fricken•16m ago
He's saying the more a driver trusts a partially automated driving system the less likely they are to be paying attention themselves.
aetherspawn•1h ago
I honestly don’t know if I would have seen and avoided that, it came up really fast. And based on the video it looked like a cardboard box or something not worth avoiding until it was within 2-3 seconds range.
chmod775•1h ago
It may look like that on video, but in fact you can hear the two guys pointing it out and chatting about it a whole 8 seconds before impact.

Also of course you're avoiding an unknown object that large, especially when there's plenty of space to go around it on either side.

If you still don't think you can avoid something like that, please get off the road for everyone's safety.

arm32•1h ago
It's easy to give you credit here—you would have seen it and avoided it. They saw it and had plenty of time to steer to the left, in the open lane, to avoid it.
jansan•1h ago
It was at least six seconds between the co driver pointing it out and the car hitting the object. The screen shows 77mph, which means he saw it from approx. 200m distance. Anybody would at least be able to prepare or slow down in such a situation.
agildehaus•1h ago
A friend of mine who loves Tesla watched this video and said "many humans would have hit that". I feel we'll be hearing a lot of that excuse.
mynameisvlad•1h ago
It's their standard go-to excuse. "I would have done that too" really says more about them as drivers than anything else.
rob74•1h ago
Well, I have to admit that your friend has a point. Humans are bad at reacting quickly and correctly to unexpected situations, and some debris large enough to damage your car showing up from out of nowhere after several hours of boring driving along a largely straight highway with little traffic is definitely one of these situations. But a self-driving system worth its salt should always be alert, scanning the road ahead, able to identify dangerous debris, and react accordingly. So, different pair of shoes...
agildehaus•1h ago
The humans in the vehicle spotted it fine, and we should not tolerate self-driving systems that are only as good as the worst of us.
mynameisvlad•1h ago
> and we should not tolerate self-driving systems that are as good as the worst of us

The person you replied to didn't do that, though:

> But a self-driving system worth its salt should always be alert, scanning the road ahead, able to identify dangerous debris, and react accordingly. So, different pair of shoes...

amelius•55m ago
They never said they did.
ape4•1h ago
And a self-driving car should have better sensors than a human (like lidar)
root_axis•1h ago
I'm not convinced. The debris is clearly visible to the humans a long way off and the adjacent lane is wide open. Avoiding road debris is extremely common even in more congested and treacherous driving conditions. Certainly it's possible that someone texting on their phone or something might miss it, but under normal circumstances it could have been easily avoided.
wlesieutre•52m ago
Even if there wasn't space to swerve, a human would've hit the brakes and hit it much slower
newyankee•17m ago
I feel Waymo's LIDAR might have worked here
close04•1h ago
> I have to admit that your friend has a point

Do they? "Many humans" would hit that? The humans in the car spotted the debris at least 8s before the impact. I don't think any humans would hit that in broad daylight unless they were asleep, completely drunk, or somehow managed to not look at the road for a full 10s. These are the worst drivers, and there aren't that many because the punishment can go up to criminal charges.

The argument that "a human would have made that mistake" backfires, showing that every Tesla equipped with the "safer than a human driver" FSD is in fact at best at "worst human driver" level. But if we still like the "humans also..." argument, then the FSD should face the same punishment a human would in these situations and have its rights to drive any car revoked.

sandworm101•56m ago
Or they would hit it if they were busy fiddling with the car's autodrive system. These humans would have avoided it had they not wasted time speculating about whether the autodrive system would save them. They would have been safer in literally any other car that didnt have an autodrive.
tptacek•58m ago
Humans routinely drive from LA to NYC without wrecking their cars. In fact, that's the normal outcome.
foobarian•44m ago
I've been driving on local and highway roads for 30 years now and I have never come across a piece of debris so large that driving over it would damage my car. Seeing that video, I don't have high confidence that I would have dodged that hazard - maybe 70% sure? The thing is, usually there is plenty of traffic ahead that acts very obviously different in situations like this that helps as well.

All that to say that I don't feel this is a fair criticism of the FSD system.

ModernMech•32m ago
> All that to say that I don't feel this is a fair criticism of the FSD system.

Yes it is because the bar isn't whether a human would detect it, but whether a car with LiDAR would. And without a doubt it would, especially given those conditions: clear day, flat surface, protruding object is a best case scenario for LiDAR. Tesla's FSD was designed by Musk who is not an engineer nor an expert in sensors or robotics, and therefore fails predictably in ways that other systems designed by competent engineers do not.

quickthrowman•25m ago
You’ve been driving for 30 years and have never seen a semi truck tire in the middle of the road after it ripped off the rim of a truck?
dmix•42m ago
I'd imagine the highway doesn't normally have a debris that's nearly the same color of the surface of a highway.
cozzyd•34m ago
The highway often has debris on it at night which can be even harder to see
JohnFen•10m ago
I encounter such things a few times a year. Usually a retread that has come off a truck tire.
JimmaDaRustla•45m ago
Literally the passenger saw it and leaned it, the driver grabbed the steering wheel to brace himself it seems. That object on the road was massive, absolutely huge as far as on road obstacles go. The camera does not do any justice - it looks like it's 3 feet long, over a foot wide, and about 6 or 7 inches high laying on the road. Unless a human driver really isn't paying attention, they're not hitting that thing.
samrus•15m ago
Bullshit. Some humans might hit thay because they werent paying attention, but most people would see that, slow down and change lanes. This is a relatively scenario that humans deal with. Even the passenger here saw it in time. The driver was relying on FSD and missed it

I dont think FSD has the intelligence to navigate this

underdeserver•1h ago
I wouldn't. And I'm not that great a driver. The Tesla should be significantly better than me.
VBprogrammer•1h ago
I don't love Tesla (though I would like an electric car). I don't think it's unlikely that someone driving could have hit that or caused an even worse accident trying to avoid it.

However, I'm sure a human driver in the loop would have reacted. The driver sitting there watching a machine do it's thing 99% of the time and being expected to step in to save that situation though is a horrific misreading of human nature.

pmontra•1h ago
Those humans saw the debris. What happens next when a human is actively at the wheel is that the driver should look at all mirrors, decide whether to change lane or brake, execute. Or anything else that could lead to a movie like multiple car accident. Hitting the debris is the least dangerous line of conduct if there are cars all around. That looked like an empty road but who knows.

By the way, playing a lot of racing videogames is a great training for dealing with that sort of stuff, except maybe getting good at mirrors. I've been in a few dangerous situations and they were only the 10th thousand averted crash. Nothing to think, only reflexes.

lawgimenez•1h ago
Humans would have slowed down at least. After watching it many times, that shadow in front is too large to warrant a concern.
cozzyd•1h ago
Sure and many humans have no business having a driver's license.
delfinom•1h ago
I would counterpoint my little cheap Civic has hit things like that and hasn't broken a thing. HEH.

The best time was a very badly designed speed bump that I didn't even hit at high speed but one side was ridiculously inclinded vs. the other so the entire civic's frontend just smashed right into the pavement and dragged for 3 feet. I wouldn't be surprised if I went into a body shop and find the frontend tilted upwards by a few centimeters. lol

potato3732842•42m ago
Timestamp 8:00-8:30. Your civic is not hitting that and surviving any better than the Tesla. It just got luckier.
2OEH8eoCRo0•1h ago
The humans in the video spotted it far in advance. I think that any human watching the road could easily avoid that.
tedggh•1h ago
The question is if avoiding the obstacle or breaking was the safest thing to do. I did not watch the entire test, but they are definitely cases where a human will suddenly break or change lanes and cause a very unsafe condition for other drivers. Not saying that was the case here, but sometimes what a human would do is not a good rule for what the autonomous system should do.
mooxie•32m ago
An enormous part of safe driving is maintaining a mental map of the vehicles around you and what your options are if you need to make sudden changes. If you are not able to react to changing conditions without being unsafe, you are driving unsafely already.
cozzyd•17m ago
This is an important aspect of why "supervised" self driving is much more dangerous than just driving.
sandworm101•1h ago
Yes. Humans would. Which is why the car should be able to handle the impact. My honda civic has had worse without issue. The suspension should be beefy enough to absorb the impact with, at worst, a blown tire. That the car has broken suspension says to me that teslas are still too fragile, biuld more like performance cars than everyday drivers.
tedggh•34m ago
With millions of Teslas on the road one would think if that was true we would have heard something by now. My absolute worst car quality wise ever was a Honda Accord. And I owned shitty cars including a Fiat. My most reliable car was a Honda Civic before I “upgraded” to a brand new Accord. I abuse my Tesla and so far no issues driving in one of the worst roads in the country. I must hit 100 potholes per month and blew a tire already. It’s not a fun car to drive like a GTI (which I own as well) but it’s definitely a solid car.
sandworm101•13m ago
Cars with "bad" suspension tend to survive potholes. A car with slow-to-move suspension will see the wheel dip less down into the hole when traveling at speed. But that is the exact opposite behabior you want when dealing with debris, which requires the supension to move up rather than down. "Good" systems will have different responce curves for up than down. Quazi-luxury cars fake this by having slow suspension in both directions, to give the sense of "floating over potholes".

[Cut, google ai provided wrong numbers]

moi2388•59m ago
A human did hit that..
rkomorn•57m ago
I guess the point was the human was intentionally letting the car do its thing, though.
dmix•41m ago
With their hands on the steering wheel and foot next to a brake and two people looking out the window.
rkomorn•10m ago
Are you saying "they let the car drive on its own but were still paying attention"?

Not sure how to take this reply.

JimmaDaRustla•47m ago
Did you friend make any mention that the passenger saw it hundreds of feet away and even leaned in as they headed directly towards it? The driver also recognized it and grabbed the wheel as if to say "brace for impact!".
dmix•44m ago
They are right.
ModernMech•28m ago
Yes many human drivers would hit it. The bad ones. But we should want driverless cars to be better than bad drivers. Personally, I expect driverless cars to be better than good drivers. And no, good drivers would not hit that thing.
mooxie•39m ago
That's laughable. Any human who couldn't avoid a large, clearly-visible object in the middle of an empty, well-lit road should not be operating a vehicle.

That's not to say that there aren't many drivers who shouldn't be driving, so both can be true at once, but this is certainly not a bar against which to gauge autonomous driving.

sanp•28m ago
Question - isn't P(Hitting | Human Driving) still less than P(Hitting | Tesla FSD) in this particular case [given that if this particular situation comes up - Tesla will fail always whereas some / many humans would not]?
mrtksn•11m ago
Obviously, in this particular case the humans wouldn't be hitting that. The people in the video have clearly seen the object, but they didn't want to react because that would have rained their video.

Even if they did not understand what it is, in the real world when you see something on the road you slow down or do a maneuver to avoid it, no matter if it is a harmless peace of cloth or something dangerous like this. People are very good at telling if something is off, you can see it in the video.

jcranmer•2m ago
When the self-driving car killed a pedestrian several years ago, the initial sentiment on this site for the first few hours was essentially "those dastardly pedestrians, darting into traffic at the last second, how are you supposed to avoid them?" It took several hours for enough information to percolate through to make people realize that the pedestrian had been slowly and quite visibly crossing the road and the self-driving car (nor the safety driver) never did a thing to react to it.

Another thing to keep in mind is that video footage is much lower quality than what we can see with our human eyeballs. At no point in the video can I clearly identify what the debris is, but it's clearly evident that the humans in the car can, because they're clearly reacting to it seconds before it's even visible to us in the dash-cam-quality footage. I will freely accept that many drivers are in fact bad drivers, but a carcass (I think?) on a lane visible for >10 seconds away is something that anyone who can't avoid needs to have their license revoked.

general1465•1h ago
Tangentially - If you as a European happen to drive on US highways, you will noticed that they are heavily littered with fallen cargo, aluminum ladders, huge amount of shredded tires and occasionally a trailer without a towing car... It has been so bizarre for me to observe this. Nobody is cleaning that?
comrade1234•1h ago
I just got back from a trip to the USA where I spent about five days driving around Michigan, Illinois, and Wisconsin and the number of shredded truck tires on the highways was flabbergasting.

From what I understand, in the USA when a truck tire wears down they put a new layer of tread on it. But it doesn't seem to work very well as it eventually peels off.

rkomorn•1h ago
Truck tire chunks on freeways was one of the biggest surprises to me.
__alexs•1h ago
Retredding like this is also entirely normal in Europe. I guess they just have more trucks?
disiplus•33m ago
Retreading is legal but only specified companyes with its own E mark legally can do it. And combine that with more in depth 12 or 6 months inspection for > 3.5t trucks it usually means that the tires are in better condition.
potato3732842•56m ago
Those are called retreads and they are not uncommon worldwide. If you're seeing anything other than long thin strips of tread on the road it's not a retread related failure.

Every now and then the Karens get to screeching about it and it reaches a critical mass and the NHTSA does a study and they find that most of the debris you're seeing has nothing to do with retreads. Here's the summary of a recent one:

https://www.moderntiredealer.com/industry-news/commercial-bu...

kitd•45m ago
I hit a retread as it became detached from the lorry I was following on the M25, UK. Scary moment, similar to the video in TFA, + an expensive repair job.
justincormack•1h ago
It is clearly designed to test the self driving...
thefourthchime•1h ago
Yes, people call it in and cops clean it up. We just have a lot of roads and people that poorly packed stuff in trucks
q3k•1h ago
... cops? There's no, like, highway traffic authority with their own maintenance force to take care of this?
jeffbee•1h ago
A ladder in the road is a legitimate emergency. If you call 911 to report a ladder in the road, they will direct the issue to the relevant agency (which will be the state police, in all likelihood, because they will need to close a lane).
bilbo0s•1h ago
I'm sorry. Could you repeat that? I couldn't make out what you said clearly?

All I heard was "taxes".

/s

On a more serious note, in the US we generally go in the direction of fewer services rather than more services. It, of course, leads to massive inefficiencies, like police removing shredded tires. But it's very difficult here, politically speaking, to get Americans to agree to add government services.

zwaps•1h ago
Why are Dutch or Swiss highways so much more pristine than French and German ones?

Have you compared distances?

carlmr•1h ago
German ones are still way way cleaner than US highways.

Germany is also the country with the most traffic because it's kind of at the center of everything and pays roads with income tax instead of tolls.

The French charge really high tolls and have very little traffic compared to Germany. They really don't have an excuse.

moi2388•57m ago
Excuses is pretty much all the French have tbh..
simgt•41m ago
> They really don't have an excuse.

Oh but we do. Most of the state-owned motorways have been sold off to conglomerates about two decades ago, dividends and exec comps have to come from somewhere.

Xylakant•28m ago
We do have road tolls for trucks in germany. It's been like that for quite some time.
disiplus•22m ago
in germany trucks pay tolls for highways. Also all the gas is taxed and it goes into the federal budget that is then used to finance highways, so i would say everybody filling up is financing highways.
general1465•1h ago
You don't need to drive far, just get on a US highway and there is dangerous litter every few hundred meters. In extreme cases it goes down to few dozens meters. Sometimes it was like driving in some Mad Max movie.
AnimalMuppet•54m ago
Litter? Sure. Dangerous litter? Every few hundred meters? No. Not sure where you're driving, but no, that's not in general the way US highways are.

I mean, I have seen some dangerous things. Not at the rate you describe, though. Not even close.

fransje26•19m ago
> Why are Dutch or Swiss highways so much more pristine than French and German ones?

??

They are not..

jeffbee•1h ago
You might have also noticed that the pavement is terrible. Mostly this comes down to the fact that in Europe the fuel taxes 3-4x ours.
rkomorn•1h ago
Not just that but also tolls. There are way more toll roads, at least where I've lived in Europe, compared to where I've lived in the US (Spain being the one very noticeable exception between France and Portugal).
xnx•27m ago
Fuel tax is a big factor, but US has a lot of road. The US has 3x the amount of paved surface vs. Germany. European winters are also milder than the US. I'm not sure how many European roads would survive going from -10 to 140 like they do in the midwest
tedggh•1h ago
It depends where in the US you drive. It’s a big country with independent state governments. It’s like saying I was driving in Romania and I was shocked by how bad European highways are. I lived in Texas and the stuff I saw on the highway was crazy, vacuum cleaners, decorated Christmas trees and refrigerators. Most parts of the country interstate and highway systems are pretty clean.
rkomorn•1h ago
I lived in two parts of California, and New Jersey for almost three decades and I've traveled to a lot of the US for leisure or work.

I honestly can't recall ever feeling like I was going through a markedly different area in either better or worse directions.

peterfirefly•54m ago
Romania has pretty good highways now.
potato3732842•1h ago
It's been years since I've seen anything you couldn't drive over that wasn't in the travel lane.

Around here anything big enough to matter gets reported, the cops fling it to the side of the road and it gets picked up on a schedule because they always pick things up a few weeks before they mow (presumably because hitting garbage isn't great for the mowers).

duxup•1h ago
It greatly depends on where you are.

But yes there are folks whose job it is to clean that stuff up.

xnx•33m ago
> Nobody is cleaning that?

In the linked video highway patrol comes out to remove the debris just a few minutes after they hit it. (highway patrol had been called out prior them hitting it)

tedggh•1h ago
I got my first Tesla last year and my first trip with FSD was just after the launch of V13, so I could not compare it to earlier versions. But I was shocked by how good it was. I completed a 800 miles trip with a handful of interventions, most or all of them likely unnecessary. Even with rain the system worked well. I don’t understand how the system was not able to see this obstacle and slow down or change lanes. Did they have traffic tailgating? I can see some edge cases where there’s really no way to avoid something like this safely. In any case it’s pretty unfortunate and it will make me be even more cautious when using FSD.
Lionga•1h ago
Even with traffic tailgating it would need to brake or go to then right. Your comment sounds like a Tesla ad written by someone from Tesla or a heavy, heavy fan boy.
thefourthchime•1h ago
When they first released V13, the highway stack was still the old C++ code. It wasn't until another four or five months that they switched it to a neural network. It doesn't seem like they've put much focus on it since then.
AlotOfReading•49m ago
Tesla FSD still uses C++. What you're referring to is control policy code that was deleted. FSD also doesn't have a separate "highway stack" and hasn't since v12.
agildehaus•1h ago
It's because anecdotal experience means very little. In ~2021, these vehicles could almost never make it from A to B without a safety-critical intervention.

That they can today doesn't mean they can do the same route 10,000 times without a safety-critical intervention. In fact, public tracking indicates Tesla's numbers are MUCH lower than that.

lapcat•1h ago
> Did they have traffic tailgating?

No. If you continue to watch the video after the collision, it's clear that there was no traffic tailgating. They even slowed down and pulled over to the side of the road. No other cars.

richrichardsson•41m ago
I'm unsure why having traffic tailgating even factors into it. If you have to hit the brakes to avoid a collision in front of you, it's not your responsibility to deal with traffic behind you that wasn't at a safe distance; that's all on them.
kgwgk•16m ago
It’s not like you would expect any inconvenience from being rear-ended or anything so why would you care?
mv4•1h ago
I've been using FSD lately on trips from CT to NYC and back (and the car performed well). My biggest fear has been debris (like shown in the video), or deer.
bambax•1h ago
What is the point of "testing" something that everyone knows doesn't work and only exists because a serial liar said so? If Musk says you will survive a fall from a 90-stories high-rise... don't test it.
spacebanana7•1h ago
We should rigorously test the things we’re most skeptical of.

You shouldn’t take pills a stranger gives you at a music festival without checking them for loads of things, for example. Even if you don’t ever intend on consuming them it’s nice to have specific accusations with evidence.

motiw•1h ago
Would lidar detect such debris?
Bengalilol•56m ago
Yes. Everything that is/has a volume gets detected. The problem with the camera thing is that the debris doesn't move, is quite small and is dark. I suspect it overran it because of maybe a fix/patch on this behavior: https://www.reddit.com/r/TeslaFSD/comments/1kwrc7p/23_my_wit...
SillyUsername•1h ago
A lot of apologists say that "a human would have hit that".

That's kind of irrelevant, this technology is meant to be safer and held to a higher standard.

Comparing to a human is not a valid excuse...

Lionga•1h ago
A human would not have hit that, the two guys see it coming from a long time and would have stopped or changed lanes like.
keyle•1h ago
Indeed... You can see the driver reaching for the wheel, presumably he saw it coming, and would have hit the breaks. He left the car to do its thing thinking it knows better than him... maybe.
littlecranky67•1h ago
> That's kind of irrelevant, this technology is meant to be safer and held to a higher standard

I don't think that is the case. We will judge FSD whether you make more or less accidents than humans, not necessarily in the same situations. The computer is allowed to make mistakes that a human wouldn't, if in reverse the computer makes a lot less mistakes in situations where humans would.

Given that >90% of accidents are easily avoidable (speeding, not keeping enough safety distance, drunk/tired driving, distraction due to smartphone usages) I think we will see FSD be safer on average very quickly.

ACCount37•56m ago
That's the main advantage self-driving has over humans now.

A self-driving car of today still underperforms the top of the line human driver - but it sure outperforms the "0.1% worst case": the dumbest most inebriated sleep deprived and distracted reckless driver that's responsible for the vast majority of severe road accidents.

Statistics show it plain and clear: self-driving cars already get into less accidents than humans, and the accidents they get into are much less severe too. Their performance is consistently mediocre. Being unable to drink and drive is a big part of where their safety edge comes from.

Xylakant•26m ago
The statistics on this are much less clear than Tesla would like us to believe. There's a lot of confounding factors, among them the fact that the autonomous driver can decide to hand over things to a human the moment things get hairy. The subsequent crash then gets credited to human error.
ACCount37•10m ago
That's an often-repeated lie.

Tesla's crash reporting rules:

> To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed.

NHSTA's reporting rules are even more conservative:

> Level 2 ADAS: Entities named in the General Order must report a crash if Level 2 ADAS was in use at any time within 30 seconds of the crash and the crash involved a vulnerable road user being struck or resulted in a fatality, an air bag deployment, or any individual being transported to a hospital for medical treatment.

At highway speeds, "30 seconds" is just shy of an eternity.

piva00•52m ago
> The computer is allowed to make mistakes that a human wouldn't, if in reverse the computer makes a lot less mistakes in situations where humans would.

This subverts all of the accumulated experience of other users on the road about what a car will do, everyone is used to potential issues caused by humans, on top of that other road users will have to learn the quirks of FSD to keep an eye for abnormalities in behaviour?

That's just unrealistic, not only people will have to deal with what other drivers can throw at them (e.g.: veering off lane due to inattention) but also be careful around Teslas which can phantom brake out of nowhere, not avoid debris (shooting it on unpredictable paths), etc.

I don't think we should accept new failure modes on the road for FSD, requiring everyone else to learn them to be on alert, it's just a lot more cognitive load...

davidcbc•14m ago
> I think we will see FSD be safer on average very quickly.

This is what Musk has been claiming for almost a decade at this point and yet here we are

pbasista•1h ago
Anecdotal: I am surprised how the basic Tesla autopilot often cannot even read the speed limit signs correctly. In perfect lighting conditions. It just misses a lot of them. And it does not understand the traffic rules enough to know when the speed limit ends.

I know that the basic autopilot is a completely different system than the so-called FSD.

But equipped with that experience from the basic autopilot, it does not surprise me that a large debris on the road was completely missed by the FSD.

tedggh•1h ago
I use autopilot for local driving (city - suburbs) and I pay for FSD when on long road trips (>300 miles). You are correct, they are completely different things so one doesn’t correlate to the other one.
mbreese•59m ago
That they are different things is really disappointing. If you want people to trust the system enough to buy FSD, the autopilot mode should use the same system, with limited functions. There is no reason why the vision/detection systems should be different. Especially if you already have the proper hardware installed…
samrus•13m ago
Completely agree. Like a rate limiting model the way LLMs do. Where you get like 5 FSD drives on the free tier per month or something
helge9210•1h ago
I would try to pass it between the wheels and would crash the same.

At least for me there was nothing indicating there is not enough clearance.

kayodelycaon•48m ago
I’d disagree. :/

It’s a huge chuck metal well over a foot long in the middle of a lane. The center of a lane is usually higher than sides and an uneven patch of road can cause a slight bounce before it, further reducing clearance. I’d worry about my RAV4 (8 inches) clearing it safely.

At first, I thought it was possibly a tire tread, which tend to curl up.

gyanchawdhary•1h ago
For those following along and curious about the tone and intensity of the criticism against Musk and Tesla (both here and in general) this blog post is a solid framework for understanding what drives many of the responses https://medium.com/incerto/the-most-intolerant-wins-the-dict...
TheAceOfHearts•1h ago
One of my biggest criticisms of Elon is that he rarely takes accountability for his words and nobody ever holds him accountable by asking directly. This isn't a high bar to clear, but some people continuously excuse him as being "overly-eager" in his predictions. If that was the case, he could still provide reasonable updates when a predicted date is missed along with an explanation, even if it's just: "Hey, it turns out this problem was much more difficult than we initially expected and it'll take longer". A lot of the problems that he's trying to solve are actually quite difficult, so it's understandable that predictions will be imprecise... But when you realize that your predictions are going to be wrong, you should have the basic decency to update people.

When you're wielding immense amounts of money, power, and influence I think it's worth trying to do the bare-minimum to hold people accountable for their words and claims. Otherwise your words are meaningless.

tokioyoyo•58m ago
In America, you fail the second you apologize or take accountability. Ignoring criticism and deflecting all the time gets you further, as it is part of the game. Unfortunately, this is just an accepted social science-y thing at this point. It is a very much cultural thing of the past couple of decades.
amelius•50m ago
Isn't the case in engineering cultures, like Boeing before they changed into a business culture.
ActionHank•22m ago
Is there any engineering culture left in the US?

I feel like this is the case across the board.

JohnFen•15m ago
Yes, there is. In every place that I've worked, including my current position, acknowledging when you're wrong or have failed at something increases trust in you and your professionalism.

People who always have an excuse, try to shift blame, etc., are assumed to be lacking in competency (let alone ethics and trustworthiness).

iancmceachern•45m ago
Do you have sources?
bayindirh•41m ago
That's a interesting take. What I have heard from a very old friend of my father is the opposite:

> Knowing when to say thanks and when to say sorry is the key for success.

...and I have used this piece of advice since then, it paid me handsomely. Of course, this doesn't allow you to be shameless, on the contrary, it requires to stick to your values as a prerequisite.

I think what allows Elon to behave like that is how he can retaliate without any repercussions since he has tons of money and influence in some circles.

2OEH8eoCRo0•22m ago
Machiavellian
organsnyder•22m ago
Then we need to change that. Those with power are best-equipped to effect that change.
JohnFen•14m ago
This is far from universal in the US, but it's certainly true in certain circles.
skeeter2020•2m ago
It's too bad, because "I'm sorry; this is my fault" is the biggest diffuser of anger and best way to appease mad customers. Try it sometime; the other party goes from ready to kill you to apologetic themselves (if you're genuine). Unfortunately it's seen as a sign of weakness by people like Elon and his cult of impersonators and an admission of liability by the litigious crowd. If you can be strong, confident and ready to admit it when you're wrong you'll not only be successful in confrontational situations but also not a giant dick.
panick21_•51m ago
Interestingly with SpaceX he is much more willing to change plans. With SpaceX he and SpaceX seem to be searching for the right solution.

For self driving, he simply decided X is right and talked about exponentials and no matter how many time it fails, there is no reflection what so ever.

sureglymop•47m ago
> when you realize that your predictions are going to be wrong, you should have the basic decency to update people

Not to get too political, but the last I've heard of Elon Musk is that he was speaking to/mobilizing right wing extremists at a big protest in London. I am also pretty sure he has been trying to do similar things in other European nations (for whatever reason).

It seems to me that it is a bit late to plead for "basic decency" at this moment.

But at the same time let me ask you, what has he got to lose? What financial or reputational risk is he taking by not taking any accountability?

samrus•17m ago
Society needa a "no assholes" policy in order to syay high trust. Elon not being a pariah because of his grifting is a sign the US is becoming a lower and lower trust society. And its billioniares making it so
pavlov•43m ago
He lies relentlessly even to customers who paid for the product.

I know because I’m one of them. FSD paid in full almost seven years ago, still does absolutely nothing in Europe. A five-year-old would do a better job at driving because the Tesla can’t even see speed limit signs correctly.

Tesla takes no responsibility for their misleading marketing and years of lies. Most recently Musk promised in early 2025 that these old cars would get a hardware update that will finally enable the paid-for FSD (as if…) The company itself pretends to know nothing about this latest promise made by its CEO.

It’s insane that a business with a trillion-dollar market cap operates like this. It seems to be more of a cult than a real company.

bayindirh•38m ago
> because the Tesla can’t even see speed limit signs correctly.

This is sad and atrocious. Not only a Ford Puma (an econobox compared to a Tesla) can read almost all speed limit signs correctly, it can pull speed limit data correctly from its onboard maps when there are no signs. These maps can be updated via WiFi or an on board modem too.

Tesla mocked "big auto industry", but that elephant proved that it can run if it needs to.

cozzyd•30m ago
I think Elon would say that this is a regulatory problem because you guys don't have the same signs or distance units as in the US.
samrus•19m ago
So why take money for it? And computer vision is well past the stage where the information can be read from these signs, theres enough training data. So thats not a real hurdle. If different road geometry or traffic customs/rules is the issue then just admit that FSD cant generalize like a human and is overfit to the us. Why lie and pretend its almost human level?
cozzyd•13m ago
I was being a bit factious (related to Elon's previous bullshit claims that it was only a regulatory issue that everybody's Tesla can't moonlight as a robotaxi by the end of some year in the past...sure Elon).
thinkingtoilet•21m ago
He's a psychopath, in the sense he doesn't feel normal emotions like remorse and empathy. He will lie to your face to get you to buy his product and when it fails to deliver on promises he will lie again.
dweinus•13m ago
My biggest criticism of Elon is that he supports neo-nazis
pirates•59m ago
Props to the article for including a gif of the crash so we don’t have to give these people any more views on youtube.
zzzeek•57m ago
I rant about Elon a lot but can someone just explain to me how this keeps going on ? FSD is almost completely a solved problem by the likes of Waymo etc. Why does anyone care what Tesla is failing to do with FSD? Is this all about, "how can we invent FSD without lidar"? Why are we bothering, because cybertruck owners don't want a dorky box on top of their truck? Does their truck already not look ridiculous?
dmix•25m ago
It's almost entirely a product/economic gamble. Basically:

"How do we get self-driving into millions of regular consumer cars without doubling the price by adding expensive sensors and redesigning the vehicle around huge chunky sensors"

Waymo is focused on Taxis for a reason because it's likely going to be unaffordable, except to people driving Lambos. But that also may be fine for a big chunk of the public who don't want to own cars (or want to rent their own as a service).

Some consumer car companies are experimenting with adding smaller LIDAR sensors like recent Volvo Ex90 which costs ~$100k. But they aren't as sophisticated as Waymo

xnx•41m ago
Guy in the video calls it a "girder", but it is almost certainly a trailer ramp: https://www.proxibid.com/lotinformation/54574457/2006-murray...

That was my hunch, but Google Lens was able to ID it. Possible that Waymo vehicles can do this too, but that must take some serious compute and optimization to do at highway speeds.

xnx•36m ago
Respect to these guys for commiting to the bit and letting the Tesla hit it. This is real journalism. Stark contrast to so much of the staged engagement-bait on YouTube.
ThinkBeat•35m ago
It seems well documented that the Tesla system is at level 2. and it requires "hands on supervision".

Has Elon lied about the capabilities? Yes, on many occasions.

Crashing your car to prove it seems lilke a waste. When the documentation is clear.

""" Tesla Autopilot is an advanced driver-assistance system (ADAS) developed by Tesla, Inc. that provides partial vehicle automation, corresponding to Level 2 automation as defined by SAE International """ https://en.wikipedia.org/wiki/Tesla_Autopilot

"""

Drive Autonomously: No manufacturer has achieved Level 4 or Level 5 autonomy for road use. Tesla has not achieved Level 3 conditional autonomy like Mercedes-Benz’s DrivePilot system. A Tesla cannot drive itself. The driver must remain attentive at all times. Tesla now qualifies Autopilot and Full Self-Driving with a (Supervised) parenthetical on its website. Tesla’s Level 2 system is a hands-on system that requires the driver to regain control immediately. Torque sensors confirm the driver’s hands are on the wheel or yoke. Level 2 Driving on City Streets: Tesla does list Autosteer on City Streets as a feature of Full-Self Driving. But notably, its website provides no further """ https://insideevs.com/news/742295/tesla-autopilot-abilities-...

""" In a statement addressing the US recall, Tesla declared its technology is a ‘Level Two’ semi-autonomous driving system – not the more advanced ‘Level Three’ system which is already being developed and rolled out by rival car-makers. """ https://www.drive.com.au/news/tesla-full-self-driving-level-...

monkeyelite•18m ago
I was hoping to learn more about self-driving here but mostly see Elon gossip.
toss1•12m ago
"One test is worth a thousand opinions."

And here, the too-frequently posted excuse that "oh, many humans would have hit that too" is utter nonsense.

In that situation, with light traffic, clear daylight visibility, and wide shoulders, any human who would have hit that is either highly distracted, incompetent, or drunk.

Both driver and passenger saw the object at least 7-8 seconds ahead of time; at 0:00sec the passenger is pointing and they are commenting on the object, at 0:05sec passenger is leaning forward with concern and the Tesla drives over it at 0:08sec. The "Full Self Driving" Tesla didn't even sound any warning until a second AFTER it hit the object.

Any alert half-competent driver would have certainly lifted off the accelerator, started braking and changing lanes in half that time. They didn't because of the expectation the Tesla would take some corrective action — bad assumption.

"My 'Full Self Driving' is as good as a drunk" is not a worthwhile claim.

Worse yet, the entire concept of [it drives and then hands control to the human when it can't handle a situation] is actively dangerous to levels of insanity.

Human perceptual and nervous systems are terrible at tasks requiring vigilance —it is like our brains are evolved for attention to wander. Having a life-critical task that can literally kill you or others ALMOST fully handled autonomously is situation designed for the human to lost attention and situational awareness. Then, demanding in a split second that (s)he immediately become fully oriented, think of a reaction plan, and then execute it, is a recipe for disaster.

In this case, it is even worse. The Tesla itself gave the humans zero warning.

The driver and passenger saw the object well in advance of the Tesla and in in 3-4 times the time and distance it would take to react effectively. But, they had an assumption nothing was wrong because they assumed the Tesla would handle the situation and they were not in a driving mindset, instead waiting to see what the Tesla would do. They were not actively driving the car in the world. Fortunately, the only result was a mangled Tesla — this time.

KevinMS•7m ago
That just seems like bad luck for the experiment. I've never seen anything like that driving on the highway. Was the claim that it could go coast to coast no matter what was thrown at it?