frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

EU to Launch Age Verification App for Online Use in July

https://www.pymnts.com/mobile-applications/2025/european-union-launch-age-verification-app-online-use-july/
1•nickslaughter02•39s ago•0 comments

Changefly: Rebuilding the Foundation of Privacy and Authentication

https://www.changefly.com/developer
1•davidandgoli4th•1m ago•0 comments

Autopoietic Networks (a few more examples)

https://gbragafibra.github.io/2025/05/27/autopoietic_nets2.html
2•Fibra•5m ago•0 comments

Why Specs Matter

http://www.diveintomark.link/2004/why-specs-matter
1•fanf2•7m ago•0 comments

Show HN: Hackertuah – I made a Hacker News CLI in Rust

https://github.com/program247365/hackertuah
1•program247365•8m ago•0 comments

Dr. Steel – TVD, Vol.2

https://www.youtube.com/watch?v=A7K1ogK6318
1•doener•10m ago•0 comments

Quantum neural networks form Gaussian processes

https://www.nature.com/articles/s41567-025-02883-z
1•bookofjoe•11m ago•0 comments

Show HN: Today, I Built a Daily Task Tracker App Using Python

1•DeepTechTaiye•12m ago•0 comments

Avoiding mistakes with AWS OIDC integration conditions

https://www.wiz.io/blog/avoiding-mistakes-with-aws-oidc-integration-conditions
1•mooreds•13m ago•0 comments

Photos taken inside musical instruments

https://www.dpreview.com/photography/5400934096/probe-lenses-and-focus-stacking-the-secrets-to-incredible-photos-taken-inside-instruments
2•worik•16m ago•0 comments

Interactive Development

https://lambdaisland.com/blog/2025-05-13-on-interactive
1•todsacerdoti•16m ago•0 comments

Twitter Down?

6•nokcha•17m ago•2 comments

Joseph Holsten Paintings

https://holstonart.com/art/paintings
1•mooreds•17m ago•0 comments

Falling Houses Prices Aren't Just a Texas or Florida Story

https://www.bloomberg.com/opinion/articles/2025-05-30/falling-houses-prices-aren-t-just-a-texas-or-florida-story
2•mooreds•20m ago•1 comments

TinyAPL part 1: Introduction and Arrays

https://blog.rubenverg.com/tinyapl_1_arrays
1•ofalkaed•21m ago•0 comments

Controversial 'lost' Jerry Lewis film discovered in Sweden after 53 years

https://www.thenationalnews.com/arts-culture/film-tv/2025/05/29/jerry-lewis-day-the-clown-cried-discovered/
19•danso•21m ago•5 comments

Towards an Even Larger Video Game Dataset: Inverse Dynamics Models for Bootstrap

https://www.openworldlabs.ai/blog/training-idms
2•artninja1988•23m ago•0 comments

Why is quality so rare?

https://linear.app/blog/why-is-quality-so-rare
1•cekrem•27m ago•0 comments

A Nice Round Ball

https://jfmengels.net/a-nice-round-ball/
1•cekrem•28m ago•0 comments

An Interactive Guide to Flexbox in CSS

https://www.joshwcomeau.com/css/interactive-guide-to-flexbox/
2•cekrem•28m ago•0 comments

Moving Managers

https://twitter.com/mitchellh/status/1928539528823976057
3•tosh•30m ago•0 comments

MTA Eyeing Another Batch of 'Open Gangway' Subway Cars

https://www.thecity.nyc/2025/05/15/subway-cars-open-gangway-new-trains-mta/
1•PaulHoule•30m ago•0 comments

Chain-of-Zoom: Extreme Super-Resolution via Scale Autoregression

https://bryanswkim.github.io/chain-of-zoom/
1•lastdong•31m ago•0 comments

It's not THAT hard switching from a furnace to a heat pump

https://www.heatpumped.org/p/it-s-not-that-hard
3•ssuds•32m ago•0 comments

The Gmail app will now create AI summaries

https://arstechnica.com/google/2025/05/the-gmail-app-will-now-create-ai-summaries-whether-you-want-them-or-not/
13•rurp•35m ago•10 comments

Show HN: MCP powered cross platform Superwhisper

https://qspeak.app
4•michalwarda•37m ago•3 comments

PBS sues Trump administration over funding cuts

https://www.axios.com/2025/05/30/pbs-funding-sue-trump
11•coloneltcb•40m ago•0 comments

Texas Holdem with LLMs Game

https://vonhoon.itch.io/texas-holdem-with-llms
1•indigodaddy•41m ago•1 comments

First MCP Server for Eval

https://twitter.com/scorecardai/status/1928482068717060567
1•Rutledge•41m ago•2 comments

Machine Learning Model Helps Identify Patients at Risk of Postpartum Depression

https://www.massgeneralbrigham.org/en/about/newsroom/press-releases/machine-learning-model-helps-identify-patients-at-risk-of-postpartum-depression
1•gmays•41m ago•0 comments
Open in hackernews

Tesla FSD doesn't stop for school buses with stop signs and red lights

https://bsky.app/profile/realdanodowd.bsky.social/post/3lqafg2zqfk2v
36•alex_young•22h ago

Comments

ultrablack•22h ago
This guy purs our a smesr video like every so often. Previously, they were discovered to be fake. Here, we cannot see if the car is on fsd.
comrade1234•22h ago
Does it at least stop when you hit a pedestrian? (Serious question - not sarcasm)
macintosh-hd•22h ago
It can see pedestrians (they even have a specific model on the visualization) and will stop for them. If one jumped in front of the car, it will slam on the brakes regardless of if FSD is enabled or not. If it somehow didn’t brake before impact, it would slam on the brakes.
Veserv•1h ago
Assuming it did not disengage prior to hitting the pedestrian, no.

If the pedestrian is on the ground near the car after the collision, it is very likely it will no longer be able to see the pedestrian and will drive over them:

https://www.youtube.com/watch?v=QrV_whh4Z84&t=384s

Seen from the frontal angle clearly demonstrating it drives over the fallen child:

https://www.youtube.com/watch?v=QrV_whh4Z84&t=410s

cut3•22h ago
I had Tesla FSD and it ran two stop signs while I was trying it, so I canceled my FSD. That can't be normal right? Everyone elses isnt that terrible right? Maybe it is though.
macintosh-hd•21h ago
How long ago was it? I was absolutely terrified when I tried it on someone's car in 2023 and swore the whole thing off, but these days when I drive my brother's car I have it on FSD basically the entire time. It has gotten extremely impressive in the past ~8 months.
macintosh-hd•21h ago
It really sucks that this is such a poisoned well that I'm extremely skeptical about any of these claims at this point. FSD is not perfect, but the last time there was a big scare like this the person was just using cruise control, not FSD.

This specific guy also has basically made his entire modern career out of lying about FSD for clicks, so... In the past he released a video where you could see the screen warning the accelerator was held down so the car would not stop. Having used it myself and seen that warning message, it also makes the standard alert error sound when it appears to draw your attention.

Veserv•17h ago
The "warning message" claim is a lie. The message that appeared on screen was "Supercharging unavailable - Add Payment method" as can be seen in this video [1], "Paid charging unavailable - Check unpaid balance" [2], or similar. That is the warning message avowed Tesla promoters claim indicates the accelerator was pressed.

They made that claim because the videos are usually linked and viewed at low resolutions where it is hard to make out the specific warning message being displayed. As such, the Tesla promoters believed they could make up false claims that suit their narrative with no way for anybody to disprove them. Unfortunately for them, the raw footage is frequently made available in 4k resolution allowing video evidence to disprove their claims of a "warning message".

In no situation where they make claims that there is a "accelerator pressed" message is the imagined message adequately visible to clearly demonstrate their claim to be true. In contrast, every situation where there is a high resolution video the message is in the category I stated and they still make demonstrably false claims that the message is "accelerator pressed". Even in situations where the video is only low resolution, there is still frequently adequate display and color resolution to see that the shape of the message is more similar to a two-word "Supercharging unavailable" than a 5-word "Cruise control will not brake" or a center screen "Autopilot will not brake".

Seriously, I dare you to present a video clearly demonstrating the claimed "accelerator pressed" message rather than shaky-cam Bigfoot-level video used as "proof". I already presented clear video evidence of my claims.

[1] https://www.youtube.com/watch?v=OA-U848qfNE

[2] https://vimeo.com/932562717/5fb3189771

[3] https://www.reddit.com/r/TeslaLounge/comments/1dpzw2n/why_di...

[4] https://www.notateslaapp.com/news/2614/tesla-software-update...

macintosh-hd•3h ago
So we do agree that at the time this video was recorded, the error message was at the bottom of the screen instead of at its current prominent location that cannot be covered.

(Your citation for the error message being placed in the more prominent location is from June 2024. Mouse over the "1 year ago" to see the date it was posted, June 27th 2024. This is after the video you reference, April 24th 2024.)

Only one error message can be displayed at a time at the bottom of the screen, and considering how frequently this specific individual has been dishonest, I would not put it past him to have removed the payment method to block the error message from being visible.

If your response to this is why does the message get blocked, I ask you what normal user would have paid $6,000 for FSD and not ever planned to use a supercharger? And if they were paying $99/mo for it as a subscription, there would have been a card to pay for superchargers. (Additionally, this error message doesn't appear until after the first time you supercharge, because that one will be allowed to start without payment attached as a courtesy.) This edge case is still one that should be fixed, and so it is good that it was.

I mean, the first video you posted just now doesn't even show the screen at the time of impact! At this point, either the guy is dishonest or stupid. And neither are trustworthy. It would be a better use of your time to defend people making videos to this effect not associated with The Dawn Project. They’re obviously not managing to convince Tesla, nor any relevant safety bodies that something needs to be done.

In any case, we are arguing over footage that is 14 months old. That version of FSD was and is never going to be used by unsupervised autonomous vehicles. I agree that their previous titling of the feature lacking “(Supervised)” was misleading, but it did still require full attention and constant confirmation of hands on the wheel.

Veserv•2h ago
You have presented no evidence of dishonesty to justify your claim that "this specific individual has been dishonest". The only argument you have presented is asserting the person is dishonest, therefore they must have engaged in nefarious schemes to cover their dishonesty so there is no evidence, therefore they are dishonest.

Again, show me a single video where "Cruise control will no brake. Accelerator pedal is pressed" or similar message is clearly visible at any time. In fact, show me a single video where any claimed manipulation is clearly visible. And no, "You can not see the accelerator pedal in this video, therefore I imagine they must have their foot on it it since I assume that is the only way to get the demonstrated outcomes." is not a valid example.

That argument is even further double wrong because the Tesla promoters made up that claim so much the Dawn Project improved the camera angles to show the accelerator pedal so the Tesla promoters could not keep lying about it. And no, you can not argue that increased transparency to dismiss fabricated claims about non-visible elements is reverse time-travel proof of past nefariousness. By that logic, any increase in transparency or data reporting in any field is proof of fraud which is pants-on-head stupid.

So, show any video which clearly demonstrates any claimed manipulation in the video contents themselves which allows you to be so convinced that dishonesty must have occurred. That is how we got here, right? You saw a video which obviously and directly reveals dishonesty in a way that no reasonable person could ignore?

And besides, we are not arguing about videos over 14 months old. This thread is about a linked video demonstrating FSD currently runs down child mannequins. There is no warning message on the screen, the accelerator is clear visible and not pressed. If you claim the only reason it ran down child mannequins in earlier videos is because the accelerator is pressed and indicated by that imagined message, this video demonstrates those claims to be objectively false. Unless you are arguing that it used to only run down child mannequins when the accelerator pedal was pressed, but now it has "improved" and will do it unprompted.

macintosh-hd•1h ago
You had a long back and forth with someone else in this post where you defended obvious dishonesty on the part of this group. They presented footage where FSD literally was not engaged in an ad showing its flaws, and it doesn’t matter if the driver signed paperwork or they had other footage where it allegedly was enabled and in total working order. Dan’s credibility was done when he didn’t backtrack and try again right there.

That is brazen dishonesty, and the fact that you will not admit that means you’re a waste of time. If you feel so strongly about it, maybe you should try to replicate the results so it can come from someone who has even a shred of trustworthiness.

0xTJ•21h ago
As some have said about self-driving: if it can't handle edge cases, it's not self-driving.
josephcsible•21h ago
Since it's been proven that Dan O'Dowd's previous videos like this one have been faked (see, e.g., https://electrek.co/2022/08/10/tesla-self-driving-smear-camp...), I won't believe this happens unless someone besides him can demonstrate it.
Veserv•17h ago
Nope. That was a smear campaign. The "reporter" did not even watch the linked raw footage which shows that FSD was, in fact, engaged and that "reporter" was a Tesla promoter who referred $15,000,000 in sales and earned $500,000 from the Tesla referral program [1]. But, hey, it worked. Here you are repeating the smear.

Oh, and by the way, Tesla did not pay out and he has now soured on them and now routinely uses a Dawn Project image demonstrating FSD running down a child in a school crosswalk [2] to headline their stories. I would assume he would not use that image if he thought the Dawn Project was still lying about the whole thing.

And frankly the logic does not even make any sense. The only way claiming that they faked it by not even enabling it, pressing the accelerator pedal, or whatever lie the avowed Tesla promoters like spewing, exonerates Tesla is if you are claiming that it only consistently runs down children when they do that.

A video demonstrating that the precondition does not need to be true demonstrates the shrill cries of faking it to be irrelevant. And the baseless claims that it it must have been faked because it would only do that when manipulated to be "assuming the desired answer and working backwards" that it always was. The fact that the Tesla promoters need to keep making up false claims since they kept getting demolished and now have to resort to pointing at earlier videos before there were enough camera angles to immediately disprove their imagined manipulations is a testament to how intellectually bankrupt their position is.

This video is just one in a long line of demonstrations that the Tesla promoters are running a clear smear campaign. None of their imagined manipulations have been true. The claims that is must have been faked since that is the only way it would consistently run down a child were false, a logical fallacy, and a intentional smear campaign by avowed Tesla promoters. Neat.

[1] https://electrek.co/2019/01/17/tesla-roadster-free-killed-re...

[2] https://electrek.co/2025/05/21/tesla-head-self-driving-admit...

josephcsible•17h ago
> The "reporter" did not even watch the linked raw footage which shows that FSD was, in fact, engaged

It was not engaged. Note this part of the article and the immediately following image:

> Sure enough, The Dawn Project’s own video of the test shows the driver “activating” FSD Beta by pressing on the Autopilot stalk, but we can clearly see that it didn’t activate because the course prediction line stays grey and the Autopilot wheel doesn’t appear on the top left:

And if these problems were real, people other than O'Dowd would be able to replicate them. But in other people's tests, they don't happen.

> if you are claiming that it only consistently runs down children when they do that

Yes, I am claiming that. The only way Teslas consistently run down children is when their built-in assistive and safety systems are shut off, and a human drives fully manually into them.

Veserv•16h ago
Can you at least read to the end of the article you post as evidence before making demonstrably false claims?

"Update: The Dawn Project has since released additional footage that doesn’t appear in its ad where we can see that they were able to activate FSD, however, the footage is inconsistent with the results published about the test and in the ad." (Note: The claim that the written results had errors at the time was true. The errors were fixed in a subsequent revision.)

https://x.com/twn675/status/1557415939569639424

In fact, here is the actual released report: https://dawnproject.com/the-dawn-projects-new-advertising-ca...

With the contemporaneous raw footage here: https://dawnproject.com/wp-content/uploads/2022/08/raw-foota...

Amazing, the claim that it was not engaged is straight up false with clear video evidence that it is false and the Tesla promoter who pretends to be a "reporter" did not feel like investigating the source before making false claims accusing others of making false claims. That should really make you rethink which side is more credible. Hint, it is not the side that does not bother to watch raw footage because they already made up their mind that the company who pays them is right.

Care to retract your false statement now that there is clear video evidence that your statement is false? Or are you going to argue that at 0:40 FSD disengaged two seconds before impact, but with enough remaining velocity to collide if no corrective human action is provided, therefore FSD is not responsible?

josephcsible•16h ago
> however, the footage is inconsistent with the results published about the test and in the ad.

That part seems to be a big deal that you're totally glossing over.

And again, if these are real problems, why have so many other people tried and failed to reproduce them?

Veserv•16h ago
So your argument was:

> It was not engaged.

But since that is now clearly false using the evidence you yourself presented you have transitioned to:

"The raw video footage clearly demonstrates the claim of FSD colliding with the child under the stated conditions, but the written report had incorrectly documented what occurred in the video footage, but which has since been corrected. That disproves the raw video footage of the claim."

Talk about moving the goalposts. Care to retract your false statement that it was not engaged?

josephcsible•16h ago
It's clear from the still image in the article that FSD really was not engaged in the original video. It's not just a mistake in the written report. The video provided later in the update must be from a later run of the test. And in particular, since O'Dowd got caught lying about the original, I no longer trust him, so I don't trust that the second test wasn't tampered with in a less-detectable way (e.g., messing with the car's sensors, or doctoring the new uploaded video). Especially because of the point you keep not addressing: nobody else seems to be able to reproduce this problem.
Veserv•15h ago
Wow, still just making up random nonsense instead of taking responsibility for your false statements.

The video footage in the report was not a "update". That is the original video footage of the tests that was released at the same time as the report that Fred had, but chose not to, access before making false claims; just like you had that opportunity right now. The error in the written report that Fred was referencing is that the contents of Table 1 recorded a slightly higher "Speed at Impact" than what actually occurred. The value originally reported was most likely the speed at disengagement, but has since been rectified to report the correct number. In no way shape or form does that invalidate the video footage unless you are a mindless Tesla drone spreading the smear.

The video you claim is the "original" was a TV advertisement, not a test. As you might be aware, advertisements are meant to be evocative, not precisely accurate and usually contain edited video footage. Or do you also believe that Red Bull actually gives you wings? Accuracy is for the tests, raw footage, and report that were released simultaneously and which clearly demonstrate the stated claims.

For that matter, if the video were clearly false and defamatory as you claim, why has Tesla not followed up on their Cease and Desist letter from 2 years ago [1]? Tesla sent a letter claiming the videos were false and defamatory using Fred's article as evidence and would sue to get them taken down. Then Fred posted that he realized FSD was engaged and it has been crickets from Tesla since then. I think you should trust Tesla's lawyers here and stop digging your hole.

I am done here. If you have any self-respect, retract your false statements and rethink your position instead of performing charity for the richest person in the world by smearing whistleblowers.

[1] https://dawnproject.com/dan-odowds-response-to-teslas-cease-...

josephcsible•15h ago
I still ask: why does this problem not happen for anyone but Dan O'Dowd? Plenty of other people have tested it.
bananapub•21h ago
Tesla doesn’t have “full self driving”, why is the US media obligated to pretend it does?
emmelaich•21h ago
I'd like to see the same test with other self-drive cars, e.g. Waymo.
throwaway48476•20h ago
The FSD approach misunderstands the problem. Let's say hypothetically you could use cameras and ML to create an agent as good as a human driver. But is that good enough? No, no one will accept that standard. The expectation is that self driving is near perfect. To achieve that is going to take more than just cameras.
bjrosen•7h ago
It's happened to me in exactly that way. There was a stopped school bus and it showed no signs of stopping before I intervened. The other FSD failure is not understanding no right turn on red. It stops for the red light, waits until there is no traffic and then goes as if it was an ordinary red light. That's less serious because it's not dangerous like passing a school bus but it's illegal. Aside from those two things FSD has been almost perfect. It drives better than a human, it's very comfortable but 13.2.x isn't ready for unsupervised.