frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Learning Software Architecture

https://matklad.github.io/2026/05/12/software-architecture.html
173•surprisetalk•3h ago•24 comments

Screenshots of Old Desktop OSes

http://www.typewritten.org/Media/
301•adunk•7h ago•128 comments

Postmortem: TanStack NPM supply-chain compromise

https://tanstack.com/blog/npm-supply-chain-compromise-postmortem
903•varunsharma07•15h ago•374 comments

EU to crack down on TikTok, Instagram's 'addictive design' targeting kids

https://www.cnbc.com/2026/05/12/tiktok-instagram-social-media-addictive-eu-crack-down.html
93•thm•1h ago•41 comments

A HN post with negative points – how?

https://news.ycombinator.com/item?id=48104663
31•donohoe•2h ago•17 comments

Docker images are MB; a full game engine compiles to 35MB WASM

https://bogomolov.work/blog/posts/wasm-vs-docker/
25•theanonymousone•2d ago•18 comments

Text Blaze (YC W21) Is Hiring for a No-AI Summer Internship

https://www.ycombinator.com/companies/text-blaze/jobs/P4CCN62-the-blaze-no-ai-summer-internship
1•scottfr•36m ago

They Live (1988) inspired Adblocker

https://github.com/davmlaw/they_live_adblocker
328•tokenburner•11h ago•104 comments

If AI writes your code, why use Python?

https://medium.com/@NMitchem/if-ai-writes-your-code-why-use-python-bf8c4ba1a055
588•indigodaddy•15h ago•628 comments

Coursera and Udemy are now one company

https://blog.coursera.org/coursera-and-udemy-are-now-one-company-creating-the-worlds-most-compreh...
51•Anon84•1h ago•13 comments

UCLA discovers first stroke rehabilitation drug to repair brain damage (2025)

https://stemcell.ucla.edu/news/ucla-discovers-first-stroke-rehabilitation-drug-repair-brain-damage
374•bookofjoe•18h ago•72 comments

Rtwatch: Watch videos with friends using WebRTC

https://github.com/pion/rtwatch
42•nateb2022•2d ago•6 comments

Music has scales / raagas. What about storytelling in movies and prestige shows?

https://arc.quanten.co/archetype
24•phaedrus044•4h ago•29 comments

Claude Platform on AWS

https://claude.com/blog/claude-platform-on-aws
162•matrixhelix•11h ago•71 comments

Toxicity on Social Media – The Noisy Room

https://thenoisyroom.com
97•skm•5h ago•65 comments

Extremely Low Frequencies

https://computer.rip/2026-05-09-extremely-low-frequencies.html
108•pinewurst•8h ago•8 comments

Optimize for change not application performance

https://www.echooff.dev/blog/developer-experience-is-a-performance-feature
18•lo1tuma•2d ago•6 comments

Google says criminal hackers used AI to find a major software flaw

https://www.nytimes.com/2026/05/11/us/politics/google-hackers-attack-ai.html
202•donohoe•23h ago•147 comments

I let AI build a tool to help me figure out what was waking me up at night

https://martin.sh/i-let-ai-build-a-tool-to-help-me-figure-out-what-was-waking-me-up-at-night/
204•showmypost•15h ago•214 comments

Chasing Chicago's movable bridges (2014)

https://aresluna.org/seesaws-for-giants/
4•NaOH•2d ago•0 comments

Unitree GD01: China's $537k rideable transformer robot is now in production

https://gagadget.com/en/709729-unitree-gd01-chinas-537k-rideable-transformer-robot-is-now-in-prod...
49•rguiscard•2h ago•33 comments

Software Internals Book Club

https://eatonphil.com/bookclub.html
118•aragonite•10h ago•20 comments

I hate soldering

https://user8.bearblog.dev/rant/
141•James72689•4d ago•131 comments

Remembering Planet Source Code: Sharing Code Before GitHub Made It Easy

https://www.pietschsoft.com/post/2026/05/05/remembering-planet-source-code-sharing-code-before-gi...
26•pabs3•3d ago•3 comments

Houses are for living, not for speculation

https://en.wikipedia.org/wiki/Houses_are_for_living,_not_for_speculation
28•robtherobber•1h ago•2 comments

Boriel BASIC

https://zxbasic.readthedocs.io/en/docs/
53•AlexeyBrin•2d ago•18 comments

Nullsoft, 1997-2004 (2004)

https://slate.com/technology/2004/11/the-death-of-the-last-maverick-tech-company.html
296•downbad_•4d ago•82 comments

Show HN: A modern Music Player Daemon based on Rockbox firmware

https://github.com/tsirysndr/rockbox-zig
92•tsiry•2d ago•22 comments

Don't hijack my mouse pointer

https://ruky.me/dont-hijack-my-pointer/
7•rukshn•36m ago•3 comments

Interaction Models

https://thinkingmachines.ai/blog/interaction-models/
247•smhx•15h ago•30 comments
Open in hackernews

EU to crack down on TikTok, Instagram's 'addictive design' targeting kids

https://www.cnbc.com/2026/05/12/tiktok-instagram-social-media-addictive-eu-crack-down.html
90•thm•1h ago

Comments

sylware•1h ago
Yeah yeah, virtue signaling, and most of EU online services are now gated by the use of one of the whatng cartel web engines (IRL, google blink), namely EU web sites are broken favoring web apps.

They have to restore interop with noscript/basic html web engines (past/present/and future).

Then, they have to be carefull with their file formats, for instance you never give "carte blanche" to such a disgusting format like PDF, you are very careful at defining a, as simple a possible, subset of it (with some internal software for validation).

nanapipirara•1h ago
Yeah yeah, whataboutism.

I'm very happy they're taking a stance. I've seen too many messed up kids and there's no doubt the addictive design plays a big role in the problem.

soco•41m ago
I must notice that every time, but really every time, EU moves a pinky finger against tech industry, a sizeable chunk of comments here will be like the one above. I wonder, is it about a general sentiment against EU? Or a general sentiment against restricting technology? Or a general sentiment against humans? Or what?
eowln•38m ago
The sentiment that having to present our ID to use tiktok gives us the heebie-jeebies, and for good reason.

Also, nobody voted for the Commission.

ToucanLoucan•31m ago
Boiling kid's (and adult's) brains probably makes them a decent chunk of money, either directly via salary or indirectly via stocks. Ensuring kids remain healthy makes no money. An unfortunately large slice of the tech sector doesn't give the tiniest shit about the health of our broader society or any group in it if it means their lines stop going up, or even go up slightly less fast.
watwut•20m ago
Imo, both. The more right wing people started to have aggressively anti-EU stance once Vance openly stood on the side of Orban and against EU and democracies in general.

And some people see tech companies as worship worthy and trying to restrict them is kind of a blasphemy.

Mashimo•33m ago
Is ending endless scrolling really virtue signaling? Don't you think it will have a measurable effect?
FinnKuhn•52m ago
I think especially restricting endless scrolling is a good thing overall to reduce the addictiveness of social media and its harmful effects.

HN having pages instead of a feed or endless list is one of the things I really like about it.

nanapipirara•35m ago
For sure.

The other thing I really love about HN is that titles are all supposed to be boring and to the point. The guidelines[1] for titles are excellent and I wish more of the web and honestly legacy media too would behave that way. Things that are of no interest to me are not trying to waste my time and attention.

[1] https://news.ycombinator.com/newsguidelines.html

ekjhgkejhgk•33m ago
> I think especially restricting endless scrolling

The actual point is that they are designed to be addictive. "endless scrolling" is just an implementation detail. If you "ban endless scrolling", they'll still be using every other trick to make it addictive.

bschwarz•36m ago
Imagine the pressure on Instagram and Tiktok to serve better content if they were forced to pick out, say, 100 short videos per person per day. And not just for kids, adults need a break from this addiction machine as well.
thiago_fm•36m ago
Why should only kids be protected from addiction?

I have a hard time understanding this.

We have plenty of adults with terrible social media addiction that is destroying their lives, and nothing being done about it.

gib444•31m ago
Makes it an easier sell politically. If you position it as dangerous to kids in particular, your opposition then looks like they're encouraging child harm.
yipbub•34m ago
Thanks, I'm an adult and I need it too
mrosenbjerg•32m ago
Had the exact same thought
garrettjoecox•31m ago
At what point should the responsibility fall on the parent to protect their children from harm?

Don’t get me wrong, if I had my way TikTok wouldn’t exist for anyone, adults included. It’s just so strange to me that so many parents hand their 7 year olds unrestricted access to TikTok and expect someone else to keep their kid safe.

Mashimo•26m ago
> the responsibility of a parent to protect their children from harm

I agree with you, but only in theory. Because that's where we are now and it does not seem to work that well.

Maybe through more education? But then again I think reducing addictive tactics like endless scrolling could be part of a 2 prong attack.

With alcohol we have education on what happens, but we also have laws that regulate it.

perarneng•25m ago
It's not so easy, they need phones and social media to communicate with their friends. They also need to fit in and find an identity. The algorithms basically all engagement engines have is harmful for humanity as a whole. They are marketed as recommendation engines but it's 100% about engagement and that is why the content you see is mostly creating dopamine from it being fun or rage for it being provocative. It's built to serve one purpose, to keep people using the platform as much as possible. Not because the platform is good, but because it serves content that maximizes engagement.

I read a post about someone saying his wife worked for a snack company. They used MRI scans to see how much salt (or sugar) they should have in the snacks to maximize the response in the brain. Sounds disturbing right.

Well engagement engines are the same thing. It's artificial intelligence optimized to get people to react and stay addicted. Basically AI doing harm. It's not what is best for the individual in terms of health. It's what generates most money to the owner of the platform.

It should not be allowed to build a business around something that exploits humans brains. Basically biohacking our brains for profit.

kioleanu•20m ago
I am from Eastern Europe and I’ve been living for many years in Western Europe. Where I come from, kids get their first phones when they start school at 6 (there’s a pre-school year) simply because every other kid has one. I keep coming back in my mind to two examples from my birth country: a friend’s kid carrying an 8 inch smartphone in his hand everywhere because the phone was as big as half his thigh and would have to carry a bag for it. The second one was on a visit at the zoo, I was on a bench and a family with two young children with them, in a cart. And both children, couldn’t have been older than 4 or 5, were scrolling TikTok, that was showing them children content!

In contrast, in Western Europe, my son is now in the sixth grade, more than half his class doesn’t have phones, phones are absolutely forbidden on school grounds and at school activities, and they are now doing a class trip where they were told that there’s a pay phone at the hotel, in case they want to call the parents - our son promptly informed us that he’ll rather buy a pack of Pokémon cards than call us and 3 days is not so much anyway.

And it is not only at school, he travels for tournaments with his team every other week and mobile phones are absolutely forbidden on the team bus. Children read, play games (including chess on a magnetic board), sing and change stories for hours at a time

conception•29m ago
This is pretty easy to solve. If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present. If the user decides you don’t, ala social media 1.0.
shiandow•26m ago
And when does the user decide? Must a platform do nothing to stimmy spam, or even illegal content to qualify as impartial?

I suppose the answer could be that only platforms that do indeed allow spam or worse are impartial, but that is a tricky position to be in.

leogiertz•17m ago
The mechanism would be that if the user has chosen to follow an account then posts from that account falls under common carrier. If the platform choses to show you other posts then it's under their responsibility.
stingraycharles•22m ago
This is one of those things that don’t translate to legal reality very well, as then you have to define “what is an algorithm”.

Is adding advertisements an algorithm?

Is including likes an algorithm?

Is automatically starting the next video after a previous one has finished an algorithm?

Is infinite scroll an algorithm?

Etc

randunel•19m ago
Everything other than sorting the list of entities by a standard measurement unit (time, length, mass, temperature, amount) needs to be covered by this law.

The moment you add other entities to the list (e.g. ads inbetween posts), then it's also subject to the same restrictions.

stingraycharles•13m ago
This effectively means “every online platform ever” and would also have included MySpace and the OG Yahoo etc, and as such would not really single out the truly bad actors.

And then we’ll end up with with another cookie-banner style law which had good intentions but actually missed the point entirely.

randunel•9m ago
I stopped using facebook around 2015-ish, when they stopped allowing sorting by date. Prior to this, hi5 and the likes also allwoed sorting by date. So no, not every online platform ever.
progval•6m ago
It even includes email providers with a spam filter.
bee_rider•3m ago
Maybe MySpace should be covered. I mean, MySpace probably(?) had the technical capacity to act maliciously in the manner that modern social media sites do, then business model just hadn’t evolved to the modern toxic state yet.

The cookie banner law is fine for the most part. Sites that do the malicious-compliance thing of over-prompting the user for permissions are providing a strong signal that they are bad actors. It’s about as much as we can expect without banning them entirely…

andybak•16m ago
This kind of complex leglislation already exists in many areas of the law: revenue collection being the most obvious one. We could choose to treat "societal harm" the way we treat "tax collection".

I'm not saying there aren't infinite edge cases and second-order effects - but we tolerate those already for many things. I'm not pretending this is simple or even desirable - I'm merely stating it's possible if we want to do it.

My biggest fear is that (like the UK Online safety act) this acts to favour the huge corporations because they are the only ones that can afford a team of lawyers. Any legislation should aim to carve out exceptions to avoid indirectly helping monopolies.

stingraycharles•10m ago
Great example. These companies are already experts at circumventing taxes, what makes you think they can’t weasel their way around some arbitrary written law?

Just look at the malicious compliance that Apple and Google have around the App Store stuff, they’ll find a way to comply with the law and implement different addictive dark patterns.

I’m not saying that I disagree that these companies need to be regulated, I absolutely do. I just think it’s going to be a complicated process, and not “oh just ban everything that’s an algorithm”.

And I have absolutely 0 faith in companies like Meta willfully complying.

3form•8m ago
This doesn't differ much from the legal reality that I've seen. Terms need to be defined, yes. It will require work to do so. And that work should be done even if it's a bother.
orbital-decay•8m ago
"Algorithm" is a method of selecting the content to display. You're listing presentation types, not selection types. Presentation has nothing to do with supervised selection. Selecting the next video in the infinite scroll would be the algorithm, not the infinite scrolling mechanism itself.
anzerarkin•25m ago
I don’t think this is only a kids issue.

A lot of adults need this too. The addictive apps are very well designed, while most blockers are either too easy to ignore or too annoying to keep using.

I built a small iOS blocker because I had the same problem. Making it strict enough to actually work without making people hate it is the main challenge.

Pesthuf•25m ago
Tell me: why are these algorithms suddenly okay when the victim turns 18?

They are bad for everyone and if you’re willing to regulate them, make them illegal to be used on anyone.

Mashimo•9m ago
Just from this article it's not clear if the methods like endless scrolling or "watch next video" are going to be regulated based on user age or not.

It just says the platform who use such methods, often target kids.

palata•5m ago
Same as for the cigarette: it's a lot easier to regulate stuff for kids, because we as a society tend to agree that they need to be protected. Much harder to do with adults, because it is much less of a consensus.
hnthrowaway0315•18m ago
But they are so profitable, and we need them to track people around and create a police state efficiently. Ah let's keep them but just fine them as well for the show.
evanjrowley•4m ago
The most on-brand solution for the EU would be to require mobile phone users to upload brain scans in real-time so the state can check for neural activity associated with addiction.