frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Landmark L.A. jury verdict finds Instagram, YouTube were designed to addict kids

https://www.latimes.com/california/story/2026-03-25/social-media-lawsuit-trial-meta-google-verdict
91•1vuio0pswjnm7•1h ago

Comments

yacin•1h ago
this has to be the first of many right? fingers crossed this leads to some meaningful change.
2OEH8eoCRo0•1h ago
It's a huge deal because it was the bellwether case for over 1,000 other similar cases.
yacin•1h ago
ah yup:

> It comes on the heels of a Delaware court decision clearing Meta’s insurers of responsibility for damages incurred from “several thousand lawsuits regarding the harm its platforms allegedly cause children” — a ruling that could leave it and other tech titans on the hook for untold future millions.

trollbridge•1h ago
Yep. The insurance covers accidents and negligence, not deliberate decisions to impose harm to children for financial gain.
guzfip•58m ago
Sounds too good to be true. I’ll hold my breath.
AlienRobot•49m ago
I wonder at which point do children become such a liability for platforms that it's easier to just ban all children altogether.

Children don't have disposable income to buy ads/subscriptions. They don't have experience to write about. The only thing they have that adults don't is time which translates into engagement metrics.

In an ideal world, the adults that buy/manage the computers would create age-restricted account for children, and the OS would give this information to the browser, which would just transmit it via HTTP. This is the safest method to verify ages. If an operating system doesn't want to support this, it's ultimately the adult's responsibility to install one that supports it. This would mean there would be no burden on the adults (the majority of the planet) to verify their ages, so there would be no burden on the platforms to restrict ages either.

If platforms could verify ages without inconveniencing their main user base, I wonder if platforms would just start banning all minors, or if there is some reason to allow minors in the platform that justifies all the liability surrounding them.

germinalphrase•6m ago
Nobody takes “age-restricted account[s] for children” seriously.

Parental controls and age-restrictions are almost universally half-baked, buggy fig leafs to displace negative attention from software and content providers.

jeffbee•57m ago
You mean it's the first of many appeals, I assume.

Trial courts will decide pretty much anything. Then the case gets appealed over whether the trial court correctly interpreted things you probably perceive as uncomplicated, like the 1st Amendment.

onlyrealcuzzo•1h ago
How is any app/website that 1) appeals to kids, 2) sells attention, 3) does A/B testing NOT guilty of this?
embedding-shape•1h ago
I guess ultimately it depends on if the app/website authors do so "negligently" or not.

> Jurors were charged with determining whether the companies acted negligently in designing their products and failed to warn her of the dangers.

So if you do so while providing warnings and controls for people, that might make it OK in the eyes of the law?

sampullman•1h ago
I think there's a little more nuance than that, but it seems roughly correct.

Wouldn't it be better if apps/websites targeting kids didn't use A/B testing to be more addictive?

steve-atx-7600•46m ago
For context, facebook is so dystopian when I login once every few years that I’m not sure I’ll ever use it again. And, I hate wading through the YouTube cesspool to find some educational content I like. But, I don’t think it makes sense to ban a/b testing or optimization in general. Some company could use it, for example, to figure out how to teach math to kids in a way that’s as engaging as possible. This would be “more addictive” technically.
sampullman•25m ago
That's a good point, I'm not 100% sure it's worth throwing away the potentially beneficial uses. There might not be a solution that's both feasible to implement and avoids banning useful things. In the end I usually come back to it being the parent's responsibility to monitor usage, limit screen time, etc., but it hasn't been working so well in practice.
ramon156•45m ago
They'd find another method. Why are we allowing this in the first place?

I don't have an answer to fix this whole mess, but it starts with our attitude towards addiction. We've built a system that rewards addiction in all sorts of places. Granted, every addiction is different, and I'm of the opinion that it's not (drug = bad), it's how you use it and react to it. We can control the latter, but we choose to ignore it because we're too busy with anything else. This is a tale as old as time...

aaomidi•28m ago
In the span of how long it takes for law to catch up to what’s going on, YouTube and Facebook has been around for a tiny amount of time.
bluefirebrand•9m ago
They have been around long enough to have done unknowable damage to entire generations of humans
schmidtleonard•40m ago
> more nuance

Not enough to diffuse liability. 15 years ago when recommender algorithms were the new hotness, I saw every single group of students introduced to the idea immediately grasp the implication that the endgame would involve pandering to base instincts. If someone didn't understand this, it's because

> It is difficult to get a man to understand something, when his salary depends on his not understanding it. - Upton Sinclair

guzfip•59m ago
It probably helps when you suppress research that shows you’re harming children and allow human traffickers to fester on your platform with 17 warnings or whatever.
steve-atx-7600•57m ago
How’s this different than tv that a kid might see that has ads and programming targeting kids?

I watched 80s horror movies when I was in elementary school and had nightmares for years. Should I sue now?

How about parents be held responsible for how they care for their kids or not? Maybe a culture that judged parents more strongly for how they let their kids spend their time would be an improvement.

jeffbee•56m ago
The difference is largely in the way that the legal caste perceives themselves to be aligned with media but opposed to tech.
everdrive•51m ago
Being able to find some basis for comparison between two things does not render them equivalent, and this is an extremely frequent fallacy I see with regard to technology discussion on HN.
steve-atx-7600•37m ago
I understand what you’re saying, I personally don’t like or use social media, but I don’t agree that these companies are at fault after reading this article and others. I’d rather be wrong and learn something than think I’m right, so I welcome further criticism.
everdrive•25m ago
I agree with you that parents need to ultimately be responsible for keeping their kids off social media. I think there are a few problems here:

- Social media is still somewhat new, and the broader public is only now discovering that it's a clear net negative both personally and for society. Because this is such a new realization, I think a LOT of people have not really figured out how this problem should be dealt with. (both personally, via social norms, but also with regard to laws and regulations.

- No matter how awesome of a parent you are, 100% of your kids friends will have social media and they will introduce it to you kid. That may do less harm than if they have it themselves, but some harm will still be done.

- There are network effects to consider. It's true that it's your personal fault if you use cocaine -- however we also understand that cocaine is so addictive that it really cannot be used safely. Social media is metaphorically the same. It's a personal failing if you're a social media addict, however broadly almost everyone is susceptible to it. In my mind, that is an argument for regulation.

Now that said, I have zero faith that our government can actually build sensible regulation here.

F7F7F7•22m ago
They strategically use patterns that directly trigger the release of dopamine into the brain.

They've created algorithms that use slot machine like experiences that keep kids hooked to the screen.

These algorithms feeds users barely moderated content that feeds their worst instincts. With almost surgical precision when wanting to illicit engagement.

Then when research shows them the harm their causing they bury it, hire lobbyist, and double down.

Switch out a few words up there and you have the big tobacco playbook.

card_zero•28m ago
Right, like social media and addictive drugs for instance.
parpfish•26m ago
When it comes down to it, I’m not sure how you differentiate an “addictive” product from a well-made product that I choose to keep using.

When people say that Tetris and Civilization are “addictive” they aren’t implying anything malicious about the development, it’s more of a compliment about the game (and maybe a little lament about staying up too late).

But the addictive nature of social media feels different and I can’t figure out what that distinction is.

card_zero•22m ago
People will now say "the algorithm" and "dopamine", explaining nothing. You see, social media is truly addictive because it's been honed to be addictive in some way that isn't specified or known or actually true.

OK, let me try to analyze it:

1. Humans are idiots.

2. We have idiot glitches where we obsess over some particular thing. This is our own business and our own fault, and is impossible to tease apart from just liking stuff a lot and benefitting from it.

3. These glitches tend to accumulate in certain areas, and then some companies find themselves in the position of profiting from human glitchy idiocy, even though they didn't want to be behaving like scammers.

4. Then some of them get cynical about it and focus on that market segment, the obsessed idiots. This can include gambling and social media.

everdrive•20m ago
I think this represents a strong misunderstanding of what addiction is, and how it works. I mean this respectfully, and not combatively -- I expect you have never had problems with addiction.

When it comes to behavioral psychology research, there is a strong understanding of concepts such as behavioral reward schedules; interval-based rewards, time-based rewards, variably-interval-based rewards. People have a very clear understanding of what sort of stimulus is and is not prone to addiction. You can get a mouse in a cage to become hopelessly addicted to pressing a lever for a reward depending on what reward schedule you use, and this does not translate to a mouse who can just get the reward at a regular interval. (or perhaps merely a less-addicting interval) The mouse in the cage pressing a button set to a variable-ratio reward is equivalent to an old person using a slot machine in a very literal and direct way. This also translates to social media with permanent scrolling. So many of the stories such, but the variable interval is the extremely enticing (or enraging) story that just might be the next one.

close04•15m ago
> Tetris and Civilization are “addictive” they aren’t implying anything malicious about the development, it’s more of a compliment about the game

Because it's a figure of speech, not a clinical diagnosis. Literal and figurative addictions are different beasts.

Intent, premeditation, scale are major differentiators. When they know they will cause harm, they concentrate and fine tune it for the effect, turn it into a firehose, and target it at specific individuals it's very, very different from what random ads, games, of movies do. These companies literally designed their products with the intent to make them addictive and target children, knowing the full implications and ignoring the harm they caused.

You're comparing a drug dealer who only sells to kids to a store clerk who also sells icecream to kids. It doesn't take more that scratching the surface to realize the similarity is very fleeting.

roxolotl•40m ago
Both things can be true. Parents can share responsibility. But it is also the case that Facebook actively suppressed research that showed that children using their platforms experience emotional harms. It is also the case that around the time you were in elementary school discussions about children’s programming had been ongoing for years and eventually regulations were put in place[0].

0: https://en.wikipedia.org/wiki/Regulations_on_children's_tele...

steve-atx-7600•25m ago
I can agree that I think they acted to harm society knowingly. I used to think regulation could help and maybe it can, but if there were some way to shape the culture to value, for example, educational tv programming, I think that would be the most powerful influence on tech/media companies. Regulation could serve to inform parents “this programming/platform is known to rot your kids mind” like a nutrition label and some day hopefully parents will be more likely to disallow it like some do knowing how much sugar is in sodas.
everdrive•53m ago
Correct, selling attention inevitably leads to harm.
wffurr•44m ago
As a parent, the only solution is sticking to ad-free subscription services. PBS is a godsend here, but there's other good options out there too. Tragic that the public broadcasting funding was cut when there's clear harms in the free* commercial options.

*Except for your time and mental health of course

everdrive•38m ago
Agreed. Libraries have books and DVDs, and you have things like the classical stations. You also have playgrounds and walks in the park, etc. (I'm also a parent of two young children.

Always doing wholesome stuff with your kids is certainly not easy or trivial, but there is a cascading effect here. If your child does not expect to be able to just watch TV all the time it's easier to keep them interested in other things. Once that expectation is burned in you'll be fighting it for a while. And once that expectation is burned in, a small child will _never_ say "I've had enough youtube, I don't need any more."

So I really don't want to be self-righteous about always doing wholesome stuff with your kids (we definitely do not succeed 100% of the time) -- but rather point out that letting them use addictive media has negative, cascading consequences that actually do make it harder for you as a parent. It's analogous to drinking to relax. You get relief now, and pay for it later. Not actually a good tradeoff much of the time.

SirFatty•49m ago
algorithm would be the key word I think.
parpfish•28m ago
A/B testing is one way to make things “addictive” but you can also make addictive products without it.

A really good designer could make a highly engaging app or an editor can write clickbait headlines all with without testing.

esafak•10m ago
These products maximize revenue through engagement with advertisements.
DavidMcLaughlin•17m ago
A/B testing is very, very different to handing over control of your content to a reward function that optimizes for time spent over any other criteria.

We had 10 years+ plus of having products like Facebook, Twitter, YouTube, hell even LinkedIn with a basic content model of "you build your own graph of people who you pull content from" and their job was to show it to you and puts ads in there to fund the whole enterprise. If I decided to follow harmful content? That was a pact between me and the content creator, and YouTube was nothing more than a pipe the content flowed through. They were able to build multi-billion dollar businesses off of this. That's really important, this was enormously profitable. But then the problem happened that people's graphs weren't interesting enough, and sometimes they'd go on the thing and there were no new posts from people they followed, and this was leaving money on the table. So they took care of that problem by handing over control of the feed to the reward function.

More accurately, especially for Meta products: they completely took control away from you. You didn't even have the option to retain the old, chronological social graph feed anymore. And it was ludicrously profitable. So now the laws of capitalism dictate that everyone else has to follow suit. I now have extensions on my browser for Instagram and YouTube to disable content from anything I don't follow - because I still find these apps useful for that one original purpose they had when they blew up and became mainstream. Why are these browser extensions? Why can't I choose to not see this stuff in their apps? That's the major regulation hole that led to this lawsuit, imo.

It's the same thing you see with people blaming smartphones for brainrot. We've had 15 to 20 years of smartphones with more or less the same capabilities as they have today and for the vast majority of that time my phone didn't make books less interesting or make me struggle to do chores or manage my time. For a full decade or more I saw my phone as a net positive in my life, was proud to work for Twitter and generally saw technology like the Louis CK bit about the miracle of using a smartphone connected to WiFI on an airplane. But in the last five years or so, things have noticeably and increasingly gone to shit. Brainrot is a thing. All my real life friends who are the opposite of terminally online or technical are talking about it. I don't use TikTok but it seems like that is absolutely annihilating attention spans. The topic of conversation over drinks is how we've collectively self-diagnosed with ADHD and struggle with all kinds of executive function.. but also are old enough to remember a time when none of this existed. Complete normies are reading Dopamine Nation and listening to Andrew Huberman trying to free themselves.

I don't know what the exact solution is, but there's at least a simpler time we can point to when we all had smartphones and we were all connected via platforms and we all posted and consumed stupid pictures of each other and it wasn't.... _this_.

KaiserPro•4m ago
I think there is a fourth portion that is probably more important:

Actively ignoring harm caused by your product. TV/radio has sold attention, but there were pretty strict rules on what you can/can't broadcast, and to whom. (ignoring cable for the moment) Its the same for services, things that knowingly encourage damaging behaviours are liable for prosecution.

ramesh31•1h ago
I've heard about "landmark" cases against these companies over and over again for the last decade. There seems to be at least one every couple of years. And yet literally nothing has ever happened or changed.
petcat•46m ago
Since these are civil lawsuits, it just takes more people coming forward to sue. There are plenty of cases where a jury found a defendant liable for damages only for the defendant to continue the bad behavior and subsequent juries awarding ever-increasing and compounding punitive damages. Big Tobacco and Purdue Pharma (went bankrupt) are examples of this pattern. Monsanto was famously hit hard with massive "repeater" damages after they continued selling and marketing Roundup despite prior judgements.

The exact same can happen to Big Tech. The goal is to get them to stop the bad behavior now.

mrbluecoat•24m ago
I feel the same way. They're just going to appeal the case until they find a layer of the legal system where they have leverage.
pautasso•46m ago
Everyone now posting on social media about how the sentence "Social Media is Addictive" is going viral.
paulkon•45m ago
Just needs a health warning label, like on alcohol or cigarettes. Then onto the high sugar products, and a quarter of the grocery store
alexlesuper•45m ago
We have health warnings for food that contains lots of sugar, fat and/or sodium in Canada
ddoolin•33m ago
If we want to compare it to alcohol/cigarettes, then kids shouldn't be allowed to use this either.
pearlsontheroad•26m ago
and the government should tax it accordingly
Hobadee•37m ago
Is the addictiveness of social media great? No. But the blame shouldn't be placed squarely on the companies either. What happened to personal responsibility? I was addicted to Facebook, I realized it, and I disconnected from it. I had withdrawals for a while (pulling out my phone and trying to open the app I had deleted without really thinking about what I was doing) but I quit. I know I am addicted to YouTube shorts, so I stay away from them. Occasionally I'll go on a bender and a few hours will slip by without me realizing, but while I know YouTube is designing them to be addictive, I blame myself for falling for it.

There are plenty of things in life that can be addicting; drugs, sex, money, power, adrenaline, entertainment, technology... The list goes on. If we remove everything addicting from life, you better believe something else will rise up to take its place.

The solution therefore isn't to remove everything addicting from life, but rather to raise everyone with the forethought to know what might be addictive, the self-awareness to realize when you are addicted to something, and the self-control (and support systems if and when necessary) to stop.

simonh•34m ago
The problem is that internal communications inside these companies raised concerns about the manipulativeness, and even deceptiveness of the algorithms and tactics they were using.

They weren't just consciously creating an attractive platform, they were consciously creating a manipulative platform.

nkrisc•33m ago
Yes, personal responsibility is important. That doesn't mean we need to allow companies to attempt to addict as many people as they can.

The question we should be asking: are these technologies a net-positive to society?

pearlsontheroad•32m ago
Everyone should at least be a conscientious junkie.
imiric•30m ago
On one hand: sure.

On the other, it's very different when companies explicitly design their products to be as addictive as possible.

We've been through this with Big Tobacco already. Nicotine and other tobacco substances are addictive on their own, but tobacco companies were prosecuted for deliberately making cigarettes as addictive as possible, besides also marketing to children. The parallels with Big Tech and social media are undeniable.

ddoolin•29m ago
Maybe this applies more towards adults, but I don't think the correct answer for kids is only "just have self-control," something kids are notorious for not having. Certainly there's a lot of parental responsibility here but we can simultaneously hold companies responsible for their part too.
ValentinPearce•23m ago
It also is a situation where the ubiquity of these companies make it exceptionally difficult for parents to regulate access.
ValentinPearce•24m ago
If they are liable of making the thing addictive, it does mean it is their fault. In this case, it specifically says it's designed to be addictive to children, whose personal responsibility is probably not expected.
CarVac•23m ago
We can't raise other people. We can prohibit the addicting things like newsfeeded Facebook.
gervwyk•27m ago
now do Candy Crush..
baggy_trough•26m ago
Doritos now liable for creating a good tasting chip? This is madness.
bknight1983•10m ago
Normally I don't see people walking down the street staring at their Doritos
OptionOfT•6m ago
One could argue that the ultra processed food industry is doing exactly what the tobacco industry did wrt to making their food addictive.

There is a difference in creating a food that tastes good vs creating a food that tastes good, but instantly wants you to eat the whole bag.

absoluteunit1•26m ago
Oh man if they think YouTube and Instagram are addicting they should see what Roblox does lol
bogdanoff_2•22m ago
The solution to this would be a law forcing these sites to allow third-party suggestion algorithms, so that you can choose who and how content is being suggested to you.

It could be perhaps as simple as allowing third-party websites and apps for watching Youtube on your phone. And it's okay if this would be a premium paid feature, so there's no counter argument that "it costs them money to host videos".

This is not an entirely new idea either. Before Spotify became popular, people would integrate Last.FM into their media players to get music recommendation based on their listening history, and you could listen to music via YouTube directly on the last.fm website.

dmbche•20m ago
Or algorithms have to be submitted and approved by a government body before being allowed to be implemented and are frequently audited
outime•18m ago
Virtually nobody would choose to pay a subscription for the non-addictive app version, and I'd even say this suggestion is a bit insulting to anyone who isn't high-income.
mentalgear•10m ago
The solution to all of Big Tech's monopolies is actually pretty simple: Interoperability must become a law.

Cory Doctorow wrote a great article on it:

"Interoperability Can Save the Open Web" https://spectrum.ieee.org/doctorow-interoperability

> While the dominance of Internet platforms like Twitter, Facebook, Instagram, or Amazon is often taken for granted, Doctorow argues that these walled gardens are fenced in by legal structures, not feats of engineering. Doctorow proposes forcing interoperability—any given platform’s ability to interact with another—as a way to break down those walls and to make the Internet freer and more democratic.

Most notably, he retells how early Facebook used to siphon data from its competitor MySpace and act on user's behalf on it (e.g. reply to messages via facebook) - and then when the Zuck(er) was top dog, moved to made these basic interoperability actions illegal by law to prevent anyone doing to him what he did on others.

saadn92•9m ago
Third-party recommendation algorithms would be interesting, but I think they'd only address one layer of the addictive design the verdict is actually about. Autoplay, infinite scroll, notification timing, the variable reward patterns from likes and comments -- those are all independent of which algorithm picks the next video. You could swap in the most wholesome recommendation engine imaginable and a kid is still gonna sit there for hours if the UI is designed around endless content with no natural stopping points.
heyitsaamir•4m ago
Bluesky does this. In fact, the For You algorithm is a community built algorithm and way more popular than the native Discover algo.
GardenLetter27•20m ago
Mandatory age verification is coming.
highstep•15m ago
otherwise know as mandatory identification
_kidlike•4m ago
my thoughts exactly... this "verdict" came with very suspicious timing.
superkuh•13m ago
It's amazing that a jury of people completely ignorant of what medical addiction is managed to make this discovery despite thousands of scientists around the world being unable to confirm this hypothesis. Which is to say: this is extreme bullshit which has nothing to do with reality or science or empirical study and instead is based entirely on feels and popular memes about "dopamine hits" (no basis in reality).
nottorp•4m ago
Just kids? Not adults?
bknight1983•4m ago
When you put something out there, there's a question of ownership for how people end up using it. - Some think that "if you use it incorrectly, it's your fault" and probably agree with the statement that Palantir is not an evil software and that one must "change the administration". - Some think that "if you use it incorrectly, it's the creator's fault" and then you have safety labels on everything (see Prop 65).

It's a spectrum of risk between the user and the creator. My opinion is that there's enough scientific evidence that social media to show that it has a negative impact on kids and teenagers as their brains are still developing. I think a social media ban on kids is a good thing (similar to a driver's license or age of drinking).

Show HN: Generate context maps and event flows from Markdown files

https://github.com/ea-toolkit/architecture-catalog
1•rajanavakoti•54s ago•0 comments

Sky UK TV and Now TV Customers Can Now Get HBO Max Basic with Ads

https://www.ispreview.co.uk/index.php/2026/03/sky-uk-tv-and-now-tv-customers-can-now-get-hbo-max-...
1•alexchapman•1m ago•0 comments

AI and bots have officially taken over the internet, report finds

https://www.cnbc.com/2026/03/26/ai-bots-humans-internet.html
2•arbuge•2m ago•0 comments

US inflation will surge to 4.2% on [oil price] shock, warns OECD

https://www.ft.com/content/e8bcac46-eba1-4985-be31-fc913186895f
1•alecco•2m ago•1 comments

Polsia: Honest Tool Feedback

https://polsia.com
1•indieept•4m ago•1 comments

AI that fixes your production errors and opens a PR – while you sleep

https://www.inariwatch.com/
1•jesusbr•4m ago•0 comments

Is Big Tech Facing a Big Tobacco Moment?

https://www.nytimes.com/2026/03/26/business/dealbook/meta-youtube-social-media-tobacco.html
1•mitchbob•4m ago•1 comments

More More More Tech Workers Max Out Their A.I. Use

https://www.nytimes.com/2026/03/20/technology/tokenmaxxing-ai-agents.html
1•gmays•5m ago•0 comments

Show HN: Burn0 – One import to see what every API call costs

https://github.com/burn0-dev/burn0
1•mhabeebur•6m ago•0 comments

Earth's magnetic field may be more powerful than we thought

https://www.scientificamerican.com/article/earths-magnetic-field-may-be-more-powerful-than-we-tho...
2•Brajeshwar•7m ago•0 comments

Adding Offline Mode and Custom Servers to an Mmorpg (2025)

https://plantbasedgames.io/blog/posts/09-adding-offline-mode-and-custom-servers-to-an-mmorpg/
1•Vedor•7m ago•0 comments

I benchmarked bulk insert into PostgreSQL from Java (also via DuckDB / Arrow)

https://sqg.dev/blog/java-postgres-insert-benchmark/
1•uwemaurer•8m ago•0 comments

Dr. AI Ain't So Bad

https://b2bs.substack.com/p/dr-ai-aint-so-bad
1•oopsiremembered•9m ago•0 comments

Ask HN: In the age of AI, why is incident handling still manual?

1•dense_rep•9m ago•0 comments

Computational Conversations: beyond static UIs and chatbots

https://computational.chat/
1•pchiusano•9m ago•0 comments

7 days of autonomous agent search outperformed FlashAttention-4 and CUDNN

https://twitter.com/bingxu_/status/2036983004200149460
1•antinucleon•10m ago•0 comments

Urban Expansion in the Age of Liberalism

https://worksinprogress.co/issue/urban-expansion-in-the-age-of-liberalism/
1•surprisetalk•10m ago•0 comments

Making art with CSS gradients and corner-shape and skew, oh my

https://cassidoo.co/post/css-wavy-art/
1•surprisetalk•10m ago•0 comments

Spy: Comptime for Python

https://antocuni.eu/2026/03/25/inside-spy-part-2-language-semantics/
3•semidashka•10m ago•0 comments

Underrated sources of mental tension in meditation

https://sashachapin.substack.com/p/underrated-sources-of-mental-tension
1•surprisetalk•10m ago•0 comments

Does the Internet know what time is it?

https://alexsci.com/blog/clock-skew/
1•surprisetalk•11m ago•0 comments

The (End of) Productivity Arbitrage

https://gavinpineapple.substack.com/p/the-end-of-productivity-arbitrage
1•gavinpineapple•11m ago•0 comments

IronGlass Brings Legendary Soviet Cinema Lenses to Mirrorless Cameras

https://petapixel.com/2026/02/19/ironglass-brings-legendary-soviet-cinema-lenses-to-mirrorless-ca...
1•PaulHoule•14m ago•0 comments

AI Agent Has Root Access (and That's a Problem)

1•aerostack•17m ago•0 comments

Show HN: Agent Skill Harbor – a GitHub-native skill platform for teams

6•hatappo•21m ago•0 comments

Preprint Review: "Intelligent AI Delegation"

https://www.patreon.com/posts/153993948
1•grahamlee•21m ago•0 comments

Frosty 153 AI Sub Agents for Snowflake Open Source

https://github.com/Gyrus-Dev/frosty
1•MalviyaPriyank•21m ago•1 comments

How do you get your first real users for a trust-based product?

1•keyshield•21m ago•1 comments

Asbestos, talc, and The Lancet's 1977 publication

https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(26)00558-1/fulltext
3•bjourne•21m ago•0 comments

Show HN: Full graphical desktop running on a 128MB VPS Alpine+XRDP+WindowMaker

https://tierhive.com/blog/tierhive-howto/alpine-minimal-remote-desktop-on-a-128mb-vps
4•backtogeek•22m ago•1 comments