frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

The Genus Amanita

https://www.mushroomexpert.com/amanita.html
1•rolph•54s ago•0 comments

We have broken SHA-1 in practice

https://shattered.io/
1•mooreds•1m ago•1 comments

Ask HN: Was my first management job bad, or is this what management is like?

1•Buttons840•2m ago•0 comments

Ask HN: How to Reduce Time Spent Crimping?

1•pinkmuffinere•3m ago•0 comments

KV Cache Transform Coding for Compact Storage in LLM Inference

https://arxiv.org/abs/2511.01815
1•walterbell•8m ago•0 comments

A quantitative, multimodal wearable bioelectronic device for stress assessment

https://www.nature.com/articles/s41467-025-67747-9
1•PaulHoule•10m ago•0 comments

Why Big Tech Is Throwing Cash into India in Quest for AI Supremacy

https://www.wsj.com/world/india/why-big-tech-is-throwing-cash-into-india-in-quest-for-ai-supremac...
1•saikatsg•10m ago•0 comments

How to shoot yourself in the foot – 2026 edition

https://github.com/aweussom/HowToShootYourselfInTheFoot
1•aweussom•10m ago•0 comments

Eight More Months of Agents

https://crawshaw.io/blog/eight-more-months-of-agents
3•archb•12m ago•0 comments

From Human Thought to Machine Coordination

https://www.psychologytoday.com/us/blog/the-digital-self/202602/from-human-thought-to-machine-coo...
1•walterbell•13m ago•0 comments

The new X API pricing must be a joke

https://developer.x.com/
1•danver0•14m ago•0 comments

Show HN: RMA Dashboard fast SAST results for monorepos (SARIF and triage)

https://rma-dashboard.bukhari-kibuka7.workers.dev/
1•bumahkib7•14m ago•0 comments

Show HN: Source code graphRAG for Java/Kotlin development based on jQAssistant

https://github.com/2015xli/jqassistant-graph-rag
1•artigent•19m ago•0 comments

Python Only Has One Real Competitor

https://mccue.dev/pages/2-6-26-python-competitor
3•dragandj•20m ago•0 comments

Tmux to Zellij (and Back)

https://www.mauriciopoppe.com/notes/tmux-to-zellij/
1•maurizzzio•21m ago•1 comments

Ask HN: How are you using specialized agents to accelerate your work?

1•otterley•23m ago•0 comments

Passing user_id through 6 services? OTel Baggage fixes this

https://signoz.io/blog/otel-baggage/
1•pranay01•23m ago•0 comments

DavMail Pop/IMAP/SMTP/Caldav/Carddav/LDAP Exchange Gateway

https://davmail.sourceforge.net/
1•todsacerdoti•24m ago•0 comments

Visual data modelling in the browser (open source)

https://github.com/sqlmodel/sqlmodel
1•Sean766•26m ago•0 comments

Show HN: Tharos – CLI to find and autofix security bugs using local LLMs

https://github.com/chinonsochikelue/tharos
1•fluantix•27m ago•0 comments

Oddly Simple GUI Programs

https://simonsafar.com/2024/win32_lights/
1•MaximilianEmel•27m ago•0 comments

The New Playbook for Leaders [pdf]

https://www.ibli.com/IBLI%20OnePagers%20The%20Plays%20Summarized.pdf
1•mooreds•27m ago•1 comments

Interactive Unboxing of J Dilla's Donuts

https://donuts20.vercel.app
1•sngahane•29m ago•0 comments

OneCourt helps blind and low-vision fans to track Super Bowl live

https://www.dezeen.com/2026/02/06/onecourt-tactile-device-super-bowl-blind-low-vision-fans/
1•gaws•30m ago•0 comments

Rudolf Vrba

https://en.wikipedia.org/wiki/Rudolf_Vrba
1•mooreds•31m ago•0 comments

Autism Incidence in Girls and Boys May Be Nearly Equal, Study Suggests

https://www.medpagetoday.com/neurology/autism/119747
1•paulpauper•32m ago•0 comments

Wellness Hotels Discovery Application

https://aurio.place/
1•cherrylinedev•33m ago•1 comments

NASA delays moon rocket launch by a month after fuel leaks during test

https://www.theguardian.com/science/2026/feb/03/nasa-delays-moon-rocket-launch-month-fuel-leaks-a...
1•mooreds•33m ago•0 comments

Sebastian Galiani on the Marginal Revolution

https://marginalrevolution.com/marginalrevolution/2026/02/sebastian-galiani-on-the-marginal-revol...
2•paulpauper•36m ago•0 comments

Ask HN: Are we at the point where software can improve itself?

1•ManuelKiessling•37m ago•2 comments
Open in hackernews

Meta's Teen Accounts Are Sugar Pills for Parents, Not Safety for Kids

https://overturned.substack.com/p/metas-teen-accounts-are-sugar-pills
56•kellystonelake•4mo ago

Comments

keanb•4mo ago
This is a bit one-sided innit. It doesn’t include the explanation given by Meta, just their rejection of the claims.
JohnMakin•4mo ago
They don't give an explanation, just say the the researchers' claims are wrong. Here is the quoted article:

https://www.bbc.com/news/articles/ce32w7we01eo

softwaredoug•4mo ago
Is there anyone else like me that just has teens disinterested in social media? My teen spends a lot of time online, but mostly its group texts / chats with friends.
paxys•4mo ago
Do you define TikTok and Snapchat as social media? Because teens are definitely interested in those.
kellystonelake•4mo ago
Over 20% of adolescents met criteria for “pathological” or addictive social media use, with an additional ~70% at risk of mild compulsive use. Teens themselves often recognize the problem—many say social media makes them feel “addicted” or unable to stop scrolling, even when it negatively affects their mood. The Surgeon General highlighted that teens commonly report social media makes them feel worse “but they can’t get off of it“
softwaredoug•4mo ago
One lilfehack a parent told me about -- instead of buying your kid a phone, buy them a cellular capable smart watch they can still text / call to some extent.
cynicalsecurity•4mo ago
I don't quite understand why those activists expect companies to watch kids. It's parents' job. Facebook and Instagram are like big malls. Don't leave your child unattended. A mall is not a kindergarten. Educate parents on how they can protect their kids and make them responsible for it. That is how it's supposed to work. Now internet companies are forced to become nannies for both children and their parents, this is ridiculous. And we must suffer with the the UK fascist laws regarding internet and upcoming EU chat control nonsense because some parents can't watch their children.
pjc50•4mo ago
Do people think it is practical to watch children 24/7, including while the child is at school and the parent is at work?
bonoboTP•4mo ago
Half their ailments are from social media, the other half from obsessive helicopter parenting.
kellystonelake•4mo ago
Meta markets products as safe for kids and offers products to reinforce this for parents and regulators but they’re not actually keeping kids safe. Meanwhile, kids die. That’s less an issue of parenting and more one of corporate responsibility.
Argonaut998•4mo ago
I generally hate social media and their antics but the one thing I’m taking their side on, is this. Children’s exposure to content on the internet is a parenting issue. It is impossible for Meta, ByteDance etc to filter any and all possible content that may not be suitable for minors. Parents should know that.

Zuckerberg famously doesn’t (didn’t?) let his children use Facebook. Perhaps everyone else should take a hint.

pjc50•4mo ago
How is it possible for parents of average technological ability and limited means to do what a multi billion dollar platform cannot?
Argonaut998•4mo ago
My parents didn’t know how to turn on a PC or use a phone and they knew what I was looking at or talking to until I was 16 years old.

There are all kinds of services that parents can use now to filter this even further than what was possible 10-15 years ago

kellystonelake•4mo ago
The point of this research is that these services are often ineffective at "filtering" yet, as your comment demonstrates, makes people (parents, regulators) feel like the platforms are more safe.
Argonaut998•4mo ago
Parents should be mature enough to not take a scandalous social media company too seriously without any scrutiny.

Note I’m not saying that Facebook don’t profit from being as ineffectual as possible, but ultimately that parents should know better.

a456463•4mo ago
I think it is wild for people to understand what you are saying. Taking responsibility for your children instead of finding excuses. I had full internet access growing up and still I new better.
pjc50•4mo ago
My parents let me go and hang out with friends and commute to school by train. I guess everyone is expected to be full helicopter these days.

One of my friends was busted at school for distributing porn downloaded from BBS on floppy disks.

Waterluvian•4mo ago
In concept you're not necessarily wrong. But I think this is one of those "why can't people just <do something that's impossible for people who don't live the life you imagine everyone to live>?"

Back in the day, hordes of kids were just set loose on the city to find empty lots to fuck around in, because a lot of families are just scraping by and the whole concept of full-time supervision of children is laughably naive. Now they're on the TikToks and the Facestagrams, which has its own set of advantages and disadvantages.

jhanschoo•4mo ago
This is something that I want to emphasize. While of course, all this discussion is centered around the present-day US context. It is not realistic to expect parents to curate their children's world, it is too much supervision to expect from two people, and historically not the case. Parents do have a domain in which they are able to supervise children, but beyond that, they rely on a tripartite social contract with their communities immediate (school, family friends, neighbors, etc.) and distal (Internet, government social programs, parenting/baby products, entertainment) to help provide a safe enough environment for their children to apprehend the world in even in times when supervision is not possible.

In this case, Meta's product that is says are suitable for teens simply doesn't meet the expectations of safety that most parents expect and it's good that there is reporting on it to inform them.

bartread•4mo ago
> Zuckerberg famously doesn’t (didn’t?) let his children use Facebook. Perhaps everyone else should take a hint.

And I think he was right to do this.

But he’s also the person who created Facebook, who allowed it to be used by children, so it’s extremely hypocritical, no?

Why doesn’t he make Facebook for over 18s only?

Money is why.

He’ll protect his own children whilst at the same time he thinks it’s fine to harm other peoples’. He think this whilst knowing full well that not all parents are created equal and that many parents won’t (or perhaps can’t) do what is best for their children’s welfare relating to social media. And this is because he values making money more, and because he’s an out of touch and elitist manchild who refuses to take responsibility for what he’s created.

So, yes, it’s Meta’s - and specifically Mark Zuckerberg’s - fault that children are exposed to harmful content, and both they and he should absolutely be held to account for it.

Fundamentally social media companies are media companies. Just because it’s a new form of media doesn’t mean they should get to dodge the scrutiny and responsibilities other media companies are subject to.

gruez•4mo ago
>But he’s also the person who created Facebook, who allowed it to be used by children, so it’s extremely hypocritical, no?

>Why doesn’t he make Facebook for over 18s only?

Should non-smokers be allowed to be tobacco executives? I think everyone agrees that smoking is bad, but whether the CEO partakes in it shouldn't really be a relevant factor either way.

anang•4mo ago
Is it ok if a tobacco executive downplays risks with smoking while at the same time forbidding their own children from smoking?

I think that’s a more accurate analogy, and I think it also would be reprehensible behavior.

kellystonelake•4mo ago
Agree
jchw•4mo ago
It's kind of amazing just how horrifically wrong this is all playing out. It actually feels like it's playing out worse than the worst case scenarios I could come up with.

First there is the obvious question: who is giving teenagers unfettered access to the Internet? Phones cost money. Home Internet costs money. Mobile data costs money. Best you can say is kids could get online using McDonalds WiFi with a phone they bought for lawnmower money, but we don't have to play pretend. We know how they got phones and Internet, the same way kids and teens were exposed to TV. Apparently though, despite this obvious first step in accountability, it's just absolutely all shoulders. This step is apparently so unimportant it has been completely not worth mentioning. I hate to just bitch about parents because I absolutely know parenting isn't easy, is important for society, and that the conditions we're in make it hard to feel "in control" anymore. On the other hand, this isn't really exactly a new problem. All the way back in 1999, South Park: Bigger, Longer and Uncut basically addressed the same god damn thing. And I don't mean to equate obscenity on TV with the kinds of safety risks that the Internet can pose, but rather particularly the deflection of blame that parents engage in for things they directly facilitated. Seriously, the "Blame Canada" lyrics somehow feel as prescient as ever now:

    Blame Canada
    Shame on Canada
    For the smut we must stop, the trash we must smash
    The laughter and fun must all be undone
    We must blame them and cause a fuss
    Before somebody thinks of blaming us
Though honestly, I absolutely think that this doesn't mean social media companies aren't to blame. Everyone knew we were selling sex and ads to kids. I think that Twitter and Tumblr were extremely violently aware that they were basically selling sex to kids, and if anything just tried as much as possible to ensure their systems had no way for them to be able to account for it (on Twitter many people have always wanted an option to mark their accounts as 18+ only, but as far as I know you still can't. A few years ago I think they added a flag they can apply to accounts that hides them, but it's still not possible. And although Twitter has sensitive content warnings... It doesn't actually let you specify that something is explicit. Only that it is sensitive, or maybe even that it contains nudity. I think this blanketing is intentional. It provides some deniability.) For their role in intentionally blurring lines so they can sell sex and ads to kids while mining their data, they should be penalized.

But instead, we're going to destroy the entire Internet and rip it apart. Even though the very vast majority of the Internet really never had any problems that crop up with kids on social media to any particularly alarming scale, we're going to basically apply broad legislation that enforces ISP level blocks. In my state there was a proposal to effectively ban VPNs and Internet pornography altogether!

I think ripping apart the Internet like this is basically evil for all parties involved. It's something regulators can do to show that they really care about child safety. It's something parents can support in stead of taking accountability for their role in not doing the thing called "parenting". And I bet all of the massive social media companies will make it out of this just fine, essentially carving their own way around any legislation, while the rest of the Internet basically burns down, as if we really needed anything to help that along.

We will never learn.

dfxm12•4mo ago
This is naive, given the situation. Meta is promoting a product under reportedly false pretenses to convince parents to allow their kids into the Meta platform early. In a way, it's like advertising cigarettes with cartoons (except here the advertising is targeting the parents, not the kids).
kellystonelake•4mo ago
this
observationist•4mo ago
It's a parenting issue. There may be kids who can use it responsibly, to a specific purpose. It's up to parents to decide and monitor and parent that, and I think it's fine for companies to not take it upon themselves to act for the parents. Part of the goals of parenting is to teach children how to engage with the world responsibly, and for better or worse, social media is now a fixture.

In a perfect world, the internet shouldn't be used by anyone under 18 without monitoring by their parents. That doesn't mean we should legislate criminal penalties for parents who fail to "correctly" parent, nor should we penalize companies whose service or product is used poorly or whose use results in a negative outcome.

Some things are cultural and social, and government isn't the right tool to fix problems. The cost of governance - loss of privacy and agency, unnecessary intrusion and bureaucracy, mistakes and waste of money and time - far exceeds the cost of letting society figure its shit out.

Yeah, there will be tiktok casualties, zombies, people with feed-fried brains. There will even be suicides and tragedies and stupid avoidable deaths from meme challenges. That's better than the alternatives, and it's not ok to put the responsibility for those negative social outcomes on companies when it's a failure of parenting. It's tragic and terrible, but we shouldn't let sympathy for parents allow shifting the blame to social media or AI companies.

That being said, there should be guardrails - no algorithm tuning to target children, rapid detection, reporting, expulsion, and lifetime bans for predators on a platform, no advertising to children, and so forth. Require parental consent and monitoring, with weekly activity summaries emailed to parents, or things like that - empowering parents and guardians and giving them tools to make their life easier, that'd be admirable.

With platforms like Roblox, they're effectively so large that it's impossible to responsibly moderate, so they get infested with predators and toxic situations.

I think it's probably going to require society to limit how big platforms and companies can get - if the service or product or platform cannot be managed responsibly at a certain scale, then they're not allowed to operate at that scale anymore.

HWR_14•4mo ago
Why not just prevent people under 18 from using social media?
2OEH8eoCRo0•4mo ago
It opens HNs favorite other can of worms called age verification.
a456463•4mo ago
It is totally a parenting issue. Nothing to be solved by regulation or anybody else. That is the "BIG BAG WORLD". And this is from someone who absolutely loathes all Big Tech companies, Apple, FB, Google, MSFT and AMAZON. And I work in tech for 18+ years

EDIT: What adults need to understand is that IG is an ad platform first and not an IM or connection platform. And leave it. If you can't convince adults then children are totally lost cause. Adults also face the same problems. These don't go away when you are not a teen. Where your teen goes and what they do, is up to you as a parent.

kellystonelake•4mo ago
It would still be wrong for a company to be offering safety products that aren’t effective to parents and marketing them as effective, right?
a456463•4mo ago
Yes. For sure!
tartoran•4mo ago
Social media is toxic for adults and especially for children. Parents should get themselves off social media and then children will likely follow suit.