frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: PastePlop – yet another Mac clipboard manager

https://bendansby.com/apps/pasteplop.html
1•webwielder2•1m ago•0 comments

Warp is now Open-Source

https://github.com/warpdotdev/warp
1•doppp•2m ago•0 comments

Nvidia Nemotron 3 Nano Omni

https://developer.nvidia.com/blog/nvidia-nemotron-3-nano-omni-powers-multimodal-agent-reasoning-i...
1•qainsights•2m ago•0 comments

Tridimensional Visualization of a Blackbird Song [video]

https://www.youtube.com/watch?v=EgWMo4BrKBs
1•vinnyglennon•3m ago•0 comments

Ask HN: What do you check before launching a web app?

1•pagelensai•3m ago•0 comments

Show HN: How to become an Anti-founder, THE MANUAL

https://manual.cochranblock.org
1•cochranblock•3m ago•0 comments

Biggest US airlines spent $1.2B more on fuel in Q1

https://sherwood.news/business/the-6-biggest-us-airlines-spent-1-2-billion-more-on-fuel-in-q1-and...
1•speckx•4m ago•0 comments

Our Uncertain Uncertainties

https://kevinkelly.substack.com/p/our-uncertain-uncertainties
2•nowflux•4m ago•0 comments

You're the Bread in the AI Sandwich

https://every.to/context-window/you-re-the-bread-in-the-ai-sandwich
2•gmays•4m ago•0 comments

The Download: Musk and Altman's legal showdown, and AI's profit problem

https://www.technologyreview.com/2026/04/28/1136479/the-download-musk-altman-openai-trial-ai-prof...
1•joozio•4m ago•0 comments

From GitHub to Codeberg/Forgejo

https://www.jonashietala.se/blog/2026/04/28/from_github_to_codebergforgejo/
1•lawn•5m ago•0 comments

Doofioso (2006)

https://scottaaronson.blog/?p=75
1•Tomte•6m ago•0 comments

Remote Code Execution on Github with a single Git push

https://twitter.com/wiz_io/status/2049153209982140718
1•ramonga•6m ago•0 comments

The Royal Game (2020)

https://codemetas.de/2020/11/22/The-Royal-Game.html
1•tosh•8m ago•0 comments

Humpback whale 'Timmy' being transported towards ocean

https://www.dw.com/en/germany-stranded-whale-timmy-being-transported-towards-ocean-in-special-bar...
1•Tomte•9m ago•0 comments

How to Acquire a Country: A Thought Experiment

https://flyingsolo.bearblog.dev/how-to-acquire-a-country/
1•ankitdce•10m ago•0 comments

SXSW Used AI-Powered Trademark Tool to Censor Dissent on Instagram

https://www.404media.co/sxsw-used-ai-powered-trademark-tool-to-censor-dissent-on-instagram/
1•cdrnsf•11m ago•0 comments

Building a Fast Multilingual OCR Model with Synthetic Data

https://huggingface.co/blog/nvidia/nemotron-ocr-v2
1•ibobev•14m ago•0 comments

DeepSeek-V4: a million-token context that agents can use

https://huggingface.co/blog/deepseekv4
2•ibobev•15m ago•0 comments

An Aristotelian understanding of object-oriented programming

https://dl.acm.org/doi/10.1145/353171.353194
2•b-man•15m ago•0 comments

Adaptive Ultrasound Imaging with Physics

https://huggingface.co/blog/nvidia/raw2insights-adaptive-ultrasound-imaging
1•ibobev•15m ago•0 comments

General Motors says it expects $500M tariff refund after SCOTUS ruling

https://abcnews.com/Business/general-motors-expects-500-million-tariff-refund-after/story?id=1324...
1•testing22321•17m ago•0 comments

AI's Economics Don't Make Sense

https://www.wheresyoured.at/ais-economics-dont-make-sense-ad-free/
2•speckx•17m ago•0 comments

Ask HN: Is Apple's Weather Service Down?

2•thomascountz•18m ago•1 comments

Claude Leak Confirms It: LLM Systems Are Architecture, Not Prompts (Orca)

https://github.com/gfernandf/agent-skills
1•gfernandf1•19m ago•1 comments

A tradeoff in defining database schemas

https://www.natemeyvis.com/a-tradeoff-in-defining-database-schemas/
1•Brajeshwar•19m ago•0 comments

Show HN: Browser timers that float above your other windows

1•jmbuilds•21m ago•0 comments

Understanding LangGraph as a stateful execution system

https://internals.laxmena.com/p/langgraph-internals-how-production
1•laxmena•22m ago•0 comments

Why does walking through doorways make us forget? (2016)

https://www.bbc.com/future/article/20160307-why-does-walking-through-doorways-make-us-forget
6•thunderbong•22m ago•0 comments

Show HN: FastSvelte – FastAPI and SvelteKit Starter Kit for Python SaaS

https://fastsvelte.dev/
1•turtledevio•22m ago•0 comments
Open in hackernews

Google and Pentagon reportedly agree on deal for 'any lawful' use of AI

https://www.theverge.com/ai-artificial-intelligence/919494/google-pentagon-classified-ai-deal
109•granzymes•1h ago

Comments

morkalork•1h ago
Will lawful use be determined in secret courts a la NSA and FISA?
Sanzig•57m ago
Doubtful it will even get that far, the DoJ will simply draft an appropriate fig leaf memo with a predetermined conclusion and the government will simply plow on ahead.

https://en.wikipedia.org/wiki/Torture_Memos

stephbook•53m ago
They simply say they have that memo. Who knows whether they even drafted it for real? And if anyone starts looking, Gemini can quickly draft one itself. Nice!
vrganj•31m ago
Don't be silly.

"When the president does it, that means that it is not illegal." - Richard Nixon

kentm•4m ago
Also the Supreme Court, half of Congress, and apparently something like 40% of the American populace.
ceejayoz•1h ago
Who defines "lawful" if Google and the Pentagon disagree?

> The classified deal apparently doesn’t allow Google to veto how the government will use its AI models.

Seems concerning?

belzebub•1h ago
There's big air quotes energy in their statement
tdb7893•53m ago
Especially concerning with the how creative the executive branch can be when it comes to what laws mean. With little oversight, it seems guaranteed that it will be used for unlawful activities (despite whatever tortured argument some lawyer will have put into a memo somewhere).
f33d5173•50m ago
Lawful is presumably defined in the usual, common sense, ie we can do whatever the f we want until a court physically forces us not to.
dmd•41m ago
And since the court has no way to physically force anything - that's the executive branch's function, (it's right there in the name) - lawful has no meaning whatsoever if it's the executive branch that wants to break the law.
shevy-java•48m ago
It kind of reminds me of a mix of Skynet in Terminator and Minority Report. But nowhere near as interesting. More annoying than anything else.

I am kind of mad at James Cameron here. Skynet was evil but interesting. Reallife controlled by Google is evil but not interesting - it is flat out annoying.

ApolloFortyNine•45m ago
This has to be one of the strangest "debates" in history.

Congress and the courts obviously.

If you think there's a hole in the law tell your congressman, don't, for some reason, try and put Google or any Ai company above the government.

ceejayoz•42m ago
> Congress and the courts obviously.

The first is fully neutered. The second is far too slow.

"Nothing unlawful" needing to be in the contract is inherently concerning, as it's typically the default, assumed state of such a thing.

deepsun•23m ago
"follow the law" in contracts IMO is there to be able to claim a "breach of contract" by one party.
calgoo•23m ago
Please! That ship sailed a long time ago. Sure tell your congressman, who is most likely bribed (lobbying is bribing, lets use the real words) by the same companies to accept the deal. The courts can try, but who is going to enforce it when the people above says that its fine.
CobrastanJorji•31m ago
That's presumably the trick, and it's not a subtle one; it's why the article puts it on quotes in the headline. Google gets to claim that it stood up for principles because it boldly insisted that the government obey the law, and the government will claim that whatever it decides to do is lawful. It's the same as what OpenAI did except not handled buffoonishly.
ethagnawl•29m ago
The classified aspect is probably the most concerning. How can I write my representative (and expect a form letter response six weeks later) if I don't know what I'm objecting to or even if I should be objecting?
cooper_ganglia•12m ago
Why would you write a letter if you don't know what you're objecting to or even if you should be objecting?
ceejayoz•6m ago
Can't I object to not knowing?
cooper_ganglia•5m ago
No, that's what classified means.
ceejayoz•4m ago
Surely I can complain about overclassification of things that should not be classified?
impulser_•13m ago
No it doesn't at all. Private corporations shouldn't be telling the government what it can and can't do. That's the job of the people. You want private corporation overriding your vote?
cooper_ganglia•13m ago
Google should never be determining what is lawful or not.
jcgrillo•1h ago
It's pretty funny how these guys are all becoming some kind of internet version of, like, Halliburton. It seems pretty desperate. B2C and B2B applications didn't pan out I guess?
zarzavat•53m ago
It's one of two identified uses for AI that is profitable today: writing code and blowing up schools. They are desperate to show the market that the technology is anything more than a money pit.
tombert•1h ago
When my sister and I would play monopoly as kids, we had lost the manual so whenever we didn’t like the outcome of whatever happened, we would make up rules about what was right. Technically then, it was very easy stay compliant while still being able to do well because we could rewrite the rules.

Also, since I was older I feel like I was able to get away with those redefinitions a lot more often…

cucumber3732842•53m ago
The big reason it's "obvious" when tech megacorps do it is because big tech is new to the game and doesn't have an existing regulatory capture system already up and running and legitimized like medical, civil engineering, energy, agriculture, chemical, etc, do.

If this were 3M making nasty stuff for Northrop to put in bombs and drop on brown people or Exxon scheming up something bad in Alaska or bulldozing a national park for solar panels or some other legacy BigCo doing slimy things that are in the interests of them and the government but against the interest of the public they'd have 40yr of preexisting trade group publications, bought and paid for academic and media chatter, etc, etc, they could point to and say "look, this is fine because the stuff we paid into in advance to legitimize these sorts of things as they come up says it is" though obviously they'd use very different words.

SecretDreams•11m ago
> If this were 3M making nasty stuff for Northrop to put in bombs and drop on brown people or Exxon scheming up something bad in Alaska or bulldozing a national park for solar panels or some other legacy BigCo doing slimy things that are in the interests of them and the government but against the interest of the public they'd have 40yr of preexisting trade group publications, bought and paid for academic and media chatter, etc, etc, they could point to and say "look, this is fine because the stuff we paid into in advance to legitimize these sorts of things as they come up says it is" though obviously they'd use very different words.

My friend, this paragraph needed some periods. I could not follow what you were trying to say - but it seemed interesting enough to consider retyping.

hgoel•1h ago
How well does this hold up in terms of legal scrutiny when previous actions indicate that the Pentagon would retaliate against Google if they didn't accept this "lawful use only" farce?

Could Google back out of this agreement later by arguing that they were coerced?

Not trying to suggest that Google would be opposed to doing evil, but curious about how solid this agreement would be in practice.

john_strinlai•1h ago
there is 0 reason that the definitions of 'lawful' for the purposes of these agreements should be classified.
svachalek•29m ago
There's a reason, you just won't like it.
mullingitover•54m ago
Reminder that this administration has some absolute howler theories about what constitutes lawful behavior[1].

[1] https://www.nytimes.com/2025/09/20/us/politics/tom-homan-fbi...

shevy-java•50m ago
The beginning of Skynet 6.0.
qznc•44m ago
And that is news-worthy because unlawful use is normal?
ChrisArchitect•41m ago
https://archive.ph/FyzNS
ChrisArchitect•41m ago
One source: https://www.reuters.com/technology/google-signs-classified-a... (https://news.ycombinator.com/item?id=47931336)
Brian_K_White•39m ago
What a handy word "lawful".
Imnimo•39m ago
Unsurprising from Google, but still bad. If Google has no right to object to a particular use, this is equivalent in practice to "any use, lawful or not".
anygivnthursday•38m ago
Is Iran already a vibe war or those are just coming?
anematode•38m ago
Who could have seen this one coming. From yesterday: https://www.cbsnews.com/news/google-ai-pentagon-classified-u... ("Hundreds of Google workers urge CEO to refuse classified AI work with Pentagon").

Any AI researcher who continues to work here is morally compromised.

devin•34m ago
That's what the 7 figure salaries are for.
testfrequency•28m ago
It’s funny to me how many progressive people I know and am friends with who work at these AI companies which are marginalized demographics (Trans, Gay, Latino, Black).

Still have faded Bernie stickers on their cars, No Kings organizers, “fuck SF I’m in the east bay for life fuck tech” - and you all make 7 figures Monday - Friday by supporting the death of society and democracy.

I don’t dare say anything though because “money is money”, the bay is expensive..but I do sure as shit judge every single person I know who joined OAI, Anthropic, Google, and Meta.

site-packages1•23m ago
I would suggest looking inwards if this is how you really feel.
gambiting•18m ago
I'm curious what is that you're suggesting, exactly.
site-packages1•8m ago
I made another comment above. People contain multitudes. Different contexts, different choices, not everyone is in a box defined by the viewer's world view. You can't really know what's going on with someone else, in their heads, in their context, so give them some grace. Instead, this person's "friends" are "hypocrites" who were "lured" into their choices. It's very condescending. I am suggesting the poster re-examine their own views on other people in light of this.
testfrequency•18m ago
I mean no harm in saying what I said, I love my friends. I just can’t stomach the hypocrisy, it’s what the companies are preying and feeding off of.

My friends are incredibly bright and good at what they do, it’s why they all have the roles they have. It makes me sad (and frustrated) knowing they are lured in by enough money dangling in front of them that makes them swallow their souls and identity, while fuelling the fire in the same breath.

I have a deep amount of respect and gratitude for my friends (and anyone else) who chooses to work at non-profits, and more ethical - mission based companies for less. I hate how much these AI companies and roles are offering people, it’s completely forced lots of gifted people into a war machine.

site-packages1•11m ago
Do you suspect there is any chance they are fully independent adult human beings with full agency, who have looked at the pros and cons, and chosen to make the choices they did with clear eyes? Do you think there's any context that might square their choices with their own internal principles that don't make them hypocrites? I mean these as real questions. For "friends you love" you really seem to take a dim view of their intelligence.
beernet•23m ago
Agreed. Just shows that big money doesn't dilude small character.
tjwebbnorfolk•29m ago
Why is it morally compromising to work with the military of the country you live in?
plaidthunder•22m ago
I'm not anti-military as a rule but... c'mon. Opinions on the US military vary.

In extremis, were the people working for Pol Pot just good patriots with no moral culpability?

We could surely at least agree that there are cases where working for the military of your home country doesn't fully excuse you from your actions.

In fact, I think international tribunals have existed which operated on just those principles.

mrexcess•21m ago
We can all agree that working for the Nazi government’s military would be morally compromising, right?

You propose that other governments militaries would not be so compromising. Seems reasonable.

But the question then becomes, what is the operative distinction between the two?

cooper_ganglia•16m ago
I genuinely can't tell if you're serious or trolling. This feels like low-tier ragebait.

The operative distinction is "lawful use" in the United States of America does not mirror Nazi Germany in even the slightest way.

CamperBob2•13m ago
This government doesn't GAF what is "lawful" and what isn't. Was what happened to Pretti and Good in Minneapolis lawful? Would you work for ICE/CBP with no qualms at all?

See also the new national sport of hunting for fishing boats off the South American coast. Is that "lawful?"

And yes, since you went there: everything the Nazis did was "lawful." To the extent it wasn't "lawful," they made it "lawful."

cooper_ganglia•5m ago
All of those things are lawful and 100% justified, yes. Don't attack law enforcement with a deadly weapon, whether it's a vehicle or gun.

ICE is objectively more effective at protecting American citizens and interests than any conflict in Iraq or Afghanistan ever was.

Retrofitted "fishing boats" packed full of narco-terrorists and fentanyl being shipped to the US are entirely lawful to blow sky-high once they're in international waters.

exe34•3m ago
> Don't attack law enforcement with a deadly weapon, whether it's a vehicle or gun.

How do you attack law enforcement with a gun while on your knees, with your arms pinned behind you and the gun is holstered? It's interesting how we can watch the same video, and some people only see what they are told to see.

exe34•4m ago
Lawful use in the US is whatever Dementia Don says it is.
orochimaaru•29m ago
Why is it morally wrong for a US citizen to work with their government?
_vertigo•27m ago
It’s not morally wrong per-se but just because you are working with your government does not mean what you’re doing is necessarily moral
cooper_ganglia•24m ago
Just because you are working with your government does not mean what you’re doing is necessarily immoral, either.
_alternator_•18m ago
Correct. It depends. For example, it might depend on what the collaboration is likely to result in. Perhaps it would be more likely to be moral there were some boundaries in place, like "no mass domestic surveillance" or "no fully autonomous weapons".

Because the US government currently believes it is legal to blow up civilian drug traffickers and wage war without congressional approval. So at some point, yes, collaboration is immoral.

Forgeties79•18m ago
Who said otherwise? Clearly it’s about facilitating specific acts by the government. Why are y’all acting like it was so wildly broad? No one said “working with the government is inherently immoral.”
Jtarii•16m ago
Hegseth bombed a girls school in Iran last month. I think it's fair to doubt the moral worth of anyone assisting this admin.
conartist6•9m ago
It's ok, they weren't Christian girls, so of course they're in hell now. ...where Pete will go!

Hey, I think I'm starting to get how this organized religion thing works. Maybe I'll join a few to make sure I go to allllll the good places

conartist6•7m ago
I'm dripping with sarcasm here, but as far as I know that's actually what macho Pete believes. He believes he blew those girls to hell with god's own fury. Fuck you, Pete, fuck you.
mattnewton•14m ago
Idk about morality, but it’s certainly a way to stop dystopian mass surveillance nightmares if everyone capable of building one refuses.

So if you live in the US and don’t want one government agency in the US to have this power (that is ambiguous under current law), one way you can try to avoid it is by refusing to sell it to them and urging others to do the same.

It’s a long shot sure, but it certainly seems more effective than hoping the legislature wakes up and reigns in the executive these days.

psychoslave•13m ago
Given most government policies and direct engagement in all kind of monstrosities over the last millennia, there is really no reason to limit the case to USA, indeed.
finghin•7m ago
The acts of the government being wrong in an upsetting amount of cases would be a big reason.
tyre•4m ago
It’s not, but legal is not the same as ethical.

For a long time, and probably still, it was legal for the US to torture enemy combatants. It was never ethical.

declan_roberts•25m ago
Thankfully Russia, China, etc have the same qualms as we do in the United States and will refused to send their brightest engineers to work on weapons so they don't become "morally compromised"!!!
gambiting•20m ago
I don't know if you're being sarcastic(sounds like you are!) but indeed a lot of engineers left Russia after the war in Ukraine started as they didn't want to be drafted and didn't want to contribute to the war effort in some way, even if indirectly. Of course, many stayed or even willingly help. See how many engineers from Iran work abroad too, for moral and other reasons.

The point is - this happens everywhere, it's not just some weird western thing.

cooper_ganglia•14m ago
Great idea! I would gladly welcome anyone that has a problem with U.S. national security interests to also leave the country, as quickly as possible!
site-packages1•23m ago
> Any AI researcher who continues to work here is morally compromised.

Arguably it's exactly the opposite. In the same way we ask billionaires to pay their taxes because the regulatory regime is what allowed them the structure to make their billions in the first place, the national security of the country the AI researchers are in is what allows them to make a vast salary to work on interesting, leading edge capabilities like AI. They should feel obligated to help the military.

2OEH8eoCRo0•23m ago
Is it any less moral than surveilling your neighbors or turning your neighbors against each other with social media?
thisisauserid•6m ago
I agree that it is immoral to obey some laws. Which ones are you saying are immoral here?
acheron•6m ago
Google has been working to ruin the Internet, technology in general, and pretty much the world for the past quarter century. Anyone who has ever worked there is already morally compromised. Frankly, “contributing to national defense” is a giant step up from their normal tasks of “turning the world into an ad-infested zero-privacy dystopia”.
cdrnsf•37m ago
Lawful is meaningless in the context of the Trump administration. Should Google waver (which they won't), they'll be declared a supply chain risk or otherwise bullied into submission.
Ritewut•13m ago
Google holds immense power in their position. Trump can make their life very difficult but Google can make life for Trump very difficult as well. They have no need to kneel, they are choosing to.
f33d5173•10m ago
what immense power?
sailfast•33m ago
This all works if you assume that any action the government takes must be “lawful”. The assumption here is that the Pentagon is obeying the law and any unlawful use would go through normal reporting / violation channels - same as any illegal order or violation or whistleblower report.

The Pentagon does not want Google or anyone else deciding what they can and cannot use their AI for. They’re saying we won’t break the law, and that should be enough for you - pinky swear!

And that seems to be enough for Google. Though I might request some auditing capability that is agentic to verify rather than take them at their word.

Next step: is Google FEDRAMP’d yet for this and for classified enclaves? Or do they also go through Palantir’s AI vehicle?

vrganj•32m ago
See also: https://en.wikipedia.org/wiki/IBM_and_the_Holocaust

Capital and Big Tech have always been opportunistic enablers, not principled actors. Corporate Values have always been nothing but internal propaganda. "Don't be evil", what a farce.

flufluflufluffy•29m ago
> We remain committed to the private and public sector consensus that AI should not be used for domestic mass surveillance or autonomous weaponry without appropriate human oversight.

And starts the lying to our faces. The public and private (from your own employees!) consensus is that it should not be used for those things at all, regardless of “human oversight.”

calgoo•16m ago
I hate this part: `domestic mass surveillance`

So the rest of the world is fine to spy on, its the domestic part they don't agree with. So go on, destroy lives all around the world, helping the powers at be build the fascist state. Its fine to use Gemini to tell what building to blow up; its fine for Gemini to wrongly identify people and cause hundreds or thousands of deaths based on the telling the military who to attack.

ctoth•26m ago
Huh. I never realized the T-800 runs on Android. Makes sense, I guess.
psychoslave•17m ago
Do no evil. Well don't make anything illegal at least. I mean, let's not do what is different from whatever we wish at the moment.
chabes•3m ago
Snakes. All of them
ripvanwinkle•3m ago
One observation.

Having your work being used by the govt in ways you disagree with feels similar to having your taxes being used in ways you disagree.

When you pay taxes you have no say in the bombs acquired with that and where they are dropped. The latter though doesn't seem to provoke the same push back