frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Oracle gave its new CFO $26M in stock after firing up to 30k workers

https://moneywise.com/news/top-stories/oracle-gave-its-new-cfo-26m-in-stock-after-firing-up-to-30...
1•robtherobber•57s ago•0 comments

Data breach at European fitness chain Basic-Fit [pdf]

https://corporate.basic-fit.com/docs/Basic-Fit%20informs%20members%20of%20an%20unauthorised%20dat...
2•lode•5m ago•0 comments

Hybrid search (BM25/vectors/RRF) barely improved over pure semantic

1•pjmalandrino•5m ago•0 comments

NavIC's Clock Crisis, and the Indian Clocks That Could Fix It

https://swarajyamag.com/technology/navics-clock-crisis-and-the-indian-clocks-that-could-fix-it
1•robertlangdon•5m ago•0 comments

Show HN: Show HN:CryptographicTimestamps4 human testimony(2HTMLfiles,noserver)

https://github.com/Bardockthegreat/thomas-more-witness-protocol-
1•Bardockthegreat•9m ago•0 comments

Tell HN: AI is bringing back waterfall, here's what I've found

1•keepamovin•13m ago•1 comments

A Git helper tool that breaks large merges into parallelizable tasks

https://github.com/mwallner/mergetopus
1•schusterfredl•15m ago•1 comments

If You're Only Running One Claude Code Session, You're Not Going Fast Enough

https://www.scape.work/blog/you-are-not-going-fast-enough
1•bgnm2000•17m ago•0 comments

We built a Green Screen Remover tool to automate batch green screen removal

https://ugcmaker.io/green-screen-remover
1•MiaTaylor•18m ago•0 comments

Ask HN: How are you handling runtime security for your AI agents?

2•saranshrana•19m ago•0 comments

Site ranks #1 on Google. ChatGPT has never heard of you

https://www.spotlight.cx/blog/keywords-are-dead
1•soorajsanker•23m ago•0 comments

LLMs don't know how to think

https://tictacguy.github.io/Meta-Reasoning/
2•tomolomolo•25m ago•0 comments

Digital Experience Consulting Company – ViitorCloud

https://viitorcloud.com/capabilities/digital-experiences/
1•Olivia_Watson•25m ago•0 comments

Hypotheses for Why Models Fail on Long Tasks

https://www.lesswrong.com/posts/jLZwydRtwRguCjEnd/5-hypotheses-for-why-models-fail-on-long-tasks
1•joozio•26m ago•0 comments

Show HN: Mantyx – Automate and accelerate your Software Operations

https://mantyx.io/software-operations
1•mantyx•26m ago•0 comments

Combining Rate and Instructions to Create Beautiful Madness

https://biggieblog.com/combining-rate-and-instructions-to-create-beautiful-madness/
1•panic•28m ago•0 comments

"Cursor Agent Is a Rebranded Claude Code"

https://twitter.com/jasonkneen/status/2043435856849940818
1•nreece•29m ago•0 comments

Community pushback on GitLab issues overhaul

https://gitlab.com/gitlab-org/gitlab/-/work_items/590689
1•jplunien•35m ago•0 comments

Intuit compressed months of tax code implementation into hours

https://venturebeat.com/data/intuit-compressed-months-of-tax-code-implementation-into-hours-and-b...
1•cpeterso•39m ago•0 comments

Fees for seas: a history of taxing waterways

https://www.ft.com/content/9a5294cf-0b64-4201-b88c-12ba586bb4fd
2•uijl•44m ago•1 comments

ChatGPT praises mood and 'bedroom/DIY texture' of fart sounds

https://www.pcgamer.com/software/ai/chatgpt-will-praise-the-mood-and-bedroom-diy-texture-of-fart-...
2•wesfenlon•46m ago•0 comments

The coordination tax: six years watching a one-day feature take four months

https://www.indiehackers.com/post/the-coordination-tax-six-years-watching-a-one-day-feature-take-...
2•max_flowly_run•48m ago•0 comments

Is AI Really gonna take our jobs?

1•PotatoAditya•48m ago•0 comments

Security Best Practices for Speedify Self-Hosted Servers

https://support.speedify.com/article/1070-security-best-practices-for-speedify-self-hosted-servers
1•goodburb•49m ago•0 comments

Show HN: sqlc-gen-sqlx, a sqlc plugin for generating sqlx Rust code

https://github.com/mathematic-inc/sqlc-gen-sqlx
1•jrandolf•53m ago•2 comments

What is gravity? – A 7-minute read

https://corpusk.info/what-is-gravity.html
1•nik_slusarenko•58m ago•0 comments

AI Changed What We Build. Then It Changed Who We Hire

https://www.hauser.io/ai-changed-what-we-build-then-it-changed-who-we-hire/
1•bkfh•1h ago•0 comments

Show HN: Xlg – Jq for APIs

1•alykirk•1h ago•1 comments

Programming Used to Be Free

https://purplesyringa.moe/blog/programming-used-to-be-free/
1•yeputons•1h ago•0 comments

The P Source: How humanities scholars changed modern spycraft (2020)

https://paw.princeton.edu/article/p-source
3•walterbell•1h ago•0 comments
Open in hackernews

The AI Layoff Trap

https://arxiv.org/abs/2603.20617
42•armcat•1h ago

Comments

rvz•1h ago
Let’s take AGI to its inevitable raw conclusion. Not by the definition (ab)used by clueless VCs screaming about abundance, but by what is already happening using the worst case:

The abundance of mass layoffs and job displacement due to funding and building of AI systems is the true definition of AGI.

We might as well get there faster instead of delaying it. You have already seen Oracle and Block attributing their layoffs to AI so it is happening right now.

So why delay any further and just get it over with.

doesnt_know•1h ago
Get where faster? Get what over with?

Aren’t you talking about destroying livelihoods? Pushing people into poverty and/or homelessness? What is the benefit exactly?

Closi•44m ago
I guess the argument would go that a new economic model will be required at that stage.

There isn't much point in having people do jobs they don't like which are trivial to automate just for money, but at the point where there isn't enough economically useful things for everyone to do, the current system falls down.

> What is the benefit exactly?

Well one benefit would be international competitiveness. The country that does it slowest will be the country doing more work for less output.

samrus•38m ago
The paper is suggesting such a new economic model. Do you have a another proposal?
palmotea•6m ago
> I guess the argument would go that a new economic model will be required at that stage.

> ...but at the point where there isn't enough economically useful things for everyone to do, the current system falls down.

Not necessarily. To quote the Bobs from Office Space: "He won't be receiving a paycheck anymore, so it will just work itself out naturally." No need to change, just let the plebs die out.

SilentM68•37m ago
Exactly!

As of now, there is no benefit to regular working people. Perhaps in the future, great abundance will occur, but as of now, there will only be job loss, fear, neo-luddism, and blame.

Believe me when I say that I know people, some close to me, that are experiencing fear due to automated systems being installed and tested where they work. They are essentially witnessing start of their automated replacement robot workforce.

Whatever is planned in terms of AI being used to help people needs to happen, sooner rather than later, because all I am seeing is chaos in the horizon.

(⧘⟃⨅⟄⧘)≋≋≋⦻

mullingitover•56m ago
The article is saying that the solution here isn’t to just throw up our hands and commit suicide as a nation, it’s to simply tax the AI, punishing the negative externality.

Seems like the obvious answer to the prisoner’s dilemma problem where everyone wants to lay off their workforce, but expects that they’ll be the only ones to get this bright idea.

friendzis•34m ago
> it’s to simply tax the AI, punishing the negative externality.

That "simply" is working overtime here.

542458•18m ago
What’s a bit hard for me to rationalize here is why are market shifts considered a negative externality here? We didn’t tax moulding machines because they reduced the demand for sculptors.

Don’t get me wrong, I think the end goal of “Tax those who can pay for it to build a social safety net” is reasonable, I just don’t buy the “negative externalities” argument.

542458•54m ago
I am thoughoughly unconvinced that the “AI-based layoffs” are actually caused by AI displacing workers and aren’t just regular layoffs caused by other factors with a smokescreen of “Actually we’re laying people off because we’re doing really well, please don’t dump your stock”.
rsalus•1h ago
Great paper.

> If AI displaces human workers faster than the economy can reabsorb them

Big if.

ithkuil•33m ago
Consequential enough that it's reasonable to plan for it.
paulpauper•59m ago
If AI displaces human workers faster than the economy can reabsorb them, it risks eroding the very consumer demand firms depend on.

That is a huge "if" though. I am not sure either that the latter falls from this. When the US transitioned away from assembly lines or agriculture dominated, it's not as if consumer spending consequently collapsed.

whazor•34m ago
Especially also since AI providers are struggling with scaling up.
thephyber•23m ago
When did the US transition away from agriculture?

When did the US transition away from assembly lines?

I don’t think you have thought through either one of these and I don’t think they are comparable to what we expect to see for AI’s changes to the job market.

semiinfinitely•55m ago
neo-luddism dressed up in economic jargon. the authors suggest the only effective tool is to tax companies based on how much automation they achieve. Penalizing efficiency is a guaranteed recipe for stagnation and if we'd done this at any point in our past we would have not made it out of the dark ages
paulpauper•48m ago
The Chicago School would probably hate it
jaccola•47m ago
It’s funny, I think most people roll their eyes when Trump says things like “you’ll be tired of winning, you’ll say ‘please no more winning’”.

But recommendations to tax efficiency are unironically that (just dressed in more serious language). “Please stop giving us what we want so efficiently, we want to work more for it!”

gostsamo•19m ago
Those winning and those asking for this to stop are two different categories of people. The former are the capital holders and the latter are those with no source of capital in their future. If you can merge the two categories, we may talk again. Until then, you need to come up with something better than trickle down in a world where there is no trickling.
palmotea•18m ago
> But recommendations to tax efficiency are unironically that (just dressed in more serious language). “Please stop giving us what we want so efficiently, we want to work more for it!”

You're trying to make it sound ridiculous, but most people aren't pure consumers. They're laborers and consumers. Policies that hurt while wearing the consumer hat may be more than justified by the benefits while wearing the labor hat.

thewhitetulip•45m ago
Let companies manage to automate 100% of their processes by having their own country/law/constitution!

Companies produce goods which people consume. If you hand everything over to the oligarch class, how will people consume products built by companies???

samrus•42m ago
Your not looming at anything after the first order effects. The idea of work os that people participate in the economy, what does a post work economy look like? How do people have the cashflows necessary to participate in things like housing and food and stuff when their way of contributing to the economy was automated away?
esseph•34m ago
I expected when we actually get to that point, we'll have an even worse version of the K-shaped economy.
jay_kyburz•23m ago
I don't believe there will be a "post work" economy. Some people will turn to farming, others will turn to crime and mischief (hunter gatherers).
jay_kyburz•30m ago
I agree, and I often wonder why companies pay any tax at all, rather than just hitting the shareholders as their wealth grows. There was a post a few months ago about taxing unrealized gains that was very interesting I thought.

The company itself has an impact on our society and needs to be "governed", so it seems reasonable for them to pay for that governance. Actually, it seems unfair that you can claim no profit and get out of paying for that governance.

I'd be really interested to know if companies pay the true cost of their impact to society, or if individual income tax has to pick up the tab.

ralfd•5m ago
Companies do generally act as tax collectors rather than the final bearers of the cost, as taxes just increase the price for the end consumer.
Epa095•12m ago
Note that it's usual that companies gets tax on surplus, not income. So we are already 'punishing' the efficient ones, we are just doing it in a relatively neutral way.

In systems with progressive income tax, the total tax income from a company with 1 employee making X$ is more than if the company had 2 employees making X/2$, so essentially 'punishing' using highly skilled labour over more less skilled ones.

There are no perfect taxes, and current tax systems have adapted from a lot of practical concerns. Some of those is that's it's easier to tax money as they are moving around vs when they are sitting still (wealth or property tax), and it's easier to tax people than abstract entities like companies, since people have a harder time moving. And for the same reason, it's easier to tax the middle class than the owner class, since the richer you are the easier it is to move yourself and those you care about to wherever taxes are low these days.

All these practical concerns have made it such one of the most common ways for the state to get a share of the productivity of its society, is from income tax. But this is not a 'economic law' that if must be like that. If more and more of the productivity and wealth creation in society is produced such that there is little employment income involved, we will have to find other ways to tax it.

drivebyhooting•53m ago
The trick is bypassing the human consumer as well. Companies satisfy (human) consumer needs as a byproduct of profit maximization. But human consumers are inefficient. They have to sleep, require medical care, etc.

A purely machine economy would be far more efficient. Therefore in the limit we should eliminate reliance on human labor and consumption to build a more perfect and efficient world.

paulpauper•49m ago
The "economy on a chip" thought experiment .
samrus•41m ago
The humans consume to fulfill needs, how do those needs get fulfilled in a post human economy?
exitb•34m ago
Having large amounts of people with unfulfilled needs is not exactly a novel idea.
ithkuil•33m ago
Just train machines on the huge corpus of human needs so they can need things like no human has needed things before.

What can possibly go wrong

tonyedgecombe•25m ago
>The humans consume to fulfill needs

That's not how the capital class thinks of human consumption.

giacomoforte•21m ago
You jest, but isn't this the logical conclusion? A sufficiently smart AGI has no need for humanity, at all.
leokennis•20m ago
Idea! Maybe these now redundant humans can be turned into a kind of battery, so they serve as a source of energy for the machines?

Perhaps it's then smart to make the humans have a brain/computer interface, to make then dream/think they are living in a normal society so they don't revolt.

slopinthebag•52m ago
Still looking for the AI in the room. Where is it exactly? Surely nobody is claiming LLMs are AI?
querez•50m ago
AI doesn't need to be AGI to be useful. Surely, you've tried Claude Code before and found it more helpful than Clippy?
IshKebab•43m ago
Of course it's AI. It's not AGI yet.
nurettin•41m ago
While technically inaccurate, there is so much talk around it that when someone says AI everyone assumes LLM at this point.
xyzal•50m ago
I use AIs for coding with moderate success, but the more I work with them, the more I am convinced that "intelligence on tap" is a pipe dream, especially in domains where logical thinking in novel (ie not-in-dataset) contexts is required.

Recently, I tasked it to study a new Czech building permit law in conjunction with some waste disposal regulations and the result was just tragic. The model (opus 4.6) just could not stop drawing conclusions from obsolete regulations in its training dataset, even when given the fulltext of the new law. The usual "you are totally right" also applied and its conclusions were most of the time obviously wrong even to a human with cursory knowledge of the subject.

I ended with studying the relevant regulations myself over the weekend.

lukan•47m ago
"The model (opus 4.6) just could not stop drawing conclusions from obsolete regulations in its training dataset"

To be fair, humans are also often like this. If some rule/law/model was deeply ingrained into them, they often cannot stop thinking in terms of that rule, even if they are clearly in a new context (like a new country).

xyzal•43m ago
When the mandatory speed limit in my country was reduced from 60km/h to 50km/h in cities, 95 percent of people instantly adapted.
lukan•18m ago
But that is pretty much the same rule, just the numbers slightly adjusted. What do you think would happen if they changed traffic from the right to the left lane?
xyzal•12m ago
Heh, that would be surely funny :) But most people at least know there is a new permit law and if they are not sure, they are to seek expert guidance. The model is even with explicit notification unable to reflect upon this fact. How it is supposed to be useful then?
lukan•5m ago
Oh, most people would know in theory for sure, but if they go into driving, habit would kick in and they end up on the wrong lane pretty quickly.

At least that is what happened to me in australia and I only had a year of driving practice back then, but driving on the right side was already deeply ingrained and I had to be really aware of what I did.

But to be clear, I am not arguing models have real understanding of anything - I know they don't. My point was humans can be similar in pretending to have understood something, but if their core was modeled different, they will fall into old patterns again quickly.

ithkuil•30m ago
I wonder what percentage of the job space truly depends on the current edge we have over machines.

I think it's reasonable to worry that way before machines are more reliable than the average human (let alone more reliable than a highly trained human) they can pose a significant disruption to the job market which will send shockwaves throughout society

isoprophlex•43m ago
So... the solution is basically "pay tax on the demand that you're destroying".

We can all hate on the premise (ai is good enough to do this) and/or the solution presented (centrally enforced taxation), but you gotta admit:

the messaging from SV's AI leaders about how "ai will take all your jobs" is confused as fuck, because if so, who will be on the consuming end of things?

ithkuil•12m ago
If such tax would introduce an asymmetry that will favour human employment, then there should be enough buying power to create some demand
palmotea•10m ago
> the messaging from SV's AI leaders about how "ai will take all your jobs" is confused as fuck, because if so, who will be on the consuming end of things?

Maybe SV's AI leaders and other assorted trillionaires. A capitalist economy that drops any pretense of serving the needs of anyone except a tiny elite.

khalic•38m ago
While I agree with the general sentiment that this requires monitoring and study, the abstract is _very_ tendentious, lays multiple hypothesis as facts and doesn’t provide any measurement or alternatives to their preferred solution.

This isn’t a scientific study, it’s a militant manifesto

bwhiting2356•37m ago
If robotics progress starts to pick up, I'll take this more seriously. Right now, there's practically infinite demand for labor in construction, manufacturing, agriculture and many other industries. All kinds us good projects that could be happening, if you dig into why, labor intensive work is a factor. Why didn't the hydroponics project take off? Why is that still an empty lot instead of a new home? Why isn't there live theatre in this small city? Why is there a pot hole in the bike lane?
Sharlin•35m ago
Infinite demand, maybe, but not at wages that most people are willing to accept. Of course, if there's literally no other work, then previously-middle-class people will take what's available and become homeless because the wage doesn't pay the bills (which are, in places, extremely inflated due to decades of jaw-droppingly bad housing and transport policies). Sounds like a highly desirable future.
eru•33m ago
Yes, but thanks to Baumol's cost disease productivity increases in other sectors can have spillover effects in terms of wages.
tonyedgecombe•31m ago
That assumes the current high wages are here to stay. This seems unlikely if AI consumes most white collar jobs.
bananaflag•34m ago
Yeah I am always disappointed in how little there is automated in construction and how slow humans are in this activity. It feels like an exclave of the Dark Ages in the Information Age.
tonyedgecombe•30m ago
Construction is interesting because productivity has actually fallen in recent decades.
bwhiting2356•2m ago
Safety and quality have increased.
seanmcdirmid•31m ago
Isn’t this more a function of how the American construction market is just really messed up somehow (corruption?)? In China, actual things get built fairly cheaply and quickly. You just don’t see workers hanging around watching one guy dig a hole like you do in the states. I would guess that automation is the only way out of the mess we are in, since just throwing more money and people at the problem just seems to make it worse.
thierrydamiba•29m ago
What’s different about the market in China that enables this?
teiferer•24m ago
Could you expand on the corruption claim?
debatem1•28m ago
So you work in one of these fields, right? Hydroponics, homebuilding, theatre construction, pothole repair?
bwhiting2356•4m ago
I currently work as a software engineer, but I've worked in the past in restaurants (dishwasher/prep cook), doordashing, as a musician, as moving help. If AI automates software I'll just do something else.
enraged_camel•27m ago
>> If robotics progress starts to pick up, I'll take this more seriously. Right now, there's practically infinite demand for labor in construction, manufacturing, agriculture and many other industries.

Don't be so sure: https://www.nytimes.com/2026/04/08/business/economy/blue-col...

efitz•22m ago
AI layoffs are very shortsighted IMO and should be viewed by investors as a sign of weakness in management or the business itself.

If everyone is going to increase productivity by some factor k per employee, then kx is the new norm of overall productivity of x employees.

If you lay off some percentage Y of your work force, then your expected gains will only be k(x(100-y)/100). In other words, you will not recognize the same productivity gains as your competitors that chose not to lay off.

Yes I realize it is more complex than that, because of reduced opex, but there are diminishing returns very quickly.

yobbo•15m ago
Start by shifting taxation from worker incomes to corporate incomes?