frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

My Gripes with Next.js

https://www.maartenhus.nl/blog/my-gripes-with-next-js/
1•MrHus•36s ago•1 comments

What Declarative Languages Are

https://semantic-domain.blogspot.com/2013/07/what-declarative-languages-are.html
1•iamwil•41s ago•0 comments

Tesla proposes $1T award for Musk if he hits targets

https://www.bbc.com/news/articles/cdx29qv4nvvo
1•onemoresoop•57s ago•0 comments

I used an AI triage bot to close 85 GitHub issues in a weekend

https://bagerbach.com/blog/using-ai-agents-to-do-months-of-work-in-a-weekend/
1•bagerbach•2m ago•0 comments

Post Cognitive Pied Piper

https://cognitivefish.substack.com/p/post-cognitive-pied-piper
1•RealRedNinja•3m ago•0 comments

OpenAI announces two "GPT-OSS" open AI models, and you can download them today

https://arstechnica.com/ai/2025/08/openai-releases-its-first-open-source-models-since-2019/
1•PaulHoule•3m ago•0 comments

Show HN: The AI Alternative to Bloomberg Which Banks and Firms Are Using

https://useallmind.ai
1•AllaTurca•4m ago•0 comments

Banana Straightener: AI image generation that gets it right

https://velvetshark.com/banana-straightener-ai-image-generator-that-works
1•LorenDB•5m ago•0 comments

ATC/OSDI '25 Joint Keynote: Accelerating Software Dev: The LLM (R)Evolution [video]

https://www.youtube.com/watch?v=dk3y3o3vhhU
1•zdw•5m ago•0 comments

When to Hire a Computer Performance Engineering Team

https://www.brendangregg.com/blog/2025-08-04/when-to-hire-a-computer-performance-engineering-team...
1•yread•5m ago•0 comments

Use singular nouns for database table names

https://www.teamten.com/lawrence/programming/use-singular-nouns-for-database-table-names.html
1•Bogdanp•5m ago•0 comments

Building Technical Vision(2022)

https://yusufaytas.com/building-a-technical-vision/
2•jatwork•6m ago•0 comments

RRAM and the AI Hardware Revolution

https://thepotentialsurface.substack.com/p/rram-and-the-ai-hardware-revolution
2•Annabella_W•7m ago•0 comments

Starless: We accidentally vanished our most popular GitHub repos

https://www.elastic.co/blog/starless-github-repos
2•jamietanna•7m ago•0 comments

Acoular – Acoustic Testing and Source Mapping Software

https://www.acoular.org
1•tomsonj•7m ago•0 comments

Are automated retries of specs a good pattern?

https://automationpanda.com/2021/06/14/are-automated-test-retries-good-or-bad/
1•lackoftactics•7m ago•0 comments

Trying to break free from online addiction

https://gobino.be/trying-to-break-free-from-online-addiction/
1•speckx•8m ago•0 comments

Criticism in the Age of AI

https://countercraft.substack.com/p/criticism-in-the-age-of-ai
1•crescit_eundo•12m ago•0 comments

OpenArchiver: Open-source platform for email archiving

https://github.com/LogicLabs-OU/OpenArchiver
3•thunderbong•14m ago•0 comments

1TB Raspberry Pi SSD on sale now for $70

https://www.raspberrypi.com/news/1tb-raspberry-pi-ssd-on-sale-now-for-70/
3•sohkamyung•14m ago•0 comments

TIL: Mastodon Has Lists

https://fedi.tips/how-to-use-the-lists-feature-on-mastodon/
2•laktak•14m ago•0 comments

OntoMotoOS – A Meta-Operating System Framework for AI Governance

2•nettalk83•15m ago•0 comments

Mark Zuckerberg Sues Mark Zuckerberg

https://techcrunch.com/2025/09/04/mark-zuckerberg-sues-mark-zuckerberg/
4•harrisreynolds•17m ago•0 comments

The false promise of WiFi 7 on iPhone 16 models

https://techloot.co.uk/ios/iphone-16-promises-blazing-fast-wifi-7-speeds-but-a-hidden-160-mhz-lim...
2•yrcyrc•17m ago•2 comments

US sanctions Palestinian groups who asked for Israel war crimes

https://www.cnn.com/2025/09/04/middleeast/trump-rubio-israel-palestinian-sanctions-hnk-intl
4•NomDePlum•18m ago•0 comments

Faster Rust Builds on Mac

https://nnethercote.github.io/2025/09/04/faster-rust-builds-on-mac.html
3•mkj•18m ago•0 comments

What the splinternet means for big tech. Unpleasant new trade-offs, for starters

https://www.economist.com/business/2025/09/04/what-the-splinternet-means-for-big-tech
1•bookofjoe•19m ago•1 comments

Elon Musk could become first trillionaire under new Tesla pay deal

https://www.independent.co.uk/news/business/elon-musk-tesla-pay-package-trillion-salary-b2820903....
2•doctaj•23m ago•0 comments

Strategies for Securing Non-Human Identities

https://www.cerbos.dev/blog/strategies-for-securing-non-human-identities
1•GarethX•23m ago•0 comments

What the panic about kids using AI to cheat gets wrong

https://www.vox.com/technology/458875/ai-cheating-data-education-panic
1•Wowfunhappy•25m ago•0 comments
Open in hackernews

OpenAI eats jobs, then offers to help you find a new one at Walmart

https://www.theregister.com/2025/09/05/openai_jobs_board/
127•rntn•2h ago

Comments

amelius•1h ago
For full irony they take your data, then use that data to replace your job.
apwell23•1h ago
they don't even replace any jobs, they suck out all money out that was previously going to employ people doing useful things.

I feel like we need a new word for money going to datacenters instead of paychecks. 'AI taking jobs' implies AI is doing the work , which is not the case.

milkshakes•1h ago
what does this even mean?
apwell23•1h ago
what part is not clear to you?
trollbridge•1h ago
Essentially displacing other jobs into power generation / energy extraction (something that, generally, we don’t want more of, since very little data centre energy is green) and huge investments in mass production AI capable servers which become obsolete rapidly.
muskyFelon•1h ago
It also seems like a lot of these layoffs due to AI are just regular layoffs.

Let's reduce headcount and spin it as AI disruption! That way we dont have to acknowledge we overhired during covid AND our stock price will go to the moon as they say.

apwell23•57m ago
Yes. I see this with my own employer.

Crazy how these CEO are so brazenly and openly committing fraud. Market and investors are playing along because stock price is going up. Board doesn't give a fuck.

USA is one giant casino right now.

smt88•1h ago
It's not irony so much as it is brazenly criminal. No one should be in a position of training someone else's AI to replace their job function without consenting to it and without being compensated.

The most valuable thing AI can do right now is write code, and it couldn't do that without thousands of StackOverflow volunteers.

mlnj•1h ago
No more laws to be enacted on AI in the USA for 10 years thanks to the billionaires. Pouring money into the elections have been a great ROI for the ultra wealthy.
nickthegreek•1h ago
that was removed from the final legislation.
toader•37m ago
Wouldn't the language / frameworks documentation be sufficient?
tjr•11m ago
At what point in the system?

If you have an LLM that was trained on (say) everything on the internet except for programming, and then trained it on the Python.org online documentation, would that be enough for it to start programming Python? I've not tried, but I get the impression that it needs to see lots and lots of examples first.

SwtCyber•8m ago
Full irony mode unlocked: they mine your data for free, feed it into a black box, then pop out a model that can do your job
milkshakes•1h ago
https://openai.com/index/expanding-economic-opportunity-with...
x187463•1h ago
How many jobs have they actually 'eaten'? For now, we mostly have AI labs and those in proximity claiming some number of jobs will be obsolete, but the actual job loss has yet to manifest.
dfxm12•1h ago
Salesforce CEO confirms 4,000 layoffs ‘because I need less heads’ with AI - https://www.cnbc.com/2025/09/02/salesforce-ceo-confirms-4000...
lernedsomecode•1h ago
this stuff is all a cover from CEOs. We're in the middle of a downturn and they're pretending layoffs like this are due to AI so as not to spook investors.
halfmatthalfcat•1h ago
You have to trust they are actually telling the truth and not using it as a convenient scapegoat. What sounds better to shareholders “we’re replacing jobs with AI” or “we hired too many during the COVID hiring glut and need to lay more people off”?
mattacular•1h ago
Does anybody really believe that SalesForce of all companies has successfully replaced 4000 real and necessary jobs with AI? Or is that more likely just an excuse to justify more layoffs in the tech industry for the usual reasons.
api•1h ago
Lots of companies using AI or RTO as excuses to just downsize since layoffs for normal reasons don’t look as good.
adabyron•1h ago
I really wish journalists & public speaking investors would call this out more.

Though like non-GAAP earnings & adjusted EBITDA, very few care. Those that do are often old, technical, conservative & silent type of investors instead of podcasters or CNBC guests. RIP Charlie M.

eastbound•1h ago
Very probable those CEOs use AI to write the speech, asking “What’s the least antagonizing way of explaining the layoffs?”.
nilkn•1h ago
There's no doubt it can function as a convenient cover, but that doesn't mean it's having no effect at all. It would be naive to assume that the introduction of a fundamentally new general-purpose work tool across millions of workers in nearly every industry and company within the span of a couple years has not played any role whatsoever in making teams and organizations more efficient in terms of headcount.
techpineapple•1h ago
They did say this was specifically in customer service, which if there was a department I believe you might be replaced by AI this would be it;

Alternatively though if the market is bad and there not launching as many new products or appealing to as many new customers, customer support may be a cost center you’d force to have “AI efficiencies”

dathinab•1h ago
tbh. for companies like SalesForce I always assume they have a lot of bloated unnecessary jobs which are done well enough by the people having them to needing a external reason to firing them

in addition SalesForce grew in employment size in 2025 AFIK and 4000 jobs are for them only around ~~5%, which means it's to small to be a meaningful metric if you don't fully trust what their press department does (and you shouldn't)

still I see people using modern AI for small productivity boosts all over the place including private live (and often with a wastely underestimate risk assessment) so in the best case it's only good enough to let people process more of the backlog (which otherwise would be discarded due to time pressure but isn't worthless) and in the worst case will lead to idk. 1/3 of people in many areas losing their job. But that is _without_ major breakthrough in AI, just based one better applying what AI already can do now :/ (and is excluding mostly physical jobs, but it's worse for some other jobs, like low skill graphic design positions)

threetonesun•48m ago
During the just-post-pandemic hiring spree I remember talking to some software developers who were doing very light coding in what I would usually think was a business analyst role. Those roles were both bloat that was lost once free money stopped flowing, and easily replaced (or reduced) with AI.

And as software developers, it would be silly if we didn't think that businesses would love to find a way to replace us, as the software we have created did for other roles for the past 60 years.

adabyron•1h ago
Those 4,000 were "customer support" positions & Salesforce just happens to also sell an AI product for customer support. They also underperformed expectations in their last earnings.

Companies like IBM & Klarna have made news for reducing positions like these & then re-hiring them.

AI, like most tech, will increase productivity & reduce headcount but it's not there yet. Remember, the days are long & the years are short.

mikelitoris•1h ago
I am sure he has had plenty of "heads" as a bigwig CEO. Also, it's "fewer" heads not "less".
ofjcihen•1h ago
Salesforces CEO needs to consider replacing them with security product architects so they can figure out a way to send me logs that aren’t crap
SoftTalker•1h ago
Wait until his customers figure out that they don't need Salesforce anymore.
Ianjit•1h ago
CRM needs to convince the markets that it isn't an AI loser. By saying it has been able to use AI to automate internally CRM is hoping the market will believe that its AI is good enough to also sell to customers. Unfortunately for CRM, after the earnings print the market still thought it was an AI loser.
Spooky23•1h ago
It’s going to ramp. I have a team where 60% of the staff (6/10) is retiring in a year.

Their function is around reconciling utilization and bills from multiple related suppliers with different internal stakeholders. They do a bunch of analysis and work with the internal stakeholders to optimize or migrate spend. It is high ROI for us, and the issue is both finding people with the right analytical and presentation skills and managing the toil of the “heavy” work.

Basically, we’re able to train interns to do 80% of the processing work with LLM tooling. So we’re doing to promote two of the existing staff and replace 2/6 vacancies with entry level new grads, and use the unit to recruit talent and cycle them through.

In terms of order of magnitude, we’ll save about $500k in payroll, spend $50k in services, and get same or better outcomes.

Another example is we gave an L1 service desk manager Gemini and made him watch a YouTube video about statistics. He’s using it to analyze call statistics and understand how his business works without alot of math knowledge. For example, he looked at the times where the desk was at 95th percentile call volume and identified a few ways to time shift certain things to avoid the need for more agents or reduce overall wait times. All stuff that would require expensive talent and some sort of data analysis software… which frankly probably wouldn’t have been purchased.

Thats the real AI story. Stupid business people are just firing people. The real magic is using the tools to make smart people smarter. If you work for a big company, giving Gemini or ChatGPT to motivated contracts and procurement teams would literally print money for you due to the stuff your folks are missing.

adabyron•1h ago
This is a great way of looking at it.

Do you use tech to grow your business or increase dividends?

Also reducing staff via attrition shows far better management skills than layoffs which imo says more about the CEO & upper management.

conartist6•51m ago
Except that we know that even if you can't put your finger on what it is right now, there's something in the way of "ChatGPT just literally prints money" being at all realistic. It's pretty obvious in the markets if anyone has figured out a repeatable strategy for printing money.

This is to say that we know from looking at outcomes over the long term that the kinds of concrete gains you're describing are offset by subtler kinds of losses which most likely you would struggle to describe as decimal numbers but which are equally real in their impact on your business.

LargeWu•12m ago
If LLM-based tools were such a money printing machine with a big ROI, why would you ever want to downsize? Keep redirecting that excess human capacity into building bigger and better capabilities to generate more revenue. Hire cheap grads to do the work.

Except it seems like the opposite is happening. CS grads have high unemployment. Companies laying off staff.

The rhetoric doesn't seem to add up to the reality.

api•1h ago
The doomer stuff about AI is another kind of hype. We are at the top of the bubble hype cycle.
malfist•1h ago
Amazon won't hire junior engineers because of AI unless it's direct from a college pipeline, and even those jobs have shrunk significantly in number. Even opening an L4 (junior) role requires VP approval these days.
nilkn•1h ago
It's rarely (maybe never) a direct one-to-one elimination of jobs. Most attempts to replace a single complete human job with an AI agent are not successful (and large companies, generally, aren't even attempting that). Rather, the phenomenon is more like a diffuse productivity gain across a large team or organization that results in a smaller headcount need over time as its final net effect. In practice, this materializes as backfills not approved as natural attrition occurs, hiring pipelines thinned out with existing team member workloads increased, management layers pruned, teams merged (then streamlined), etc.
torginus•49m ago
I mean one of the services at work had a custom HTML dashboard. We eliminated it and replaced it with Grafana.

I worked on both - my skillset went from coding pretty bar charts in SVG + Javascript to configuring Grafana, Dockerfiles and Terraform templates.

There's very little overlap between the two, other than general geekiness, but thanks I'm still doing OK.

patrick451•16m ago
Seems like a bad decision. Grafana is awful compared to a bespoke solution.
LeifCarrotson•37m ago
When my father joined an attorney's office as recently as the 80s, there was a whole team of people that worked with: Of course, there were the attorneys who actually argued the cases, but also legal assistants who helped create various briefs, secretaries who took dictation (with a typewriter, of course) to write those documents, receptionists who managed the attorney's schedules and handled memos and mail, filing clerks who helped keep and retrieve the many documents the office generated and demanded in a giant filing room (my favorite part of visiting as a kid: row after row of rolling shelves, with crank handles to put the walkways where they were needed to access a particular file), librarians who managed a huge card catalog and collection of legal books and acquired those that were not in the collection as needed... it was not a small team.

When he retired a few years ago, most of that was gone. The attorneys and paralegals were still required, there was a single receptionist for the whole office (who also did accounting) instead of about one for each attorney, and they'd added an IT person... but between Outlook and Microsoft Word and LexisNexis and the fileserver, all of those jobs working with paper were basically gone. They managed their own schedules (in digital Outlook calendars, of course), answered their own (cellular) phones, searched for documents with the computers, digitally typeset their own documents, and so on.

I'm an engineer working in industrial automation, and see the same thing: the expensive part of the cell isn't the $250k CNC or the $50k 6-axis robots or the $1M custom integration, those can be amortized and depreciated over a couple years, it's the ongoing costs of salaries and benefits for the dozen humans who are working in that zone. If you can build a bowl screw feeder and torque driver so that instead of operating an impact driver to put each individual screw in each individual part, you simply dump a box of screws in the hopper once an hour, and do that for most of the tasks... you can turn a 12-person work area into a machine that a single person can start, tune, load, unload, and clean.

The same sort of thing is going to happen - in our lifetimes - to all kinds of jobs.

dathinab•1h ago
entry "throw away, no skill" positions for programming are also kinda going away, or at least get collapsed to a small fraction of positions

to be fair this positions never made that much sense as they tend to cause more trouble then they are helping on the long run, but they exist anyway

and companies should know better then throwing away "junior, not yet skilled, but learning" positions (but then many small startups also are not used to/have the resources to teach juniors, which is a huge issue in the industry IMHO)

but I imagine for many of the huge "we mainly hire from universities"/FANG companies it will turn into a "we need only senior engineers and hire juniors only to grow our own senior engineers", this means the moment to you growth takes too long/stagnates by whatever _arbitrary_ metric you get kicked out fast. And like with their hiring process they have the resources, scale, and number of people who want to work for them to be able to really use some arbitrary imperfect effective discriminatory metrics.

Another aspect is that a lot of the day to day work of software engineering is really dump simple churn, and AI has the potential to massively cut down the time a developer needs to do that, so less developers needed especially in mid to low skill positions.

Now the only luck devs have is that there is basically always more work which was cut due to time pressure but often isn't even supper low priority, so getting things done faster might luckily not map one to one to less jobs being available.

EthanHeilman•1h ago
In my personal experience I've seen:

- OCR eat a good chunk of data entry jobs,

- Automated translation eat a number of translation jobs,

- LLM have eaten quite a few tier I support roles.

I don't have numbers tho, maybe people are still doing data entry or hiring translators on mechanical turk.

torginus•40m ago
I have a friend who used to do book translations. Due to some craftsman union rules, the minimum rate for translations has been set (and of course that's what everybody pays). Machine translation didn't decrease these rates, but they haven't been increased in 15 years, which made inflation completely eat it up.

Initially machine translation was way worse (by professional standards) than people assumed, essentially useless, you had to rewrite everything.

As time went on, and translation got better, the workflow shifted from doing it yourself to doing a machine pass, and rewriting it to be good enough. (Machine translation today is still just 'okay', not professional quality)

On the initially set rates 15 years ago you could eke out a decent-ish salary (good even if you worked lots of hours and were fast). Today if you tried to do the work by hand, you'd starve to death.

jdietrich•27m ago
Also illustrators, voiceover artists and customer service agents. Commercial photographers have seen their income from stock image services collapse and they are now seriously worried about the impact Nano Banana will have on work like product and fashion photography.

The question is no longer whether AI will put people out of work, but how many and how quickly.

Springtime•1h ago
I mean, I keep web pages about cases that previously used a human and have now been replaced with neural network models. Some of which include:

- Translation. See: Gizmodo firing its Spanish translation team and switching exclusively to LLMs.

- Editors. See Microsoft replacing their news editors at MSN with LLMs.

- Customer service. Various examples around the world.

- Article graphics for publications. See: The San Francisco Standard (which used it for various articles for a period), Bleepingcomputer.com, Hackaday (selectively, before some pushback).

- Voice acting. The Finals game used synthetic voices for the announcer voices.

potato3732842•1h ago
I think a lot of clerical tasks where people are crossing defined things with other defined things that could be somewhat automated expensively before can be somewhat automated less expensively enough to be worth doing now.

So jobs being killed by AI are basically being killed same way that office number crunching technology killed administrative assistant positions and put those tasks onto other people.

Take for example a purchasing department for a big company. Some project needs widgets. Someone crosses the specs against what their suppliers make. They take the result of that and makes a judgement call. AI replaces that initial step so a team of N can now do the work that formerly took a team of N + Y. Bespoke software could have replaced that step too but it would have been more expensive, less flexible, etc, etc. since there's all this work required to make human facing content into machine parsable content, including the user's input and the juice simply wasn't worth the squeeze. With AI doing all that drudgery on an as-needed basis the juice now is worth the squeeze in some applications.

SwtCyber•10m ago
We've definitely seen impact in areas like copywriting, customer service, and basic coding tasks, but it’s more nibbling at the edges than wholesale devouring
101008•1h ago
Why a company like OpenAI, that is so big and has a clear challenging goal, cares about online academies, etc? What's the % they can get from it? It's even worth it? They should focus all their minds to their (demanding) main goal, but it seems they are distracted with this stupid things.

This sounds so weird to me, and I feel I am missing something.

splatzone•1h ago
It feels like a proactive PR move to me. When the shit hits the fan in a few years and jobs are vanishing, they can point to this as one of the many examples of how they're fighting the good fight for humanity
empath75•1h ago
Probably because they have thrown as much money and manpower as they possibly can at the core problem and investing more into it is diminishing returns. Building out a platform is the logical next step and there are probably 100s of revenue positive businesses they can build on top of it by just throwing some software developers at the problem. They are going to be releasing a _lot_ of products over the next few years that aren't just "GPT 6" or whatever.

You might as well ask why google built an ad company or email or video, or a browser or a phone OS etc, when they should have spent more money on their core search engine.

keiferski•1h ago
IIRC this is a background premise in William Gibson’s The Peripheral. Most jobs are either for a tech company, or at a Walmart-esque store that has eaten all other retail.
A_D_E_P_T•1h ago
That book was weirdly prescient. IIRC Gibson himself noted that he was too on-the-nose with The Jackpot, which has made it very difficult to write the sequels.

Hyperstition is a real thing... if, that is, you're William Gibson.

nicce•27m ago
I haven’t read the book, and I should.

The series was really good. Too bad they cancelled it for being too expensive.

an0malous•11m ago
People don’t cancel things because they’re expensive, costs are a known and predictable quantity. They cancel things because they don’t make enough to justify the costs.
apocalyptic0n3•3m ago
In this case, it actually was due to unknown and unpredictable costs. They were about to start production on S2 when the strikes hit. They delays spiked the costs of an already expensive show (S1 cost an estimated $175M) which made it less tenable for Amazon. This happened to quite a few shows during the pandemic and during the strikes – renewed because it made financial sense, then pandemic/strike spiked the cost, and then they canceled the renewal.
keiferski•6m ago
I liked the first half of the series, but it gradually became a generic superhero series.

The book, however, is excellent- definitely recommend.

colesantiago•1h ago
Nice tongue and cheek from the Register and OpenAI to tickle the 'doomer' narrative.

But reality is that there will be new high skilled jobs from AI thanks to Jevons' Paradox, the more companies using AI will lead to a huge demand for highly skilled people who can use AI in more ways than we are today.

Not so much about being replaced, but there will be new jobs for people to do.

I guess for those people being 'replaced' it is a 'skill issue'

esafak•12m ago
What kind of new jobs?
ryanackley•1h ago
There is something kafkaesque about these giant tech companies restricting what you can talk to the AI about under the name of ethics while at the same time openly planning to replace you in the workforce.
muldvarp•1h ago
Is there? I don't see any contradiction there.

For me it's funny that the first time most programmers ever think about the ethics of automating away jobs is when they themselves become automated.

scott_w•1h ago
He didn't say "contradiction," he said "kafkaesque," meaning "characteristic or reminiscent of the oppressive or nightmarish qualities of Franz Kafka's fictional world" (according to Google).
d_sem•56m ago
I'm being facetious, but life in the rust belt post industrial automation is kinda close. Google Maps a random Detroit east side neighborhood to see what I mean.
capyba•29m ago
But it wasn’t industrial automation that ruined Detroit. It was the automakers’ failure to compete with highly capable foreign competition.
dgfitz•23m ago
> It was the automakers’ failure to compete with highly capable foreign competition.

I contend it was when Dodge won the court case deciding that shareholders were more important than employees. It’s been a slow burn ever since.

muldvarp•54m ago
I don't see why it would be "kafkaesque" either.

In fact I fail to see any connection between those two facts other than that both are decisions to allow or not allow something to happen by OpenAI.

MSFT_Edging•41m ago
It's oppressive and nightmarish because we are at the mercy of large conglomerates tracking every move we make and kicking our livelyhoods out from under ourselves, while also censoring AI to make it more amenable to pro-corporate speech.

Imagine if ChatGPT gave "do a luigi" as a solution to walmart tracking your face, gait, device fingerprints, location, and payment details, then offering that data to local police forces for the grand panopticon to use for parallel reconstruction.

It would be unimaginable. That's because the only way for someone to be in the position to determine what is censored in the chat window, would be for them to be completely on the side of the data panopticon.

There is no world where technology can empower the average user more than those who came in with means.

jajko•26m ago
Yeah but what we are all whining here about (apart from folks working on llms and/or holding bigger stocks of such, a non-trivial and a vocal group here) has hit many other jobs already in the past. Very often thanks to our own work.

It is funny, in worst way possible of course, that even our chairs are not as stable as we thought they are. Even automation can be somehow automated away.

Remember all those posts stating how software engineering is harder, more unique, somehow more special than other engineering, or generally types of jobs? Seems like its time for some re-evaluation of that big ego statements... but maybe its just me.

gjsman-1000•17m ago
There’s two kinds of programmers:

0. The people who got into it just as a job

1. The people who thought they could do it as art

And #1 is getting thrashed and thrown out the window by the advent of AI coding tools, and the revelation companies didn’t give a darn about their art. Same with AI art tools and real artists. It even begs the question if programming should ever have been viewed as an art form.

On that note, programmers collectively have never minded writing code that oppresses other people. Whether with constant distractions in Windows 11, building unnecessarily deadly weapons at Northrop Grumman, or automating the livelihoods of millions of “inferior” jobs. That was even a trend, “disrupting” traditional industries (with no regard to what happens to those employed in said traditional industry). Nice to see the shoe just a little on the other foot.

For many of you here, keep in mind your big salary, came from disrupting and destroying other people’s salaries. Sleep well tonight and don’t complain when it’s your turn.

MSFT_Edging•2m ago
What's sad is engineering is very much an art. Great innovation comes from the artistic view of engineering and creation.

The thing is, there's no innovation in the "track everything that breaths and sell the data to advertisers and cops" market.

They might get better at the data collection and introspection, but we as a society have gotten nothing but streamlined spyware and mental illness from these markets.

myko•7m ago
Having used agentic ai (Claude Code, Gemini CLI) and other LLM based tools quite a bit for development work I just don't see it replacing developers anytime soon. Sure a lot of my job now is cleaning up code created by these tools but they are not building usable systems without a lot of developer oversight. I think they'll create more software developer roles and specialties.
MSFT_Edging•5m ago
> Yeah but what we are all whining here about has hit many other jobs already in the past.

I'm less talking about automation and more about the underpinnings of the automation and the consequences in greater society. Not just the effects it has on poor ole software engineers.

It is quite ironic to see the automation hit engineers, who in the past generally did not care about the consequences of their work, particularly in data spaces. We have all collectively found ourselves in a local minima of optimization, where the most profitable thing we can do is collect as much data on people as possible and continually trade it back and forth between parties who have proven they have no business holding said data.

torium•26m ago
If you don't see why this is oppressive, that's really a _you_ problem.
ryanackley•52m ago
It's not a comment on the ethics of replacing jobs but the hypocrisy of companies using "ethics" as reasoning for restricting content.

They are pursuing profits. Their ethical focus is essentially a form of theater.

muldvarp•46m ago
Then why are you even talking about the replacement of jobs?
ryanackley•31m ago
I've already explained it. I don't know how to break it down any further without coming off as patronizing. You seem dead-set on defending OpenAI and not getting the point.
BeetleB•33m ago
Replacing jobs is not an ethical issue.

Automation and technology has been replacing jobs for well over a century, almost always to better outcomes for society. If it were an ethical issue, then it would be unethical not to do it.

In any case, which jobs have been replaced by LLMs? Most of the actual ones I know were BS jobs to begin with - jobs I wish had not existed to begin with. The rest of the ones are where CEOs are simply using AI as an excuse to execute layoffs (i.e. the work isn't actually being done by an LLM).

bugglebeetle•29m ago
Ask ChatGPT to explain consequentialism to you.
nicce•22m ago
Yeah, the issue is that there is no common benefit if the private company is the only one doing the replacement. Are we ready for AGI before we solve issues of capitalism? Otherwise, the society may get a harsh reset.
bgwalter•48m ago
The number of bullshit jobs has been growing since the Internet, and programmers have facilitated them by adding unnecessary complexity.

This time the purported capabilities of "AI" are a direct attack on thinking. Outsourcing thinking is creepy and turns humans into biorobots. It is different from robotic welding in an assembly line.

Even if new bullshit jobs are created, the work will just be that of a human photocopier.

[All this is written under the assumption that "AI" works, which it does not but which is the premise assumed by the persons quoted in the Register article.]

muldvarp•38m ago
I don't see how thinking about some source code is an innately more human activity than welding. Both can be done by humans, both couldn't be done by anything but humans until automation came along and now both can be done by humans and automated systems.

I also fail to see how LLMs can turn humans into "biorobots". You can still do all the things you could do before LLMs came along. The economic value of those things just decreased enourmously.

bgwalter•30m ago
Then go weld. There are still some positions for humans.
ActionHank•47m ago
I think many of us question the ethics of lying to sell a product that cannot deliver what you are promising.

All the good devs that I know aren't worried about losing their jobs, they are happy there is a shortcut through boilerplate and documentation. They are also equally unhappy about having to talk management, who know very little about the world of dev, off the ledge as they are getting ready to jump off with their AI wings that will fail.

Finally, the original point was about censorship and controlling of information, not automating jobs.

an0malous•13m ago
While training their models on pirated and scraped content
SwtCyber•13m ago
It's like being lectured on ethical behavior by the thing that's actively eating your lunch
muldvarp•10m ago
Why do you think automating software development is any less ethical than automating other jobs (which many software developers actively engaged in)?
Printerisreal•12m ago
"Rules for thee, but not for me"
bgwalter•1h ago
"AI will be disruptive. Jobs will look different, companies will have to adapt, and all of us – from shift workers to CEOs – will have to learn how to work in new ways," she said in a blog post.

We don't have to do anything. People always listen to this propaganda from the wealthy and think the latest gadgets are inevitable.

We can go on a general strike until copyright is restored and the "AI" companies go bankrupt. Journalists can write sharper and sharper articles. You can refuse to use "AI". If mandated in a job, demonstrate that it slows you down (probably true anyway).

You can demand an investigation into Suchir Balai's demise (actually Elon Musk recently endorsed a tweet demanding just that; you can, too).

You can boycott robotaxis. You can stop watching YouTube channels that do product placement of humanoid robots. You can do a lot of things.

j1000•46m ago
Finally some common sense here. This is exactly my thoughts, how come in age of instant information flow people cannot gather together under one idea. How French Revolution could happen without Messenger and WhatsUp? People had to walk and talk about idea.

AI CEOs are talking like they are oracles but they just need to please stakeholders.

thoroughburro•12m ago
> how come in age of instant information flow people cannot gather together under one idea

Because instant misinformation is even faster and satisfies their biases.

muldvarp•14m ago
This won't happen. Even if western countries will outlaw LLMs (they won't, people care about unemployed software developers just as much as most software developers cared about unemployed translators), China won't. And why would we even do that? Because software developers like their cushy job and don't want to do blue-collar jobs?
bgwalter•6m ago
China FOMO? Xi Jin Ping has warned twice about an "AI" bubble:

https://www.ft.com/content/9c19d26f-57b3-4754-ac20-eeb627e87...

https://www.wsj.com/tech/ai/china-has-a-different-vision-for...

China restricts "AI" across the country to prevent cheating:

https://nypost.com/2025/08/19/world-news/china-restricts-ai-...

Why would we do that? Because we can! Why would we let some rich brats who have never created something and just steal human knowledge get even richer and control and surveil us all?

thoroughburro•9m ago
> We can go on a general strike until copyright is restored and the "AI" companies go bankrupt. Journalists can write sharper and sharper articles. You can refuse to use "AI".

This sounds like naive wishful thinking. Add “theoretically” after every “can” for real life.

kordlessagain•54m ago
The amount of arrogance in technology is staggering.

Instacart IPO'd. OpenAI hit $300B valuation. Different companies, different industries—yet look closer and you'll find the same names signing both checks.

ai-christianson•53m ago
We're replacing managers with AI at our startup.
daedrdev•37m ago
Why are people saying its AI and not the current administration running the economy into the ground?
nerpderp82•35m ago
The endgame for all AI companies is to replace all of labor. Everyone's use of AI through their chat and api endpoints is training those replacements right now.
SwtCyber•6m ago
And when the dust settles, the same companies will turn around and sell us the cure for the problem they engineered
avodonosov•28m ago
And capcha forces users to train neural networks for free, planning to then replace the users with those neural networks :)

Moreover, website ovners even pay for capcha. It should be other way around - people participated in training the neural nets should share profit and owhership of the networks, at the very least.

mensetmanusman•25m ago
It turns out you can’t have a thriving economy if everyone is just working at service jobs as floor stockers and barista’s.

If we are lucky, AI provides a huge accelerant to being able to insource manufacturing with advanced automation. This is an energy saving play to reduce the total distance that objects are shipped which wastes a lot of energy especially as the objects are of lower value.

SwtCyber•16m ago
It's like setting your house on fire and handing you a fire extinguisher... for a fee