frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Polynya – Turn your Postgres into workspaces for AI

https://polynya.dev/
2•hasyimibhar•54s ago•0 comments

Show HN: Matrirc – run irssi in 2026, talk to people on Matrix

https://github.com/pawelb0/matrirc
2•pawelb0•1m ago•0 comments

AT&T Unix PC – What went wrong? [video]

https://www.youtube.com/watch?v=_x3uxKfFI-0
1•naves•1m ago•0 comments

Ace Technical Preview: GitHub Next's Agentic Workspace – Maggie Appleton [video]

https://www.youtube.com/watch?v=ClWD8OEYgp8
1•chaoxu•3m ago•0 comments

ELI: Explain Like I'm for any ArXiv Paper

https://eli.voxos.ai/
1•Falimonda•7m ago•0 comments

Agentic AI Security

https://www.straiker.ai/
1•noashavit•8m ago•0 comments

Show HN: PatchWork extracts your full career history and writes resumes for you

https://usepatch.work/
1•mcohrs•8m ago•1 comments

U.S. Mint Buys Drug Cartel Gold and Sells It as 'American'

https://www.nytimes.com/2026/04/26/world/americas/us-mint-gold-drug-cartel-colombia.html
8•mikhael•11m ago•0 comments

Butterflies are in decline across North America, a look at the Western Monarch

https://www.smithsonianmag.com/science-nature/butterflies-are-in-dramatic-decline-across-north-am...
17•1659447091•12m ago•0 comments

Security issues found within rust-coreutils

https://discourse.ubuntu.com/t/an-update-on-rust-coreutils/80773
2•birdculture•13m ago•1 comments

Kitty-graphics.el v0.5.0: tmux support for images inside terminal Emacs

https://cashmere.rs/blog/kitty-graphicsel-v050-tmux-support-sixel-performance-typst-support
1•cashmere1337•19m ago•0 comments

Ask HN: Anyone want to collaborate on a local-first AI-based research assistant

2•venkatram-s•19m ago•0 comments

Humanoid Data

https://www.technologyreview.com/2026/04/21/1135656/humanoid-data-robot-training-ai-artificial-in...
1•gnabgib•22m ago•0 comments

Rapunzel: Tree style tabs for codex, Claude Code and Gemini

https://github.com/salmanjavaid/rapunzel/tree/main
1•WasimBhai•25m ago•1 comments

If an AI tutor that adapts to your learning style

https://tutoraimvp.netlify.app/index.html
1•Avia_Studio•27m ago•0 comments

1:59:30: Sabastian Sawe Shatters the 2-Hour Barrier at 2026 London Marathon

https://www.letsrun.com/news/2026/04/15930-sabastian-sawe-shatters-the-2-hour-barrier-at-2026-lon...
18•nradov•27m ago•2 comments

Remembering the 1984 Unix PC. Why did it fail so hard?

https://tech.slashdot.org/story/26/04/26/2038235/remembering-the-1984-unix-pc-why-did-it-fail-so-...
2•MilnerRoute•28m ago•0 comments

Claude Design Is Real Design

https://diverging.run/checkpoints/claude-design-is-real-design/
1•shay_ker•29m ago•0 comments

TRELLIS.2: Native and Compact Structured Latents for 3D Generation

https://microsoft.github.io/TRELLIS.2/
4•stavros•29m ago•0 comments

Two Athletes Break Sub-2-HR Marathon in Adizero Adios Pro Evo 3

https://news.adidas.com/running/two-adidas-athletes-sabastian-sawe-and-yomif-kejelcha-break-the-s...
2•canucker2016•31m ago•0 comments

New HEIC to JPG/PNG Converter

https://heyc.runtime-hub.com/
1•RunTimeZero•32m ago•0 comments

Charity Guiness record - 9 day stream raised almost 70mln USD for cancer

https://streamer.guide/blog/latwogang-breaks-guinness-record-charity-stream-2026
2•halonn•32m ago•0 comments

The New Linux Kernel AI Bot Uncovering Bugs Is a Local LLM on Framework Desktop

https://www.phoronix.com/news/Clanker-T1000-AMD-Ryzen-AI-Max
5•guerby•35m ago•0 comments

Anonymous IRQ Handlers

https://trident64.github.io/anonymous-irq-handlers/
2•adunk•36m ago•0 comments

Show HN: Tiao, A two-player turn-based board game

https://playtiao.com
1•trebeljahr•37m ago•0 comments

Show HN: AI memory with biological decay (52% recall)

https://github.com/sachitrafa/YourMemory
10•SachitRafa•37m ago•5 comments

Forcing Scammers to Pass a "Face Captcha" [video]

https://www.youtube.com/watch?v=odFq0xgTrko
1•akavel•39m ago•0 comments

Sawe smashes two-hour mark to 'move goalposts for marathon running'

https://www.bbc.com/sport/athletics/articles/crm1m7e0zwzo
37•berkeleyjunk•39m ago•6 comments

The Preservation Sequences, Part 1: Less Dead

https://nectome.substack.com/p/the-preservation-sequences-part-1
1•bcjordan•41m ago•0 comments

Show HN: Mdlens – Reduce token spend and boost retrieval on Markdown-heavy repos

https://github.com/Dreeseaw/mdlens
1•dreeseaw•41m ago•0 comments
Open in hackernews

AI should elevate your thinking, not replace it

https://www.koshyjohn.com/blog/ai-should-elevate-your-thinking-not-replace-it/
120•koshyjohn•1h ago

Comments

CorbenDallas•1h ago
There are plenty of engineers, who simply can't think, AI will not change anything in this regard.
joe_mamba•1h ago
How do you graduate your engineering degree without being able to think?

Even my colleagues who cheated their way through uni still needed critical thinking to do that and get away with cheating without being caught.

People might hate this but being a good cheat requires a lot of critical thinking.

ironman1478•1h ago
You don't need a 4.0 to graduate. And even if you got one, a lot of grades are composed of tests, not projects. You can just memorize your way through things if you were dedicated enough.

It's not really that hard to get a degree in engineering if your only goal is the degree itself.

johndough•1h ago
> a lot of grades are composed of tests, not projects

(Take home) projects are easier than ever thanks to AI. In the past, you at least had to track down some person to do the work for you.

awesome_dude•1h ago
Mate, have you never had to deal with over-confident graduates who think they've got the complete answers, but, in reality, they only have a sliver of the whole picture in their minds?
operatingthetan•1h ago
That is different than the suggestion that one could graduate with a CS degree and "never think." Which is absurd.
lispisok•1h ago
Grade inflation and schools passing kids who should fail to game metrics and keep collecting student loans is a problem. I wouldnt consider hiring anybody from my alma mater who didnt score a sandard deviation or higher on the tests.
spacechild1•1h ago
OP should have put "engineers" in double quotes. Many software developers like to describe themselves as engineers although they don't have an actual engineering degree. A lot of software development resembles plumbing more than engineering, so most devs don't really need an engineering degree anyway, but they should be more honest about what they're actually doing and not trying to elevate themselves with fancy titles.

You are, of course, right that the idea that someone could finish a serious engineering degree without being able to think is ridiculous.

vips7L•1h ago
Half of my graduating class could barely program.
spacechild1•19m ago
What did you study?
whstl•11m ago
Yep. Way more than half of the people I interview can't even do a very basic FizzBuzz, even with guidance. Those are people with a degree, job experience and reference letters.
shagie•41m ago
A degree is passing the test. Not all degree programs get into more advanced topics nor do they necessarily require that someone is able to work through how to solve a problem that they haven't seen before.

--

A lot of students (and developers out there too) are able to pass follow instructions and pass the test.

A smaller portion of them are able to divide up a task into the "this is what I need to do to accomplish that task".

Even fewer of them are able to work through the process of identifying the cause of a problem they haven't seen before and work through to figure out what the solution for that problem is.

--

... There are also a lot of people out there that aren't even able to fall into the first group without copying and pasting from another source. I've seen the "stack sort" at work https://xkcd.com/1185/ https://gkoberger.github.io/stacksort/ professionally. People copying and pasting from Stack Overflow (back in the day) without understanding what they're writing.

Now, they do it with AI. Take the contents of the Jira description, paste it into some text box, submit the new code as a PR, take the feedback from the PR and paste it back into the box and repeat that a few times. I've seen PRs with "you're absolutely correct, here are the updates you requested" be sent back to me for review again.

This is not a new thing. AI didn't cause it, but AI is exacerbating the issue with professional programming by having the people who are not much more than some meat between one text box and another (yes, I'm being a bit harsh there) and the people who need instructions but don't understand design to be more "productive" while overwhelming the more senior developers.

... And this also becomes a set of permanent training wheels on developers who might be able to learn more if they had to do it. That applies at all levels. One needs to practice without training wheels and learn from mistakes to get better.

what-the-grump•36m ago
I don't know but I can point at more than half of the people that I work with that can't think, and every time they try to, takes a whole group of people that can think to undo their mess, they all have degrees and I don't.

So what does that tell me?

Better yet, for about 30% having the LLM slop it would have yielded better outcomes, but having them slop something nets terrible slop. But at least I can reshape because even the LLM wont do something that stupid.

quantum_state•27m ago
Can’t think properly seems to be the real issue. That’s one of the reasons that SE domain is mostly in ruin. AI won’t help, only to delay a bigger mess.
taurath•11m ago
Ever since the standard office setup went from offices or cubicles to bullpens and hot desks there is less and less time to think, and all of that is a management decision to ship things as fast as possible
sharts•1h ago
Meh, there’s plenty that rise in their careers while being mediocre.
joe_mamba•1h ago
The tech industry lost the plot when SCRUM Masters and AGILE coaches were highly paid con-men to waste everyone's time and add no value while raking in the coal. AI doesn't impact something already broken.
operatingthetan•1h ago
When was tech not bureaucratic and political?
joe_mamba•1h ago
60's, 70's, 80's, 90's, basically before the Google and Meta found out ads and money printing run the world, and after the tech industry was run by nerds with mullets, New Balance sneakers and khaki shorts.
operatingthetan•1h ago
Oracle, HP, Microsoft, Cisco, IBM, Apple, Xerox and countless other names were internally bureaucratic and political in the 80's and 90's. Like famously so.
joe_mamba•1h ago
Every single one of those companies you mentioned was lean, agile and run by skilled motivated nerds with mullets and thick glasses in the beginning when they started in a garage.

And every single major company becomes bureaucratic and political after 30+ years in the business when the original founders are long retired, and the Wall Street friendly beancounters take over, caring only about the quarterly reports.

operatingthetan•57m ago
You are changing your argument by adding this: "when they started in a garage."

'Lean agile' tech companies are by far the exception, not the rule.

Look at OpenAI and Anthropic, both fairly new companies that are excessively political already. This 'garage stage' of lacking politics is a myth, read old stories about Microsoft, when it was 15 people it was political.

sheepscreek•1h ago
AI is creating problems. This isn’t one of them. Engineers are going to now think at a higher level of abstraction. No one misses coding in assembly.
orblivion•1h ago
Compilers are a layer of abstraction that we can ask another human about. Some human is there taking care of it. Until we get to the point where we trust AI with our survival it would be good to be able to audit the entire stack.
andsoitis•1h ago
any human can read the code an AI produces.
hun3•1h ago
How can you read a language you didn't learn?
kirth_gersen•1h ago
for now. some people seem to think we should make ai native programming languages and just let them be black boxes. which is a bad idea imo
dawnerd•1h ago
Have you tried to shift through a whole lot of vibe coded slop? It’s really mentally draining to see all of the really bad techniques they fall back on just to brute force a solution.
cyclopeanutopia•1h ago
Nope, not anymore. Many already forgot how to do that and it's not a joke.

And putting aside the vanishing skill, there is also an issue of volume.

cyberpunk•58m ago
So... Our jobs are safe then? I mean, assuming we don't also atrophy to the same extent as the 'many'?
cyclopeanutopia•48m ago
I'm just saying that I already see that people are outsourcing all the thinking to the models - not only code generation and reviews, but even design - the part that "senior engineers" without imagination think only they are capable of doing.

It's worrying how much trust is being put in those systems. And my worry is not about the job anymore, but our future in general.

cyberpunk•31m ago
It's a bit of a weird place to be in as a senior engineer who has spent 2 decades perfecting his craft.

So, on one hand, I'm also kinda sad and how quickly we've thrown the guardrails away, but on the other -- it's... Well. It's just work.

Turns out, no one ever really cared how elegant or robust our code was and how clever we were to think up some design or other, or that we had an eye on the future; just that it worked well enough to enable X business process / sale / whatever.

And now we're basically commoditised, even if the quality isn't great, more people can solve these problems. So, being honest, I think a lot of my pushback is just a kinda internal rebellion against admitting that actually, we're not all that special after all.

I'm just glad I got to spend 20 years doing my hobby professionally, got paid really well for it, and often times was forced to solve complicated problems no one else could -- that kept me from boredom.

I think the shift we are seeing now, as 'previously' knowledge workers is that work becomes a lot more like manual labour than what we've really been doing up until now. When there's no 'I don't know' anymore, then you're not really doing knowledge work, right?

I guess I'll just ride the wave, spew out LLM crap at work, and save the craft for some personal projects, I'll certainly have the capacity now work is a no-op.

cyclopeanutopia•16m ago
Yeah, but the thing is, it's not "just work". Software now has really big impact on the world and actual lives.

In a corporate world, we are typically detached from real world consequences and looking at people around me, people really don't think about such things - but I do. And I really care, because "relaxed" standards might result in errors that amount to stuff like identity thefts, or stolen money, shit like this, even on the smallest scale.

Obviously we can't prevent everything, but it seems like we, as industry, decided to collectively YOLO and stop giving shit at all. And personally I don't like that it is me who is losing sleep over this, while people who happily delegate all their thinking over to LLMs sleep better than ever now.

threethirtytwo•27m ago
I think those of us who have years of experience under our belt our safe. If we're older the knowledge is ingrained and atrophy of this knowledge will be limited based on the fact that it's already "imprinted" onto our brains.

Our futures are safe in this sense, in fact it's even beneficial as we may be the last generation to have these skills. Humanities future on the other hand is another open question.

orblivion•58m ago
Unless people can't think without the AI.
cyclopeanutopia•1h ago
> No one misses coding in assembly.

It's only your opinion that is provably false.

First, there are still people who don't like high level languages and don't use them, because they find assembly better.

Second, I personally work in a field where I need to consult the source of truth, the actual binary, and not the high level source code - precisely because the high level of abstraction is obscuring the real mechanics of software and someone needs to debug and clean up the mess done by "high level thinkers".

High level programming languages are only an illusion (albeit a good one) but good engineers remember that illusion is an illusion.

threethirtytwo•31m ago
When people communicate they speak in terms of the overwhelming generality of reality. There's always at least one guy that is an extreme exception.

I can tell you this, the person you're replying to comes from the overwhelming majority/generality. You, on the other hand, are that one guy.

Of course even my comment is a bit general. You're not "one" guy literally. But you are an extreme minority that is small enough such that common English vernacular in software does not refer to you.

cyclopeanutopia•30m ago
Thank you.
hun3•1h ago
You can write unambiguous (UB-free) code and the compiler's output will be deterministic. There will even be a spec that explains how your source maps to your program's behavior. LLM has neither.

Also, if you need to control performance, you still need to know how CPU cache and branch prediction works, both of which exists at the abstraction level of assembly.

nickandbro•1h ago
I think there are engineers that can’t think without AI. But the best think with it. Unfortunately, we are now living in a day and age where simply ignoring AI is no longer an option.
fnordpiglet•1h ago
There were always engineers who didn’t think and depended on crutches around them like senior engineers and politicizing the perf cycle. Most people got into this because their parents told them it makes a lot of money, and they never had the drive and curiosity to develop the passion required to truly think through the problems in computing and computer science. They will continue to use crutches to survive. Those that are driven by the problems for the problems will continue to think and use AI as a tool for leverage. This is no different than any other assistive technology.
saadn92•1h ago
Hard disagree. I feel like I'm thinking a lot more now because I have so many parallel projects going on at the same time. AI has allowed me to really, truly create in a way that I've never done before. Yes, my coding skills probably aren't as sharp as they used to be, but my system design skills are at an all time high. Don't blame the tool.
klodolph•1h ago
What part do you disagree with? It sounds like you don’t disagree with either the title of the article or its contents.

> In talking to engineering management across tech industry heavy-weights, it's apparent that software engineering is starting to split people into two nebulous groups:

> The first group will use A.I. to remove drudgery, move faster, and spend more time on the parts of the job that actually matter i.e. framing problems, making tradeoffs, spotting risks, creating clarity, and producing original insight.

enraged_camel•1h ago
The HN title is heavily editorialized. Actual article title is far less controversial: "A.I. Should Elevate Your Thinking, Not Replace It"
klodolph•1h ago
Ah, I was thinking of the editorialized HN title.
Jcampuzano2•1h ago
"Hard disagree because it doesn't affect me personally"

There is already research literally showing that on average it is a net loss on focus, learning and critical thinking skills.

dawnerd•1h ago
> Yes, my coding skills probably aren't as sharp as they used to be

If not the tool then whose to blame? It’s very clear people that rely on LLMs for coding lose their skills. Just because you have a lot of parallel tasks going at once doesn’t mean you’re producing quality work. Who’s reviewing it? Are you just blindly trusting it?

jnpnj•1h ago
But is the debate about "fleshing out a system spec" or "ability to come up, plan and explore various ideas to solve problems elegantly on a budget" ? I think there's always these two sides conflated as one when discussing LLM impact on users.
relativeadv•1h ago
I work with others who have made this same claim. For those people, when I observed their work during demo days the unmentioned thing is that they were going to the AI for system design questions as well. This was framed as "just using it as a sounding board" but what was actually done was not merely a sounding board but instead was asking for solutions. Anchoring bias being what it is, these felt like good ideas and they kept them.

Its the feeling of having done a lot of thinking for themselves without having actually done so.

idle_zealot•1h ago
If 1% of people using the tool end up like you, and 99% end up drooling invalids, I think it would be insane to not blame the tool. If a tool that's incompatible with humans isn't to blame for that incompatibility, what is to blame for the harm done? Human nature? The point of a tool is to be used by humans.
xyproto•55m ago
Even if a tool can only be used for lobotomizing humans, the usage of the tool is where the main blame should be placed.
Ekaros•1h ago
For how many different parallel projects can you really keep proper mental model in your head at one time? Or put enough effort to seriously consider all aspects. I think number varies between simple and more complex. But still, could that number be lower than many think it is?
SpicyLemonZest•34m ago
It really depends on who you consider the "many" to be. I've seen people who claim they can meaningfully iterate on 10 projects simultaneously, and I'm skeptical of that. My personal experience is that my decisions are noticeably degraded at 3-4 parallel workstreams, and with even the simplest projects I'm non-functional past 6.

But I can juggle 2 workstreams in a day easily, and I can trivially swap projects in and out of the "hot path" as demanded by prioritization or blockers; before LLM coding both of those were a lot harder.

SirYandi•29m ago
So you'll have a beautifully designed system with rotting bones? A system constrained to the same patterns seen in training data. Not terrible, good enough.

I don't know, I don't doubt you're more productive. Broadly so. But the depth and rigor I think may be missing, as the article suggests.

As an aside, I suppose it's a good time for those nearing the end of their careers, those who no longer need to learn, to cash out and go all in on AI.

Leonard_of_Q•22m ago
The real question is whether you'd be able to continue doing your work if someone took your toys away and said "here's a nickel, kid, go buy yourself a real computer". I'm not referring to whether you'd be able to keep up your productivity since it is clear you couldn't just like a carpenter with a nail gun works faster than one with a hammer and a bucket'o'nails. Could you do the work, starting with the design followed by boiler plate and finishing with a working system? The carpenter could, albeit slower since his tools only speed up the mechanics of his work. Coding agents do much more than that, they take away part of the mental modelling which goes into creating a working system. The fancier the tool, the more work it takes out of your hands. Say that the aforementioned toy thief comes by in a year or two after the operating systems (etc.) you're targeting have undergone a few releases with breaking changes. A number of APIs have been removed, others have been deprecated and new ones have been added. You were used to telling the agent to 'make it work on ${older_versions} as well as ${newest version} but now you're sitting there with a keyboard at your fingertips and that stupid cursor merrily blinking away on the screen. How long would it take you to become productive again? What if the toy thief waits 5 years before making his heist? What if the models end up rebelling or sink into depression and the government calls upon you to save your economic sector?

When cars first appeared it took quite some knowledge and experience to even get the things started, let alone to keep them running. Modern cars are far better in all respects and as a result modern drivers often don't have a clue what to do when the 'Check Engine' light appears. More recent cars actively resist attempts by their owners to fix problems since this is considered 'too dangerous' - which can be true in case of electric cars. That's the cost of progress, it is often worth it but it does make sense to realise what it would take to go back in time to the days when we coded our software outside in the rain, upphill both ways with only a cup of water to quench our thirst. In the dark. With wolves howling in the woods. OK, you get my drift.

Will there be something like 'software preppers' who prepare for the 'AIpocalypse' by keeping their laptops in shielded containers while studiously chugging along without any artificial assistance. Probably. As a hobby, at least, just like there are 'survivalist preppers' who make surviving some physical apocalypse their goal in some way or other.

zulux•1h ago
Yes.... and I can't think without compiled languages. Missed out on assembler.

Becoming dependent on a technology is to be expected. I'm pretty sure 95% of us are dependent on packaged meat and don't know how to hunt.

awepofiwaop•1h ago
I'm seeing plenty of internal work where I ask someone about their code, they ask Claude, and reply with "Claude says...".

That's substantively different than going from assembly to C.

puapuapuq•1h ago
I am that someone thinking why you can't ask Claude yourself.
ben_w•1h ago
Every time things change, the change itself is different.

I remember some of my earlier issues with various languages. `Dim A, B as Int`, in VisualBasic one of them is an Int the other is a Variant, in REALbasic (now Xojo) they're both Int. `MyClass *foo = nil; [foo bar];` isn't an error in ObjC because sending a message to nil is a no-op.

Or how, back when I was a complete beginner, if I forgot a semicolon in Metrowerks, the compiler would tell me about errors on every line after (but not including!) the one where I forgot the semicolon.

"Docs say", "Compiler says", "StackOverflow says", "Wikipedia says"; either this tool is good enough or it isn't; it not being good enough means we're still paid to do the thing it can't do, that only stops when nobody needs to because it can do the thing. The overlap, when people lean on it before the paint is dry, is just a time for quick-and-dirty. LLMs are in the wet-paint/quick-and-dirty phase. You could get suff done by copy-pasting code you didn't understand from StackOverflow, but you couldn't build a career from that alone. LLMs are better than StackOverflow, but still not a full replacement for SWeng, not yet.

awesome_dude•1h ago
In answer to the headline - it's not, no more than calculators stopped people from thinking.

It's changing the way we think, and reason.

Speaking as a BE focused Go developer, I'm now working with a typescript FE, using AI to guide me, but it scares the shit out of me because I don't understand what it's suggesting, forcing me to learn what is being presented and the other options.

No different to asking for help on IRC or StackOverflow - for decades people have asked and blindly accepted the answers from those sources, only to later discover that they have bought a footgun.

The speed at which AI is able to gather the answers from StackOverflow coupled with its "I know what I am talking about" tone/attitude does fool people at first, just like the over-confident half assed engineers we have always had to deal with.

Unlike those human sources, we can forcefully pushback on AI and it will (usually) take the feedback onboard, and bring the actual solution forward.

Thus proving the engineer steering it still has to know what they are doing/looking at.

journal•1h ago
A.I. is creating engineers who can't WORK without it
0xbadcafebee•1h ago
No, AI is not creating that group of people. They already existed. They were the people who would google for StackOverflow snippets and copy+paste them without even reading the entire snippet, much less understand them. Same people, new tool.
clutter55561•44m ago
Exactly what I posted as well!
stavros•1h ago
Skills you don't need, atrophy. Skills you need, don't. It's very simple, and the "you won't have the skills you used to need but don't need any more!" line of reasoning is tired and invalid.
miyoji•1h ago
That's not how it works, unfortunately. Skills you use stay fresh, skills you don't practice get rusty and fade away. You might need things you aren't using anymore.

If you never walk, your legs get weak, you gain weight, your aerobic system loses capacity, and you lose the ability to walk. You don't need it, you say, because you have your car and your mobility scooter and you'll always have these things. Your crutches don't make you weaker, you can still do everything the walkers can do, you say.

Good luck with the nature hike!

stavros•1h ago
Sure. What are these programming skills you never need but that you're going to need at some indeterminate time in the future?
jasonjmcghee•1h ago
There are plenty of engineers that couldn't work without a modern IDE or in languages without memory management.

Or without the ability to use a library from GitHub / their package manager.

It doesn't feel THAT much different to me.

"Engineer" as a term might drift. There are "web developers" that can only use webflow / wordpress.

bpye•1h ago
At least today, it isn't practical for most people to run these models locally- I think adding a dependency on a cloud service is different enough to some local (possibly open source) tool like an IDE.
jasonjmcghee•45m ago
Slack, GitHub, Figma, AWS, etc

Lots of people use firebase, supabase etc.

Many people's jobs are centered around using Salesforce

It all makes me uncomfortable- I want to be able to work without internet. But it's getting more difficult to do it

StrauXX•24m ago
Self hosting at a reasonable scale is much cheaper than people think. I am running clusters of DGX Spark machines with BiFrost load balancers in our company and for client projects. They work flawlessly!

128 GB unified memory, Nvidia chip and ARM CPU for just around 3k€ net. They easily push ~400 input and ~100 output tokens per second per device on say gpt-oss-120b. With two devices in a cluster, thats enough performance for >20 concurrent RAG users or >3 "AI augmented" developers.

And they don't even pull that much power.

torben-friis•1h ago
The huge difference is that we don't know the cost we're going to end up with.

Will you have AI at the cost of a slack subscription? At the cost of a teammate? Will it not be available and you'll have to hire anthropic workers with AI access?

heipei•55m ago
Local AI models are already more than capable enough writing code that surpasses the ability of any bad or even mediocre engineer. That is not something we need to worry about.

In a way, this is less of a cost issue than the fact that some/many engineers do not seem to be willing or able to host things themselves anymore and will happily outsource every part of their stack to managed services, be it CDN, hosting, databases, etc. I don't know why that's not more alarming than the LLMs.

Jcampuzano2•1h ago
Engineer as a term has already drifted vastly since nobody in the field of "Software Engineering" is actually an Engineer if we go by a strict definitions.

Engineers are accredited and in some countries even come with a title.

jjtheblunt•32m ago
i think you accidentally overlooked accredited engineers who happen to be writing software
lkmill•3m ago
as an actual engineer i just feel sad. i should probably feel happy but i like solving problems. fml i have becomea luddite.
vict7•30m ago
IDEs are free. Libraries are free. Languages are free. This is becoming more like an internet subscription where you’re at the mercy of Anthropic the same way you may be at the mercy of Comcast.

I’m sure you can see the difference between a garbage collector and a nondeterministic slop generator

But it feels good to equivocate, so here we are.

ares623•3m ago
"What kind of engineer are you" - Jesse Plemons wearing bright-red sunglasses
joe_the_user•1h ago
Post title is completely misleading relative to the article. Article title: "A.I. Should Elevate Your Thinking, Not Replace It"
_pdp_•1h ago
Huberman: Your brain has a region that only grows when you do things you don't want to do

...or as I interpret it your brain grows only when it does things that are difficult.

If you remove the difficulty, it will atrophy into a hum of a mindless chit-chat.

Engineering the data structures and control flows from scratch is a completely different than asking an LLM to scaffold them for you.

clutter55561•41m ago
I love programming, but I don’t love working. I’m about 10 years away from retiring and can’t wait. Does that count? ;-)
erxam•1h ago
Here's the question I want to posit and nobody who's against AI has managed to answer satisfactorily: what is it in for me if I were to acquire all those skills?

I don't give a shit about this career. I don't give a shit about engineering. I despise every second of it. There's nothing to aim for other than being a drone that does whatever is asked of it.

If AI can reduce my mental workload, why wouldn't I want to delegate everything over to it so I can save my faculties for what I truly enjoy? For the art of a worthless craft?

_pdp_•1h ago
Some people enjoy working with computers. :) It is not always about the money. It is also about having fun and learning new things.

For you, it seems that you are not cut for it judging from what you say.

So yes, use LLMs.

sumeno•1h ago
Why are you employable if the AI does everything for you?
m4rkuskk•1h ago
Before AI I would spend multiple days mapping out my database tables and queries while now I ask AI to propose multiple different approaches and I pick the best one. But then on the other hand I’m working on 10 features at the same time and have to carefully look through them. But I can see that I’m totally dependent on the AI now. Creating a full plan by yourself feels like a waste of time, since you know the AI can create the same or better plan in a split second. So when Claude is down, I end up not being productive at all.
halamadrid•1h ago
This is true. Speaking only based on personal experience. My team had started treating AI like a super intelligent being.

“AI suggested we do it that way”

And we’ve been degrading our systems rapidly for last several weeks. We’ve decided to pause and reflect and change how we use AI on tasks that are not dead simple.

joshcramer•1h ago
First, it was pencil and paper. Then it was calculators. Then computers! It’s a slippery slope, this technology business.
Unmotivator2677•1h ago
That why I don't use AI for any personal projects, I like to keep my mind sharp. Unless it's a projects that incorporates AI in some way, but don't use AI to code it. But at work I don't care, I do what I am paid for, if my manager wants me to entirely vibe code using Claude, his choice, I will not be the one paying for technical dept that creates.
chromacity•1h ago
Aaand it's the second "AI is bad" story on the front page today that's evidently generated by AI.
srcreigh•1h ago
Is it wise to understand everything that AI does for you?

Let’s say a person has 10 units of learning per week. Is the author actually claiming that that person must not deliver any results beyond their 10 units?

It makes some sense to have say 20 units of results and prioritize which ones to fully comprehend.

I suspect APIs / libraries / languages / platforms will have more churn due to AI. New platform new system need to learn. Once every 5 years might become every year or even more frequent. That would be a sort of inflation of knowledge and skills. It would affect the decision making about how to spend one’s 10 units per week.

addaon•54m ago
> Let’s say a person has 10 units of learning per week.

This is… not how humans work? If you have the time and energy to learn ten things, and then spend time babysitting a random number generator to produce evidence of 10 more units of work, you’re paying an opportunity cost compared to someone who spends the time learning an eleventh thing. You can argue who has more short term value to a company… but who is the wiser person after a thirty year career?

conqrr•1h ago
This is a huge concern and I fully agree with the post. Even though one might think I am not fully giving into AI, this was always the case etc. It still affects YOU and everyone else. 1. Software, often, isn't built in vacuum. Lots of companies are shoving AI down throats like it or not. Most Bigtech is heavily using metrics to get to 100% AI generated code. Reviewing is a nightmare. 2. New entrants (new grads etc) are largely AI first and are losing out on the safety and reliability aspects that are enforced automatically when you learn coding without AI.

IMO, teams need to agree on a set of principles on AI usage, concrete examples of where and how to use it. Perhaps its much more useful in parts of your system that's faster evolving and doesn't have too much core logic like testing frameworks etc

Simply discarding it as 'yet another tool' is part of the problem.

staticshock•1h ago
The eloquence with which this point gets (repeatedly) made is continuing to improve each next time I read it. However, I still feel like we haven't nailed it. That is, we are not yet at the "aphorism" stage of the discourse (e.g. "the medium is the message", "you ship your org chart", "can't make 9 babies in a month"), in which the most pointed version of this critique packs a punch in just a few words that resonate with the majority of people. That kind of epistemological chiseling takes years, if not decades. And AI certainly won't do it for us, because we don't know how to RL meaning-making.
xnx•52m ago
This concept won't reach that point because when you chisel too hard it crumbles. There are countless lower level tasks that typical programmers no longer learn how to do. Our capacity for knowledge is not unlimited so we offload everything we can to move to the next level of abstraction.
ua709•40m ago
I get your point, I just wonder how accurate it is. We basically never look at the output of the compiler, so I agree that tool allows one to operate at a higher level than assembly. But I always have to wade through the output from AI so I’m not sure I got to move to the next level of abstraction. But maybe that’s just me.
staticshock•26m ago
That's true, but I think it's beside the point. The flip side of that argument, which is equally true, goes something like, "not doing cognitive push-ups leads to cognitive atrophy."

There are skills we're losing that are probably ok to lose (e.g. spacial memory & reasoning vs GPS, mental arithmetic vs calculators), primarily because those are well bounded domains, so we understand the nature of the codependency we're signing up for. AI is an amorphous and still growing domain. It is not a specific rung in the abstraction hierarchy; it is every rung simultaneously, but at different fidelity levels.

xyproto•57m ago
Calculators and computers are creating engineers that can't think without them either. There are many problems with AI, but from my point of view, the title has not thought things through.
AndrewDucker•50m ago
We teach kids basic maths before we give them calculators.

University degrees certainly used to teach computing fundamentals without you having a computer in front of you.

samuelknight•53m ago
We are in a transition phase where you need systems and coding skill but you can't be sufficiently productive without AI.
Waterluvian•51m ago
I think AI can generally be utilized in two ways:

1) you use it to help write code that you still “own” and fully understand.

2) you use it as an abstraction layer to write and maintain the code for you. The code becomes a compile target in a sense. You would feel like it’s someone else’s code if you were asked to make changes without AI.

I think 2) is fine for things like prototypes, examples, references. Things that are short lived. Where the quality of the code or your understanding of it doesn’t matter.

I think people get into trouble when they fool themselves and others by using 2) for work that requires 1). Because it’s quicker and easier. But it’s a lie. They’re mortgaging the codebase. And I think the atrophy sets in when people do this.

kylebyte•22m ago
And any push to use 2 to build infra to make 1 easier is hard to sell when a lot of engineers think AI will be able to perfectly do 1 in some nebulous time in the near future.
p_stuart82•5m ago
the thing is it doesn't even feel like mortgaging. shipping, features going out, everything looks fine. then something breaks and you realize you can't debug your own code without asking the model again.
clutter55561•51m ago
AI isn’t creating the problem, it is just showing the problem. Those who did not want to learn before AI did so reluctantly, mixing Google and SO. Now they ask AI. An existing problem found a new solution.

Personally, I really enjoy using AI. I have created my own cascade workflow to stop myself from “asking one more question”. Every session is planned. Claude and Codex can be annoying as hell (for different reasons). Neither is sufficiently smart for me to trust them. I treat them as junior devs who never get tired, know a lot of facts but not necessarily how to build.

hpbc5•49m ago
Theory of Bounded Rationality and its implications is something they should teach everyone.
julienfr112•43m ago
Structure engineer can't either any more build bridge or tower without CAD or FDM
smj-edison•34m ago
On the point of avoiding the struggle of learning, I think it's easy to swing too far the other direction and go back to not using modern development tools. I think it is doing a new learner a disservice by saying something like "don't use GDB/REPL/AI tool to learn, since you'll never learn the fundamentals". I think all of these tools allow for learning, if that's how the learner engages with them. So I hope that AI becomes integrated in the learning process, as far as it accelerates and doesn't replace understanding.
TrackerFF•31m ago
For all we know, we're in the early stages of making traditional (software) engineering obsolete. As in, we don't know if the role of software engineer as we know it today will still exist in 10-15-20 years.

I mean, right now we're at the stage where any user can get AI to make you software to solve very specific things - almost no technical knowledge needed.

My prediction is that first will software engineers be rendered obsolete. After that, small businesses will disappear, as users can simply get those products/services directly via AI.

dyauspitr•10m ago
Convenience is king. We became fat and unhealthy because high calorie foods are cheap and easy. We will become stupid because AI will do our thinking for us. There’s no way around it. Only a small percentage of the population are capable of perpetual self control. The old world forced you to be healthy, there was no other choice. Now there are like 15 things you have to have self control to do the hard work at even though you can get the same results the easy way. Working out, dieting, “proper” social interaction, sleep timing, child rearing, social meetups, career networking etc. The list is never ending and none of it is organic like it used to be.
teaearlgraycold•6m ago
I think many of us have interviewed people with 10+ YoE, and resumes that seem impressive, and then seen them fail to do much of anything in evaluations. I expect this problem to get significantly worse. There will be a class of people tucked into organizations where they can get away with sitting in meetings and YOLOing AI code for years.
fermatf•3m ago
For couple of last weeks, I use AI to speedup my thinking process. Instead of think about something to come up to conclusion, I let AI brainstorm for me and then select. Not for everything, but I found it faster with AI. Having taste on select the ai output is important though.