frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Ask HN: Anyone Using a Mac Studio for Local AI/LLM?

44•UmYeahNo•1d ago•27 comments

Ask HN: Ideas for small ways to make the world a better place

10•jlmcgraw•9h ago•17 comments

Ask HN: Non-profit, volunteers run org needs CRM. Is Odoo Community a good sol.?

2•netfortius•4h ago•1 comments

Ask HN: Non AI-obsessed tech forums

18•nanocat•7h ago•13 comments

Ask HN: 10 months since the Llama-4 release: what happened to Meta AI?

42•Invictus0•1d ago•11 comments

AI Regex Scientist: A self-improving regex solver

6•PranoyP•11h ago•1 comments

Ask HN: Who wants to be hired? (February 2026)

139•whoishiring•4d ago•513 comments

Ask HN: Who is hiring? (February 2026)

312•whoishiring•4d ago•511 comments

Tell HN: Another round of Zendesk email spam

104•Philpax•2d ago•54 comments

Ask HN: Any International Job Boards for International Workers?

2•15charslong•7h ago•1 comments

Ask HN: Is Connecting via SSH Risky?

19•atrevbot•2d ago•37 comments

Ask HN: Why LLM providers sell access instead of consulting services?

4•pera•17h ago•13 comments

Ask HN: Has your whole engineering team gone big into AI coding? How's it going?

17•jchung•1d ago•12 comments

Ask HN: What is the most complicated Algorithm you came up with yourself?

3•meffmadd•19h ago•7 comments

Ask HN: How does ChatGPT decide which websites to recommend?

5•nworley•1d ago•11 comments

Ask HN: Is it just me or are most businesses insane?

7•justenough•1d ago•5 comments

Ask HN: Mem0 stores memories, but doesn't learn user patterns

9•fliellerjulian•2d ago•6 comments

Ask HN: Is there anyone here who still uses slide rules?

123•blenderob•3d ago•122 comments

Ask HN: Anyone Seeing YT ads related to chats on ChatGPT?

2•guhsnamih•1d ago•4 comments

Ask HN: Does global decoupling from the USA signal comeback of the desktop app?

5•wewewedxfgdf•1d ago•2 comments

Kernighan on Programming

170•chrisjj•4d ago•61 comments

We built a serverless GPU inference platform with predictable latency

5•QubridAI•2d ago•1 comments

Ask HN: How Did You Validate?

4•haute_cuisine•1d ago•4 comments

Ask HN: Does a good "read it later" app exist?

8•buchanae•3d ago•18 comments

Ask HN: Have you been fired because of AI?

17•s-stude•3d ago•15 comments

Ask HN: Cheap laptop for Linux without GUI (for writing)

15•locusofself•3d ago•16 comments

Ask HN: Anyone have a "sovereign" solution for phone calls?

12•kldg•3d ago•1 comments

Test management tools for automation heavy teams

2•Divyakurian•1d ago•2 comments

Ask HN: OpenClaw users, what is your token spend?

14•8cvor6j844qw_d6•4d ago•6 comments

Ask HN: Has anybody moved their local community off of Facebook groups?

23•madsohm•4d ago•18 comments
Open in hackernews

Ask HN: Is it still worth learning a new programming language?

16•xparadigm•1mo ago
I have been writing Python code for a few years now. But I feel like LLMs can write much better code than me. I used to keep myself updated with newer technology. But now I am loosing interest. I was interested in learning Rust. But I don't find any motivation now since I can just vibe code with Rust. Any thoughts in that?

Comments

ben_w•1mo ago
I found myself losing interest but for different reasons. A different magic tool besides LLMs that promised to solve problems I never had, but which has become necessary as most job adverts called for it.

Reactive.

Not just React itself, but the paradigm, so also SwiftUI.

I never had a problem with "massive ViewControllers", the magic that's supposed to glue it all together is just a little bit fragile and hard to debug when it does break, and the syntactic sugar (at least for SwiftUI) is just self-similar enough for me to keep mixing it up.

But learning new languages? Nah, I'm currently learning/getting experience with JavaScript and ruby by code-reviewing LLM output.

Antibabelic•1mo ago
If you don't actually want to write code there's no reason to learn anything. The question is, if LLMs can write much better code than you, what does your employer need you for?
kevin061•1mo ago
Your employer needs you because writing code was never the hardest part of programming and software engineering in general. The hardest part is managing expectations, responsibilities, cross-team communication, multi-domain expertise, corporate bureaucracy and pushing back against unnecessary requirements and constraints. None of which LLMs can solve, and are especially terrible at pushing back.
gus_massa•1mo ago
I agree. I remember once the full especification I got was

> Enough

After talking for 4 hours and 3 coffe cups, I got enough corner cases and main case to understand what they wanted. 1 week later I got a list of criteria that can be programmed. 5 years later most of the unusual but anoying rought corners were fixed. We still had a button to aprove manually the weird cases.

2rsf•1mo ago
LLM's are far from perfect, any production grade code must go through a human inspection unless it's a tiny ad hoc app. This means that you need to be familiar with the languiage and the environment to get good quality code.
chistev•1mo ago
Specialize
gaganuk•1mo ago
What kind of LLM can write prod grade code right now? i think LLM's should be merely used as a tool.
lordkrandel•1mo ago
Only Juniors can think that. You can "vibe code" with Rust? And who is doing the reviews? Verifying requisites, performance, security? You must know the language very well to have a senior level.
kypro•1mo ago
Agreed. There's no way someone can vibe code production-quality code today...

Interestingly as AI models are becoming "more competent" I'm finding more and more issues with AI generated code in the project I work on...

Whenever AI is used by a more junior dev (or a senior dev who simply can't be assed) you always find strange patterns which a senior would never have done...

Typically the code works, but there might be subtle security issues or just unusual coding patterns where it appears an LLM has written slop, and instead of taking a step back and reconsidering its approach when errors crop up, LLMs tend to just add layers of complexity to patch over its slop.

These problems obviously compound if left unchecked.

I actually prefer how things were last year when coding models were less competent because at least if a problem was hard enough they'd get nowhere. Today they're good enough to keep hacking until the slop it writes is just about working.

In regards to OPs question though, I suspect there's less point in playing around with different technologies to get some basic understanding of how they work today (LLMs can do this). But if you want to be able to guide LLMs towards good solutions and ensure the code being produced in the era of AI is good, then having engineers with a deep understanding of the technologies they're using is very important.

apothegm•1mo ago
If you don’t know the language better than the LLM, you can’t notice when it’s making terrible decisions. Use the LLM to accelerate your ramp-up and make you more productive while learning, but it’s not a replacement for learning.
eevmanu•1mo ago
I find the premise reasonable, though I do have an observation.

The current AI hype may have placed us in a filter bubble or echo chamber, shaping our conclusions. These highly specialized algorithms can nudge or reward us for thinking in specific ways.

Regarding programming languages, there is immense value in understanding internal primitives.

As example, consider concurrency primitives. Different languages provide different levels of abstraction: high-level library support in Python, the event loop structure in JavaScript, compiler-level implementations in Rust and C++, runtime-intrinsic mechanisms in Go and Java, and virtual machine intrinsics, such as Erlang.

By viewing languages through this lens, you recognize that each implements these primitives differently, allowing you to choose the most effective tool for the job.

If your goal is to assess the short-term economic value of a technology, your logic is understandable. However, learning new languages and tools remains worthwhile. When AI agents begin invoking these tools on the fly, you may not know if a specific choice is the most effective one. Without this knowledge, you will have some gaps to challenge the AI's decision.

In the long run, making the effort to master these concepts yields far greater value as a software engineer. It enables you to understand the rationale behind applying a precise tool to a precise task.

There are valid arguments supporting various perspectives on this. However, while any approach can be useful, this discussion highlights the need for wisdom: the awareness of one's own biases. As I noted earlier, filter bubbles can distort judgment. Continuously questioning your conclusions helps ensure you move toward the best outcomes. I hope you find this recommendation useful.

whatamidoingyo•1mo ago
For the longest time, I wanted to really dive deep into lower-level learning (e.g. C, Assembly, HDL, chips). LLMs temporarily killed my motivation to continue learning C. I wanted to build a clipboard history similar to windows 11, but for a Linux-based OS. Prompted ChatGPT for the code, and it spit some out. It was pretty bad, nowhere near a finished project. I deleted the LLM code and started anew.

I remembered why I wanted to learn this stuff. It's not for money, or to look cool.

It's for the fascination I have for computing.

How do electrons flow through a wire? How do the chips within a computer direct that flow to produce an image on a screen? These questions are mind-blowing for me. I don't think LLMs can kill this fascination. Although, for web programming, sure. I always hated front-end programming, and now I don't really have to do it (I don't have the same fascination for the why of such tech). So will I ever learn a new front-end framework? Most likely not.

sloaken•1mo ago
Nand to tetris - you can thank me later
whatamidoingyo•1mo ago
Haha, I've taken it. Incredible course. I'll raise your suggestion to the Turing Complete game :)
softwaredoug•1mo ago
LLMs can write. Often with more clarity than I can. But I still like to write, because writing is thinking. And I want to hone my thinking about the problem.

The same can be said about coding. Code to think and explore a problem. See how different languages help you approach a problem. Get a deeper understanding about a topic by coding.

sky2224•1mo ago
My question for you is: how do you not learn a language by having an LLM aid in writing it for you?

What I've found now is LLMs allow me to use/learn new languages, frameworks, and other technologies while being significantly more productive than before. On the flip side, as others have mentioned, I shoot myself in the foot more often.

Basically, I output more, I see more pitfalls upfront, and I get proficient sooner. This can be the case for you if you take on an active development approach rather than a blind/passive one where you just blindly accept whatever a model outputs.

gethly•1mo ago
If you want to be a serious programmer, you have to learn a compiled language, not the Python garbage. And definitely not Rust as that is for seasoned programmers. Java´/Scala/Kotlin, Zig, Odin, Jai(next year), C, C++, C#, Go, Swift, are safe bets with employment opportunities that allow you to also venture into other languages with ease.

Forget about LLM. That's for normies.

I totally understand the lack of motivation. Doing something just for the sake of doing it is pointless. So you have to find a project to do and the language will then become merely a tool instead of the whole point. So find out what you need or want to do, maybe a 3D engine, mobile application, database, search engine, image recognition/OCR, maybe robotics or some arduino automation or whatever and just start working on it.