frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

From GPT-4 to GPT-5: Measuring Progress in Medical Language Understanding [pdf]

https://www.fertrevino.com/docs/gpt5_medhelm.pdf
38•fertrevino•2h ago•15 comments

Uv format: Code Formatting Comes to uv (experimentally)

https://pydevtools.com/blog/uv-format-code-formatting-comes-to-uv-experimentally/
113•tanelpoder•4h ago•79 comments

Crimes with Python's Pattern Matching (2022)

https://www.hillelwayne.com/post/python-abc/
106•agluszak•5h ago•38 comments

Happy 0b100000th Birthday, Debian

https://lists.debian.org/debian-devel-announce/2025/08/msg00006.html
13•pabs3•3d ago•0 comments

An interactive guide to SVG paths

https://www.joshwcomeau.com/svg/interactive-guide-to-paths/
188•joshwcomeau•3d ago•20 comments

Elegant mathematics bending the future of design

https://actu.epfl.ch/news/elegant-mathematics-bending-the-future-of-design/
36•robinhouston•3d ago•0 comments

Show HN: Changefly ID + Anonymized Identity and Age Verification

https://www.changefly.com/blog/2025/08/anonymized-identity-and-age-verification-a-new-era-of-privacy-for-changefly-id
9•davidandgoli4th•5h ago•2 comments

AI tooling must be disclosed for contributions

https://github.com/ghostty-org/ghostty/pull/8289
492•freetonik•6h ago•257 comments

DeepSeek-v3.1 Release

https://api-docs.deepseek.com/news/news250821
262•wertyk•5h ago•59 comments

My other email client is a daemon

https://feyor.sh/blog/my-other-email-client-is-a-mail-daemon/
86•aebtebeten•16h ago•17 comments

Beyond sensor data: Foundation models of behavioral data from wearables

https://arxiv.org/abs/2507.00191
189•brandonb•10h ago•41 comments

Miles from the ocean, there's diving beneath the streets of Budapest

https://www.cnn.com/2025/08/18/travel/budapest-diving-molnar-janos-cave
98•thm•3d ago•13 comments

Show HN: Splice – CAD for Cable Harnesses and Electrical Assemblies

https://splice-cad.com
21•djsdjs•3h ago•4 comments

Text.ai (YC X25) Is Hiring Founding Full-Stack Engineer

https://www.ycombinator.com/companies/text-ai/jobs/OJBr0v2-founding-full-stack-engineer
1•RushiSushi•3h ago

Weaponizing image scaling against production AI systems

https://blog.trailofbits.com/2025/08/21/weaponizing-image-scaling-against-production-ai-systems/
311•tatersolid•12h ago•83 comments

How well does the money laundering control system work?

https://www.journals.uchicago.edu/doi/10.1086/735665
176•PaulHoule•11h ago•175 comments

The Onion Brought Back Its Print Edition. The Gamble Is Paying Off

https://www.wsj.com/business/media/the-onion-print-subscribers-6c24649c
63•andsoitis•2h ago•11 comments

Beyond the Logo: How We're Weaving Full Images Inside QR Codes

https://blog.nitroqr.com/beyond-the-logo-how-were-weaving-full-images-inside-qr-codes
36•bhasinanant•3d ago•14 comments

Using Podman, Compose and BuildKit

https://emersion.fr/blog/2025/using-podman-compose-and-buildkit/
241•LaSombra•13h ago•79 comments

Philosophical Thoughts on Kolmogorov-Arnold Networks (2024)

https://kindxiaoming.github.io/blog/2024/kolmogorov-arnold-networks/
8•jxmorris12•3d ago•0 comments

Show HN: OS X Mavericks Forever

https://mavericksforever.com/
289•Wowfunhappy•3d ago•120 comments

Building AI products in the probabilistic era

https://giansegato.com/essays/probabilistic-era
85•sdan•6h ago•50 comments

The power of two random choices (2012)

https://brooker.co.za/blog/2012/01/17/two-random.html
41•signa11•3d ago•3 comments

Privately-Owned Rail Cars

https://www.amtrak.com/privately-owned-rail-cars
91•jasoncartwright•12h ago•130 comments

Mirage 2 – Generative World Engine

https://demo.dynamicslab.ai/chaos
14•selimonder•3h ago•4 comments

Mark Zuckerberg freezes AI hiring amid bubble fears

https://www.telegraph.co.uk/business/2025/08/21/zuckerberg-freezes-ai-hiring-amid-bubble-fears/
673•pera•13h ago•679 comments

The contrarian physics podcast subculture

https://timothynguyen.org/2025/08/21/physics-grifters-eric-weinstein-sabine-hossenfelder-and-a-crisis-of-credibility/
152•Emerson1•7h ago•180 comments

Launch HN: Skope (YC S25) – Outcome-based pricing for software products

38•benjsm•9h ago•30 comments

I forced every engineer to take sales calls and they rewrote our platform

https://old.reddit.com/r/Entrepreneur/comments/1mw5yfg/forced_every_engineer_to_take_sales_calls_they/
246•bilsbie•9h ago•171 comments

The Core of Rust

https://jyn.dev/the-core-of-rust/
142•zdw•8h ago•118 comments
Open in hackernews

A Decoder Ring for AI Job Titles

https://www.dbreunig.com/2025/08/21/a-guide-to-ai-titles.html
51•dbreunig•5h ago

Comments

extr•4h ago
Seems about right. My official title at work is "AI Engineer". What does that mean exactly?

- I'm not a researcher and not fine tuning or deploying models on GPUs

- I have a math/traditional ML background, but my explanation of how transformers, tokenizers, etc work would be hand-wavy at best.

- I'm a "regular engineer" in the sense I'm following many of the standard SWE/SDLC practices in my org.

- I'm exclusively focused on building AI features for our product, I wear a PM hat too.

- I'm pretty tuned in to the latest model releases and capabilities of frontier models, and consider being able to articulate that information part of my job.

- I also use AI heavily to produce code, which is helpfully a pretty good way to get a sense for model capabilities.

Do I deserve a special job title...maybe? I think there's definitely an argument that "AI Engineering" really isn't a special thing, and considering how much of my day to day is pure integration work with the actual product, I can see that. OTOH, part of my job and my value at work is very product based. I pay a lot of attention to what other people in the industry are doing, new model releases, and how others are building things, since it's such a new area and there's no "standard playbook" yet for many things.

I actually quite enjoy it since there's a ton of opportunity to be creative. When AI first started becoming big I thought about doing the other direction - leveraging my math/ML background to get deeper into GPUs and MLOps/research-lite kind of work. Instead I went in a more producty direction, which I don't regret yet.

apwell23•3h ago
> I pay a lot of attention to what other people in the industry are doing, new model releases, and how others are building things,

what do you think of recent MIT news that 95% gen ai projects don't do anything valuable at all ?

extr•3h ago
Sounds kind of aggressive but probably the number is up there.
Nevermark•2h ago
> what do you think of recent MIT news that 95% gen ai projects don't do anything valuable at all ?

Worth noting that a project that ends up “doing nothing” isn’t the same as a project that had/created no value.

Even some projects that in hindsight were deterministic lemons.

Assuming compute resources continue scaling up, and architectures keep improving, AI change now has an everything, everywhere, all the time, scope. Failing fast is necessarily going to have a substantial dimension.

IanCal•1h ago
Not sure the majority of projects do anything valuable at all.
janalsncm•23m ago
You would need to compare that to the baseline value creation from non-AI projects.
tamimio•3h ago
Who cares? The word "engineer" is meaningless now and anyone can be a self-proclaimed engineer whenever they feel like it. Will anyone double check or even reject you for an engineering job when you are not? Absolutely not! Take a bootcamp, submit plenty of PRs correcting typos, and pass the interview with the help of AI and you basically made it, dreams come true!!
glitchc•3h ago
Yeah the term "engineer" has been diluted into oblivion, and we only have ourselves to blame for not protecting it.
GLdRH•3h ago
In German you're not even an engineer if you don't sometimes wear a hard hat or hold a screwdriver.
tamimio•3h ago
Agree 100%, even blue collar workers guard their profession. Hell, I was talking to a friend and they rejected her for a retail job because she had never worked in retail before. Engineering on the other hand has zero gatekeeping - it's a sign spinner job right now. Just do a few humiliation rituals like daily standup and you're the perfect candidate!
fakedang•1h ago
In Dubai, the poor underpaid folks cleaning the roads and gutters late at night are called "Cleaning Engineers" and "Garden Engineers". It's honestly sad, almost a mockery.
quesera•33m ago
In the US, we've had "sanitation engineer" as the euphemistic neologism for "worker paid to pick up your garbage bins" for 50(?) years.
GuinansEyebrows•1h ago
protecting it? ha! we’re just the first group of greater fools who thought it applied to us in the first place (hell, i became an “engineer” with an Associates degree!). just because we benefited from the prestige doesn’t always mean we’re actually held to the classical standards of engineers.
andrew_lettuce•3h ago
We all know the AI part is largely meaningless because of the hype and nonsense, but what defines you as an engineer? When you consider that classical engineers are responsible for the correctness of their work, combining it with AI seems like a joke
extr•3h ago
Hard to tell what you're even trying to say here. I am obviously responsible for the correctness of my work. "AI Engineer" does not generally mean "AI-Assisted Engineer", thought that was clear from my post.
potatolicious•3h ago
> "When you consider that classical engineers are responsible for the correctness of their work"

Woah hang on, I think this betrays a severe misunderstanding of what engineers do.

FWIW I was trained as a classical engineer (mechanical), but pretty much just write code these days. But I did have a past life as a not-SWE.

Most classical engineering fields deal with probabilistic system components all of the time. In fact I'd go as far as to say that inability to deal with probabilistic components is disqualifying from many engineering endeavors.

Process engineers for example have to account for human error rates. On a given production line with humans in a loop, the operators will sometimes screw up. Designing systems to detect these errors (which are highly probabilistic!), mitigate them, and reduce the occurrence rates of such errors is a huge part of the job.

Likewise even for regular mechanical engineers, there are probabilistic variances in manufacturing tolerances. Your specs are always given with confidence intervals (this metal sheet is 1mm thick +- 0.05mm) because of this. All of the designs you work on specifically account for this (hence safety margins!). The ways in which these probabilities combine and interact is a serious field of study.

Software engineering is unlike traditional engineering disciplines in that for most of its lifetime it's had the luxury of purely deterministic expectations. This is not true in nearly every other type of engineering.

If anything the advent of ML has introduced this element to software, and the ability to actually work with probabilistic outcomes is what separates those who are serious about this stuff vs. demoware hot air blowers.

extr•3h ago
Nicely said, I'm going to borrow some language here. I've talked a little to my coworkers about how it's possible the future of SWE looks more like "build a complex system with AI and test it to death to make sure it fits inside the performance envelope you require".
dbreunig•3h ago
I will be thinking about this comment for a bit. Thanks for this perspective!
simonw•3h ago
This comment is excellent.
whatevertrevor•3h ago
You're right in a descriptive manner, but I also think the parent comment's point is about correctness and not determinism.

In other engineering fields correctness-related-guarantees can often be phrased in probabilistic ways, e.g. "This bridge will withstand a 10-year flood event but not a 100-year flood event", but underneath those guarantees are hard deterministic load estimates with appropriate error margins.

And I think that's where the core disagreement between you and the parent comment lies. I think they're trying to say AI generated code-pushers are often getting fuzzy on speccing out the behavior guarantees of their own software. In some ways the software industry has _always_ been bad at this, despite working with deterministic math, surprise software bugs are plentiful, but vibe-coding takes this to another level.

(This is my best-case charitable understanding of what they're saying, but also happens to be where I stand)

potatolicious•30m ago
> "I think they're trying to say AI generated code-pushers are often getting fuzzy on speccing out the behavior guarantees of their own software."

I agree, and I think that's the root of the years-long argument of whether programmers are "real" engineers, where "real engineering" implies a level of rigor about the existence of and adherence to specifications.

My take on this is though that this unseriousness really has little to with AI and entirely to do with the longstanding culture of software generally. In fact I'd go as far as to say that pre-LLM ML was better about this than the rest of the industry at-large.

I've had the good fortune to be working in this realm since before LLMs became the buzzword - most ML teams had well-quantified model behaviors! They knew their precision and recall! You kind of had to, because it was very hard to get models to do what you wanted, plus companies involved in this space generally cared about outcomes.

Then we got LLMs, when you can superficially produce really impressive results easily, and the dominance of vibes over results. I can't stand it either, and mostly am just waiting for most of these things to go bust so we can go back to probabilistic systems where we give a shit about quantification.

Fellshard•2h ago
This seems to me patently absurd, because LLMs are not part of the probabilistic environment of the domain you're engineering; rather, you're injecting new probabilistic inputs into your system. That seems to me to be a wholly different category, and wildly misrepresents how an engineer is supposed to operate and think.
potatolicious•22m ago
> "because LLMs are not part of the probabilistic environment of the domain you're engineering; rather, you're injecting new probabilistic inputs into your system"

You do this as a process engineer also. You don't have to have a human operator inserting the stator into the motor housing, you could have a robot do it (it would cost a lot more) and be a lot more deterministic.

After the stator is in the housing you don't need to have a human operator close it using a hand tool. You could do it robotically in which case the odds of failure are much lower. That also costs a lot.

You choose to insert probabilistic components into the system because you've evaluated the tradeoffs around it and decided it's worth it.

Likewise you could do sentiment analysis of a restaurant review in a non-probabilistic manner - there are many options! But you choose a probabilistic ML model because it does a better job overall and you've evaluated the failure modes.

These things really aren't that different.

nlawalker•2h ago
The author’s definitions suggest you should have “Applied” in your title, which I like, but my impression is that “applied” roles so vastly outnumber “creation of models” roles globally that it’s actually the latter that would benefit from a modifier. For now, you have to rely on context (mostly the nature of the company’s primary output) when trying to interpret something like a job posting or an acquaintance’s title.
janalsncm•29m ago
It’s not that crazy to add a couple of domain-specific prediction heads to a BERT-family pretrained model and then do a quick fine tuning. By volume that’s less common but I would guess most people are just using things off the shelf and might not even consider themselves AI engineers. I have no frame of reference though.
jimbobimbo•3h ago
"Forward Deployed Engineer" is a bodyshop with LLM.
Avicebron•2h ago
They could just call it "Field Service Tech" like the rest of the universe. I understand using title inflation/deflation to keep pushing the engineer title (and pay expectation) into the dirt, but still, this is dumb.
dbreunig•2h ago
I also dislike the term. It feels concocted to evoke “tacticool” vibes.

Unless you’re pushing new firmware onto a drone in Ukraine, FDE is stolen valor.

jordanb•2h ago
Pretty sure this title came from Palentir who got it from the military.
bigiain•1h ago
"Forward Deployed Software Engineer - This role includes working in locations that include risks of getting shot and possibly killed"
Squeeeez•1h ago
Weell, you probably don't want to serve "Backwards Deployed Engineers" to your clients
gdbsjjdn•2h ago
I thought this was going to be satire. Software engineer job titles are already pretty bogus (Senior Principal Distinguished Engineer, anyone?), and the AI trend has only created more jobs with nebulous descriptions around "doing AI".
ceejayoz•2h ago
I wanna just be a webmaster again.
hervature•2h ago
I think you mean webmain.
layer8•2h ago
How about promptmaster? ;)
trhway•1h ago
AI-snake charmer.
kingbob000•2h ago
Yeah, same. I was actually disappointed when I saw that they were taking the titles seriously
dude250711•2h ago
Senior Anything-But-C-Level-Compensation-Package Engineer.
crorella•1h ago
I saw an "Exalted engineer" once, not kidding.
notatoad•1h ago
you might not be kidding, hopefully they were.

when you decide titles don't matter and let people choose their own, you get some titles that weren't created in total seriousness.

nphardon•2h ago
Still not clear to me what is meant by "ai" now? My sense is that it is a marketing term for LLM. Is that accurate? Do people now consider any ML project to be ai?
dbreunig•2h ago
You should read the post. You might find the “domain” discussion interesting.
nphardon•2h ago
That's what I was alluding to, I don't think it defines ai, do you? These pieces seem like classical ML pieces to me plus LLM. Is that ai? Like from a technical standpoint, is it clearly defined?
layer8•2h ago
It’s not clearly defined. Nowadays by default it means generative AI (https://en.wikipedia.org/wiki/Generative_artificial_intellig...).
tomrod•1h ago
AI is defined by algorithmic decision making. ML, a subset, is about using pattern matching with statistical uncertainty in that decision making. GenAI uses algorithms of classical ML, including deep learning based on neural networks, to encode the decode input to output, jargonized as a prompt. Whether diffusion or next token prediction, the patterns are learned during ML training.

AI is not totally encapsulated by ML. For example, reinforcement learning is often considered distinct in some AI ontologies. Decision rules and similar methods from the 1970s and 1980s are also included though they highlight the algorithmic approach versus the ML side.

There are certainly many terms used and misused by current marketing (especially the bitcoin bro grifters who saw AI as an out of a bad set of assets), but there actually is clarity to the terms if one considers their origins.

nphardon•1h ago
"AI is not totally encapsulated by ML" that's the part I haven't been able to put my fingers on. I understand that it's not encapsulated, ML is not intelligence, it's gradient descent. So what is in that set AI - {ML}?
adenta•2h ago
Should’ve included “Member of Technical Staff”
gamerDude•1h ago
I assume if you are applying to AI roles, you use AI to find and possibly apply for you. So, we don't even need to understand what the titles mean because AI can do it for us.

I'm tempted to use /s, but then again...

latexr•1h ago
> Even when you live and breathe AI, the job titles can feel like a moving target. I can only imagine how mystifying they must be to everyone else.

> Because the field is actively evolving, the language we use keeps changing. Brand new titles appear overnight or, worse, one term means three different things at three different companies.

How can you write that and not realise “maybe this is all made up bullshit and everyone is pulling titles out of their asses to make themselves look more important and knowledgeable than they really are, thus I shouldn’t really be wasting my time giving the subject any credence”? If you’re all in on the field and can’t keep up, why should anyone else care?

janalsncm•35m ago
I agree some analysis of job postings or pay distributions by title would’ve made this article stronger. The titles are less relevant than the job descriptions, which are task specific and not bullshit.
rvz•1h ago
Almost nobody here wanted to be an 'AI researcher' until late 2022 when the money started pouring into AI researchers.

Now with this article clearly defining each of these roles (AI researcher being the most serious out of the rest) everyone now suddenly wants to be one.

"AI" is a vast field which spans beyond deep learning and LLMs. Unless you are very serious and fully interested in actually advancing the field, don't bother.

Why not robotics or electrical engineer? Not cool enough?

janalsncm•38m ago
My heuristic has been

ML engineer => knows pytorch

AI engineer => knows huggingface

Researcher => implements papers

I know these heuristics are imperfect but I call myself an MLE because it’s closest to my skillset.