frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Valve releases Steam Controller CAD files under Creative Commons license

https://www.digitalfoundry.net/news/2026/05/valve-releases-steam-controller-cad-files-under-creat...
491•haunter•3h ago•160 comments

Appearing Productive in the Workplace

https://nooneshappy.com/article/appearing-productive-in-the-workplace/
181•diebillionaires•2h ago•65 comments

From Supabase to Clerk to Better Auth

https://blog.val.town/better-auth
65•stevekrouse•1h ago•19 comments

The bottleneck was never the code

https://www.thetypicalset.com/blog/thoughts-on-coding-agents
374•Anon84•2d ago•255 comments

BYD overtakes Tesla and Kia as the best-selling EV brand in key overseas markets

https://electrek.co/2026/05/05/byd-overtakes-tesla-kia-best-selling-ev-brand-key-overseas-markets/
35•doener•29m ago•9 comments

Show HN: I built an open-source email builder, alternative to Beefree/Unlayer

https://play.templatical.com
34•oahmadov•2h ago•11 comments

CARA 2.0 – “I Built a Better Robot Dog”

https://www.aaedmusa.com/projects/cara2
394•hakonjdjohnsen•2d ago•48 comments

What makes a good smartphone camera?

https://cadence.moe/blog/2026-05-05-what-makes-a-good-smartphone-camera
36•zdw•1d ago•18 comments

Setting up a Sun Ray server on OpenIndiana Hipster 2025.10

https://catstret.ch/202605/srss-hipster202510/
105•jandeboevrie•8h ago•31 comments

Google tools for customizing searches

https://cardcatalogforlife.substack.com/p/google-has-a-secret-reference-desk
39•maxutility•14h ago•4 comments

Colombia hosts talks on exiting fossil fuels as global energy crisis deepens

https://www.latimes.com/environment/story/2026-04-26/colombia-hosts-talks-on-exiting-fossil-fuels...
58•PaulHoule•2h ago•25 comments

Knitting bullshit

https://katedaviesdesigns.com/2026/04/29/knitting-bullshit/
362•ColinEberhardt•13h ago•157 comments

The Thinking Plant's Man (2025)

https://www.sciencehistory.org/stories/magazine/the-thinking-plants-man/
46•benbreen•1d ago•10 comments

Reverse-engineering the 1998 Ultima Online demo server

https://draxinar.github.io/articles/2026-05-01-uodemo-reverse-engineering.html
186•notsentient•12h ago•47 comments

CNN founder Ted Turner, a pioneer of cable TV news, dies at 87

https://www.cnn.com/2026/05/06/us/ted-turner-death
104•pseudolus•3h ago•80 comments

Batteries Not Included, or Required, for These Smart Home Sensors

https://coe.gatech.edu/news/2026/04/batteries-not-included-or-required-these-smart-home-sensors
161•gnabgib•3d ago•58 comments

Multi-stroke text effect in CSS

https://yuanchuan.dev/multi-stroke-text-effect-in-css
252•cheeaun•14h ago•33 comments

A Theory of Deep Learning

https://elonlit.com/scrivings/a-theory-of-deep-learning/
4•elonlit•23h ago•0 comments

Going Full Time on Open Source

https://jdx.dev/posts/2026-04-17-going-full-time-on-open-source/
40•thunderbong•1h ago•2 comments

Coverage Cat (YC S22) Seeks Fractional Engineer to build AI Growth Toolkit

https://www.coveragecat.com/careers/engineering/fractional-growth-engineer
1•botacode•6h ago

245TB Micron 6600 ION Data Center SSD Now Shipping

https://investors.micron.com/news-releases/news-release-details/industry-leading-245tb-micron-660...
207•neilfrndes•15h ago•156 comments

Show HN: Tilde.run – Agent Sandbox with a Transactional, Versioned Filesystem

https://tilde.run/
85•ozkatz•2h ago•71 comments

YouTube, your RSS feeds are broken

https://openrss.org/blog/youtube-your-feeds-are-broken
287•veeti•17h ago•99 comments

Wolfenstein 3D for Gameboy Color on custom cartridge (2016)

https://www.happydaze.se/wolf/
109•ksymph•2d ago•19 comments

Agents can now create Cloudflare accounts, buy domains, and deploy

https://blog.cloudflare.com/agents-stripe-projects/
584•rolph•15h ago•331 comments

Google Cloud fraud defense, the next evolution of reCAPTCHA

https://cloud.google.com/blog/products/identity-security/introducing-google-cloud-fraud-defense-t...
22•unforgivenpasta•54m ago•9 comments

StarFighter 16-Inch

https://us.starlabs.systems/pages/starfighter
608•signa11•16h ago•335 comments

Vibe coding and agentic engineering are getting closer than I'd like

https://simonwillison.net/2026/May/6/vibe-coding-and-agentic-engineering/
165•e12e•3h ago•228 comments

Egg Intake and the Incidence of Alzheimer's Disease in Adventist Health Study-2

https://www.sciencedirect.com/science/article/pii/S0022316626001902
19•Stratoscope•1h ago•15 comments

Chrome downloads a 4GB AI file without user consent, researcher alleges

https://www.engadget.com/2166113/chrome-downloads-a-4gb-ai-file-without-user-consent-researcher-a...
12•netfortius•1h ago•1 comments
Open in hackernews

Link Time Optimizations: New Way to Do Compiler Optimizations

https://johnnysswlab.com/link-time-optimizations-new-way-to-do-compiler-optimizations/
39•signa11•11mo ago

Comments

sakex•11mo ago
Maybe add the date to the title, because it's hardly new at this point
vsl•11mo ago
...or in 2020 (the year of the article).
Deukhoofd•11mo ago
What do you mean, new? LTO has been in GCC since 2011. It's old enough to have a social media account in most jurisdictions.
jeffbee•11mo ago
Pretty sure MSVC ".NET" was doing link-time whole-program optimization in 2001.
andyayers•11mo ago
HPUX compilers were doing this back in 1993.
jeffbee•11mo ago
Oh yeah, well ... actually I got nothin'. You win.

I will just throw in some nostalgia for how good that compiler was. My college roommate brought an HP pizza box that his dad secured from HP, and the way the C compiler quoted chapter and verse from ISO C in its error messages was impressive.

abainbridge•11mo ago
Or academics in 1986: https://dl.acm.org/doi/abs/10.1145/13310.13338

The idea of optimizations running at different stages in the build, with different visibility of the whole program, was discussed in 1979, but the world was so different back then that the discussion seems foreign. https://dl.acm.org/doi/pdf/10.1145/872732.806974

srean•11mo ago
Yes and if I remember correctly there used to be Linux distros that had all the distro binaries LTO'ed.
phkahler•11mo ago
I tried LTO with Solvespace 4 years ago and got about 15 percent better performance:

https://github.com/solvespace/solvespace/issues/972

Build time was terrible taking a few minutes vs 30-40 seconds for a full build. Have they done anything to use multi-core for LTO? It only used one core for that.

Also tested OpenMP which was obviously a bigger win. More recently I ran the same test after upgrading from an AMD 2400G to a 5700G which has double the cores and about 1.5x the IPC. The result was a solid 3x improvement so we scale well with cores going from 4 to 8.

wahern•11mo ago
Both clang and GCC support multi-core LTO, as does Rust. However, you have to partition the code, so the more cores you use the less benefit to LTO. Rust partitions by crate by default, but it can increase parallelism by partitioning each crate. I think "fat LTO" is the term typically used for whole-program, or at least in the case of Rust, whole-crate LTO, whereas "thin LTO" is what you get when you LTO partitions and then link those together normally. For clang and GCC, you can either have them automatically partition the code for thin LTO , or do it explicitly via your Makefile rules[1].

[1] Interestingly, GCC actually invokes Make internally to implement thin LTO, which lets it play nice with GNU Make's job control and obey the -j switch.

WalterBright•11mo ago
Link time optimizations were done in the 1980s if I recall correctly.

I never tried to implement them, finding it easier and more effective for the compiler to simply compile all the source files at the same time.

The D compiler is designed to be able to build one object file per source file at a time, or one object file which combines all of the source files. Most people choose the one object file.

srean•11mo ago
I think MLton does it this way.

http://mlton.org/WholeProgramOptimization

Dynamically linked and dynamically loaded libraries are useful though (paid for with its problems of course)

tester756•11mo ago
Yea, generating many object files seems like weird thing. Maybe it was good thing decades ago, but now?

Because then you need to link them, thus you need some kind of linker.

Just generate one output file and skip the linker

WalterBright•11mo ago
I've considered many times doing just that.
tester756•11mo ago
And what was the result/conclusion of such considerations?
WalterBright•11mo ago
Not worth the effort.

1. linkers have increased enormously in complexity

2. little commonality between linkers for different platforms

3. compatibility with the standalone linkers

4. trying to keep up with constant enhancement of existing linkers

yencabulator•11mo ago
Not maybe. Sufficient RAM for compilation was a serious issue back in the day.
kazinator•11mo ago
Sure, and if any file is touched, just process them all.
adrian_b•11mo ago
Some compilers had incremental compilation to handle this during development builds.

Then only the functions touched inside some file would be recompiled, not the remainder of the file or other files.

Obviously, choosing incremental compilation inhibited some optimizations.

adrian_b•11mo ago
Generating many object files is pointless for building an executable or a dynamic library, but it remains the desired behavior for building a static library.

Many software projects that must generate multiple executables are better structured as a static library plus one source file with the "main" function for each executable.

WalterBright•11mo ago
One thing the D compiler does is it can generate a library in one step (no need to use the librarian). Give a bunch of source files and object files on the command line, specify a library as the output, and boom! library created directly (compiling the source files, and adding the object files).

I haven't used a librarian program for maybe a decade.

senkora•11mo ago
In C++, there is a trick to get this behavior called "unity builds", where you include all of your source files into a single file and then invoke the compiler on that file.

Of course, being C++, this subtly changes behavior and must be done carefully. I like this article that explains the ins and outs of using unity builds: https://austinmorlan.com/posts/unity_jumbo_build/

WalterBright•11mo ago
> this subtly changes behavior

The D module design ensures that module imports are independent of each other and are independent of the importer.

YorickPeterse•11mo ago
For Inko (https://inko-lang.org/) I went a step further: it generates an object file for each type, instead of per source file or per project. The idea is that if e.g. a generic type is specialized into a new instance (or has some methods added to it), only the object file for that type needs to be re-generated. This in turn should allow for much more fine-grained incremental compilation.

The downside is that you can end up with thousands of object files, but for modern linkers that isn't a problem.

dooglius•11mo ago
It sounds like this would prevent the inherit concurrency you would get out of handling files separately?
WalterBright•11mo ago
It's complicated and not at all clear. For example, most modules import other modules. With separate compilation, most of the modules need to be compiled multiple times, with all-together, it's only once.

On the other hand, the optimizer and code generator can be run concurrently in multiple processes/threads.

Remnant44•11mo ago
Link time optimization is definitely not new, but it is incredibly powerful - I have personally had situations where the failure to be able to inline functions from a static library without lto cut performance in half.

It's easy to dismiss a basic article like this, but it's basically a discovery that every Junior engineer will make, and it's useful to talk about those too!

srean•11mo ago
The inline keyword should really have been intended for call sites rather than definitions.

Perhaps language designers thought that if a function needs to be inlined everywhere, it would lead to verbose code. In any case, it's a weak hint that compilers generally treat with much disdain.

lilyball•11mo ago
ffmpeg has a lot of assembly code in it, so it's a very odd choice of program to use for this kind of test as LTO is presumably not going to do anything to the assembly.
mcdeltat•11mo ago
Different .c/.cpp files being a barrier to optimisation always struck me as an oddly low bar for the 21st century. Yes I know the history of compilation units but these days that's not how we use the system. We don't split code into source files for memory reasons, we do it for organisation. On a small/medium codebase and a decent computer you could probably fit dozens of source files into memory to compile and optimise together. The memory constraint problem has largely disappeared.

So why do we still use the old way? LTO seems effectively like a hack to compensate for the fact that the compilation model doesn't fit our modern needs. Obviously this will never change in C/C++ due to momentum and backwards compatibility. But a man can dream.

kazinator•11mo ago
LTO breaks code which assumes that the compiler has no idea what is behind an external function call and must not assume anything about the values of objects that the code might have access to:

    securely_wipe_memory(&obj, sizeof obj);
    return;
  }
Compiler peeks into securely_wipe_memory and sees that it has no effect because obj is a local variable which has no "next use" in the data flow graph. Thus the call is removed.

Another example:

    gc_protect(object);
    return
  }
Here, gc_protect is an empty function. Without LTO, the compiler must assume that the value of object is required for the gc_protect call and so the generated code has to hang on to that value until that call is made. With LTO, the compiler peeks at the definition of gc_protect and sees the ruse: the function is empty! Therefore, that line of code does not represent a use of the variable. The generated code can use the register or memory location for something else long before that line. If the garbage collector goes off in that part of the code, the object is prematurely collected (if what was lost happens to be the last reference to it).

Some distros have played with turning on LTO as a default compiler option for building packages. It's a very, very bad idea.

djmips•11mo ago
So slow
jordiburgos•11mo ago
Any idea on the performance improvements with these LTO?