Though from what I gather form the story, part of the spedup comes from how android composes their build stages.
I.e. speeding up by not downloading everything only helps if you don't need everything you download. And adds up when you download multiple times.
I'm not sure they can actually provide a speedup in a tight developer cycle with a local git checkout and a good build system.
Looks like it's similar in some ways. But they also don't tell too much and even the self-hosting variant is "Talk to us" pricing :/
And as for pricing... are there really that many people working on O(billion) lines of code that can't afford $TalkToUs? I'd reckon that Linux is the biggest source of hobbyist commits and that checks out on my laptop OK (though I'll admit I don't really do much beyond ./configure && make there...)
I.e. this isn't something battel tested for hundreds of thousands of developers 24/7 over the last years. But a simple commercial product sold by people that liked what they used.
Well, since android is their flagship example, anyone that wants to build custom android releases for some reason. With the way things are, you don't need billions of code of your own code to maybe benefit from tools that handle billions of lines of code.
> As the build runs, any step that exactly matches a prior record is skipped and the results are automatically reused
> SourceFS delivers the performance gains of modern build systems like Bazel or Buck2 – while also accelerating checkouts – all without requiring any migration.
Which sounds way too good to be true.
At the start snapshot the filesystem. Record all files read & written during the step.
Then when this step runs again with the same inputs you can apply the diff from last time.
Some magic to automatically hook into processes and doing this automatically seems possible.
I was a build engineer in a previous life. Not for Android apps, but some of the low-effort, high-value tricks I used involved:
* Do your building in a tmpfs if you have the spare RAM and your build (or parts of it) can fit there.
* Don't copy around large files if you can use symlinks, hardlinks, or reflinks instead.
* If you don't care about crash resiliency during the build phase (and you normally should not, each build should be done in a brand-new pristine reproducible environment that can be thrown away), save useless I/O via libeatmydata and similar tools.
* Cross-compilers are much faster than emulation for a native compiler, but there is a greater chance of missing some crucial piece of configuration and silently ending up with a broken artifact. Choose wisely.
The high-value high-effort parts are ruthlessly optimizing your build system and caching intermediate build artifacts that rarely change.
theossuary•2h ago
jeffbee•2h ago