When Google wanted engineers to use AI features, it turned them on in Cider-V by default. And if you turned them off, later updates would turn them back on. This is very good for your adoption metrics, but might not tell you exactly what you want to know about engineer happiness.
Such a dominant IDE also allows management to ignore the long-tail of users who aren't using it.
You have access to an extremely powerful remote workstation that from a UI perspective functions almost identically to a local workstation, via Chrome Remote Desktop. Plus, no one builds things locally, even on that machine. There is a huge, absolutely amazing distributed build system that everyone uses for everything. (Again, Android and Chromium are different.)
So you don't really need a powerful local machine. I held out for a long time--there were a lot of growing pains in the early days. But eventually it got really, really good.
Afterwards I was issued a 12" Pixelbook and it was surprisingly much more usable than I had expected! I could ssh into a Linux box for running builds and tests. Cider worked perfectly. It was snappy enough to serve as a thin client even on a 4K screen.
The article is framed around "all Googlers" but there is still a very large contingent of Googlers who cannot use these tools.
I do think Google will continue to get results out of their tooling, as long as they are investing in the tooling. But that is not zero cost. Is it worth it for what they are doing? Largely seems to be.
But it isn't like they are that much more successful at software projects than any other company? They are still largely an ads company, no?
Sure, the money is mostly in ads, but serving searches, AI, youtube, and all the rest at the scale Google does it requires a technical tour-de-force. Does Google do it better than everyone? Absolutely not. But it does it better than many.
Certainly it isn't the _only_ way to do it--other companies also manage to do it. But not all that many at the same scale. It's an existence proof that you can.
Consider that they spend more on trying to build up and support this central IDE than most companies dream of losing in productivity to not having this.
The aspect I miss is the distributed compilation hinted at in the article. I remember back at the end of 1990s using distcc and things, but that never seemed to happen in the Java world and the tooling like maven etc is structured to make everything one long dependent chain. Shame.
It's also nice that it stores all my preferences in the cloud, so switching machines is seamless (helpful when my macbook broke a couple weeks ago and I had to use a loaner chromebook for a day).
It's also well integrated with google3 and codesearch, and seamlessly runs tests on remote machines with tmux integration and all.
Not all of google tooling is my favorite (like their source control), but the IDE is great.
My recollection from 2009-2011 is that emacs and vim were the dominant editors (just as the TV show Silicon Valley depicted), and there was a decent-sized minority using Eclipse and Intellij, both of which had official support for Google tooling. The command line still largely ruled though, even though the official Google developer workstation was Goobuntu, Google-flavored Ubuntu. This reflected the overall developer population of the time.
I think Cider actually was invented a little earlier than the article describes. I have vague memories of some engineers experimenting with web-based IDEs that would integrated directly with Critique (the code-review software) as early as 2013-2014. Its use was not widespread when I left in 2014; there was still the impression that it wasn't powerful enough for daily driving.
When I came back in 2020, emacs/vim use was much lower, again probably reflecting differences in the general population of developers. Many more of the developers had been trained in the post-2010 developer ecosystem of VSCode, IntelliJ, etc, and this was reflected in tool usage at Google too. I'd say IntelliJ was the dominant IDE, with Cider a close second and Cider-V just starting to take market share. You still had to pry emacs and vim from a grizzled old veteran's hands.
By 2022 I'd transferred to an Android team, and Android Studio with Blaze was the dominant IDE, even as general IntelliJ usage in the company was falling. Cider just didn't have the same Android-specific support. Company-wide Cider-V was growing the fastest, taking market share from both IntelliJ and Cider-V.
By 2024 Cider-V was dominant and there started to be a concerted push to standardize on it, particularly since new AI agent tools were coming out and they couldn't be supported on all editors that Googlers wanted to use.
As of my departure in 2026, the company-wide push was to standardize on Antigravity [1], which, as I understand it, won a turf war within the developer tools org and got blessed as the "official" Google AI coding agent. This also has the effect of concentrating developer time dogfooding Google's external AI coding offering, which hopefully should improve its quality. There's still significant Cider-V usage, but it's dropping, and execs are pushing Antigravity hard.
Now, ironically with so many extensions and LLM computing, users seem to forget that they chose Cider because of its lightweight.
ncruces•54m ago