I often think about what an incredible bargain YouTube was at $1 billion.
So just the simple fact of "existing" on the Internet now costs money. Not only that, with their ads, you can target competitors search names. So you need to outbid your competitors so that your users don't get directed to your competitors by "accident". When I lost uBlock last month, I was surprised how many sites buy ads for their own names.
To be clear, they have been doing this for so long I was first told it by a friend in the travel sector almost 20 years ago, along with the explanation of why this ends up absorbing a frightening proportion of the spare money in the economy.
The whole reason Google had such a nightmare over Facebook was FB is the only thing that broke their monopoly on this, which is why FB also prints money.
It's interesting that not many investors sees this and there is little investment in an alternative browser. It is literally a software worth hundreds of billions of $$$. I hate Sam Altman as an individual but I have to admit that he could see it which is probably why he tried to grab "chrome" when an opportunity presented itself.
But, as a Google employee for about a decade at this point, I really do think that the company is trash when it comes to planning and coordination and is getting worse at it over time. I think that there are a lot of things it never really needed to learn how to do because the ads money is just so outrageous.
It's hard to tell if it's actually turning over a profit when they only report revenue for it.
Amazon reports today. AWS has been slipping and at current growth rates would slip to the number 2 cloud provider in a few years time… let’s see if that trend holds true today.
Is that why Azure was down, maybe it was out celebrating? Hahah
We are at a point in this bubble where planned capex is approaching theorectical limits of actual energy capacity. Let’s see when they start pumping coal companies because they restart those plants…not even joking.
Even the labour market for the required skills is absurdly overvalued, if they don't see it as a major priority they won't spend the necessary rates for it.
They rolled out search without building a search engine and could've done the same with AI.
Coal retirement in the US is almost entirely driven by economics. If someone needs the power, coal plants will stay open and capacity factor go up.
https://www.eia.gov/outlooks/steo/report/elec_coal_renew.php
A lot of folks agree though that the smarter play would have been just just sit more on the sidelines intentionally and wait for the bubble to burst, then buy up stuff cheap in the resulting fire sales. AWS should have just focused on the core compute and storage services, which they do well.
AWS simply doesn’t have the right talent to do well in AI at the moment and the folks they did have mostly fled elsewhere.
Cloud is great but its just borrowing someone else's machine for a fee. It's like a hyper-scalable and granular mainframe but like all mainframes, the client is powerless without it.
We need to get our acts together and not allow ourselves to be smothered in silicon nimbus (clouds) and lose track of the open sky that is the internet.
I am not suggesting going back to the 2000s with a small tower PC under someone's desk with a blinking light running a business critical cron job.
Just that we need more appetite to take responsibility of our servers and systems. Yes it takes time to manage and get up to speed, but that's empowering right....?
I rather not, but instead these compute clouds should be treated similarly to utilities - you don't really want to be running your own generators for electricity do you? So why is it that we can setup utilities properly, where everybody can trust that it stays on and usable for a price low enough?
My theory is that vendor lock in is the cause. Cloud should be a commodity, but it isn't because nobody in the cloud business wants to be a commodity. Hence, they have every incentive to prevent it from happening.
>In Belgium, we’ll soon install the first ever battery-based system for replacing generators at a hyperscale data center. In the event of a power disruption, the system will help keep our users’ searches, e-mails, and videos on the move—without the pollution associated with burning diesel.
https://blog.google/inside-google/infrastructure/cleaner-dat...
Disclosure: I work at Google, but no on anything related to this.
S3 goes down, your entire data infrastructure (and systems attached to it) is out of reach. Local backups that can be deployed and run like S3 as if your services are still up, unless I’m mistaken, aren’t very common and would mean doing the thing you got S3 to handle in the first place. A generator is much easier to set up and use to solve the problem in a 1:1 way comparatively with zero dependence on local utilities and click on almost quite literally the second you lose power. Yeah they can’t run forever and aren’t the most cost efficient, but people use them weeks and have 100% function of all electronics in their home again.
I am not responsible for building and deploying these systems, I just depend on some of them at my job and interact with S3/media convert a ton. My understanding has always been that backups that can be restored very quickly are the aim, not trying to keep all the things attached to S3 running as if it’s still going. But if I am wrong please let me know and I would love to hear more from folks about this! I actually find this whole dance very interesting
What I always feared as a user was that they'd invent a new billable metric, which happened a few times. Have you ever seen a utility add them at this pace? The length of your monthly usage report shows all those items at $0 that could eventually be charged. Let that sink in.
Another interesting element is that all higher level services are built on core services such as s3/ec2. So the vendor lock in comes from all propaganda that cloud advocates have conditioned young developers with.
Notice how core utilities in many countries are state monopolies. If you want it to be a true utility, perhaps that's the solution to get them started. The state doesn't need a huge profit, but it needs sovereignty and keep foreign agents out of its DCs. Is it inefficient? Of course. But if all you really need is s3/ec2 and some networking / queuing constructs, perhaps private companies can own higher tier / lock-in services while guaranteeing it runs on such a utility. This would provide their users reduced egress fees from a true utility which doesn't need (and is not allowed to have) a 50x profit on that line item.
It’d lead to direct comparisons between clouds (as someone who had estimated moving legacy work loads it’s a shit load of work to get a reasonable “drive away” price for one of them and the work is duplicated to do it for the other).
If they are truly commodities then than goes away as does their margins to large degree - none of them want that.
When it comes to software engineering at scale, nothing beats Google. They already operate the world’s largest and (profitable) information index, they design their own chips, and employ the best engineers who know how to deliver massive systems cost-effectively. They can sustain large investment far longer than any other company simply because of how profitable all their existing businesses are. I just wish they would get their act together when it comes to product management.
NVIDIA also seems like a risky bet given all the money they're passing around in circles to themselves via multi-billion dollar "investments"
Beyond that there are a lot of new chip companies attacking the market: https://news.ycombinator.com/item?id=45686790
Apple on the other hand... (though they're behind in other regards)
I don't see a natural monopoly anymore.
It’s everyone’s wet dream that Apple buys Anthropic, but I’m guessing their valuation now is already too high even for Apple. And why would Amazon let that happen?
It’s a solid match but not happening unless the AI market craters and Apple swoops in to buy them for vastly less.
They might say, have fun making single digit or negative profit margins on AI, we’ll sell subscriptions and iPhones and extended warranties.
It makes no sense for Apple to waste money on having their own LLM when they are becoming commoditized - just chose the one that will give them the best deal.
Alphabet also has massive market share with YouTube, solid cloud offerings, Waymo which already has driverless/unsupervised robot taxis and their old standby search which is morphing more into more AI answers when you throw questions in your browser.
Alphabet's P/E ratio is lower than the S&P 500's. Some may argue the S&P is mostly being held up by Tech and AI related companies so stock in Alphabet could be seen as undervalued and NVDIA as overvalued.
Hardware almost always ends up getting commoditized so investing in hardware companies for the long haul does carry maybe more risk than mainly software companies.
From the technical/talent/funding side, I agree with you. With Alphabet, the real risk is that they lose interest/focus and kill (or pivot) products early. I don't really see that happening in the short term (considering the hype/investments/cash around AI) but maybe in the medium to long term if things start to get stale.
Even with that risk though, I agree with you.
I can’t understand how they’re bleeding talent.
Unless there are antitrust concerns, but that's hard to see because Google is competition and Anthropic will probably remain the third more niche player.
IMHO I think Microsoft is a lot closer in terms of creating a real business around AI because they understand Enterprise customers. In fact they were first to offer private OpenAI deployments with AuthN/AuthZ that you could drop in your own VPC and the ability to handle sensitive data while everyone else was handing out API keys off a public endpoint like a startup. Don't get me wrong Google has had some okay technology (i.e. Gemini 2.5 is middle of the pack) but they seem like a very consumer-oriented company and don't seem to know how to market them.
Google loses on all of those fronts.
Your mention of "VC funding" may be a placeholder for conversational purposes but just to clarify, OpenAI is way beyond the small pocketbooks of typical Venture Capitalists. They have investments from huge sovereign wealth funds (e.g. billions from Middle East oil funds) and trillion-dollar corporations like NVIDIA, Microsoft, AMD, etc.
In other words, OpenAI has embedded itself into the money and tech ecosystem much more deeply than the dotcom failed startup Webvan burning through the limited cash of Sequoia VC.
There are lots of heavily funded vested interests that don't want to see Google be the ultimate winner (or only winner).
This definitely doesn't look like it's going to be a winner takes all market.
OpenAI doesn't release MAUs, but it's possible Gemini could surpass OpenAI in MAUs (but really unlikely DAUs) by the end of 2026.
Google can kick ass in AI all day, but soon Gemini is going to be saying shit like, "it sounds like your air conditioner is broken, maybe you should call Clearwater HVAC - they're the best!"
Once Google has to start making money from AI, talking to Gemini is going to feel like talking to that friend that invited you to dinner only to try to pitch you on an MLM.
Wouldn't one possible solution be just adding a sidebar with ads for products/services relevant to the current topic? They don't need to change the model output to have ads. (Not advocating for ads, just pointing out there's lots of ways to deliver ads)
I think the winner will be between xai or meta, im leaning more with xai since elon seems to always have 100% conviction even with seemingly bad ideas and he is way more technical than sam & zuck
The biggest risk for Google is that it is already a behemoth and it has lost a lot of what made it so special. I also think they have a huge conflict of interest when it comes to AI, because it is a threat to their core business.
But, all that aside, you know what they're really good at? Google Cloud. I'm a user of all the major Cloud providers and nothing beats GCP's interface. Azure? Complicated, buggy and unreliable. There's some exploit every quarter. AWS? Overly complex security policies even to deploy a basic app, very enterprise focused and startup-unfriendly. GCP - you can be up and running on a serverless/VM instance in your lunch break. Simple, reliable, scales effortlessly. We serve 10M+ visitors on there and we've had zero issues in the last half a decade with them.
They suck at a lot of other things and they have a lot of other problems. But boy, are they good at Cloud. No wonder even Apple is their customer. It's one of the few products from Google where you can say "it just works".
"Normal" is what most of us want.
But if you are a consultant, why pray tell are you spending that much time in the console except for occasional monitoring? Everything is either CLI commands or infrastructure as code.
And the issue with Google in particular is their customer support and business enterprise go to market sucks. The sales guys at ProServe use to run circles around GCP and never really took them seriously or had talking points. For big contracts we mostly had to compete with Azure because big boring enterprise was already using Microsoft.
When we did have to compete against GCP - and their sales team didn’t blow up the deal themselves - we just had to say “do you really want to trust Google with your workloads? Look at their history of abandoning products and raising prices”
Apple is a customer of all of the big cloud providers and was on stage at last year’s reinvent.
AWS throws money and people at startups. I’ve been on three sides - working at a 60 person startup who hosted on AWS, working at ProServe and now a third party consulting company where many deals come about because of AWS funding.
Sarcasm, of course, for those who didn’t catch it - it’s hard to make jokes online.
Google has so much cash from other services…
Tangentially, it's sad to see AI become yet another attention factory. Sora 2 is OAI trying to get the jump on this, but I think it's quite likely that they'll get clobbered by Google and Meta when those companies start juicing their users attention in the same way - they have the ad buyers and the infrastructure for that business already, it's literally how they've made money this whole time.
They've already successfully monetised their chatbot (Gemini)[1] while all the others are still wondering how to get there.
-------------------
[1] I've got about 15 - 20 additions to the settings to prefix all chats with "don't show me video links", but apparently the "show youtube link" is at the system layer and cannot be overridden at all.
So, "search revenue" means ads, right? So with youtube ads, 66B out of 100B revenue is from advertising?
In other words Alphabet is still an advertising company first, and foremost?
TLDR: Google shows the user an ad for a product they were already planning to buy at the very last minute.
That’s not by choice, of course. It’s difficult-to-impossible to turn off.
Google has the best infrastructure, its own chips TPUs and a huge customer base. Ben Thompson (Stratechery) and Horace Deidi (Asymco) have referenced Clay’s “Innovator’s Dilemma” about the difference between a disruptive technology and a sustaining technology.
AI is going to help BigTech that exists today get bigger.
The cloud providers - AWS, Google and Microsoft - are just exposing models as just another API to their existing customers. Google added AI overviews and an AI tab. Even Apple is going to come out ahead because developers who want local (free) inference can depend on iPhones to have decent hardware and local models.
Facebook is using AI for better ad targeting.
Where does that leave OpenAI? It’s been said plenty of times that no one is going to care about their models that are a commodity and they could put any decent model behind ChatGPT and it would be just as popular.
Is that true? Google just has an infinite money printing machine and a city's worth of engineers. They don't strike me as a company that fits the description you wrote.
I admit I'm biased at present since they killed off support for the Nests in my house.
RedShift1•4h ago
fidotron•4h ago
Anyone remotely familiar with Google as a third party developer will notice the pattern: this will ramp up until it is almost your entire job simply dealing with their changes, most of which will be not-quite-actual-fixes to their previous round of changes.
This is not unique to Google, but it is a strategy employed to slow down development at other companies, and so help preserve the moat.
refulgentis•4h ago
Worked at Google for 7 years, and your post reminds me it is time to share a secret: it is Koyaanisqatsi* and people's base instincts unbridled, no more. There is no quicker route to irrelevancy than being the person who cares about something from last years OKRs.
* to be clear, s/Koyaanisqatsi/too big to exist healthily and yet it does exist -- edited this in, after I realized it's unclear even if you watched the movie and know the translation of the word
embedding-shape•3h ago
refulgentis•3h ago
* no net headcount increases, random firings, and any new headcount should be overseas. i.e. we have the same # of people we did in 2021 with 50% more to do.
Insanity•2h ago
jp_nc•3h ago
refulgentis•3h ago
To be clear, I agree with you, and am puzzled by the lack of consequences from the real world for the stuff I saw. But that was always the mystery of Google to me, in a nutshell: How can we keep getting away with this?
fidotron•3h ago
A large part of that is the Google-are-super-geniuses PR effort. Anyone pointing out that Google's products don't reflect this to their boss faces having their own credibility reduced instead.
a4isms•3h ago
> Watch out when your competition fires at you. Do they just want to force you to keep busy reacting to their volleys, so you can’t move forward?
> Think of the history of data access strategies to come out of Microsoft. ODBC, RDO, DAO, ADO, OLEDB, now ADO.NET – All New! Are these technological imperatives? The result of an incompetent design group that needs to reinvent data access every goddamn year? (That’s probably it, actually.) But the end result is just cover fire. The competition has no choice but to spend all their time porting and keeping up, time that they can’t spend writing new features. Look closely at the software landscape. The companies that do well are the ones who rely least on big companies and don’t have to spend all their cycles catching up and reimplementing and fixing bugs that crop up only on Windows XP.
—Joel Spolsky, "Fire and Motion," 2002
https://www.joelonsoftware.com/2002/01/06/fire-and-motion/
CheeseFromLidl•3h ago
adzm•2h ago
Notably all of these still work, even if not getting new updates.
morkalork•4h ago
TeMPOraL•3h ago
blitzar•4h ago
cube00•3h ago
How do you handle the slow console?
written-beyond•3h ago
r_lee•3h ago
CamouflagedKiwi•3h ago
jonfw•2h ago
CamouflagedKiwi•2h ago
Plus Python is notoriously not easy to deploy for - a Go (or Rust or whatever) binary would have almost no dependencies to worry about.
seneca•2h ago
sgt•3h ago
I guess the big providers learn from each other.
candiddevmike•3h ago
RedShift1•1h ago
Pxtl•3h ago
https://steve-yegge.medium.com/dear-google-cloud-your-deprec...
> Dear RECIPIENT,
> Fuck yooooouuuuuuuu. Fuck you, fuck you, Fuck You. Drop whatever you are doing because it’s not important. What is important is OUR time. It’s costing us time and money to support our shit, and we’re tired of it, so we’re not going to support it anymore. So drop your fucking plans and go start digging through our shitty documentation, begging for scraps on forums, and oh by the way, our new shit is COMPLETELY different from the old shit, because well, we fucked that design up pretty bad, heh, but hey, that’s YOUR problem, not our problem.
> We remain committed as always to ensuring everything you write will be unusable within 1 year.
> Please go fuck yourself,
> Google Cloud Platform
aiahs•3h ago
mg•3h ago
Those rarely change.
Otek•3h ago
whynotminot•3h ago
Esophagus4•3h ago
So you either:
1) postpone all your updates for years until a bad CVE hits and you need to update or some application goes end of life and you’re screwed because updating becomes a massive exercise
2) do regular updates and patches to the entire stack, including Linux, in which case, you’re in the same position you were before with running on the stack rot treadmill
So you might’ve moved the rot to a different place, but I don’t know if you’ve reduced any of it. I’ve owned stuff deployed off of vanilla VMs and I actually found it harder to maintain because everything was a one-off.
jillesvangurp•2h ago
I see a lot of teams being overly conservative with keeping their stuff up to date running with years out of date stuff with lots of known & fixed bugs of all varieties, performance issues that have long since been addressed, etc. All in the name of stability.
I treat anything that isn't up to date as technical debt. If an update breaks stuff, I need to know so I can either deal with it or document a work around or a (usually temporary) version rollback. While that happens, it doesn't happen a lot. And I prefer knowing about these things because I've tried it over being ignorant of the breakage because I haven't updated anything in years. It just adds to the hidden pile of technical debt you don't even know you have. Ignorance is not an excuse for not dealing with your technical debt. Or worse compounding it by building on top of it and creating more technical debt in the process.
Dealing with small changes over time is a lot less work than with dealing with a large delta all at once. It's something I've been doing for years. If I work on any of my projects, the first thing I do is update dependencies. Make sure stuff still works (tests). Make sure deprecated APIs are dealt with.
Esophagus4•2h ago
If business understands that you need time to work on these things :’)
mg•2h ago
To me "I wish they would stop deprecating stuff" sounds like any part of the stack has something like a 1% or even 10% chance in any given year to be shut off.
I would expect that by carefully choosing your stack from open source software in the Debian repos, you can bring the probability of any given part being gone with no successor to less than 0.1% per year. As an example - could you imagine Python becoming unavailable in 2026? Or SQLite? Docker?