What an absurd amount of money - if only this was invested in energy sector scientific research and development, or healthcare or anything else practical.
I really hoped to see compact molten salt nuclear reactors in operation before 2030.
Love this for them.
- Softbank's motto
Haven’t you heard? AGI is going to solve every problem for us!!!
Provided they keep cost growth slower than revenue and don't get disrupted by another model provider/commodification etc.
I hope that’s “real” revenue and not the cyclic quid pro quo that seems to be propping the whole thing up.
I hope that’s “real” revenue and not the cyclic quid pro quo that allegedly seems to be propping the whole thing up.
For average people the global competitors are putting up near identical services at 1/10th the cost. Anthropic and Google and OpenAI may have a corporate sales advantage about their security and domestic alignment, or being 5% better at some specific task but the populace at large isn't going to cough up $25 for that difference. Beyond the first month or two of the novelty phase it's not apparent the average person is willing to pay for AI services at all.
You can squeeze 25 USD/month of all US people on average, and claim the US government gives you "free AI".
Incidentally that’s how much SoftBank lost on WeWork.
And Ethiopia couldn't get US$5 billion loan for a dam to service 60 million people.
The world is sick.
I suggest you stop viewing the actions of multi billion dollar multinational corporations as anything similar to individual action.
https://www.reuters.com/business/softbanks-wework-once-most-...
We should all care how resources are used, even if don't own them. That investment is going to make things more expensive for everyone. See RAM, SSD, and GPU prices.
xAI already brings gas turbines on-site, and i think the trend of on-site energy generation will grow, which will open opportunity (by providing well finaned demand) for compact/mobile/safer nuclear, and BigTech companies are among the best for any new tech development. I expect nuclear engineer positions get opened with Google and the likes :)
I don't understand how this is the top comment. LLMs have unlocked a lot of value for me personally, and arguably for the society as a whole. They are also one of the coolest technologies I've tried in years. As a technologist, I'm really glad that money is pouring in and allowing us to find its limits.
At this stage, why not go public? Yes, they would need to manage quarterly financial reports and answer to shareholders, but they have reached a size where they are in the top 20 range on the NASDAQ. These public companies doing well, so it seems like a logical next step.
Anthropic don't have all the free users, but they're also raising absurd amounts and have similar costs.
On the other hand they would probably set things up in a way where they still control all the voting power
1) We are in a huge investment bubble right now and it's going to burst.
2) LLMs are extremely useful right now for certain niche tasks, especially software engineering.
3) LLMs have the potential to transform our world long-term (~10 yr horizon), on the order of the transformations wrought by the internet and mobile.
4) LLM's don't lead directly to AGI (no continuous learning), and we're not getting AGI any time soon.
This is an extremely obvious point, but bears repeating. I feel the assumption of an implicit link (in both truth or falsehood) between these fairly independent assertions can cause people to talk past each other about the really important questions in play here.
Regarding The Great Bubble, I am very very bearish about OpenAI in particular. They've had a good run for three years with consumer mindshare due to their first-mover advantage, but they have no moat, trouble monetizing most of their users, not much luck building out products that stick among consumers that aren't chatbots, and their models are no better than Anthropic's, Google's, or even the best Chinese open weight models 6 months later.
My bet would be on Google and Apple together (with Gemini powering Siri, for now) destroying OpenAI in the consumer AI market over the next 2-3 years. Google has first-rate models... but more than that, both Google and Apple have the enormous advantage of owning underlying platforms that they can use to put their own AI chat in front of consumers. Google has a mobile OS, the leading browser, and search. Apple has the premium hardware and the other, premium, mobile OS. They also have the advantage of the current regulatory climate being less antitrust than it was. And they don't have to monetize their AI offerings (no ads in gemini; ChatGPT is adding them) and can run them at a loss for as long as it takes to eat up OpenAI's market share. If they partner up, as they seem to be doing, OpenAI should very very afraid.
Don't get lost in the tech scene sauce, programming is a small sliver of what people are using LLMs for. OpenAI's report in September pegged it at ~4% of tokens being for software generation. Sure Anthropic is probably 80% or something, but only a small sliver of LLM users are using Anthropic. The reality is probably even less if you count Google's AI overviews. We hate it, but I have never seen a regular person skip over it.
The question is if regular people will pay cell phone level subscription costs ($70-$100/mo) for LLMs. If so, then we are probably not in a bubble, and the ROI will have a 5-10 yr horizon, which is totally tenable.
500,000,000 people paying $75/mo is $450B/yr. Inference is cheap too, it's training that is ludicrously expensive. Don't be fooled by the introductory pricing we have today either, that's just to get you dependent.
And yeah, chinese models, but look at what they did to tiktok. No way they are going to let the Chinese government be peoples confidants and no way is more than 0.01% of people gonna home lab.
Data centers need power (H100s are ~700W each), and recent capacity additions were mostly pre-allocated. Chip supply is also constrained by CoWoS packaging, not fab capacity, and expansions take years.
If power, packaging, and GPUs are fixed in the near term, does $100B mostly drive inflation in AI infrastructure prices rather than materially more deployed compute? Are we seeing the real cost of a usable GPU cluster rise faster than actual capacity?
Has anyone modeled what $100B actually buys in deployable compute over the next 2–3 years given these constraints—and whether that figure is shrinking as more capital piles in?
redrix•2h ago