I wouldn't be surprised if mankind will evolve similar to an organism and use 20% of all energy it produces on AI. Which is about 10x of what we use for software at the moment.
But then more AI also means more physical activity. When robots drive cars, we will have more cars driving around. When robots build houses, we will have more houses being built, etc. So energy usage will probably go up exponentially.
At the moment, the sun sends more energy to earth in an hour than humans use in a year. So the sun alone will be able to power this for the foreseeable future.
Not sure about the exact numbers, but I guess that at the moment normal roofs and solar panels absorb very roughly about the same percentage of sunlight.
So if in the future solar panels become more efficient, then yes, the amount of sunlight turned into heat could double.
Maybe that can be offset by covering other parts of earth with reflective materials or finding a way to send the heat back into the universe more effectively.
And also, people should paint their roofs white.
Some could, but a dark roof can be beneficial in the winter
This doesn't seem true. In SF, waymo with 300 cars does more rides than lyft with 45k drivers. If self driving cars interleave different tasks based on their routes I imagine they would be much more efficient per mile.
A car driving from A to B will cost less than 50% of the current price. Which will unlock a huge amount of new rides.
> With more than 700 vehicles in its fleet - 300 of which operate in San Francisco - Waymo is the only U.S. firm that runs uncrewed robotaxis that collect fares.
Those numbers are from April 2025.
https://waymo.com/blog/2025/05/scaling-our-fleet-through-us-...
Seems like we are way too early in the adoption curve to tell. Currently the average number of passengers per trip is >1.0 across the whole fleet. Some day, I'd expect that to dip below 1.0, as people send an empty car to pick up the dog from the vet, or circle the block to avoid having to pay for parking, etc.
kindof sounds like Jevons paradox? https://wiki.froth.zone/wiki/Jevons_paradox
The discussion about nuclear vs solar remind me of the discussions about spinning HDs versus solid state drives when they were new.
GAFAM nuclear are mere announcements, intentions.
On the other front most already progress. https://www.reuters.com/sustainability/climate-energy/micros...
https://www.reuters.com/sustainability/climate-energy/micros...
I don't think there will be much carbon intensive energy creation in a few decades from now. It does not make sense economically.
Anyway I hope you're right, but so far global CO2 output is still growing. All the other energy has only come on top of carbon intensive energy, it hasn't replaced any of it. Every time we build more, we find new ways of spending that much energy and more.
I remember how me and my friends discovered email in 1999 and were like "Yay, in the future we'll all do this instead of sending letters!". And it took about 20 years until letters were largely replaced by email and the web. And when the first videos appeared on the web, it was quite clear to us that they would replace DVDs.
Similar with the advent of self driving cars and solar energy I think.
> The carbon intensity of electricity used by data centers was 48% higher than the US average.
If the people always talking about how cheap solar is want to fix this, find a way to make that cheapness actually make it into the customer's electric bill.
There are periodically news articles and such about data centers in Iceland, of course, but I get the impression it's mostly a fad, and the real build-outs are still in Northern Virginia as they've always been.
The typical answer I've seen is that Internet access and low latency matter more than cooling and power, but LLMs seem like they wouldn't care about that. I mean, you're literally interacting with them over text, and there's already plenty of latency - a few extra ms shouldn't matter?
I'd assume construction costs and costs of shipping in equipment also play a role, but Iceland and Canada aren't that far away.
As for power, that's what I was referring to with geothermal and hydro - Iceland and Quebec both have famously cheap electricity. The former would need a large increase in capacity, for sure, but Quebec already pumps out a lot of power (and regularly sells it to the Northeastern US).
Not saying it wouldn't be difficult, by any means, but it does seem like all the right incentives are there.
The main complaint about energy usage is it will damage the environment, which will (indirectly) reduce quality of life.
It's a question of which factor wins.
Increasing demand can lead to stimulus of green energy production.
Even with things like orphaned natural gas that gets flared otherwise - rescuing the energy is great but we could use it for many things, not just LLMs or bitcoin mining!
If you would have built 10GW of solar or nuclear to replace other generation and instead the data center operators provide funding to build 20GW so that 10GW can go to data centers, the alternative wasn't replacing any of the other dirty generation. And the economies of scale may give the non-carbon alternatives a better cost advantage so you can build even more.
You could do it better than we are doing now, but you'll always have people saying: "that's unfair, why are you picking on me"
Mind you people won't like that since we're so used to using the atmosphere as a free sewer. The idea of having to pay for our pollution isn't palatable since the gasses are mostly invisible.
Though it's sad that we're talking about market solutions rather than outright bans for the majority of applications like we did for leaded gas.
Meanwhile the people with a 10 year old car they drive 5000 miles a year will keep it until it's a 20 year old car, at which point they'll buy another 10 year old car, but by then that one will run on electricity.
Then you could theoretically ban it, but by then do you even need to?
You don't have to ban existing cars, they will phase themselves out. Give every X years and ban the sales of any non-hybrids for all but a few niche applications. Then in X+Y years ban all combustion engines other than niche applications.
But ultimately, we need to be serious about this, and half the population and the governments of most western countries are not serious. Many people still believe that climate change is a hoax, and ridiculous ideas like hydrogen cars and ammonia burning ships are still getting funding.
What we should do instead is start at the other end. Envision a world that would be sustainable, that we would want to live in, and decide which incentives have to exists for us to get there, fairness be damned.
Maybe that says the fees aren't yet high enough for high income people to change behavior, but I'm willing to bet they never truly will be due to the influence this subset of the population holds over politics.
As for the stove, how much it uses is directly related to the kind of cooking you do, and for how long.
Even if rich people don’t consume much more energy than poor people (I have no idea, just engaging with your idea as stated), they must be buying something with their money… carbon taxes should raise the price of goods with lots of embodied carbon.
If they aren’t consuming much energy and am they aren’t buying stuff with much embodied carbon… I dunno, I guess that’s the goal, right?
Rivers gotta run, suns gotta shine, winds gotta blow.
Making electricity so abundant and efficient is probably more solvable. You can’t solve stu… society
> But as more of us turn to AI tools, these impacts start to add up. And increasingly, you don’t need to go looking to use AI: It’s being integrated into every corner of our digital lives.
Forward looking, I imagine this will be the biggest factor in increasing energy demands for AI: companies shoving it into products that nobody wants or needs.
I think the bigger underrated concern is if LLMs fall into an unfortunate bucket where they are in fact generally useful, but not in ways that help us decarbonize our energy supply (or that do, but not enough to offset their own energy usage).
I’m sorry. I’m being blocked by some mysterious force from understanding what “actual human” means. And I don’t know how to get you in contact with your car manufacturer. Would you like me to repeat my 20 step suggestion on how to troubleshoot “why does my shitty car put the A/C on freezer mode whenever “Indian Summer” tops the charts in Bulgaria”, but with more festive emojis?
I think of this a little every time Google gives me another result with the AI summary and no option for anyone to turn it off. Apparently worldwide there are 8+ billion searches every day.
So yes, I'd like to disable this completely. Even if it's just a single birthday candle worth of energy usage.
It's from 2021 so won't cover the 2022-onwards generative AI boom.
From the Wikipedia summary it sounds like it's about machine learning algorithms like classification, AlphaGo and concerns about ethics of training and bias.
On that note, what’s the energy footprint of the return to office initiatives that many companies have initiated?
That’s a lot of big assumptions - that the job getting replaced was tedious in the first place, that those other “more productive” job exists, that the thing AI can’t necessarily do will stay that way long enough for it not to be taken over by AI as well, that the tediousness was not part of the point (e.g. art)…
Like driving a uber or delivering food on a bicycle ? Amazing!
I doubt this is going to change.
That said, the flip side of energy cost being not a big factor is that you could probably eat the increase of energy cost by a factor of say 2 and this could possibly enable installation of short term (say 12h) battery storage to enable you to use only intermittent clean energy AND drive 100% utilization.
But everyone knows fuel economy is everything but a knowable value. Everything from if it has rained in the past four hours to temperature to loading of the vehicle to the chemical composition of the fuel (HVO vs traditional), how worn are your tires? Are they installed the right way? Are your brakes lagging? The possibilities are endless. You could end up with twice the consumption.
By the way, copy-pasting from the website is terrible on desktop firefox, the site just lags every second, for a second.
Oil might be able to carry more heat but it's more expensive to use.
Oil immersion is something nerds like to think is amazing but it's just a pain in the ass for negligible benefits. Imagine the annoyance of doing maintenance.
https://dgtlinfra.com/data-center-water-usage/
https://datacenters.microsoft.com/sustainability/efficiency/
"The carbon intensity of electricity used by data centers was 48% higher than the US average."
I'd be fine with as many data centers as they want if they stimulated production of clean energy to run them.
But that quote links to another article by the same author. Which says
"Notably, the sources for all this power are particularly “dirty.” Since so many data centers are located in coal-producing regions, like Virginia, the “carbon intensity” of the energy they use is 48% higher than the national average. The paper, which was published on arXiv and has not yet been peer-reviewed, found that 95% of data centers in the US are built in places with sources of electricity that are dirtier than the national average. "
Which in turn links to https://arxiv.org/abs/2411.09786
Which puts the bulk of that 48% higher claim on
"The average carbon intensity of the US data centers in our study (weighted by the energy they consumed) was 548 grams of CO2e per kilowatt hour (kWh), approximately 48% higher than the US national average of 369 gCO2e / kWh (26)."
Which points to https://ourworldindata.org/grapher/carbon-intensity-electric...
For the average of 369g/KWh. That's close enough to the figure in the table at https://www.epa.gov/system/files/documents/2024-01/egrid2022...
which shows 375g/KWh (after converting from lb/MWh)
But the table they compare against shows.
VA 576g/KWh
TX 509g/KWh
CA 374g/KWh
and the EPA table shows VA 268g/KWh
TX 372g/KWh
CA 207g/KWh
Which seem more likely to be true. The paper has California at only marginally better than the national average for renewables (Which I guess they needed to support their argument given the number of data centers there)I like arxiv, It's a great place to see new ideas, the fields I look at have things that I can test myself to see if the idea actually works. I would not recommend it as a source of truth. Peer review still has a place.
If they were gathering emissions data from states themselves, they should have caclulated the average from that data, not pulled the average from another potentially completely different measure. Then their conclusions would have been valid regardless what weird scaling factor they bought in to their state calculations. The numbers might have been wrong but the proportion would have been accurate, and it is the proportion that is being highlighted.
This program posts news to thousands of machines throughout the entire civilized world. Your message will cost the net hundreds if not thousands of dollars to send everywhere. Please be sure you know what you are doing. Are you absolutely sure that you want to do this? [ny]
Maybe we meed something similar in LLM clients. Could be phrased in terms of how many pounds of atmospheric carbon the request will produce.
Even driving your car around you at least are somewhat aware of the gas you are burning.
Or, even worse God forbid, think about how much carbon is produced to create a single bottle or carton of water. Then consider how casually people down bottles of water.
A google search claims water use uses 12.7% of US energy.
Another search gave 11.7% US energy goes to powering AI (projected to increase to roughly 25% by 2030).
Taking into account hydropower power provides 6.2% of US energy, I feel comfortable saying your statement isn't true.
To further strengthen my statement, I would like to point out another statistic. NPR gives us an estimate of 300K gallons of water use/day to cool the average data center. That pretty much guarantees an LLM query produces more carbon than my filling a cup from a gravity fed water system filled by rain.
Funny how this suddenly became a thing after electrification became a thing. Need to find a new way to wag the finger after all.
Is there a way to quantify this? My experience as well is that the tire particulate pollution has mostly been an anti-EV talking point.
The oil industry is a conglomerate of degenerates spamming boomer logic all the way down to the workers. Their memes propagate throughout society and lead to the other boomer characteristic of rewriting personal and societal history.
The finger waggers now are being programmed to pretend they talked about tire particulates and the carheads are being programmed to pretend they never cared about 0-60. This another "We have always been at war with Eastasia", just like they all opposed the Iraq war from day 1 and didn't cancel the Dixie Chicks, et cetra.
This may have been discussed in specialist literature somewhere but even when I did ecology courses in university circa 2001ish, I never heard about tire particulates, while I did hear a lot about greenhouse gasses.
It's a concern but not a civilization ending concern like climate change. I low key resent these attempts to move goalposts to satisfy the writer's urge for negativity.
Consider that a bus has six to ten tires that each weigh around ten times more than a typical car tire. This is presented as the alternative to cars, is it even any different? Not implausible that it could actually be worse, especially if the bus isn't at full occupancy at all times.
Meanwhile the weight difference between EVs and petroleum cars is emphasized in the complaints, even though it isn't very large, while the much larger weight difference between any cars and buses is ignored. Because the point isn't to complain about tires, it's to complain about EVs.
And if the point actually was to complain about tires then you still wouldn't be talking about EVs, you would be talking about tires and how to make them shed less or construct them out of lower toxicity materials etc.
To make those sorts of calculations easy, you can ignore all the pressure/usage/etc nonsense and just do basic math on tire dimensions (including min/max tread depth and width, not just radius, though I typically ignore siping and whatnot) and typical longevity. Volume lost per mile driven is basic high-school arithmetic, and the only real questions are regarding data quality and whether the self-imposed constraints (e.g., examining real-world wear rather than wear given optimal driving or something) are reasonable.
Harder rubber seems like it could make a difference, but then you could also put tires with harder rubber on a car.
You can get a heavier vehicle to have the same pressure at the road by using more and bigger tires, but then the problem is that the tires are bigger and there are more of them.
> plus its normal driving patterns have less wear than typical Tesla use.
Isn't a city bus constantly starting and stopping, both as a result of city traffic and picking up and dropping off passengers?
> To make those sorts of calculations easy, you can ignore all the pressure/usage/etc nonsense and just do basic math on tire dimensions (including min/max tread depth and width, not just radius, though I typically ignore siping and whatnot) and typical longevity.
I tried plugging these in and it still comes out as a 6-wheel commercial bus has several times the tire wear as a 4-wheel light truck, rather than being the same.
And I expected the difference to be even more, but I guess that goes to show how much the weight argument is motivated reasoning if ~7x the weight is only ~3x the tire wear and then people are complaining about something which is only ~1.2x the weight.
Pardon me if I ask the obvious question, but did you divide your result by the average number of people moved? Because that's the actual utility of mass vs. individual transport. I would find it rather surprising if tire wear was the one measure were buses didn't win out.
Tire temperature also will play a big roll in tire wear, and I wouldn't expect bus tires to get very hot only rolling half the time and at a lower speed than the typical car.
And of course you also gotta factor in passenger count. Buses generally have more than just 1 or 2 people, while the vast majority of cars will have 1 or 2 people most of the time. And even if a bus tires were to wear out twice as fast as a car's tire, that is still less wear per person than a car.
No bus tires to not typically last 500k miles. <100k is the norm, and really not more than a long-life car tire.
They do get retreaded more often than car tires do, but that just means they get new rubber added regularly.
My city buses in peak travel hours have anywhere from 20 to 75 people on them. Even if we assume that every one of those folks would have carpooled (which rarely happens), we're looking at a lot of cars, and thus tires, on the road.
This is really the problem with buses outside of extremely high density areas. (And extremely high density areas should have subways.)
You get off work at 5PM, you want to go to an entertainment venue and then go home at 10PM. You can find a full bus a 5:15PM that will take you there because it's rush hour, but then you can't get home on the bus because there is no bus service after 9PM. Which means you can't take the bus there during rush hour either, because you need your car to be there so you can get home.
Or, you can run mostly-empty buses in the darkness hours, but there goes your efficiency.
The biggest problem with tailpipe emissions used to be horrendous smog. That was mostly solved in many places, and now the biggest problem is the impact on the global climate.
The biggest issue with childhood mortality used to be disease. Now we (correctly) focus more on accidental deaths.
EVs solved tailpipe emissions, but they’re not perfect. Their biggest problem is just something else.
There’s been decades of lies about climate change. And once the truth got out society was already massively dependent on it. For cars specifically it was a deliberate policy to make e.g. the US car-dependent. And once the truth got undeniable the cope was switched to people’s “carbon footprint” (British Petroleum). In fact there are rumors that the cope echoes to this day.
Zoom out enough and it becomes obviously unproductive to make “mass ignorance” the locus of attention.
Emissions should be fixed on the production side (decarbonization) not on the demand side (guilt/austerity).
Scaling up battery production makes EVs more appealing on the demand side. How do you disincentivize fossil fuel production?
That’s what I said?
> Reducing demand is e.g. requiring you to drive fewer miles.
Demand can be reduced by increasing fuel prices. Either at the pump through consumption tax or at production. The effect remains the same.
> You get a car that doesn't run on petroleum and CO2 goes down while vehicle miles can stay the same or go up.
Obviously. What does that have to do with disincentivizing fossil fuel production or consumption?
The problem with fossil fuels isn’t that they pollute, but that most of the negative impact of that pollution is borne by others. This results in an artificially low price which distorts the market and results in economically inefficient overuse.
Capture that cost by making producers pay a tax of the corresponding amount, and market forces will use the “right” amount of fossil fuels in ways that are a net benefit.
Do you really think the average person could within 2 orders of magnitude when estimating their carbon footprint for a year?
Why? AI isn't a human being. We have no obligation to be "fair" to it.
I can picture an Elizabeth Holmesian cartoon clutching her diamond necklace.
"Oh, won't somebody think of the tech billionaires?!"
If you don't freak out about running your shower or microwave for a couple seconds or driving a few hundred feet
The basic premise of the modern tech industry is scale. It's not one person running a microwave for a couple of seconds, it's a few billion people running a microwave for the equivalent of decades.
So yes let’s hand wring over AI and continue to do nothing about everything else. And we’ll probably do nothing about AI either, but the endless articles will no doubt keep people distracted.
Political lobbying.
“We care”. Because British Petroleum told us to care.[1] Now the new scapegoat grift is how many prompts you use to make your “creative” wedding invitations. Nevermind industrial use though. We’ll just hammer every little API or just endpoint, slurp up the data, then do the same thing tomorrow because why cache? Keep it simple. It’s not our cost.[2] No one will scold us (that can get to us).
Public service announcement huh. Let’s just fall for it again.
[1] https://news.ycombinator.com/item?id=44046315
[2] https://drewdevault.com/2025/03/17/2025-03-17-Stop-externali...
If memory serves Jet A is not taxed at all federally in the case of for-profit corporations (while non-commercial users DO pay a tax!) and many states also either do not tax it or tax it very litte.
It's completely insane that we do not tax fuel usage for probably the most energy-intensive way to move people and/or goods and often that movement of people is entirely frivelous.
Eh. It's not Bill Gates and Alice Walton. Sometimes the obvious answer is the real one: It's the fossil fuel industry.
> It's completely insane that we do not tax fuel usage for probably the most energy-intensive way to move people and/or goods and often that movement of people is entirely frivelous.
That one's just the arbitrage problem. Planes move around. If there is an international flight to a country that doesn't tax jet fuel (or taxes it less) then the plane is going to fly into LAX with enough fuel still in the tank to get back to the other jurisdiction and fill up again. Which actually increases fuel consumption because fuel is heavy and they otherwise wouldn't want to do that.
This is the same reason the EU doesn't tax jet fuel.
Any reason that can't be treated as a fuel import and taxed accordingly? I understand current laws may not allow it but is that legislation impossible to write?
Joe driving to work spends a larger fraction of his income on fuel and thus fuel tax than his rich counterpart.
This is true for all "let the consumer/poluter pay" taxes, they're all regressive. They say: it's fine to burn up the world as long as you're rich.
Personally I like the idea of setting the price for emitting 1 ton of CO2 equivalent emissions to the realistic cost of capturing 1 ton of CO2. At least, that seems like a reasonable end goal for a carbon tax, since that could fully account for the negative externality. This would of course be obscenely expensive, which would be a strong incentive to lower carbon emissions where possible, and for people to consume less of products that require large emissions to make or use.
The carbon tax would also have to apply to imported goods to be effective, and tracking how much tax should apply to imports would be even more difficult than doing so for domestic sources of pollution.
Please see my comment again. Under a revenue-neutral carbon tax everyone gets money back. But they don't realize it. Costs only go up for people who emit more carbon than average.
Exhibit A: Canada's recent repudiation of their carbon tax because nobody knew they were getting rebates. https://www.cbc.ca/news/politics/carbon-tax-rebate-rebrand-1...
> Talk to some outside of your activist bubble
That's quite condescending btw. Is it "activism" to try to avert a calamity that will increase the cost of living by a lot more 20 years from now? I think it's good fiscal sense. Long-term thinking and planning. Y'know adult shit.
> Humans are, on average, selfish beings
And easily swayed by stupid arguments. Exhibit B: Canada's recent repudiation of the carbon tax because fossil fuel industry propaganda convinced everyone that the tax was the cause of price increases. Now prices will stay the same (because the market will bear them) but no one will get any rebate money.
https://en.m.wikipedia.org/wiki/Sales_taxes_in_the_United_St...
Those were all voted in. It's a type of tax that people are happy with for whatever reason.
Even flying would only cost about 10% more for example. And most other activities have carbon free alternatives they can shift to rather than just eat the cost. Which is kind of the point.
Are solar panels convenient? All polysilicon today is made with fossil fuels, and the R&D to make it with renewable energy is still in-progress. Not to mention that we ship them across the ocean with fossil fuel.
Same thing with steel – both are critical input materials and can be made without fossil fuels, but they aren’t today. Maybe a carbon tax would fix that!
They've been implemented all over the world, because they're effective. They cover 31% of emissions in developed nations.
To whatever degree you could say they are unpopular, they're unpopular in regions where the government doing stuff about climate change (or just "the government doing stuff") is unpopular, which makes it odd to single out putting a price on carbon specifically
See where they are used here: https://carbonpricingdashboard.worldbank.org/
This is one of the more hysterical things I have heard.
How would we even know it was translating animal language correctly?
You can see traffic. It's easy to understand the dangers in a collision because when you drive into something unexpectedly your body takes a hit and you get frightened since you immediately realise that it might cost you a lot of money but you don't know for sure.
Being subtly manipulated by a disgustingly subservient fake conversationalist is another thing altogether.
You underestimate the pervasiveness of AI, and in particular, ChatGPT. It is quite popular in the blue collar trades.
And yeah, a lot of them probably regard everything that ChatGPT tells them as fact.
"Rolling coal" is the practice of modifying a diesel engine—usually in pickup trucks—to increase the amount of fuel entering the engine, which causes the vehicle to emit large, thick plumes of black smoke from the exhaust. This is often done by tampering with or removing emissions control devices.
Yes.
Llama 3.1: ~0.7 to 1.5 grams of CO2
Rolling coal event: ~10,000 to 100,000+ grams of CO2
You would need hundreds of lb of gasoline for that!
I refuse to believe that anyone with a functioning brain would choose to engage in it. Are you saying you do?
The only purpose is to scapegoat the possible environmental or economic fallout. Might as well put it on individuals. Like what’s always done.
I’ve already seen it on the national broadcast. There some supposed experts were wagging their fingers about using AI for “fun”. Making silly images.
Meanwhile we’re gonna put AI to good use in arms races: more spam (automated applications, ads, ads, ads, abuse of services) and anti-spam. There’s gonna be so much economic activity. Disruptive.
Likewise, I doubt that USENET warning was ever true beyond the first few years of the networks' lifetime. Certainly if everything was connected via dial-up, yes, a single message could incur hundreds of dollars of cost when you added the few seconds of line time it took to send up across the whole world. But that's accounting for a lot of Ma Bell markup. Most connections between sites and ISPs on USENET were done through private lines that ran at far faster speeds than what you could shove down copper phone wiring back then.
Are you saying all of that new capacity is needed to power non-LLM stuff like classifiers, adtech, etc? That seems unlikely.
Had you said that inference costs are tiny compared to the upfront cost of training the base model, I might have believed it. But even that isn't accurate -- there's a big upfront energy cost to train a model, but once it becomes popular like GPT-4, the inference energy cost over time is dramatically higher than the upfront training cost.
You mentioned batch computing as well, but how does that fit into the picture? I don't see how batching would reduce energy use. Does "doing lots of work at once" somehow reduce the total work / total energy expended?
Well, partly because they (all but X, IIRC) have commitments to shift to carbon-neutral energy.
But also, from the article:
> ChatGPT is now estimated to be the fifth-most visited website in the world
That's ChatGPT today. They're looking ahead to 100x-ing (or 1,000,000x-ing) the usage as AI replaces more and more existing work.
I can run Llama 3 on my laptop, and we can measure the energy usage of my laptop--it maxes out at around 0.1 toasters. o3 is presumably a bit more energy intensive, but the reason it's using a lot of power is the >100MM daily users, not that a single user uses a lot of energy for a simple chat.
This seems like a classic tragedy of the commons, no? An individual has a minor impact, but the rationale switching to LLM tools by the collective will likely have a massive impact.
Something to temper this, lots of these AI datacenter projects are being cancelled or put on hiatus because the demand isnt there.
But if someone wants to build a nuke reactor to power their datacenter, awesome. No downsides? We are concerned about energy consumption only because of its impact on the earth in terms of carbon footprint. If its nuclear, the problem has already been solved.
Wait, any sources for that? Because everywhere I go, there seems to be this hype for more AI data centers. Some fresh air would be nice.
AI seems like it is speedrunning all the phases of the hype cycle.
"TD Cowen analysts Michael Elias, Cooper Belanger, and Gregory Williams wrote in the latest research note: “We continue to believe the lease cancellations and deferrals of capacity points to data center oversupply relative to its current demand forecast.”"
If you want to know more about energy consumption, see this 2 part series that goes into tons of nitty-gritty details: https://blog.giovanh.com/blog/2024/08/18/is-ai-eating-all-th...
The article uses open source models to infer cost, because those are the only models you can measure since the organizations that manage them don't share that info. Here's what the article says:
> The largest of our text-generation cohort, Llama 3.1 405B, [...] needed 3,353 joules, or an estimated 6,706 joules total, for each response. That’s enough to carry a person about 400 feet on an e-bike or run the microwave for eight seconds.
I just looked at the last chat conversation I had with an LLM. I got nine responses, about the equivalent of melting the cheese on my burrito if I'm in a rush (ignoring that I'd be turning the microwave on and off over the course of a few hours, making an awful burrito).
How many burritos is that if you multiply it by the number of people who have a similar chat with an LLM every day?
Now that I'm hungry, I just want to agree that LLMs and other client-facing models aren't the only ML workload and aren't even the most relevant ones. As you say adtech has been using classifiers, vector engines, etc. since (anecdotally) as early as 2007. Investing algorithms are another huge one.
Regarding your USENET point, yeah. I remember in 2000 some famous Linux guy freaking out that members of Linuxcare's sales team had a 5 line signature in their emails instead of the RFC-recommended 3 lines because it was wasting the internet or something. It's hard for me to imagine what things were like back then.
So they used to send this message, but then it stopped I assume. Costs lowered a lot or the benefits outweighed all associated costs. Same can happen here.
How is this even quantifiable?
How about this. Before using AI to make fake images and help kids cheat on their homework, we take it offline and use it to solve it's own problem of energy use.
You know what this does not happen? Because the goal is profit and the profit comes not from solving real important problem, but by making people think it is helping them solve made up problems.
Edit: it's sad that I'm not sure if the downvotes are because people can't tell that this is sarcasm, or because they know that it is.
A few big consumers in centralized locations isn't changing the grid as much as the energy transition from fuels to electricity is
> In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.
As we all know, the generative AI boom only really kicked into high gear in November 2022 with ChatGPT. That's five years of "AI" growth between 2017 and 2022 which presumably was mostly not generative AI.
> There is a significant caveat to this math. These numbers cannot serve as a proxy for how much energy is required to power something like ChatGPT 4o.
Otherwise this is an excellent article critiquing the very real problem that is opacity of these companies regarding model sizes and deployments. Not having an honest accounting of computing deployed worldwide is a problem, and while it's true that we didn't really do this in the past (early versions of Google searches were undoubtedly inefficient!), it's not an excuse today.
I also wish this article talked about the compute trends. That is, compute per token is going significantly down, but that also means use of that compute can spread more. Where does that lead us?
Even were it matches sorta (the 400 feet e-bike thing) that only works out for me because I use an AMD card. An NVIDIA card can have several times the generation speed at the same power draw so it all falls down again.
And the parameters they tried to standardize their figures with (the 1024x1024 thing) is also a bit meh because the SAME amount of pixels in a different aspect ratio can have huge variations in gen speed and thus power usage. for instance for most illustrious type checkpoints the speed is about 60% higher at aspect ratios other than 1024x1024. Its all a bit of a mess.
The racks I am personally responsible for consume 17.2kW. That’s consistent across the year; sure things dip a bit when applications are shut down, but in general 17.2kW is the number. Presuming a national average of 1.2kW per home, each rack of equipment I oversee could potentially power 14 houses. I am responsible for hundreds of these racks, while my larger organization has many thousands of these racks in many locations worldwide.
I’ve found no other way to let the scale of this sink in. When put this way she is very clear: the price isn’t worth it to humanity. Being able to get, say, Door Dash, is pretty neat! But not at the cost of all our hoarded treasure and certainly not at the cost of the environment on the only planet we have access to.
The work done by AI will only ever benefit the people at the top. Because to be frank: they won’t share. Because the very wealthy have hoarding disorder.
Each!
We are in no meaningful sense torching the biosphere to get AI.
This is condescending and rude. It also strikes me as obviously wrong. The 'response' isn't emotional in the slightest; it just explains the emotional and cognitive experience of acquiring understanding. It's a reasonable, well-reasoned explanation of the difficulty of intuitively grasping how much energy these data centers, and thus the companies that use them, consume -- and then of the shock many experience when it dawns on them.
> For example, an A320 aloft uses the same energy as two thousand of your hypothetical racks (2.5 tons of kerosene per hour).
> Each!
> We are in no meaningful sense torching the biosphere to get AI.
What exactly is the reasoning here? That airplanes use a lot of energy, just like data centers or compared to data centers, and therefore that AI isn't ecologically damaging -- or isn't "meaningfully" damaging, whatever that means? That's not just wrong. It's a nonsequitur.
There's a simpler, easier, better way to wrap our heads around the data, one that doesn't require false dichotomies (or something like that): humans are torching the biosphere both to get AI and to travel.
No. It is not "just like data centers". That is my point. The amount of energy used by transportation is several orders of magnitude more than the energy used by information technologies. The energy used to fly back and forth to Nevada to write this whiny article was more than was needed to train some of the latest models. The whole topic is totally nonsense.
That said, your response was emotional as well: as if your words were cast down from on high, a gift from the gods! Your arrogance and rudeness should inspire some self-examination, but I am not hopeful on that front.
Unless your racks can only serve 14 customers.
Table A1 , PDF page 29:
(P.S. check your spelling!)
... So don't? Explicitly shift the cost to the customer.
If I want to hook up to the energy grid with 3-phase power, I pay the utility to do it.
If a business wants more power and it isn't available, then the business can pay for it.
Then only businesses that really need it will be willing to step up to the plate.
No amount of "accounting" or "energy needs prediction" will guard against regulatory capture.
Training SOTA models will, like steel mills or other large industrial projects, require a lot of environmental footprint to produce. But my prediction is that over time the vast majority of use cases in the hands of users will be essentially run on device and be basically zero impact, both in monetary cost and environment.
Which I already thought was odd, because London would need all that electricity to see through the giant mountain of poop piled up by all the horses the british use for transportation.
Not sure about comprehensive claim here if end-to-end query chains were not considered.
For example the mobile wireless node (that're being used by the majority of the users) contribution to the energy consumption are totally ignored. The wireless power amplifier or PA for both sides of users and base-stations are notorious for their inefficiency being only less than than 50% in practice although in theory can be around 80%. Almost all of the current AI applications are cloud based not local-first thus the end users energy consumption and contribution are necessary.
Impressive how Big Tech refuses to share data with society for collective decisions.
I'd also recommend the Data Vampires podcast series:
https://techwontsave.us/episode/241_data_vampires_going_hype...
https://techwontsave.us/episode/243_data_vampires_opposing_d...
https://techwontsave.us/episode/245_data_vampires_sacrificin...
https://techwontsave.us/episode/247_data_vampires_fighting_f...
"Hard to test", but very obviously true if you make any attempt at guessing based on making a few assumptions... like they seem comfortable doing for all the closed source models they don't have access to being run in conditions they're not testing for. Especially considering they're presenting their numbers as definitive, and then just a couple paragraphs down admit that, yeah, they're just guessing.
Regardless, I know for a fact that a typical commercial shoot uses way more energy than driving across the TMZ in an e-bike (considering they're definitely using cars to transport gear, which gives you less than 4 miles for the same energy).
If you don't want to go there, it doesn't really matter how much energy the human uses because the human will just use the same energy to do something else.
Human's got to exist and needs to work to eat. They don't really, necessarily, existentially need to be 10x productive with the help of AI.
But I'll be honest, that's not really a solid argument, because it could rapidly lead to the question of why they do this exact job in the first place, instead of e.g. farming or whatever else there might be that can be called a net positive for humanity without reservations.
Nearly any other daily activity of a consumer in the developed world uses orders of magnitude more energy and resources than scrolling TikTok on a phone.
Examples?
– Driving to work: commuting burns far more fuel in a week than your phone uses in a year.
– Gym sessions: heated, lit, air-conditioned spaces plus transit add up quickly.
– Gaming or watching TV: bigger screens, bigger compute easily 100x and higher power needs vs phone gaming.
– Casually cooking at home: using a metric ton of appliances (oven, stove, fridge, pans) powered like twice a week, replaced every ~10 years.
– Reading print media: a daily newspaper or weekly book involves pulp, ink, shipping, and disposal.
– Streaming on a laptop or smart TV: even this draws more power than your phone.
– Taking a shower: the hot water energy use alone dwarfs your daily phone charge.
Of couse not doing any sports or culture is also not what societies want, but energy wise a sedentary passive tiktok lifestyle is as eco friendly as it get's vs. any other real world example.
Phones are basically the least resource-intensive tool we use regularly. Externalities, context, and limited human time effects matter a lot more than what one phone uses vs the other.
Even e-readers already break even with books after 36 paper equivalents
https://www.npr.org/2024/05/25/1252930557/book-e-reader-kind...
I expect rapid progress in both model efficiency and hardware specialization. Local inference on edge devices, using chips designed specifically for AI workloads, will drastically reduce energy consumption for the majority of tasks. This shift will free up large-scale compute resources to focus on truly complex scientific problems, which seems like a worthwhile goal to me.
I can imagine that doing some clever offloading to a normal programs and using the LLM as a sort of "fuzzy glue" for the rest could improve the efficiency on many common tasks.
The low hanging fruit has been plucked by said silicon development process and while remarkable improvement in AI efficiency is likely it is highly unlikely for that to follow a similar curve.
More likely is slow, incremental process taking decades. We cannot just wish away billions of parameters and the need for trillions of operations. It’s not like we have some open path of possible improvement like with silicon. We walked that path already.
Maybe photonics..
It's just hard to replicate the power and efficiency of CUDA.
> AI is unavoidable
> We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode
This is surely meant to be an objective assessment, not a fluff piece.
I found this article to be a little too one sided. For instance, it didn’t talk about the 10x reductions in power achieved this past year — essentially how gpt4 can now run on a laptop.
Viz, via sama “The cost to use a given level of AI falls about 10x every 12 months, and lower prices lead to much more use. You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period. Moore’s law changed the world at 2x every 18 months; this is unbelievably stronger.” https://blog.samaltman.com/three-observations
1/ did not exist before
2/ does not replace/reduce previous/other power (some, very much more critical and essential) usages.
3/ a LOT of tasks are still way more energy/time-efficiently done with regular existing methods (dedicated software, or even by hand), but are still asked/improperly routed to AI chatbots that ... statistically guess the answer.
It also leads to automation and efficiency, even if it isn’t a fully linear path.
AI isnt a waste. We can’t let environmental consciousness get in the way of rather natural human development. Especially CO2. (I have different opinions about biodiversity because of the irreversible loss. I also believe that we have the technology to pause and reverse climate change — but don’t pursue it because of degrowth ideologies)
While “economic value” =/= “solving climate change” without enough tax revenue costly transitions are impossible.
It's revealing that for some, it's easier to imagine Earth without life than Earth without capitalism.
For _who_ really?
Shareholders of AI tools producing companies?
Shareholders of companies that pretend to replace people and verifiable working processes with poorly understood black boxes?
(I can't help but notice the _same_ playbook as with crypto, NFTs, Web3, metaverse, and the same enabling-hardware provider).
Value, automation, efficiency will not solve the climate change challenges, if they are not directed towards it aggressively, as well as humanity acceptance and well-being.
Alas, they are directed towards a very little few bank accounts. Violence and subjugation, in many of their forms, is directed towards the others. It's not by accident.
Give some concrete examples and stats?
> It also leads to automation and efficiency, even if it isn’t a fully linear path.
Ditto.
He’s one of the most knowledgeable persons in the world on the topic. It’s like saying that the ceo of BMW isn’t citable on conversations about expected cost decreases in electric cars.
Any environmental topic is susceptible to a huge amount of groupthink. The world isn’t so binary as people make it out to be. It is far from truth that LLMs=bad for environment, any more than computers=bad for the environment.
What I mean is that I have a healthy level of skepticism with Altman. He has to constantly battle for funding. Surely, he must be knowledgeable about LLMs, but he's the CEO of the largest AI company in the world, his PR needs to give "most knowledgeable person in the world on the topic" but I think that title should go to all of the engineers and developers working on these technologies and not a capital founder.
All that said, I agree that LLM's being bad for the environment is a complex topic. I think it would be more accepted if people had safety nets and could be excited for AI to take their job instead of having to be terrified, or if AI isn't just used as another tool for increasing wealth inequality.
Altman mentions a 150x increase in efficiency, you claim that trend will continue through to gpt6. At that point these models would be 22500 as efficient as they currently are, which would mean generating a 10 hour long video would cost around the same amount of electricity as running your microwave for 15 minutes. Will you have some introspection if that doesn't come to pass?
I haven't seen them merge a pull request in less than 3 days. However simple.
The team lead, in 2 years, contributed less than 20 pull requests, all of which were 1 to 3 line changes to CSS or similar.
(cgroups, as per a sibbling comment, are addressed in this write-up as "not maximally satisfying")
They could go even go further and report themselves as a perf issue, like when google's lighthouse reports google analytics.
This is outrageous. People still struggle to access fresh water (and power), but hey "sustainability is all to our company" is always promoted as if something nice is being done on from the behemoth's sides. BS. What a waste of resources.
I truly condemn all this. To this day I do still refuse to use any of this technology and hope that all this ends in the near future. It's madness. I see this as nothing more than next-gen restrictive lousy search engines, and as many have pointed out ads are going to roll soon. The more people adopt it the worse will be for everyone.
I always emphasize this: 10-15 years ago I could find everything through simple web searches. Everything. In many cases even landing on niche and unexpected but useful and interesting websites. Today that is a difficult/impossible task.
Perhaps there is still room for a well-done traditional search engine (haven't tried Kagi but people in general do say nice things about it) to surface and take the lead but I doubt it, when hype arrives especially in the tech industry people follow blindly. There are still flourishing "ai" startups and from night to day everyone has become a voice or expert on the subject. Again: BS.
Traditional web engines and searches were absolutely just fine and quite impressed with their outputs. I remember it. What the heck has happened?
Of note, cooling is water evaporation, so the water will inevitably come back to us as good as new. This contrasts uses that will actually pollute water
Very quickly skimming, I have some trouble taking this post seriously when it omits that the larger DeepSeek one is a mixture-of-experts that will only use 12.5% (iirc) of its components for each token.
The best summary of text energy use I've seen is this (seemingly more rigorous, although its estimates are consistent with the final numbers made by the present post): epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use
Estimates for a given response widely for a "typical" query (0.3 Wh; 1080 joules) and a maximal-context query (40 Wh; 144k joules). Assuming most uses don't come close to maximizing the context, the energy use of text seems very small compared to the benefits. That being said, the energy use for video generation seems substantial
I would be interested in seeing the numbers corresponding to how LLMs are typically used for code generation
- "by 2028 [...] AI alone could consume as much electricity annually as 22% of all US households."
What would the 22% be if compared against all US energy instead of just all US household?
I guess it becomes okay when the companies guzzling the energy are some of the biggest tech employers in the world, buttering your bread in some way.
VR is back in its niche.
3DTV .. I've not seen that marketed for quite a while now. Things that are fads will die out eventually.
Crypto meanwhile is in an odd space: everyone knows blockchains are too much of a hassle and that the volatility is too high, so they're using centralized exchanges and "stablecoins" (shadow dollars). There's still a huge amount of money there but not quite as much as the stadium ads / FTX peak.
Its probably coming back in a few years via lightfield displays, though for some reason Google seems to think the killer app for those is videoconferencing.
Whether the cost/benefit works out in the case of AI is another question.
On one hand, the cost of compute per token has gone down a lot, and will continue to go down, because that's exactly the economic incentives at play. We had a little short-term nonsense where "the bigger the better" was all the rage, but inference was never this way, and now training is also pushing in this direction.
But on the other hand, less compute per token means it can be more broadly deployed. And so there is likely more energy use, not less, in the long run.
Economic chaos is one thing, Skynet is another. And then people with the power are stupid and evil. Joy.
Companies like Apple and Google are both building data centers and trying to make on-device AI a thing. Unfortunately, they also keep inventing new, more expensive algorithms.
It’s at least plausible that most LLM use will become cheap enough to run on battery-limited devices like laptops and phones, though it’s not what most people are betting on.
[1] https://epoch.ai/data-insights/llm-inference-price-trends
Build more nuclear, build more solar. Tax carbon.
Erm ... that's a weird date considering this article came out yesterday. They actually pledge to triple the world's nuclear capacity by 2050[1]
There are a couple of weird things like that in this article, including the classic reference to "experts" for some of its data points. Still ... at least somebody's trying to quantify this.
[1] https://www.world-nuclear-news.org/articles/amazon-google-me...
Yes, much of what is being promoted is slop. Yes, this bubble is driven by an overly financialized economy. That doesn't preclude the possibility of AI models precipitating meaningful advancements in the human condition.
From refrigeration to transportation, cheap and abundant energy has been one of the major driving forces in human advancement. Paradoxically, consuming cheap energy doesn't reduce the amount of energy available on the market. Instead it increases the size of the market.
I'm worried about the environmental impacts of this, but from everything I've seen society values model output more. Curious to watch this over the rest of the decade.
How many batteries do you need to power-up a datacenter at night ?
The conditions for that don’t exist everywhere, but we don’t need these datacentres everywhere. LLMs aren’t latency-sensitive.
Unfortunately the current government of the US is entirely beholden to fossil fuel interests so they're actively hostile to offshore wind.
Turning 4% of the U.S.'s electricity into cat videos and online shopping and advertisements and heat already sounds like a lot of use. Maybe the rapid rise of AI use is what's alarming people?
Contrast the 2016 study[0] of data center energy use where use was recently flat because of efficiency improvements in 2010-2020 but historically there was a ton of growth in energy consumption since ~1990; basically we have always been on a locally exponential growth curve in data center energy use but our constant factors were being optimized by the hyperscalers in that 2010-2020 period.
We also need to compare the efficiency of AI with other modes of computation/work. The article goes into detail on the supposed actual energy use but there's a simple metric; All the large companies provide costs per unit of inference which can put a hard ceiling on actual energy cost. Something like $20/1M tokens for the best models. METR used a 2M token budget. So you can currently price out N hours of work at $40 from whichever latest METR benchmarks come out and have a worst case cost for efficiency comparison.
Lastly, if we're not on a trend toward having Dyson swarms of compute in the long run then what are we even doing as a species? Of course energy spent on compute is going to grow quadratically into the distant future as we expand. People are complaining about compute for AI but compute is how we figure things out and get things done. AI is the latest tool.
[0] https://eta.lbl.gov/publications/united-states-data-center-e...
I can't decide whether they think it won't be that bad, or the scientific forecasts are wrong, or that they just don't care because whatever turmoil results from serious climate change, they'll be able to rise above it, and well, fuck the rest of humanity.
I'm disgusted by our sense of priorities. (Though maybe I shouldn't be since we live in a country that values subsidizing the industrial-military complex over the health and education of its citizens.)
mentalgear•8mo ago
stevage•8mo ago
JohnFen•8mo ago
Environmental, social, and governance (ESG) is shorthand for an investing principle that prioritizes environmental issues, social issues, and corporate governance.
kkarakk•8mo ago