I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.
Whatever you’re imagining as the impact, it’s probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.
One Nvidia H100 DGX AI server consumes 10.2kW at 100% utilization, meaning that one hour’s use of one server is equivalent to the electricity consumption of the typical USA home in one year. This is just a single 8-GPU server; it excludes the electricity required by the networking and storage hardware, let alone the electricity required to run the facility’s climate control.
xAI alone has deployed hundreds of thousands of H100 or newer GPUs. Let’s SWAG 160K GPUs = ~20K DGX servers = >200MW for compute alone.
H100 is old. State of the art GB200 NVL72 is 120kW per rack.
Musk is targeting not 160K, but literally one million GPUs deployed by the end of this year. He has literally flown in multiple natural gas power plants which he is now operating without any environmental permits or controls, to the detriment of the locals in Memphis.
This is just one company training one typical frontier model. There are many competitors operating at similar scale and sadly the vast majority of their new capacity is running on hydrocarbons because that’s what they can deploy at the scale they need today.
AI is the driver of the parabolic spike in global data center buildouts. No other use case comes close in terms of driving new YoY growth in tech infra capex spend.
Well, patents and what have you are a thing. I’m mostly thinking that I wouldn’t want e.g. Facebook to run any nuclear reactors or energy grids. That’s something I prefer the government does.
Because the training has diminishing returns, meaning the small improvements between (for example purposes) GPT 3 and 4 will need exponentially more power to have the same effect on GPT 5. In 2022 and 2023 OpenAI and DeepMind both predicted that reaching human accuracy could never be done, the latter concluding even with infinite power.
So in order to get as close as possible then in the future they will need to get as much power as possible. Academic papers outline it as the one true bottleneck.
And academia will work on that problem. It reminds me of intel processors “projected” to use kilowatts of energy, then smart people made other types of chips and they don’t need 2000 watts.
Volume of requests and power consumption requirements unrelated to requests made, at least I have to assume. Certainly doesn’t help that google has forced me to make a request to their ai every time I run a standard search.
Seriously. I’d be somewhat less concerned about the impact if it was only voluntarily used. Instead, AI is compulsively shoved in every nook and cranny of digital product simply to justify its own existence.
The power requirement for training is ongoing, since mere days after Sam Altman released a very underehelming GPT-5, he begins hyping up the next one.
I also never saw a calculation that took into amount my VPS costs. The fckers scrape half the internet, warming up every server in the world connected to the internet. How much energy is that?
100’s of Gigawatts is how much energy that is. Fuel is pretty damn energy dense.
A Boeing 777 might burn 45k Kg of fuel, at a density of 47Mj/kg. Which comes out to… 600 Megawatts
Or about 60 houses energy usage for a year in the U.S.
It’s an asinine way to measure it to be fair, not only is it incredibly ambiguous, but almost no one has any reference as to how much energy that actually is.
I usually liken it to video games, ya. Is it worse that nothing? Sure, but that flight or road trip, etc, is a bigger concern. Not to mention even before AI we’ve had industrial usage of energy and water usage that isn’t sustainable… almonds in CA alone are a bigger problem than AI, for instance.
Not that I’m pro-AI cause it’s a huge headache from so many other perspectives, but the environmental argument isn’t enough. Corpo greed is probably the biggest argument against it, imo.
A flight to Europe’s worth of energy is a pretty asinine way to measure this. Is it not?
It’s also not that small the number, being ~600 Megawatts of energy.
However, training cost is considerably less than prompting cost. Making your argument incredibly biased.
Similarly, the numbers released by Google seem artificially low, perhaps their TPUs are massively more efficient given they are ASICs. But they did not seem to disclose what model they are using for this measurement, It could be their smallest, least capable and most energy efficient model which would be disingenuous.
I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.
Whatever you’re imagining as the impact, it’s probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.
Please show your math.
One Nvidia H100 DGX AI server consumes 10.2kW at 100% utilization, meaning that one hour’s use of one server is equivalent to the electricity consumption of the typical USA home in one year. This is just a single 8-GPU server; it excludes the electricity required by the networking and storage hardware, let alone the electricity required to run the facility’s climate control.
xAI alone has deployed hundreds of thousands of H100 or newer GPUs. Let’s SWAG 160K GPUs = ~20K DGX servers = >200MW for compute alone.
H100 is old. State of the art GB200 NVL72 is 120kW per rack.
Musk is targeting not 160K, but literally one million GPUs deployed by the end of this year. He has literally flown in multiple natural gas power plants which he is now operating without any environmental permits or controls, to the detriment of the locals in Memphis.
This is just one company training one typical frontier model. There are many competitors operating at similar scale and sadly the vast majority of their new capacity is running on hydrocarbons because that’s what they can deploy at the scale they need today.
I’d like to understand what this math was before accepting this as fact.
If their energy consumption actually was so small, why are they seeking to use nuclear reactors to power data centres now?
Because demand for data centers is rising, with AI as just one of many reasons.
But that’s not as flashy as telling people it takes the energy of a small country to make a picture of a cat.
Also interesting that we’re ignoring something here – big tech is chasing cheap sources of clean energy. Don’t we want cheap, clean energy?
AI is the driver of the parabolic spike in global data center buildouts. No other use case comes close in terms of driving new YoY growth in tech infra capex spend.
Sir we do not make reasonable points in here, you’re supposed to hate AI irrationally and shut up.
Sure we do. Do we want the big tech corporations to hold the reins of that though?
If cheap(er/better) energy is invented then that’s good, why would tech corpos be able to “hold the reins” of it exclusively?
Well, patents and what have you are a thing. I’m mostly thinking that I wouldn’t want e.g. Facebook to run any nuclear reactors or energy grids. That’s something I prefer the government does.
Nuclear reactors already exist, that’s not new tech.
Didn’t xitter just install a gas powered data center that’s breaking EPA rules for emissions?
Yes, yes it did. And as far as I can tell, it’s still belching it out, just so magats can keep getting owned by it. What a world
https://tennesseelookout.com/2025/07/07/a-billionaire-an-ai-supercomputer-toxic-emissions-and-a-memphis-community-that-did-nothing-wrong/
To be fair, nuclear power is cool as fuck and would reduce the carbon footprint of all sorts of bullshit.
Because the training has diminishing returns, meaning the small improvements between (for example purposes) GPT 3 and 4 will need exponentially more power to have the same effect on GPT 5. In 2022 and 2023 OpenAI and DeepMind both predicted that reaching human accuracy could never be done, the latter concluding even with infinite power.
So in order to get as close as possible then in the future they will need to get as much power as possible. Academic papers outline it as the one true bottleneck.
And academia will work on that problem. It reminds me of intel processors “projected” to use kilowatts of energy, then smart people made other types of chips and they don’t need 2000 watts.
Academia literally got cut by more than a third and Microsoft is planning to revive breeder reactors.
You might think academia will work on the problem but the people running these things absolutely do not.
Found the American.
Did the EU suddenly develop a tech industry overnight or are you unaware where all the major AI companies are located?
Volume of requests and power consumption requirements unrelated to requests made, at least I have to assume. Certainly doesn’t help that google has forced me to make a request to their ai every time I run a standard search.
Seriously. I’d be somewhat less concerned about the impact if it was only voluntarily used. Instead, AI is compulsively shoved in every nook and cranny of digital product simply to justify its own existence.
The power requirement for training is ongoing, since mere days after Sam Altman released a very underehelming GPT-5, he begins hyping up the next one.
I also never saw a calculation that took into amount my VPS costs. The fckers scrape half the internet, warming up every server in the world connected to the internet. How much energy is that?
That’s not small…
100’s of Gigawatts is how much energy that is. Fuel is pretty damn energy dense.
A Boeing 777 might burn 45k Kg of fuel, at a density of 47Mj/kg. Which comes out to… 600 Megawatts
Or about 60 houses energy usage for a year in the U.S.
It’s an asinine way to measure it to be fair, not only is it incredibly ambiguous, but almost no one has any reference as to how much energy that actually is.
That’s not ~600 Megawatts, it’s 587 Megawatt-hours.
Or in other terms that are maybe easier to understand: 5875 fully charged 100kWh Tesla batteries.
I usually liken it to video games, ya. Is it worse that nothing? Sure, but that flight or road trip, etc, is a bigger concern. Not to mention even before AI we’ve had industrial usage of energy and water usage that isn’t sustainable… almonds in CA alone are a bigger problem than AI, for instance.
Not that I’m pro-AI cause it’s a huge headache from so many other perspectives, but the environmental argument isn’t enough. Corpo greed is probably the biggest argument against it, imo.
A flight to Europe’s worth of energy is a pretty asinine way to measure this. Is it not?
It’s also not that small the number, being ~600 Megawatts of energy.
However, training cost is considerably less than prompting cost. Making your argument incredibly biased.
Similarly, the numbers released by Google seem artificially low, perhaps their TPUs are massively more efficient given they are ASICs. But they did not seem to disclose what model they are using for this measurement, It could be their smallest, least capable and most energy efficient model which would be disingenuous.
A Megawatt is a unit of power not energy. It means nothing without including the duration, like Megawatt-hours