>>xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.
These generators polluted the nearby historically black neighborhoods in Memphis Tennessee with nitrogen oxides. Residents are afraid to open their windows, with the elderly, children and those suffering from conditions like COPD particularly affected. Lawsuits alleging environmental racism are pending.
xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!
> xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!
80ish% in the US live <100 miles from their hometown.
It would be wise to see "jobs!" Investment!" as little more than a mafioso like threat to agrarian-stay in one place-work to live types. "Sure is a nice Shire you got there. Better hope it doesn't suffer from lack of investment in jobs."
Threats of it all imploding are taken seriously by a lot of people.
So what if it does? That's normal with the passage of time. As long as human biology exists humans will solve for those problems. Beyond that obligation is just socialized memes, ethno objects that come and go with the generations.
Everyone alive now worried about propagation of our culture sure does not seem concerned Latin fell out of common use. That they aren't spending their lives keeping old traditions alive should make it obvious old traditions don't mean that much to the living.
Politicians and rich need us servicing debt they so graciously took on to invest in jobs or we would be free to police them.
The phrase implies that powerful companies know that historically black neighborhoods don’t have the resources to mount a legal defense against abnormal pollution from data center generators, so the smart choice is to put all the pollution near historically black neighborhoods.
The agenda, as it is every day, is how to externalize costs so that megacompanies don’t have to spend more money to keep our environment clean.
Crime rates also statistically correlate with demographics, but if I assume a specific person is a criminal based on that stat, I would (rightly) be called racist.
Expecting people to assume 'historically black' == 'poor' similarly feels racist.
It makes more sense to word it like this when you take into consideration historical trends, like drowned towns for lakes or dams, highway system along redline, thriving neighbourhoods erased to create parks… often preceded by violence and little to no compensation.
My interpretation is it would be less likely to happen near a wealthy neighborhood compared to a poor neighborhood. Why talk about race if its not about race?
I'm a bit skeptical about this. I know diesel generators make these kind of pollutants, but I haven't heard the same about natural gas.
My city has a big NG facility downtown that pipes heated water to a bunch of buildings, and it is surrounded by condos. I've never heard anything about it impacting the air (other than CO2 which is a global and not local issue).
Every building here (except for those connected to district heating systems), large and small, has a natural gas boiler or furnace. We have also several NG plants generating electricity within city limits. Again, localized pollution is not what concerns people about these things. Coal plants, on the other hand, tended to be way outside the city when they were still in operation.
Burning gas always creates stuff you don't want to be breathing. These small portable turbines were allowed to run dirtier than a full-size NG plant because the premise was that they are small and temporary. But then xAI put 40 of them in a parking lot and fired them all up at the same time, which is quite illegal but xAI also controls the government of both Tennessee and the USA, so residents are fucked.
You hear AI folks including Trump's AI Tsar David Sachs frequently promoting what happened in Tennessee as the future of AI power generation. They're calling it "behind the meter" power generation. Understand that this is what it is: generating gigawatts of power with dozens or hundreds of "small" gas turbines all stacked in one place. Instant, on-demand toxic triangle coming to a data center project near you.
Large gas plants are probably relatively clean overall, but the temporary, portable gas generators used by eg the xAI datacenter are not as tightly regulated and aren’t inspected or controlled in the same way. Given the particular corporate agent involved, I’d be surprised if any care at all were being taken to minimize air pollution caused by these portable generators.
Lower efficiency gas furnaces don’t have a completely sealed exhaust and rely on a draft for pollutant evacuation. This usually works good enough when properly installed and maintained but can be a source of indoor air pollution. Although typically minimal.
And there are also decorative and/or supplemental gas heating devices which exhaust into the home.
>I'm a bit skeptical about this. I know diesel generators make these kind of pollutants, but I haven't heard the same about natural gas.
it is about gas turbine high temperature and pressure, not about natural gas. That is why diesel engine does it too, while it isn't such an issue for regular gas engine, nor for "simple" LNG burners/heaters.
What xAI does here sounds horrendous. 270MW of gas turbines dumping the exhaust straight into the neighborhood. It is like 1000 diesel trucks running their engine full power 24x7 near your house.
TFA said it's all legal and explicit federal policy. You don't have to like it, but some people are going to have to make minor sacrifices if the majority want AI services. Look on the bright side, when these people all have personal robot doctors caring for them well into their 100s they will be grateful they didn't listen to the NIMBYs
To be honest, as soon as someone starts talking about "historically black" this and that I'm sure they're pulling a bullshit job just like the "critically endangered" snail-darters and the "historical" laundromats of San Francisco.
Probably not because if it affected white neighborhoods, it either wouldn't be enacted, shut down after complaints, or receive enough bad press as to be shut down.
Because the people who decided where to locate it and the people in government who could do something to stop it make decisions about how much they care based on those folks’ skin color. If those generators were placed near a rich white neighborhood, the government response would be wildly different.
Mississippi in particular is well known at the state government level to actively choose not to enforce environmental regulations in areas where its Black citizens live.
And TFA addresses this. South Memphis was a community largely composed of freed slaves, where manufacturers set up shop, the military dumped waste (now a superfund site), and people have continued to mark the area for polluting industries for generations.
It’s cute they describe this as a solution to _the_ power problem. It’s a solution to _their_ power problem. We have a grid problem. This massive amount of investment would be an incredible time to do something about it. Instead we’ve got an administration hostile to modern energy solutions and an industry hostile to everyone. Really depressing to see all this money go up in smoke in such a massive short sighted rush.
I previously worked directly for some of the power generation manufacturers listed in the article and later on the grid/power transmission side.
My takeaway is they get it correct enough but no deep insight on the power generation industry.
I was surprised by and learned a few things from the article though. Definitely gives me some ideas of reaching out to old contacts to see if there’s any opportunities with building models and analytics for the new demands.
Focusing on Bloom is fun because they’re new and startup vibes but Innio and cat are really having a resurgence of demand with their generators and building diesel/natg engines is much simpler than gas turbines. I’m sure the heads at GE wish they hadn’t sold that off now.
On steam/gas turbine blade manufacturing there most certainly are more big players than 4 and many US based. You have to remember this is an old industry with existing supply chains and maintenance companies.
As long as the demand for new data centers doesn’t lose steam these onsite options will continue to flourish. Fed grid access builds are currently a 10+ year wait and they are reworking the system to be “fast”, only 5-6 years for build outs now. They’re also changing how the bidding process works which was touched on here. You need skin in the game if you want to be taken seriously now. There’s so many requests from companies arbing who can give them the best deal/timeline. Now you need to put money up if you even want a call back.
so we had some onsite generation moves from the lower end - residential solar, etc - and now we have it from the higher end - fossil fuel generation at datacenters. If that creates high efficiency generators then that may drive "onsite" further into the mid-segment. That may also affect the grid role nudging it from hierarchical delivery to network sharing/rebalancing, and may even lead to separate local grids (like 100+ years ago). That also would give fossil fuels new demand (and also would be a market for small/compact nuclear). Kind of disintegration wave.
Kinda proving that these are a bad deal for communities - very few jobs and tax revenues, but enjoy the increased asthma and cancer we all get to pay for.
Does anyone know a really good source for basic information estimating what % of global carbon emissions come from AI training and AI inference, both 1) now and 2) in the future if we believe AI companies' capacity projections? I would really like to read a detailed analysis of this avoids both AI hype and anti-AI hysteria. It's an important question but it excites strong reactions that tend to cloud the facts.
Yes, all sources are biased, but some are useful. And I know that it's hard to get solid data on this from AI companies, but we must have at least a rough estimate?
US grid carbon intensity is 0.384 gCO2/kWh (source: ourworldindata). US datacenter energy use in 2023: 176 TWh (excluding crypto, source US congress). How much of that is AI, I couldn't find.
So that's 67Mt CO2, I hope I haven't misplaced my decimal point, please double check. That would be 1.3% of the 5Gt of CO2 the US emits per year.
For global emission and future trends the IEA estimates about 500TWh/year globally today, and 1000TWh/year in 2030 (base scenario). Assuming these use the current US grid carbon intensity, that would be about 200MtCO2 today, 400 in 2030. Global CO2 emissions today are 40Gt/year, so that would be 0.5% today, and 1% in 2030 (if global emissions stay stable).
Part of what bothers me with AI energy consumption isn't just how wasteful it might be from an ecological perspective, it's how brutally inefficient it is compared to the biological "state of the art" — 2000kcal = 8,368 kJ. 8,368 kJ / 86,400 s = 96.9 W.
So the benchmark is achieving human-like intelligence on a 100W budget. I'd be very curious to see what can be achieved by AI targeting that power budget.
Is it though? When I ask an LLM research questions, it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.
Similarly, I've had times where it wrote me scientific simulation code that would take me 2 days, in around a minute.
Obviously I'm cherry-picking the best examples, but I would guess that overall, the energy usage my LLM queries have required is vastly less than my own biological energy usage if I did the equivalent work on my own. Plus it's not just the energy to run my body -- it's the energy to house me, heat my home, transport my groceries, and so forth. People have way more energy needs than just the kilocalories that fuel them.
If you're using AI productively, I assume it's already much more energy-efficient than the energy footprint of a human for the same amount of work.
Training energy is amortized across the lifespan of a model. For any given query for the most popular commercial models, your share of the energy used to train it is a small fraction of the energy used for inference (e.g. 10%).
For this kind of thinking to work in practice you would need to kill the people that AI makes redundant. This is apart from the fact that right now we are at a choke point where it's much more important to generate less CO2 than it is to write scientific simulation code a little quicker (and most people are using AI for much more unnecessary stuff like marketing)
> For this kind of thinking to work in practice you would need to kill the people that AI makes redundant.
That is certainly not a logical leap I'm making. AI doesn't make anybody redundant, the same way mechanized farming didn't. It just frees them up to do more productive things.
Now consider whether LLM's will ultimately speed up the technological advancements necessary to reduce CO2? It's certainly plausible.
Ah! And don't get me started about how specific its energy source must be! Pure electricity, no less! Where a human brain comes attached with an engine that can power it for days on a mere ham sandwich!
How so? A human needs the entire civilisation to be productive at that level. If you take a just the entire US electricity consumption and divide it by its population, you'll get a result that's an order of magnitude higher. And that's just electricity. And that's just domestic consumption, even though US Americans consume tons of foreign-made goods.
I can generate images or get LLM answers in below 15 seconds on mundane hardware. The image generator draws many times faster than any normal person, and the LLM even on my consumer hardware still produces output faster than I can type (and I'm quite good at that), let alone think what to type.
An LLM gives AN answer. If you ask for not many more than that it gets confused, but instead of acting in a human-like way, it confidently proceeds forward with incorrect answers. You never quite know when the context got poisoned, but reliability drops to 0.
There's many things to say on this. Free is worthless. Speed is not necessarily a good thing. The image generation is drivel. But...
The main nail in the coffin is accountability. I can't trust my work if I can't trust the output of the machine. (and as a bonus, the machine can't build a house. It's single purpose).
Speed highly correlates with power efficiency. I believe my hardware maxes out somewhere around 150W. 15 seconds of that isn't much at all.
> Also, why are people moving mountains to make huge, power obliterating datacenters if actually "its fine, its not that much"?
I presume that's mostly training, not inference. But in general anything that serves millions of requests in a small footprint is going to look pretty big.
It's not a good analogy at all, because of what they said about mundane hardware. They're specifically not talking about any kind of ridiculous wattage situation, they're talking about single GPUs that need fewer watts than a human in an office to make text faster than a human, or that need 2-10x the watts to make video a thousand times faster.
Beyond wasteful the linked article can't even remotely be taken seriously.
> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.
What? I let ChatGPT swag an answer on the revenue forecast and it cited $2-6B rev per GW year.
And then we get this gem...
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
So now we're going to be spewing ~486 g CO₂e per kWh using something that wasn't designed to run 24/7/365 to handle these workloads? These datacenters choosing to use these forms of power should have to secure a local vote showcasing, and being held to, annual measurements of NOx, CO, VOC and PM.
This article just showcases all the horrible bandaids being applied to procure energy in any way possible with little regard to health or environmental impact.
I often like SemiAnalysis' work, but there's parts of this article that are shockingly under-researched and completely missing critical parts of the narrative.
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive.
> Again, clever firms like xAI have found remedies. Elon's AI Lab even pioneered a new site selection process - building at the border of two states to maximize the odds of getting a permit early!
The energy strategy was to completely and almost certainly illegally bypass permitting and ignore the Clean Air Act, at a tangible cost to the surrounding community by measurably increasing respiratory irritants like NOx in the air around these communities. Characterizing this harm as "clever" is wildly irresponsible, and it's wild that the word "illegal" doesn't appear in the article once, while at the same time handwaving the fact that permitting for local combustion-based generation (for these reasons!) is one of the main factors to pushing out timelines and increasing cost.
What about renewables + battery storage? Does it take much longer to build? I can imagine getting a permit can take quite a long time, but what takes so long to set up solar panels and link them to batteries, without even having to connect them to the grid?
How many batteries is that? If we're talking solar and you have say a 300MW datacenter and you need it to operate for 12 hours without sun you need at least two of the largest battery install in the world[1] at 1700MWh. That doesn't factor cloudy days.
Another POV is, if datacenters are really constrained by power, by all means, offer users a discount when their queries utilize solar. Millions of Americans drive further to save cents to fill up their tanks - you can’t say there isn’t precedent among normal people to deal with this. The better question is, is it really a constraint?
Doesnt really work, as the biggest cost is buying GPUs etc which has to be paid for, and leaving them idle when the sun isnt shining doesnt pay the purchase costs. Their are industries where this does work though.
haha how do you figure? with how much time people spend playing league of legends, watching tiktok and standing in line for "Free" shit, i think their time is actually quite flexible
Reciprocating natural gas engines can be moved from [concrete] pad to pad and be up and running in under 24 hours. The portable turbines take longer but they’re still fast.
Acquiring enough solar panels and battery storage still takes a very long time by comparison.
The density required for solar is also much lower - the coordination between different land parcels and routing power and getting easements increases the time required vs. on prem gas turbines.
Takes much longer to build, requires a much larger up-front investment, and requires a lot more land.
The footprint needed when trying to generate this much power from solar or wind necessitates large-scale land acquisition plus the transmission infrastructure to get all that power to the actual data center, since you won't usually have enough land directly adjacent to it. That plus all the battery infrastructure makes it a non-starter for projects where short timescales are key.
I think it's funny that at no point in the article do they mention the idea of simply making LLMs more efficient. I guess that's not important when all you care about is winning the AI "race" rather then selling a long term sustainable product.
What makes you think that the entire process isn't being made more efficient? There are entire papers dedicated to pulling out more FLOPs from GPUs so that less energy is being wasted on simply moving memory around. Of course, there's also inference side optimizations like speculative decoding and MoE. Some of these make the training process more expensive.
The other big problem is that you can always increase the scale to compensate for the energy efficiency. I do wonder if they'll eventually level this off though. If performance somehow plateaus then presumably the efficiency gains will catch up. That being said, that doesn't seem to be a thing in the near future.
> This is a really long way of saying "We need to burn fossil fuels to make more money."
Like every other industry in the world?
I’m kind of amazed that AI data centers have become the political talking point for topics like water usage and energy use when they’re just doing what every other energy-intensive industry does. The food arriving at your grocery store and the building materials that built your house also came from industries that consume a lot of fossil fuels to make more money.
The difference is they are new. It’s not rational but people on the whole generally are ok with the status quo of how the sausage is made largely because they don’t really think about it. But new systems being spun up provide an entry point for a discussion. Ideally that discussion can then be widened and open up an opportunity for wider scale change. Or nothing happens and it all becomes the new status quo which most don’t think about again.
The difference is that the food industry at least feeds me and a house provides me shelter. The people in charge of building AI data centers intend to replace all labor in using my own and leave me begging for peanuts, while also making energy and many goods more expensive and ruining the environment
More like it’s a really long way to say the government has utterly failed at making sure electricity generation and transmission capacity keeps up with demand so datacenters have been forced to get creative with alternative ways to power themselves. These companies absolutely want to use renewable energy from the power grid but the government blew it.
Really cool in depth report, thanks for sharing. It's very interesting to see what these big datacenter deployments are actually doing. Go look at the oil price charts for the last 25 years and you'll see why it makes a ton of sense economically.
I also love how you can see the physical evidence of them pitting jurisdictions against each other from the satellite photos with the data center on one side of a state border and the power generation on the other.
And all without the proper permits! Using 35 generators when they were only allowed 15! Yay! So glad we're allowing AI companies to break law after law after law to not be able to reason logically the basic Towers of Hanoi.
The problem is that most of the AI labs are popping up in TX that has a uniquely isolated electrical grid. Recall how the Texas cold snap a few years ago took down the grid for days. Turns out if you make a grid based on short term profit motifs, it's not going to be flexible enough to take new demand.
It's not the grid's technological limitation. We could have lived in a world with a more connected grid, more nibble utility commissions, and a lot less methane/carbon emissions as a result of it
Boom’s pivot to trying to build turbines for data centers wasn’t surprising when data center deployments started using turbines. Either their CEO saw one of the headlines or their investors forwarded it over and it became their new talking point.
What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
Boom doesn’t actually have a turbine yet. Their design partner publicly pulled out of their contract with Boom a while ago.
Boom has been operating on vaporware for a while. It’s one of those companies I want to see succeed but whatever they’re doing in public is just PR right now. Until they actually produce something (other than a prototype that doesn’t resemble their production goals using other people’s parts) their PR releases don’t mean a whole lot.
> What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
My first thought when seeing that article is “I can buy one of these right now from Siemens or GE, and I could’ve ordered one at any time in the last 50 years.”
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
This seems like a big reach for me. Their largest engine (and it is absolutely massive) "only" produces 80MW of power. The Brayton cycle is unbeatable if you need to keep scaling power up to ridiculous levels.
I mean, the claim is certainly nonsensical in the sense that this isn't something Wärtsilä just "realized". They have been in the power plant business for decades. In the oldest financials they have online (the annual report for year 2000) their power plant sales are larger than their marine engine sales.
Really makes me wonder about anything else I've read on Semianalysis. Like, it is such an insane thing to claim and so easy to check. And they just wrote it anyway, like some kind of pathological fabulists.
But what's the part that seems like a "big reach"? Are you saying they didn't sign those contracts? That their customers are making a mistake?
This is coming from a group that does analysis on the semiconductor and cloud industries and provided very expensive access to their models and info. They are the citation.
So I guess it’s not a bubble then since these companies are raking in the big revenues? Or maybe they are counting all those circular investments as revenues somehow?
If you do the math, that's $10-$12 per watt year. There's approx 24×365.25=8766 hours in a year, so assuming that the datacenters would be running 24×7, that boils down to $1.14 to $1.37 in revenue per kWh. That's not a bad deal if power really is a major part of the expense.
As far as I can tell, power isn't actually a major part of the expense, it's dwarfed by the capex. Just the amortization on the GPU will be an order of magnitude higher than the cost of the power to run the GPU at 100%. (Assuming a 5 year depreciation period.)
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive. xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.
Wow, "truck-mounted gas turbines"? Who else could have mastered such a futuristic tech in so short a time? Seriously, who wrote this? Grok? And let's ignore that this needless burning of fossil fuel is making life on Earth harder for everyone and everything else.
I'm no fan of Musk, but you've got to admit it was a clever way to achieve the goal. SemiAnalysis don't do fanboy articles - their research is pretty in-depth. So they are stating it as they see it.
The problem ordinary people all over the world have is that governments are allowing this to happen. Maybe if there were stricter regulation it will prevent players such as Musk to come up with such "innovations".
"Getting a permit for 15 turbines after having illegally used 35 turbines that then poisoned the air for the residences around the turbines" is a clever way to achieve the goal? I wouldn't call doing a blatant illegal action "clever", but rather sociopathic.
Yeah I guess I'm not the target audience for this because I assumed that "the power problem" was "massive increase in electricity costs for people despite virtually unchanged usage on their part", not "AI companies have to wait too long to be able to start using even more power than they already are":
> Nicole Pastore, who has lived in her large stone home near Baltimore’s Johns Hopkins University campus for 18 years, said her utility bills over the past year jumped by 50%. “You look at that and think, ‘Oh my god,’” she said. She has now become the kind of mom who walks around her home turning off lights and unplugging her daughter’s cellphone chargers.
> And because Pastore is a judge who rules on rental disputes in Baltimore City District Court, she regularly sees poor people struggling with their own power bills. “It’s utilities versus rent,” she said. “They want to stay in their home, but they also want to keep their lights on.”
I understand the instinct but if people seriously think that they are solving any problem by unplugging cell phone chargers, they are simply bad at math. Human time is easily worth more than that, even when working at minimum wage.
That said, it obviously sucks that utility prices are rising for people who can not effortlessly cover that (not to speak of the local pollution, if that's an issue). Maybe some special tax to offset that cost to society towards hyper scalers would be a reasonable way to soften the blow, but I have not done the math.
They are not necessarily bad at math, but they probably aren't electricians or EEs or have ever needed or been asked to calculate how much power a cell phone charger uses.
Mom/Dad used to unplug things and turn lights off, so they do too.
And the air quality around these plants is poor, leading to health problems for the neighbors.
This short term, destructive, thinking should be criminalized.
I think it's time to discuss changing the incentives around ai deployment, specifically paying into a ubi fund whenever human jobs are replaced by ai. Musk himself raised the idea.
It can't be "criminalized" if govt and justice system is effectively actively bribed by the AI cartel because AI-related GDP "growth" is only veneer hiding the economical fuckups of the government
In the case of Grok's turbines, no emissions controls means sick people. Plus all the CO2 pushing climate collapse faster which hurts every coming generation.
Gas plants are not bad… but imagine 400 MW of gas plants in a concentrated area. You’ll always have NOx and SOx by products whenever you’re burning gas.
Gas is certainly less of a problem than coal, but they still produce plenty of bad stuff: nitrogen oxides and bad VOCs like formaldehyde that are well studied to increase risk of asthma and some types of cancer. I certainly wouldn’t want to live close to one.
The only way to solve problems like this IMO is to price in the externalities. Tax fossil fuels for the damage they do, in order to reveal their true cost. Then they will never look like the most affordable option, because they're not.
True. The same is true for nuclear energy. I never heard of a nuclear power plant that did not receive substantial subsidies throughout lifetime. Not to forget the nuclear fuel and the efforts required to create it and later to store it.
The natural gas turbines used are relatively efficient as far as engines go. Having them on-site makes transmission losses basically negligible.
Nothing short of full solar connected to batteries produced without any difficult to mine elements will make some people happy, but as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
I'm honestly curious whether you yourself are even aware of the disingenuousness of this argument. It's fairly impressive in its density!
1. Nobody complained about the efficiency of natural gas turbines. You can efficiently do a lot of useless stuff with deep negative externalities, and the fact it's efficient is not all that helpful.
2. Saying "the extreme far end would not be satisfied even by much better solutions" is not an excuse not to pursue better solutions!
3. There are many dimensions of this that people care about beyond the "global concern" level regarding "pollution and fuel consumption."
4. There are many problems that are significant and worth thinking about even if they are not the largest singular problems that could be included by some arbitrarily defined criteria
> I'm honestly curious whether you yourself are even aware of the disingenuousness of this argument.
Unnecessarily condescending and smug, but I’ll try to respond.
That said, you’re putting forth your own disingenuous assumptions and misconceptions. The natural gas turbines are an intermediate solution to get up and running due to the extremely long and arduous process of getting connected to the grid.
Arguing pedantry about the word efficiency isn’t helpful either. The data centers are being built, sorry to anyone who gets triggered by that. The gas turbines are an efficient way to power them while waiting for grid interconnect and longterm renewables to come online.
Disingenuous is acting like this is a permanent solution to the exclusion of others. The whole point is that it gets them started now with portable generation that is efficient.
> The data centers are being built, sorry to anyone who gets triggered by that.
Unnecessarily smug?
Beyond that they can be stopped. They're being met with a lot of resistance in the Midwest as they're attempting to be built without much understanding of the public utilities impact. People are catching on to the fact that energy and water consumption is pushing up costs for residents. A lot of assumptions are supporting this argument.
> The gas turbines are an efficient way to power them while waiting for grid interconnect and longterm renewables to come online.
I like the gymnastics of wordplay here. Efficient only when you look at them through the lens of some ephemeral timeframe that may or may not exist.
The gas turbines are hopefully an intermediate solution due to the long and not guaranteed process of grid connection and renewable buildout. History is of course full of such bets that did not work out the way their proponents hoped.
> The data centers are being built, sorry to anyone who gets triggered by that.
It's obvious that you're starting from your conclusion and working backwards, which is probably how your initial comment was full of so much motivated reasoning to begin with.
In your mind, is there any set of negative externalities that would justify not building the data centers, or at least not building them now, or at least not building them now in specific areas that require these types of interim solutions?
This is exactly right. These are glorified emergency generators, and grid power is ordinarily far cheaper; especially for interruptible loads like training new models (checkpointing work in progress and resuming it later is cheap and easy). The article mentions that quite clearly.
> as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
Same level doesn't remove the concern for this unnecessary pollution. Stop changing the subject from the environmental problems that AI usage can have by their increased power consumption.
Natural gas engines are efficient!
Ok! But what about the pollution they produce to nearby neighborhoods? What about the health repercussions? Do human lives not matter?
And imagine all this poorly located, overpriced, haphazardly thrown together and polluting infrastructure will basically get flushed down the toilet once either the AI bubble pops, or they figure out a new way of doing AI that doesn't require terawatts of power.
> However, AI infrastructure cannot wait for the grid’s multiyear transmission upgrades. An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually. Getting a 400 MW datacenter online even six months earlier is worth billions. Economic need dwarfs problems like an overloaded electric grid. The industry is already searching for new solutions.
wow, that's some logic. Environmentally unsound means of extracting energy directly damage the ecosystem in which humans need to live. The need for a functioning ecosystem "dwarfs" "problems" like billionaires not making enough billions. Fixing a ruined ecosystem would cost many more billions than whatever economic revenue the AI generated while ruining it. So if you're not harnessing the sun or wind (forget about the latter in the US right now, btw), you're burning things, and you can get lost with that.
This kind of short sighted thinking is because when folks like this talk about generating billions of dollars of worth, their cerebellums are firing up as they think of themselves personally as billionaires, corrupting their overall thought processes. We really need to tax billionaires out of existence.
The dialog around AI resource use is frustratingly inane, because the benefits are never discussed in the same context.
LLMs/diffusers are inefficient from a traditional computing perspective, but they are also the most efficient technology humanity has created:
> AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) and human individuals performing equivalent writing and illustrating tasks. Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.
>>xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.
These generators polluted the nearby historically black neighborhoods in Memphis Tennessee with nitrogen oxides. Residents are afraid to open their windows, with the elderly, children and those suffering from conditions like COPD particularly affected. Lawsuits alleging environmental racism are pending.
xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!
https://time.com/7308925/elon-musk-memphis-ai-data-center/
> xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!
80ish% in the US live <100 miles from their hometown.
It would be wise to see "jobs!" Investment!" as little more than a mafioso like threat to agrarian-stay in one place-work to live types. "Sure is a nice Shire you got there. Better hope it doesn't suffer from lack of investment in jobs."
Threats of it all imploding are taken seriously by a lot of people.
https://www.mentalfloss.com/culture/generations/millennials-...
So what if it does? That's normal with the passage of time. As long as human biology exists humans will solve for those problems. Beyond that obligation is just socialized memes, ethno objects that come and go with the generations.
Everyone alive now worried about propagation of our culture sure does not seem concerned Latin fell out of common use. That they aren't spending their lives keeping old traditions alive should make it obvious old traditions don't mean that much to the living.
Politicians and rich need us servicing debt they so graciously took on to invest in jobs or we would be free to police them.
The phrasing 'historically black neighborhoods' feels like it pushes a specific agenda rather than just addressing the pollution.
It implies that if this were happening near a non black neighborhood, it wouldn’t be as egregious, which is a strange moral stance.
Also 'historically' is irrelevant. Pollution hurts the people living there now.
The phrase implies that powerful companies know that historically black neighborhoods don’t have the resources to mount a legal defense against abnormal pollution from data center generators, so the smart choice is to put all the pollution near historically black neighborhoods.
The agenda, as it is every day, is how to externalize costs so that megacompanies don’t have to spend more money to keep our environment clean.
You’re conflating race with poverty.
It feels racist to expect people to assume a neighborhood is 'resource poor' just because it is 'historically black'.
Also, the OP explicitly states that lawsuits are pending. Clearly, the community was able to mount a legal defense
> It feels racist to expect people to assume a neighborhood is 'resource poor' just because it is 'historically black'.
Statistically poverty is correlated with race. For reasons to do with (quite recent) history.
Statistics are not a license to assume.
Crime rates also statistically correlate with demographics, but if I assume a specific person is a criminal based on that stat, I would (rightly) be called racist.
Expecting people to assume 'historically black' == 'poor' similarly feels racist.
Statistically crime is also related with race. Are we ready to make similar assumptions then?
Crime is related with poverty which is related with race.
> It implies that if this were happening near a non black neighborhood, it wouldn’t be as egregious, which is a strange moral stance.
I read it the other way: that it simply wouldn't happen in a white neighborhood.
That makes sense. For some reason though I still sense a hint of desire for retribution in the original comment
It makes more sense to word it like this when you take into consideration historical trends, like drowned towns for lakes or dams, highway system along redline, thriving neighbourhoods erased to create parks… often preceded by violence and little to no compensation.
I think this is an uncharitable interpretation.
My interpretation is it would be less likely to happen near a wealthy neighborhood compared to a poor neighborhood. Why talk about race if its not about race?
I'm a bit skeptical about this. I know diesel generators make these kind of pollutants, but I haven't heard the same about natural gas.
My city has a big NG facility downtown that pipes heated water to a bunch of buildings, and it is surrounded by condos. I've never heard anything about it impacting the air (other than CO2 which is a global and not local issue).
Every building here (except for those connected to district heating systems), large and small, has a natural gas boiler or furnace. We have also several NG plants generating electricity within city limits. Again, localized pollution is not what concerns people about these things. Coal plants, on the other hand, tended to be way outside the city when they were still in operation.
Burning gas always creates stuff you don't want to be breathing. These small portable turbines were allowed to run dirtier than a full-size NG plant because the premise was that they are small and temporary. But then xAI put 40 of them in a parking lot and fired them all up at the same time, which is quite illegal but xAI also controls the government of both Tennessee and the USA, so residents are fucked.
You hear AI folks including Trump's AI Tsar David Sachs frequently promoting what happened in Tennessee as the future of AI power generation. They're calling it "behind the meter" power generation. Understand that this is what it is: generating gigawatts of power with dozens or hundreds of "small" gas turbines all stacked in one place. Instant, on-demand toxic triangle coming to a data center project near you.
Global issues start locally. See: tragedy of the commons
Gas furnaces and stoves are known polluters of indoor air: https://www.psehealthyenergy.org/gas-stoves-and-indoor-air-p...
Large gas plants are probably relatively clean overall, but the temporary, portable gas generators used by eg the xAI datacenter are not as tightly regulated and aren’t inspected or controlled in the same way. Given the particular corporate agent involved, I’d be surprised if any care at all were being taken to minimize air pollution caused by these portable generators.
That is true of gas stoves, but gas furnaces don't exhaust into the house.
Lower efficiency gas furnaces don’t have a completely sealed exhaust and rely on a draft for pollutant evacuation. This usually works good enough when properly installed and maintained but can be a source of indoor air pollution. Although typically minimal.
And there are also decorative and/or supplemental gas heating devices which exhaust into the home.
>I'm a bit skeptical about this. I know diesel generators make these kind of pollutants, but I haven't heard the same about natural gas.
it is about gas turbine high temperature and pressure, not about natural gas. That is why diesel engine does it too, while it isn't such an issue for regular gas engine, nor for "simple" LNG burners/heaters.
What xAI does here sounds horrendous. 270MW of gas turbines dumping the exhaust straight into the neighborhood. It is like 1000 diesel trucks running their engine full power 24x7 near your house.
We can sue to shut down pollution generators? Finally, I can get rid of that annoying airport...
TFA said it's all legal and explicit federal policy. You don't have to like it, but some people are going to have to make minor sacrifices if the majority want AI services. Look on the bright side, when these people all have personal robot doctors caring for them well into their 100s they will be grateful they didn't listen to the NIMBYs
To be honest, as soon as someone starts talking about "historically black" this and that I'm sure they're pulling a bullshit job just like the "critically endangered" snail-darters and the "historical" laundromats of San Francisco.
Why is the skin tone of the residents of the affected community relevant?
In the US, we have a living history of discriminatory policies based on race
https://www.thesidewalksymposium.com/blog/the-enduring-shado... , here is a quick overview of redlining in Memphis
Yeah. I've heard about it. So this wouldn't be a problem if it affected a different group of people?
Probably not because if it affected white neighborhoods, it either wouldn't be enacted, shut down after complaints, or receive enough bad press as to be shut down.
That's a lot of assumptions. If they wanted that to be the point of the article they could have done it a lot more explicitly.
Because the people who decided where to locate it and the people in government who could do something to stop it make decisions about how much they care based on those folks’ skin color. If those generators were placed near a rich white neighborhood, the government response would be wildly different.
Mississippi in particular is well known at the state government level to actively choose not to enforce environmental regulations in areas where its Black citizens live.
And TFA addresses this. South Memphis was a community largely composed of freed slaves, where manufacturers set up shop, the military dumped waste (now a superfund site), and people have continued to mark the area for polluting industries for generations.
To be fair they would definitely do this to rural and/or poor white people too.
Maybe. East Palestine OH got a decent amount of political attention.
It’s cute they describe this as a solution to _the_ power problem. It’s a solution to _their_ power problem. We have a grid problem. This massive amount of investment would be an incredible time to do something about it. Instead we’ve got an administration hostile to modern energy solutions and an industry hostile to everyone. Really depressing to see all this money go up in smoke in such a massive short sighted rush.
I previously worked directly for some of the power generation manufacturers listed in the article and later on the grid/power transmission side.
My takeaway is they get it correct enough but no deep insight on the power generation industry.
I was surprised by and learned a few things from the article though. Definitely gives me some ideas of reaching out to old contacts to see if there’s any opportunities with building models and analytics for the new demands.
Focusing on Bloom is fun because they’re new and startup vibes but Innio and cat are really having a resurgence of demand with their generators and building diesel/natg engines is much simpler than gas turbines. I’m sure the heads at GE wish they hadn’t sold that off now.
On steam/gas turbine blade manufacturing there most certainly are more big players than 4 and many US based. You have to remember this is an old industry with existing supply chains and maintenance companies.
As long as the demand for new data centers doesn’t lose steam these onsite options will continue to flourish. Fed grid access builds are currently a 10+ year wait and they are reworking the system to be “fast”, only 5-6 years for build outs now. They’re also changing how the bidding process works which was touched on here. You need skin in the game if you want to be taken seriously now. There’s so many requests from companies arbing who can give them the best deal/timeline. Now you need to put money up if you even want a call back.
so we had some onsite generation moves from the lower end - residential solar, etc - and now we have it from the higher end - fossil fuel generation at datacenters. If that creates high efficiency generators then that may drive "onsite" further into the mid-segment. That may also affect the grid role nudging it from hierarchical delivery to network sharing/rebalancing, and may even lead to separate local grids (like 100+ years ago). That also would give fossil fuels new demand (and also would be a market for small/compact nuclear). Kind of disintegration wave.
Kinda proving that these are a bad deal for communities - very few jobs and tax revenues, but enjoy the increased asthma and cancer we all get to pay for.
Does anyone know a really good source for basic information estimating what % of global carbon emissions come from AI training and AI inference, both 1) now and 2) in the future if we believe AI companies' capacity projections? I would really like to read a detailed analysis of this avoids both AI hype and anti-AI hysteria. It's an important question but it excites strong reactions that tend to cloud the facts.
Yes, all sources are biased, but some are useful. And I know that it's hard to get solid data on this from AI companies, but we must have at least a rough estimate?
Please don't tell me to ask ChatGPT about it :)
US grid carbon intensity is 0.384 gCO2/kWh (source: ourworldindata). US datacenter energy use in 2023: 176 TWh (excluding crypto, source US congress). How much of that is AI, I couldn't find.
So that's 67Mt CO2, I hope I haven't misplaced my decimal point, please double check. That would be 1.3% of the 5Gt of CO2 the US emits per year.
https://ourworldindata.org/grapher/carbon-intensity-electric...
https://www.congress.gov/crs-product/R48646#_Toc207199546
For global emission and future trends the IEA estimates about 500TWh/year globally today, and 1000TWh/year in 2030 (base scenario). Assuming these use the current US grid carbon intensity, that would be about 200MtCO2 today, 400 in 2030. Global CO2 emissions today are 40Gt/year, so that would be 0.5% today, and 1% in 2030 (if global emissions stay stable).
https://www.iea.org/data-and-statistics/charts/global-data-c...
Part of what bothers me with AI energy consumption isn't just how wasteful it might be from an ecological perspective, it's how brutally inefficient it is compared to the biological "state of the art" — 2000kcal = 8,368 kJ. 8,368 kJ / 86,400 s = 96.9 W.
So the benchmark is achieving human-like intelligence on a 100W budget. I'd be very curious to see what can be achieved by AI targeting that power budget.
Is it though? When I ask an LLM research questions, it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.
Similarly, I've had times where it wrote me scientific simulation code that would take me 2 days, in around a minute.
Obviously I'm cherry-picking the best examples, but I would guess that overall, the energy usage my LLM queries have required is vastly less than my own biological energy usage if I did the equivalent work on my own. Plus it's not just the energy to run my body -- it's the energy to house me, heat my home, transport my groceries, and so forth. People have way more energy needs than just the kilocalories that fuel them.
If you're using AI productively, I assume it's already much more energy-efficient than the energy footprint of a human for the same amount of work.
> it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.
In that case I think it would be only fair to also count the energy required for training the LLM.
LLMs are far ahead of humans in terms of the sheer amount of knowledge they can remember, but nowhere close in terms of general intelligence.
Training energy is amortized across the lifespan of a model. For any given query for the most popular commercial models, your share of the energy used to train it is a small fraction of the energy used for inference (e.g. 10%).
For this kind of thinking to work in practice you would need to kill the people that AI makes redundant. This is apart from the fact that right now we are at a choke point where it's much more important to generate less CO2 than it is to write scientific simulation code a little quicker (and most people are using AI for much more unnecessary stuff like marketing)
> For this kind of thinking to work in practice you would need to kill the people that AI makes redundant.
That is certainly not a logical leap I'm making. AI doesn't make anybody redundant, the same way mechanized farming didn't. It just frees them up to do more productive things.
Now consider whether LLM's will ultimately speed up the technological advancements necessary to reduce CO2? It's certainly plausible.
Honest question - what are artists being freed up to do that’s more important? DoorDash?
you didn't consider the 18+ years we have with almost no productivity and the extra resources required to sustain life
Ah! And don't get me started about how specific its energy source must be! Pure electricity, no less! Where a human brain comes attached with an engine that can power it for days on a mere ham sandwich!
How so? A human needs the entire civilisation to be productive at that level. If you take a just the entire US electricity consumption and divide it by its population, you'll get a result that's an order of magnitude higher. And that's just electricity. And that's just domestic consumption, even though US Americans consume tons of foreign-made goods.
How much energy did evolution "spend" to get us here?
I agree human brains are crazy efficient though.
try to calculate 12312312.123213 * 123123.3123123
A computer uses orders of magnitude less energy than a human.
It's all about the task, humans are specialized too.
EDIT: maybe add a logarithm or other non-linear functions to make the gap even bigger.
GenAI completely fails to even get the right answer to numeric problems
A GenAI does not, however.
That’s about the energy a laptop or two uses at full tilt.
You can't compare a training run that produces a file which can be run forever after to a human day
Inference itself is also very costly!
But either way, how many human lives are spent making that file?
Not really.
I can generate images or get LLM answers in below 15 seconds on mundane hardware. The image generator draws many times faster than any normal person, and the LLM even on my consumer hardware still produces output faster than I can type (and I'm quite good at that), let alone think what to type.
An LLM gives AN answer. If you ask for not many more than that it gets confused, but instead of acting in a human-like way, it confidently proceeds forward with incorrect answers. You never quite know when the context got poisoned, but reliability drops to 0.
There's many things to say on this. Free is worthless. Speed is not necessarily a good thing. The image generation is drivel. But...
The main nail in the coffin is accountability. I can't trust my work if I can't trust the output of the machine. (and as a bonus, the machine can't build a house. It's single purpose).
Okay, but this has vanishingly little to do with the comment chain you replied to, which was about energy efficiency.
Is "faster" really what we are talking about right now? It could be a lot faster to take a helicopter to work everyday too, versus riding a bike.
Also, why are people moving mountains to make huge, power obliterating datacenters if actually "its fine, its not that much"?
Speed highly correlates with power efficiency. I believe my hardware maxes out somewhere around 150W. 15 seconds of that isn't much at all.
> Also, why are people moving mountains to make huge, power obliterating datacenters if actually "its fine, its not that much"?
I presume that's mostly training, not inference. But in general anything that serves millions of requests in a small footprint is going to look pretty big.
> It could be a lot faster to take a helicopter to work everyday too, versus riding a bike.
Great analogy.
It's not a good analogy at all, because of what they said about mundane hardware. They're specifically not talking about any kind of ridiculous wattage situation, they're talking about single GPUs that need fewer watts than a human in an office to make text faster than a human, or that need 2-10x the watts to make video a thousand times faster.
There's a billion users. Why do we make massive cities and factories and fields if humans only need 2000 calories a day
idk, why?
Beyond wasteful the linked article can't even remotely be taken seriously.
> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.
What? I let ChatGPT swag an answer on the revenue forecast and it cited $2-6B rev per GW year.
And then we get this gem...
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
So now we're going to be spewing ~486 g CO₂e per kWh using something that wasn't designed to run 24/7/365 to handle these workloads? These datacenters choosing to use these forms of power should have to secure a local vote showcasing, and being held to, annual measurements of NOx, CO, VOC and PM.
This article just showcases all the horrible bandaids being applied to procure energy in any way possible with little regard to health or environmental impact.
> What? I let ChatGPT swag an answer on the revenue forecast and it cited $2-6B rev per GW year.
This article is coming from one of the premier groups doing financial and technical analysis on the semiconductor industry and AI companies.
I trust their numbers a hundred times more than a ChatGPT guess.
Are you sure they don't have a vested interest? At least ChatGPT gave me sources.
It doesn't matter who they are if there's nothing backing it up.
The entire article is predicated on the fact that this is profitable long term.
Again: > An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.
Yet this simple fact isn't justified at all nor is it stated what "AI cloud" actually is or how they got to those numbers.
I often like SemiAnalysis' work, but there's parts of this article that are shockingly under-researched and completely missing critical parts of the narrative.
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive.
> Again, clever firms like xAI have found remedies. Elon's AI Lab even pioneered a new site selection process - building at the border of two states to maximize the odds of getting a permit early!
The energy strategy was to completely and almost certainly illegally bypass permitting and ignore the Clean Air Act, at a tangible cost to the surrounding community by measurably increasing respiratory irritants like NOx in the air around these communities. Characterizing this harm as "clever" is wildly irresponsible, and it's wild that the word "illegal" doesn't appear in the article once, while at the same time handwaving the fact that permitting for local combustion-based generation (for these reasons!) is one of the main factors to pushing out timelines and increasing cost.
[1] https://time.com/7308925/elon-musk-memphis-ai-data-center/
[2] https://www.selc.org/news/resistance-against-elon-musks-xai-...
[3] https://naacp.org/articles/elon-musks-xai-threatened-lawsuit...
It’s called “Semi” analysis for a reason. Dylan Patel is the Jim Cramer of industry reporting for this sector.
More appropriate word is “sly” not “clever”.
I enjoyed the detailed article despite how depressing it is. I never can blame business for finding a market-palatable solution.
However, it is worth saying that xAI’s “solution” was illegal, unhealthy for the local constituents, and stinks of corruption, https://insideclimatenews.org/news/17072025/elon-musk-xai-da....
Our kids are not going to be happy we spun up more CO2 generation for this.
their uploaded minds will enjoy the infinite AI slop though
I hear they LOVE Sanctuary Moon.
What about renewables + battery storage? Does it take much longer to build? I can imagine getting a permit can take quite a long time, but what takes so long to set up solar panels and link them to batteries, without even having to connect them to the grid?
How many batteries is that? If we're talking solar and you have say a 300MW datacenter and you need it to operate for 12 hours without sun you need at least two of the largest battery install in the world[1] at 1700MWh. That doesn't factor cloudy days.
[1] https://www.heise.de/en/news/850-MW-World-s-largest-battery-...
Another POV is, if datacenters are really constrained by power, by all means, offer users a discount when their queries utilize solar. Millions of Americans drive further to save cents to fill up their tanks - you can’t say there isn’t precedent among normal people to deal with this. The better question is, is it really a constraint?
Doesnt really work, as the biggest cost is buying GPUs etc which has to be paid for, and leaving them idle when the sun isnt shining doesnt pay the purchase costs. Their are industries where this does work though.
The customers time is not flexible like that.
And every second GPU is not working, it's not making money
You both are talking about this stuff as if it is a new concept. Demand-based pricing is already commonplace for both electricity and compute.
The demand for both compute and electricity is higher while people are awake and using them.
> The customers time is not flexible like that.
A lot of the super expensive queries are flexible. Especially the agentic coding ones. And higher use naturally follows the sun anyway.
> And every second GPU is not working, it's not making money
Some companies already have more chips than they can feed, so if that continues then sure why not let it idle part of the night.
> The customers time is not flexible like that.
haha how do you figure? with how much time people spend playing league of legends, watching tiktok and standing in line for "Free" shit, i think their time is actually quite flexible
Reciprocating natural gas engines can be moved from [concrete] pad to pad and be up and running in under 24 hours. The portable turbines take longer but they’re still fast.
Acquiring enough solar panels and battery storage still takes a very long time by comparison.
The density required for solar is also much lower - the coordination between different land parcels and routing power and getting easements increases the time required vs. on prem gas turbines.
Takes much longer to build, requires a much larger up-front investment, and requires a lot more land.
The footprint needed when trying to generate this much power from solar or wind necessitates large-scale land acquisition plus the transmission infrastructure to get all that power to the actual data center, since you won't usually have enough land directly adjacent to it. That plus all the battery infrastructure makes it a non-starter for projects where short timescales are key.
land. compute what surface you need for 1 GW of solar
I think it's funny that at no point in the article do they mention the idea of simply making LLMs more efficient. I guess that's not important when all you care about is winning the AI "race" rather then selling a long term sustainable product.
If you make it more efficient, then you train it for longer or make it larger. You're not going to just idle your GPUs.
And yes of course it's a race, everything being equal nobody's going to use your model if someone else has a better model.
They are already power-constrained. Any efficiency improvements would immediately be allocated to more AI.
What makes you think that the entire process isn't being made more efficient? There are entire papers dedicated to pulling out more FLOPs from GPUs so that less energy is being wasted on simply moving memory around. Of course, there's also inference side optimizations like speculative decoding and MoE. Some of these make the training process more expensive.
The other big problem is that you can always increase the scale to compensate for the energy efficiency. I do wonder if they'll eventually level this off though. If performance somehow plateaus then presumably the efficiency gains will catch up. That being said, that doesn't seem to be a thing in the near future.
This is a really long way of saying "We need to burn fossil fuels to make more money."
It didn't make long-term sense for our world before AI. It makes no more sense with AI.
> This is a really long way of saying "We need to burn fossil fuels to make more money."
Like every other industry in the world?
I’m kind of amazed that AI data centers have become the political talking point for topics like water usage and energy use when they’re just doing what every other energy-intensive industry does. The food arriving at your grocery store and the building materials that built your house also came from industries that consume a lot of fossil fuels to make more money.
The difference is they are new. It’s not rational but people on the whole generally are ok with the status quo of how the sausage is made largely because they don’t really think about it. But new systems being spun up provide an entry point for a discussion. Ideally that discussion can then be widened and open up an opportunity for wider scale change. Or nothing happens and it all becomes the new status quo which most don’t think about again.
The difference is that the food industry at least feeds me and a house provides me shelter. The people in charge of building AI data centers intend to replace all labor in using my own and leave me begging for peanuts, while also making energy and many goods more expensive and ruining the environment
FYI Fairly certain this account is a bot. Acct created in 2023 and has 36k karma
We need food and housing. We don't need AI.
More like it’s a really long way to say the government has utterly failed at making sure electricity generation and transmission capacity keeps up with demand so datacenters have been forced to get creative with alternative ways to power themselves. These companies absolutely want to use renewable energy from the power grid but the government blew it.
Really cool in depth report, thanks for sharing. It's very interesting to see what these big datacenter deployments are actually doing. Go look at the oil price charts for the last 25 years and you'll see why it makes a ton of sense economically.
I also love how you can see the physical evidence of them pitting jurisdictions against each other from the satellite photos with the data center on one side of a state border and the power generation on the other.
Economic * need dwarfs problems like an overloaded electric grid.
*greed.
We are well past the point that any economic growth at all is anything but a distribution of income problem.
"xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines."
So they solved the power problem by consuming more fossil fuel. Got it.
And all without the proper permits! Using 35 generators when they were only allowed 15! Yay! So glad we're allowing AI companies to break law after law after law to not be able to reason logically the basic Towers of Hanoi.
https://techcrunch.com/2025/07/03/xai-gets-permits-for-15-na...
The problem is that most of the AI labs are popping up in TX that has a uniquely isolated electrical grid. Recall how the Texas cold snap a few years ago took down the grid for days. Turns out if you make a grid based on short term profit motifs, it's not going to be flexible enough to take new demand.
It's not the grid's technological limitation. We could have lived in a world with a more connected grid, more nibble utility commissions, and a lot less methane/carbon emissions as a result of it
Title should be "AI labs are raping the planet"
I found Boom's pivot much less confusing after this article.
For those like me that are missing context:
https://qz.com/boom-supersonic-jet-startup-ai-data-center-po...
Boom’s pivot to trying to build turbines for data centers wasn’t surprising when data center deployments started using turbines. Either their CEO saw one of the headlines or their investors forwarded it over and it became their new talking point.
What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
I had been under the mistaken impression that the turbines in airplanes were more different from the turbines in power plants than they actually are.
Boom doesn’t actually have a turbine yet. Their design partner publicly pulled out of their contract with Boom a while ago.
Boom has been operating on vaporware for a while. It’s one of those companies I want to see succeed but whatever they’re doing in public is just PR right now. Until they actually produce something (other than a prototype that doesn’t resemble their production goals using other people’s parts) their PR releases don’t mean a whole lot.
> What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
My first thought when seeing that article is “I can buy one of these right now from Siemens or GE, and I could’ve ordered one at any time in the last 50 years.”
What I didn't get is afair Boom doesn't build engines, aren't they using some old 50s-60s fighter jet engines?
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
This seems like a big reach for me. Their largest engine (and it is absolutely massive) "only" produces 80MW of power. The Brayton cycle is unbeatable if you need to keep scaling power up to ridiculous levels.
I mean, the claim is certainly nonsensical in the sense that this isn't something Wärtsilä just "realized". They have been in the power plant business for decades. In the oldest financials they have online (the annual report for year 2000) their power plant sales are larger than their marine engine sales.
Really makes me wonder about anything else I've read on Semianalysis. Like, it is such an insane thing to claim and so easy to check. And they just wrote it anyway, like some kind of pathological fabulists.
But what's the part that seems like a "big reach"? Are you saying they didn't sign those contracts? That their customers are making a mistake?
They likely use multiple engines.
Isn't spinning up huge amounts of power on inefficient engines going to make climate impacts worse?
Interesting choice of names: "Solar Turbines" - a wholly owned Caterpillar subsidiary that designs and manufactures industrial gas turbines.
That said, it is all pretty impressive.
> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.
Citation needed.
This is coming from a group that does analysis on the semiconductor and cloud industries and provided very expensive access to their models and info. They are the citation.
So I guess it’s not a bubble then since these companies are raking in the big revenues? Or maybe they are counting all those circular investments as revenues somehow?
You can make a lot of money but still be in a bubble if speculation is significantly higher than (actual or potential) revenue.
I think that's most people's assumption. It's not that AI is worthless, but that it's significantly less valuable than investors are betting on.
Revenue isn't profit, and the presence or absence of either, separately or jointly, isn't sufficient to determine there is a bubble.
I mean, if so then they are lying through their teeth.
Based on what? I’m inclined to trust a well known industry analyst over an HN comment with no basis.
If you do the math, that's $10-$12 per watt year. There's approx 24×365.25=8766 hours in a year, so assuming that the datacenters would be running 24×7, that boils down to $1.14 to $1.37 in revenue per kWh. That's not a bad deal if power really is a major part of the expense.
As far as I can tell, power isn't actually a major part of the expense, it's dwarfed by the capex. Just the amortization on the GPU will be an order of magnitude higher than the cost of the power to run the GPU at 100%. (Assuming a 5 year depreciation period.)
“Could”, sure... and I “could” fly if I strapped a jet engine to my ass
... and, all this for what ?
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive. xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.
Wow, "truck-mounted gas turbines"? Who else could have mastered such a futuristic tech in so short a time? Seriously, who wrote this? Grok? And let's ignore that this needless burning of fossil fuel is making life on Earth harder for everyone and everything else.
I'm no fan of Musk, but you've got to admit it was a clever way to achieve the goal. SemiAnalysis don't do fanboy articles - their research is pretty in-depth. So they are stating it as they see it.
The problem ordinary people all over the world have is that governments are allowing this to happen. Maybe if there were stricter regulation it will prevent players such as Musk to come up with such "innovations".
"Getting a permit for 15 turbines after having illegally used 35 turbines that then poisoned the air for the residences around the turbines" is a clever way to achieve the goal? I wouldn't call doing a blatant illegal action "clever", but rather sociopathic.
https://techcrunch.com/2025/07/03/xai-gets-permits-for-15-na...
https://www.politico.com/news/2025/05/06/elon-musk-xai-memph...
Youve posted the same think like 4 times now
I'm not sure if this matters. The illegality of the article's central premise seems like an important point excluded from the article.
Power problem: solved
Natural Gas supply problem: worsened
Carbon in the atmosphere problem: worsened
Yeah I guess I'm not the target audience for this because I assumed that "the power problem" was "massive increase in electricity costs for people despite virtually unchanged usage on their part", not "AI companies have to wait too long to be able to start using even more power than they already are":
> Nicole Pastore, who has lived in her large stone home near Baltimore’s Johns Hopkins University campus for 18 years, said her utility bills over the past year jumped by 50%. “You look at that and think, ‘Oh my god,’” she said. She has now become the kind of mom who walks around her home turning off lights and unplugging her daughter’s cellphone chargers.
> And because Pastore is a judge who rules on rental disputes in Baltimore City District Court, she regularly sees poor people struggling with their own power bills. “It’s utilities versus rent,” she said. “They want to stay in their home, but they also want to keep their lights on.”
https://www.bloomberg.com/graphics/2025-ai-data-centers-elec...
I understand the instinct but if people seriously think that they are solving any problem by unplugging cell phone chargers, they are simply bad at math. Human time is easily worth more than that, even when working at minimum wage.
That said, it obviously sucks that utility prices are rising for people who can not effortlessly cover that (not to speak of the local pollution, if that's an issue). Maybe some special tax to offset that cost to society towards hyper scalers would be a reasonable way to soften the blow, but I have not done the math.
How many paid hours do they get? Human time isn't fungible with paid hours.
They are not necessarily bad at math, but they probably aren't electricians or EEs or have ever needed or been asked to calculate how much power a cell phone charger uses.
Mom/Dad used to unplug things and turn lights off, so they do too.
And the air quality around these plants is poor, leading to health problems for the neighbors.
This short term, destructive, thinking should be criminalized.
I think it's time to discuss changing the incentives around ai deployment, specifically paying into a ubi fund whenever human jobs are replaced by ai. Musk himself raised the idea.
https://www.indexbox.io/blog/tech-leaders-push-for-universal...
It can't be "criminalized" if govt and justice system is effectively actively bribed by the AI cartel because AI-related GDP "growth" is only veneer hiding the economical fuckups of the government
I assumed gas plants are pretty good in terms of air quality?
Coal plants are bad.
In the case of Grok's turbines, no emissions controls means sick people. Plus all the CO2 pushing climate collapse faster which hurts every coming generation.
https://www.politico.com/news/2025/05/06/elon-musk-xai-memph...
Gas plants are not bad… but imagine 400 MW of gas plants in a concentrated area. You’ll always have NOx and SOx by products whenever you’re burning gas.
It depends on if they treat the exhaust to remove nitrogen oxides. Not sure what the standard is for this kind of plant though.
Gas is certainly less of a problem than coal, but they still produce plenty of bad stuff: nitrogen oxides and bad VOCs like formaldehyde that are well studied to increase risk of asthma and some types of cancer. I certainly wouldn’t want to live close to one.
The word 'pollution' appears exactly one time in this entire thing, the word 'community' or 'communities' never.
The only way to solve problems like this IMO is to price in the externalities. Tax fossil fuels for the damage they do, in order to reveal their true cost. Then they will never look like the most affordable option, because they're not.
True. The same is true for nuclear energy. I never heard of a nuclear power plant that did not receive substantial subsidies throughout lifetime. Not to forget the nuclear fuel and the efforts required to create it and later to store it.
This website appears to be very AI heavy in articles. I think it's fair to say these articles are biased because of that.
The natural gas turbines used are relatively efficient as far as engines go. Having them on-site makes transmission losses basically negligible.
Nothing short of full solar connected to batteries produced without any difficult to mine elements will make some people happy, but as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
I'm honestly curious whether you yourself are even aware of the disingenuousness of this argument. It's fairly impressive in its density!
1. Nobody complained about the efficiency of natural gas turbines. You can efficiently do a lot of useless stuff with deep negative externalities, and the fact it's efficient is not all that helpful.
2. Saying "the extreme far end would not be satisfied even by much better solutions" is not an excuse not to pursue better solutions!
3. There are many dimensions of this that people care about beyond the "global concern" level regarding "pollution and fuel consumption."
4. There are many problems that are significant and worth thinking about even if they are not the largest singular problems that could be included by some arbitrarily defined criteria
> I'm honestly curious whether you yourself are even aware of the disingenuousness of this argument.
Unnecessarily condescending and smug, but I’ll try to respond.
That said, you’re putting forth your own disingenuous assumptions and misconceptions. The natural gas turbines are an intermediate solution to get up and running due to the extremely long and arduous process of getting connected to the grid.
Arguing pedantry about the word efficiency isn’t helpful either. The data centers are being built, sorry to anyone who gets triggered by that. The gas turbines are an efficient way to power them while waiting for grid interconnect and longterm renewables to come online.
Disingenuous is acting like this is a permanent solution to the exclusion of others. The whole point is that it gets them started now with portable generation that is efficient.
> The data centers are being built, sorry to anyone who gets triggered by that.
Unnecessarily smug?
Beyond that they can be stopped. They're being met with a lot of resistance in the Midwest as they're attempting to be built without much understanding of the public utilities impact. People are catching on to the fact that energy and water consumption is pushing up costs for residents. A lot of assumptions are supporting this argument.
> The gas turbines are an efficient way to power them while waiting for grid interconnect and longterm renewables to come online.
I like the gymnastics of wordplay here. Efficient only when you look at them through the lens of some ephemeral timeframe that may or may not exist.
The gas turbines are hopefully an intermediate solution due to the long and not guaranteed process of grid connection and renewable buildout. History is of course full of such bets that did not work out the way their proponents hoped.
> The data centers are being built, sorry to anyone who gets triggered by that.
It's obvious that you're starting from your conclusion and working backwards, which is probably how your initial comment was full of so much motivated reasoning to begin with.
In your mind, is there any set of negative externalities that would justify not building the data centers, or at least not building them now, or at least not building them now in specific areas that require these types of interim solutions?
This is exactly right. These are glorified emergency generators, and grid power is ordinarily far cheaper; especially for interruptible loads like training new models (checkpointing work in progress and resuming it later is cheap and easy). The article mentions that quite clearly.
> as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
Same level doesn't remove the concern for this unnecessary pollution. Stop changing the subject from the environmental problems that AI usage can have by their increased power consumption.
Natural gas engines are efficient!
Ok! But what about the pollution they produce to nearby neighborhoods? What about the health repercussions? Do human lives not matter?
https://www.politico.com/news/2025/05/06/elon-musk-xai-memph...
Yeah, that headline made me think "Oh good, there's some solution on the horizon that won't require absurd amounts of electricity."
Not so.
And imagine all this poorly located, overpriced, haphazardly thrown together and polluting infrastructure will basically get flushed down the toilet once either the AI bubble pops, or they figure out a new way of doing AI that doesn't require terawatts of power.
Coincidentally the USA is more than self sufficient in natural gas and is a net exporter. Drill baby drill!
supply is finite
> However, AI infrastructure cannot wait for the grid’s multiyear transmission upgrades. An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually. Getting a 400 MW datacenter online even six months earlier is worth billions. Economic need dwarfs problems like an overloaded electric grid. The industry is already searching for new solutions.
wow, that's some logic. Environmentally unsound means of extracting energy directly damage the ecosystem in which humans need to live. The need for a functioning ecosystem "dwarfs" "problems" like billionaires not making enough billions. Fixing a ruined ecosystem would cost many more billions than whatever economic revenue the AI generated while ruining it. So if you're not harnessing the sun or wind (forget about the latter in the US right now, btw), you're burning things, and you can get lost with that.
This kind of short sighted thinking is because when folks like this talk about generating billions of dollars of worth, their cerebellums are firing up as they think of themselves personally as billionaires, corrupting their overall thought processes. We really need to tax billionaires out of existence.
TL;DR by saying fuck environment
TLDR: They're not reducing power consumption, they're just also using gas now. Buckle up for higher prices, the AI slop factory needs more power.
The dialog around AI resource use is frustratingly inane, because the benefits are never discussed in the same context.
LLMs/diffusers are inefficient from a traditional computing perspective, but they are also the most efficient technology humanity has created:
> AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) and human individuals performing equivalent writing and illustrating tasks. Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.
Source: https://www.nature.com/articles/s41598-024-54271-x