The AI Briefing

Exploring the massive energy demands of AI data centers, where cooling systems consume nearly as much power as the compute itself. Discussion covers innovative cooling solutions and the path to efficiency.

AI Data Center Cooling Crisis: The Hidden Energy Cost

Key Topics Covered
Global Energy Impact
  • Data centers projected to use 2-4% of global electricity
  • AI driving unprecedented spike in compute demands
  • Real-time access to large language models requiring massive processing power
The Cooling Challenge
  • 40% of data center power goes to compute operations
  • 38-40% of data center power dedicated to cooling systems
  • Nearly equal energy split between computing and cooling
Innovative Cooling Solutions
Underwater Data Centers
  • Microsoft leading underwater compute deployment
  • Ocean cooling provides natural temperature regulation
  • Concern: Large-scale deployment could warm surrounding ocean water
Underground Mining Solutions
  • Finland pioneering repurposed mine data centers
  • Cold bedrock provides natural cooling
  • Risk: Potential ground warming and permafrost impact
The Path Forward
  • Chip efficiency as the ultimate solution
  • More efficient processors = less heat generation
  • Potential 20% electricity cost reduction through improved chip design
  • Consumer impact: Lower costs could reduce wholesale electricity prices
Environmental Considerations
  • Heat displacement challenges across all solutions
  • Scale considerations for environmental impact
  • Need for sustainable cooling innovations
Key Takeaways
  • Every AI query has a hidden energy cost
  • Cooling represents nearly half of data center energy usage
  • Innovation in both cooling methods and chip efficiency crucial for sustainable AI
  • Economic benefits of efficiency improvements extend to consumers
Contact
Recorded in snowy Washington DC
Chapters
  • 0:00 - Introduction: AI's Growing Energy Footprint
  • 1:47 - The Shocking 40% Cooling Reality
  • 2:27 - Creative Cooling Solutions: Ocean to Underground
  • 4:16 - The Future: Chip Efficiency and Consumer Impact

What is The AI Briefing?

The AI Briefing is your 5-minute daily intelligence report on AI in the workplace. Designed for busy corporate leaders, we distill the latest news, emerging agentic tools, and strategic insights into a quick, actionable briefing. No fluff, no jargon overload—just the AI knowledge you need to lead confidently in an automated world.

Hello and welcome to today's AI briefing.

My name is Tom.

For anyone who's new to the AI briefing, we come up with short news snippets, information,
stuff that helps people cut through the noise in the area to try and understand what's

actually going on in the AI environment.

Because there's always a lot going on.

And so today, we're in a rather chilly, rather snowy Washington DC.

And we're going to talk about AI cooling demands in the data centers that are required to
drive all of the power and the stuff that needs to get done to allow you to ask ChatGPT

your favorite questions.

so...

Over the course of the last few years, of course, the usage of data, data centers and
compute to able to drive LLMs and everything goes with it has increased dramatically.

Analysts say that data centers will use 2 to 4 percent of global electricity in the near
term, which of course is a substantial amount when you think about it just powering

computers now as a

society, we make a lot of use of computers, but 2 4 % uh is a large number.

And of course, it's going up with the amount of processing that's required to be able to
facilitate all these chat GPT, Claude, whatever, LLMs.

so AI, of course, is causing a spike of all of this because the requirements for people to
get real-time access to huge corpses of data.

ask all these questions and actually find out what's going on and get the sensible
results, which is the big problem when it comes to AI and trying to get some sensible

answers out of all of this.

The electricity consumption though isn't just compute, of course, it's also the cooling
that is required to drive the systems themselves.

so according to a recent study, 40 % of data center power went on compute-ish, and then 38
to 40 % of data center power then went on the cooling systems to actually be able to cool

it.

Now,

There's obviously snow on the ground here for anyone who's watching the YouTube video, for
anyone who is listening to the podcast, hopefully you can hear the snow under my feet.

And of course, 40 % of the electricity consumption just to cool the systems is a big deal.

So there's many different programs underway where companies and organizations are trying
to find ways to be able to cool the data centers and the compute within the data centers

in different ways.

And so, for example, it's not snow related, but Microsoft in recent years have put data
centers in the water.

I know they're not the only organization to do so, but they're more prevalent.

And they sunk a bunch of compute into the ocean.

Now, of course, that sounds great.

But the problem is, if you did that on a huge scale, then you would also potentially warm
the ocean around the compute.

Other countries like Finland,

working on burying their data centers deep underground in mines, repurposed mines, that
type of stuff, where you've got access to a very cold climate and very cold bedrock that

can also help drive the cooling.

Of course, the offset of that is that you end up warming the earth that surrounds them
now.

Is that going to be so much that it causes, you know, permafrost melting, what have you?

I don't know.

But if you did it on large enough scale, course, the heat has to go somewhere.

So I guess the point of this podcast, just briefly today, is just to talk through that and
understand that everything has a knock-on impact.

And the more that we rely on LLMs and the more that we make use of them, until more
efficient compute chips come into play, the power usage and the consumption of cooling

to deal with that and get that heat out of the computers and into the environment will
forever increase.

over the course of the next few years, I suspect an awful lot of different organizations
and industries will start to look at that and try and figure out more, know, start to

expand on these cooling programs and different ways to able to deal with that.

Chip efficiency, though, of course, will end up being the key driver of this.

If you can make the chips

super efficient and not leverage an awful lot or not produce as much heat as they
currently do, then that will end up being a much more effective way of being able to

control the amount of heat and cost of electricity that's used to then cool the data
centers.

And so, of course, the knock-on impact of this as well as to the consumer, because if you
can then spend 40 % less on your electricity bill,

entirely 40 percent, but say you knock 20 percent off your electricity bill because your
chips created half as much heat and needed half as much cooling, then that's a dramatic

reduction in electricity costs to the data center, which would then go and impact both
electricity prices on the wholesale market, but also the amount it would cost the data

center to able to run your compute models.

Anyway, there we go.

I figured I would use the snow as a

example as to some of the ways that people are looking at cooling compute as the demands
for it ever increase.

So I hope you found this useful.

If you have, drop comments into the box below.

If you're looking online, you can always reach out to me at tom at concept of cloud.com.

And in the meantime, I will leave you with that and have a great rest of your day.

Thanks for tuning in.