How AI can solve its own energy crisis | Varun Sivaram

with Varun Sivaram

Published November 3, 2025
View Show Notes

About This Episode

Host Elise Hu introduces a talk by grid futurist Varun Sivaram about the looming clash between rapidly growing AI data center demand and an aging electricity grid. Sivaram explains how making AI data centers flexible in when and where they consume power can relieve grid stress, unlock existing unused capacity, and accelerate the integration of cheap renewable energy. He describes Emerald AI's "Emerald Conductor" software, real-world demonstrations, and industry collaborations aimed at turning AI from a grid threat into a key ally for a cleaner, more reliable energy system.

Topics Covered

Disclaimer: We provide independent summaries of podcasts and are not affiliated with or endorsed in any way by any podcast or creator. All podcast names and content are the property of their respective owners. The views and opinions expressed within the podcasts belong solely to the original hosts and guests and do not reflect the views or positions of Summapod.

Quick Takeaways

  • AI data centers are driving massive new electricity demand that current grids are not prepared to handle, risking delays, higher prices, and more fossil fuel use.
  • Roughly half of existing power system capacity goes unused on average, creating a huge opportunity if AI data centers can flex their demand over time.
  • By modestly reducing power use for short periods during grid peaks, flexible AI data centers could unlock up to 100 gigawatts of new AI capacity on current U.S. grids.
  • Emerald AI's "Emerald Conductor" software uses temporal and spatial flexibility to pause batchable AI workloads and move others to regions with spare power.
  • A real-world demonstration in Phoenix showed 256 GPU servers cutting power consumption by 25% for three hours while still meeting AI performance requirements.
  • Flexible AI data centers can act as fast, large "shock absorbers" for the grid, enabling greater use of intermittent solar and wind and reducing the need for expensive grid upgrades.
  • Industry cooperation between utilities, data center operators, hardware makers, and software firms is critical to fully realize AI's potential as a grid stabilizer.
  • Sivaram argues that with smart orchestration, society can have rapid AI innovation alongside abundant, affordable, reliable, and clean energy.

Podcast Notes

Podcast introduction and framing of AI's energy challenge

Show and host introduction

Elise Hu identifies the show as TED Talks Daily and herself as the host[2:57]
She notes the show brings new ideas every day to spark listeners' curiosity

High-level framing of AI's energy problem

Demand for AI computing power is growing at an exponential rate[3:08]
Elise acknowledges excitement about the AI revolution alongside concerns
She highlights a "terrible truth" that AI's soaring energy needs are taxing the world's resources at an unprecedented rate[3:20]

Introduction of guest and topic

Elise introduces Varun Sivaram as a grid futurist[3:23]
She says he shares his work on how to make AI work in energy terms
She previews that he will discuss developing flexible AI data centers that could help power the energy grid[3:27]
She notes the goal is to support the AI boom responsibly[3:33]

Phoenix case study: AI data centers helping the grid

Setting the scene in Phoenix, Arizona

Varun describes a blistering hot day in Phoenix when a million air conditioners were driving up demand on the power grid[3:44]
This day represents a typical peak stress scenario for the grid during extreme heat

AI servers behaving differently than expected

He explains that a cluster of energy-hungry AI servers at an Oracle data center "bucked the trend" by actually helping the grid[3:48]
For three years, these AI computers have dropped their power consumption by 25% during days of peak demand[4:01]
The reduction was described as "perfectly timed relief" during peak demand periods
Despite the power reduction, advanced NVIDIA chips continued meeting stringent performance requirements[4:12]
He lists tasks such as training, fine-tuning, and using AI large language models as workloads that still met performance needs

Emerald AI's role in the demonstration

Varun says his team at Emerald AI orchestrated this first-of-a-kind demonstration of flexible AI computing[4:27]
He notes that Google has also made impressive strides in this area[4:32]
He implies that multiple players are working on similar flexible computing concepts, not just his company

Potential global impact of scaling flexibility

Scaling such technologies across the U.S. and globally could help solve the challenge of powering the AI revolution[4:44]
He connects this to building a more reliable, affordable, and clean power grid[4:51]
He argues that, far from undermining the grid, AI could actually help save it[4:57]

Reimagining the AI power challenge and Varun's career shift

Need to reimagine how to power AI

Varun says that to understand how AI can help the grid, we need to reimagine the challenge of powering AI[5:03]

Varun's background in energy

He notes he has reinvented his own career to address this new framing[5:05]
For 15 years he worked as an energy executive and as America's lead clean energy diplomat[5:12]
During that period, his focus was on building more clean energy

Shifting focus from supply to demand

He points out that energy supply is only half of the equation[5:19]
He founded Emerald AI to focus on demand instead[5:20]
The company's aim is to help AI intelligently use energy, support grids, and unlock massive stranded power capacity that already exists
He warns that without such demand-side capabilities, we face an impending crisis[5:38]

Impending collision between AI data centers and the electricity grid

Description of the two massive networks

Varun describes a historic collision between two multi-trillion dollar networks[5:43]
The first network is rapidly growing AI data centers[5:45]
The second network is an aging electricity grid utterly unprepared for all the new demand[5:50]

Risks: U.S. competitiveness in AI

He says America risks falling behind in AI due to grid constraints[6:00]
In Virginia, called the data center capital of the world, it takes up to seven years to connect new data centers to the grid[6:08]

Risks: Rising power prices for communities

He notes that power prices are soaring for communities as new grids and power plants are built[6:11]
In 2025, data center demand drove up the average annual household power price in Columbus, Ohio, by $240[6:21]
He frames this as just the beginning of a trend as data center demand grows

Risks: Scale of future data center demand

He states that data centers currently account for 4% of U.S. power demand[6:30]
By 2030, that share is projected to rise to 12%[6:33]
He analogizes this increase to adding another Germany to the U.S. power grid

Risks: Increased fossil fuel use

He explains that fossil fuels are set to power the boom in AI data centers, which require reliable power today[6:42]
In the U.S., natural gas is powering most AI growth[6:49]
He notes that countries like India will see rising coal use, increasing global carbon emissions[6:52]
He emphasizes that this trajectory is not inevitable; it "doesn't have to be this way"[6:58]

Flexibility as the key concept to align AI and the grid

Reframing AI as a potential grid ally

Varun argues that the biggest new user of electricity could actually become the grid's greatest ally[7:06]
He identifies "flexibility" as the key to achieving this[7:12]

Defining flexibility vs efficiency

He distinguishes flexibility from efficiency, noting they are different concepts[7:14]
Efficiency refers to using less energy overall, whereas flexibility is about when energy is used[7:17]
He states that if AI were just a little more flexible in when it uses energy, it could consume vast amounts of otherwise stranded power[7:27]
He emphasizes that this stranded power already exists on today's grids

Analogy: electric power system as a superhighway

Varun asks the audience to think of the electric power system as a superhighway[7:31]
He says the grid faces peak "rush hour" just a few hours per month[7:36]
He uses the hottest summer day in Phoenix as an example when air conditioning demand peaks[7:41]
On those days, grids risk being overwhelmed by massive new data centers that may soon consume more than a gigawatt[7:45]
He notes this is more energy than the entire state of Vermont consumes

Underutilization of grid capacity

He explains that most of the time, power plants operate well below full capacity[8:06]
Transmission lines also usually carry less power than they could, analogous to an underused highway[8:00]
On average, throughout the year, half of the power system's capacity goes unused[8:06]

Hypothetical: AI data centers as flexible users

He poses the question: what if during peak rush hour periods, AI data centers could dynamically reduce their power consumption?[8:12]
The idea is that they could take advantage of spare capacity throughout the rest of the year[8:23]
He likens this to briefly taking 18 weeks off the road to let remaining traffic flow smoothly[8:26]
He quantifies the needed flexibility: if AI data centers were flexible less than 2% of the year, trimming demand by a quarter for a couple of hours at a time[8:39]
Under those conditions, the U.S. could fit up to 100 gigawatts of new data centers on existing power grids
He values this at $4 trillion of AI investment that could be unlocked without waiting years for infrastructure

Broader energy system implications

He acknowledges that America will still need more energy to power a growing economy[9:05]
He notes that data centers, factories, and other electricity users will join in driving new demand[9:11]
By making AI data centers flexible, society can more prudently expand the grid[9:14]
He suggests this buys time to build clean nuclear or geothermal power plants[9:23]
He adds that flexible AI data centers acting as giant shock absorbers can help integrate intermittent but cheap solar and wind power[9:30]
This integration can drive down the cost of energy for AI itself

Emerald Conductor: An AI to manage AI energy use

Purpose of Emerald Conductor

Varun says his team is building the software brain to give AI data centers flexibility[9:44]
He describes it as an "AI for AI" and names it the Emerald Conductor[9:54]

Concept of spatio-temporal flexibility

Emerald Conductor works by harnessing what they call spatio-temporal flexibility[9:59]
He admits the term sounds fancy but says it's based on a simple idea[10:01]

Temporal flexibility: pausing batchable jobs

He explains that not all AI jobs are created equal[10:11]
Some workloads, such as training or fine-tuning an AI model, are considered batchable[10:15]
He includes conducting deep research and running massive scientific simulations in the batchable category[10:17]
These tasks are important but don't need to be completed immediately[10:24]
Software can intelligently pause or slow these workloads when the grid is stressed[10:28]
The system can then speed them back up when plenty of power is available

Spatial flexibility: moving workloads geographically

He introduces spatial flexibility using the example of a user query to a generative AI chatbot[10:42]
Such jobs cannot be paused because the response must be prompt[10:09]
However, these jobs can be moved across the country at the speed of light[10:51]
He contrasts the difficulty of building new electric power transmission with the existing network of fiber optic cables[10:57]
He calls this use of fiber optics "virtual transmission"[11:00]
Workloads can be shifted from a data center in a city where the grid is strained, such as Phoenix on a hot day[11:08]
They can be moved to a data center in a region with abundant power, like the windswept Great Plains[11:08]
The AI workloads still get done, but the stressed grid gets a break at the critical time[11:22]
He emphasizes that the user never notices because an AI is orchestrating AI behind the scenes[11:27]
In this setup, data centers become smart, cooperative partners to the power grid[11:31]

Phoenix demonstration: proving flexible AI in practice

Details of the Phoenix demonstration

Varun refers back to the earlier demonstration he mentioned and clarifies that it truly happened[11:39]
In May 2025 in Phoenix, Arizona, they used a cluster of 256 GPU servers[11:42]
They ran a mix of AI workloads: some highly flexible, others entirely inflexible, and many in between[11:51]
During one hot afternoon, their software received a signal that the local utility was about to reach peak demand[12:00]
In response, Emerald Conductor reduced the AI computational power load by 25% for exactly the three hours requested by the grid[12:06]
This reduction was coordinated while preserving required AI performance for workloads
He concludes they proved AI data centers can "flex" when the grid is tight and "sprint" when users need them[12:18]

From technology proof to industry adoption

Varun says proving the technology was only the first step[12:22]
He identifies the hardest part as convincing the enormous energy and AI industries to cooperate and change operations[12:24]

Traditional utility assumptions about demand

For over a century, utilities have assumed users cannot simply reduce power consumption during grid rush hours[12:36]
He acknowledges limited existing programs, like utilities requesting homes to adjust thermostats[12:48]
Utilities may also ask large industrial loads to dial down consumption[12:50]
However, he characterizes these interventions as tiny and marginal[12:56]

Why AI data centers are fundamentally different loads

He asserts that AI data centers are fundamentally different from traditional loads[12:58]
They are massive energy users compared with tiny household loads that must be aggregated[13:06]
They respond faster and more gracefully than large manufacturing facilities[13:12]
They can move workloads around the country at the speed of light, which no other energy user can do[13:14]
These characteristics give AI data centers a transformative potential to be flexible for grid support

Industry initiatives and partnerships to scale flexibility

EPRI's DCFlex and regional demonstrations

Varun expresses excitement about initiatives that bring together energy and technology industries, citing EPRI's DCFlex[13:28]
He mentions upcoming demonstrations in the United States and with National Grid in the United Kingdom[13:36]
In these, Emerald will showcase how AI workloads can flex and move across regions[13:42]
They will prove that software like Conductor can orchestrate AI workloads in concert with on-site energy equipment such as batteries[13:52]
Combining workload orchestration with batteries can deliver even more flexibility to power grids

Partnership with NVIDIA and reference design

Varun says Emerald is partnering with NVIDIA[14:01]
Together they are building a reference design for next-generation data centers, which he also calls AI factories, to be power flexible[14:04]
The goal is that utilities seeing this certification can more swiftly connect grid-friendly AI factories[14:13]

Vision: AI as a catalyst for a better energy system

Shortening timelines for AI infrastructure

Varun argues that rather than waiting years for grid upgrades, society can build AI infrastructure now[14:19]
He frames this as a way to sharpen competitive edge[14:26]

Avoiding grid crises and price spikes

He says that flexible AI data centers can provide relief before the grid hits a breaking point, avoiding rolling blackouts[14:30]
He proposes that instead of increasing power prices, prices could go down[14:40]
The mechanism is more effective utilization of existing energy infrastructure by flexible AI data centers[14:43]
This can defer expensive upgrades to the grid[14:50]

Aligning AI growth with clean energy deployment

He contrasts the earlier risk of goosing demand only for fossil fuels with a better alternative[14:56]
He says AI's soaring energy needs could instead encourage more clean energy onto the grid at home and abroad[14:58]
He notes that solar is currently the cheapest, fastest-growing power source on the planet[15:08]
He asks listeners to imagine flexible AI data centers capable of ramping their energy consumption to match daytime solar peaks[15:16]
He also envisions shifting AI loads to better integrate clean energy onto the grid[15:25]

Closing vision: having it all with AI and energy

Varun declares that the AI revolution is here[15:27]
He says he believes society can "have it all": breakneck innovation and massive AI investments[15:35]
He adds to this vision abundant, affordable, reliable, and clean energy for all[15:41]
He concludes that an AI for flexible AI infrastructure could be a linchpin for the future energy system[15:50]
He ends his talk with a thank you[15:52]

Outro and TED curation information

Context of the talk and partner mention

The narrator states that the talk was given by Varun Sivaram at a TED Countdown event in New York[16:02]
The event was in partnership with the Bezos Earth Fund in 2025[16:05]

TED curation guidelines and show credits

Listeners are invited to learn more about TED's curation at ted.com slash curation guidelines[16:12]
The narrator notes that TED Talks Daily is part of the TED Audio Collective[16:16]
The talk was fact-checked by the TED Research Team[16:19]
Production and editing credits are given to Martha Estefanos, Oliver Friedman, Ryan Green, Lucy Little, and Tansika Sangmarnivong[16:25]
The episode was mixed by Christopher Fasey-Bogan, with additional support from Emma Taubner and Daniela Balarezo[16:30]
Elise Hu says she will be back with another idea and thanks listeners for listening[16:39]

Lessons Learned

Actionable insights and wisdom you can apply to your business, career, and personal life.

1

Treat large energy users as flexible, schedulable resources rather than fixed, uncontrollable loads to unlock hidden capacity in existing systems.

Reflection Questions:

  • Where in your own work or organization do you currently treat demand or workloads as fixed when some portion could be made flexible?
  • How could you redesign one key process so that parts of it can be paused, slowed, or shifted without harming outcomes?
  • What concrete step could you take this month to pilot a small-scale flexibility experiment in your operations or schedule?
2

Reframing a problem from the supply side to the demand side can reveal entirely new solution spaces and business opportunities.

Reflection Questions:

  • What challenges in your field are currently being attacked mainly from the supply side, and how might they look different if you focused on demand instead?
  • How could shifting your perspective from "How do we get more?" to "How do we use what we have more intelligently?" change your current strategy?
  • Which project are you working on right now where deliberately exploring a demand-side solution would be worth a dedicated brainstorming session?
3

Orchestrating tasks by their time sensitivity and location constraints is a powerful way to increase resilience and efficiency in complex systems.

Reflection Questions:

  • How clearly have you categorized your responsibilities into urgent, time-flexible, and location-flexible buckets?
  • In what ways could you reassign or reschedule work so that time-critical tasks get priority while flexible tasks absorb periods of slack or constraint?
  • What system or tool could you implement this week to better track and route different types of work based on their flexibility?
4

Cross-industry collaboration is essential when solving problems that span multiple infrastructures, like technology and energy, because no single sector can implement systemic change alone.

Reflection Questions:

  • Which of your current challenges realistically requires cooperation from people or organizations outside your usual domain?
  • How might engaging stakeholders from another industry change the design or feasibility of the solution you're pursuing?
  • What is one concrete outreach action you could take in the next two weeks to start a cross-domain collaboration related to your work?
5

Designing new infrastructure to be "grid-friendly" or system-friendly from the outset reduces friction, speeds deployment, and aligns innovation with societal constraints.

Reflection Questions:

  • When you start new projects or build new capabilities, how often do you explicitly consider how they will interact with existing systems and constraints?
  • How could you modify a current initiative so that it makes life easier for key partners, regulators, or upstream/downstream users instead of harder?
  • What design principle could you adopt going forward to ensure new initiatives are easier to integrate into the broader environment they depend on?

Episode Summary - Notes by Blake

How AI can solve its own energy crisis | Varun Sivaram
0:00 0:00