
by David Brown, PhD, EDGE Faculty Director and Snow Family Business Professor, Fuqua School of Business
It seems like every day brings fresh headlines about the voracious energy demands of artificial intelligence. According to a recent IEA report, data centers alone will drive nearly half of the growth in U.S. electricity demand by 2030. Tech companies are scrambling to secure enough power for compute, while grid operators are struggling to keep pace. The economic promise of AI is increasingly shadowed by mounting energy costs – residential electricity prices in the U.S. have risen by 6.5% over the past year – prompting concern over whether consumers are unfairly bearing the burden and whether our grid can keep up.
To address these challenges, we need a diverse toolkit of solutions – technological, economic, and policy-driven. One of the most promising areas, in my view: using AI itself to help optimize and manage our energy systems.
This has been a focus of my recent research, along with colleagues across Duke and other institutions, and I’d like to share some key lessons and exciting opportunities ahead.
Rethinking generation: Flexibility as the foundation
Traditional grid planning – centered on “unit commitment” strategies built around average load forecasts – is the bedrock of current practice. But volatility in net loads makes this increasingly ill-suited: today’s grid faces wild swings in demand from AI workloads and equally unpredictable supply disruptions from intermittent renewables.
That uncertainty needs to be built into system planning from the start. Generation and storage assets must be explicitly modeled as the dynamic resources that they truly are. In a recent publication with my coauthor Jim Smith (Dartmouth Tuck), we explore this concept in a paper we call “Unit Commitment without Commitment” (available Open Access here). We develop novel flexible dispatch technology based on stochastic optimization and show this technology can have impact at scale, lowering Duke Energy’s generation costs by 2-3% today, and by 5-6% or more by 2030.
Importantly, unlike today’s power-hungry AI models, our algorithms run in seconds on a laptop. If AI is going to help solve AI’s energy problem, we need lean, efficient, and low-carbon solutions.
Work continues as collaborators and I actively explore related questions: how to overcome transmission bottlenecks, and how to scale these flexible dispatch methods to manage millions of distributed energy sources. One exciting example: PG&E’s recent virtual power plant (VPP) experiment in Northern California tapped into about 100,000 home batteries to deliver over 500 MW to the grid during evening peak load time. From the perspective of our dynamic unit commitment tech, a VPP is conceptually the same as an integrated utility: just bigger. My experience with unit commitment leads me to strongly believe that AI-based optimization and pricing algorithms could unlock even more value from these emerging systems.
Load flexibility: An untapped opportunity
It’s not just about the supply side. There’s enormous potential in making demand more flexible, too.
The idea of shifting or shaping electricity demand isn’t new – “demand response” has been around since the 1970s – but its moment may finally be arriving. A recent paper, “Rethinking Load Growth”, from Duke colleagues Tyler Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell, finds that the U.S. grid could accommodate ~80 GW of additional load today, provided that we permit just a tiny amount of curtailment (~0.25% annually). That might mean briefly delaying some of the largest AI training jobs – just enough to maintain grid stability. To put that in context, Duke Energy Carolinas and Progress recently reported record peak loads of about 37 GW.
This work has understandably drawn keen attention from tech companies, utilities, and regulators alike. These findings have enormous implications not only for longer term questions related to resource adequacy, but for short-term grid operations as well.
The future: Where flexible generation meets flexible demand
Matching electricity supply with demand has always been the core task of grid operators. But what if we could optimize both sides of that equation at once?
That’s the Holy Grail: scheduling flexible loads (like data centers) in tandem with flexible generation and storage. And it’s more important than ever. AI isn’t just driving up average demand – it’s increasing variability. These workloads are bursty and unpredictable, driven by erratic training schedules and surges in user demand (hello, GPT-5). Capturing that uncertainty in our models – and building systems to manage it – will be essential to creating a resilient, efficient grid.
Duke’s role: Collaboration, innovation, and impact
I am grateful to be part of a vibrant community at Duke that’s engaging with these questions from every angle.
This spring, I moderated a terrific panel at the EDGE Executive Council on AI and sustainability, featuring experts from Google’s “AI for Sustainability” team, Sust Global, and Stem, Inc. Kudos to my EDGE teammates Katie Kross and Dan Vermeer for organizing.
Load growth was also a recurring theme at this year’s “Billions to Trillions” summit at Duke in early April, focused on how public policy can catalyze private investments in climate solutions. This year’s summit marked the sequel of the inaugural Billions to Trillions event in 2024 – I was delighted to see Duke’s convening power again on full display, with Geneen Auditorium at Fuqua packed again.
Duke’s new Deep Tech Initiative, founded by Fuqua and OpenAI colleague Ronnie Chatterji and led by Sanford Professor David Hoffman, is another powerful platform. Along with focal areas such as cybersecurity and quantum computing, environmental sustainability is a key pillar, and several Deep Tech affiliates are already collaborating to advance sustainable AI research – boosted by efforts like Rethinking Load Growth and related new funding from the ALIGN Project (Advanced Load Integration for Grid Needs) at Duke, aimed at providing a deeper dive into the impact potential of load flexibility.
On the policy front, I had the pleasure of joining a related “Sustainable AI Policy” dinner in Washington, D.C., hosted by the Nicholas Institute and Grove Climate Group. The event convened around 40 leaders from tech, utilities, markets, and government – exactly the kind of cross-sector dialogue we need. As sustainable AI research efforts at Duke continue to ramp up, I expect an uptick in events like this.
Keeping the Conversation Going
Many of you in the EDGE community are tackling these and related challenges in your own work. Please stay in touch – we’d love to hear from you, learn from your experiences, and continue building a future where AI and sustainability go hand in hand.
—
RELATED ARTICLES: