The Data Center Frontier Show

Welcome to The Data Center Frontier Show podcast, telling the story of the data center industry and its future. Our podcast is hosted by the editors of Data Center Frontier, who are your guide to the ongoing digital transformation, explaining how next-generation technologies are changing our world, and the critical role the data center industry plays in creating this extraordinary future.

Listen on:

  • Apple Podcasts
  • Podbean App
  • Spotify
  • Amazon Music
  • iHeartRadio
  • PlayerFM
  • Podchaser
  • BoomPlay

Episodes

Wednesday Dec 10, 2025

In this panel session from the 2025 Data Center Frontier Trends Summit (Aug. 26-28) in Reston, Va., JLL’s Sean Farney moderates a high-energy panel on how the industry is fast-tracking AI capacity in a world of power constraints, grid delays, and record-low vacancy.
Under the banner “Scaling AI: The Role of Adaptive Reuse and Power-Rich Sites in GPU Deployment,” the discussion dives into why U.S. colocation vacancy is hovering near 2%, how power has become the ultimate limiter on AI revenue, and what it really takes to stand up GPU-heavy infrastructure at speed.
Schneider Electric’s Lovisa Tedestedt, Aligned Data Centers’ Phill Lawson-Shanks, and Sapphire Gas Solutions’ Scott Johns unpack the real-world strategies they’re deploying today—from adaptive reuse of industrial sites and factory-built modular systems, to behind-the-fence natural gas, microgrids, and emerging hydrogen and RNG pathways. Along the way, they explore the coming “AI inference edge,” the rebirth of the enterprise data center, and how AI is already being used to optimize data center design and operations.
During this talk, you’ll learn:
* Why record-low vacancy and long interconnection queues are reshaping AI deployment strategy.
* How adaptive reuse of legacy industrial and commercial real estate can unlock gigawatt-scale capacity and community benefits.
* The growing role of liquid cooling, modular skids, and grid-to-chip efficiency in getting more power to GPUs.
* How behind-the-meter gas, virtual pipelines, and microgrids are bridging multi-year grid delays.
* Why many experts expect a renaissance of enterprise data centers for AI inference at the edge.
Moderator:
Sean Farney, VP, Data Centers, Jones Lang LaSalle (JLL)
Panelists:
Tony Grayson, General Manager, Northstar
Lovisa Tedestedt, Strategic Account Executive – Cloud & Service Providers, Schneider Electric
Phill Lawson-Shanks, Chief Innovation Officer, Aligned Data Centers
Scott Johns, Chief Commercial Officer, Sapphire Gas Solutions

Tuesday Dec 09, 2025

Recorded live at the 2025 Data Center Frontier Trends Summit in Reston, VA, this panel brings together leading voices from the utility, IPP, and data center worlds to tackle one of the defining issues of the AI era: power.
Moderated by Buddy Rizer, Executive Director of Economic Development for Loudoun County, the session features:
Jeff Barber, VP Global Data Centers, Bloom Energy
Bob Kinscherf, VP National Accounts, Constellation
Stan Blackwell, Director, Data Center Practice, Dominion Energy
Joel Jansen, SVP Regulated Commercial Operations, American Electric Power
David McCall, VP of Innovation, QTS Data Centers
Together they explore how hyperscale and AI workloads are stressing today’s grid, why transmission has become the critical bottleneck, and how on-site and behind-the-meter solutions are evolving from “bridge power” into strategic infrastructure.
The panel dives into the role of gas-fired generation and fuel cells, emerging options like SMRs and geothermal, the realities of demand response and curtailment, and what it will take to recruit the next generation of engineers into this rapidly changing ecosystem.
If you want a grounded, candid look at how energy providers and data center operators are working together to unlock new capacity for AI campuses, this conversation is a must-listen.

Monday Dec 08, 2025

Live from the Data Center Frontier Trends Summit 2025 – Reston, VA
In this episode, we bring you a featured panel from the Data Center Frontier Trends Summit 2025 (Aug. 26-28), sponsored by Schneider Electric. DCF Editor in Chief Matt Vincent moderates a fast-paced, highly practical conversation on what “AI for good” really looks like inside the modern data center—both in how we build for AI workloads and how we use AI to run facilities more intelligently.
Expert panelists included:
Steve Carlini, VP, Innovation and Data Center Energy Management Business, Schneider Electric
Sudhir Kalra, Chief Data Center Operations Officer, Compass Datacenters
Andrew Whitmore, VP of Sales, Motivair
Together they unpack:
How AI is driving unprecedented scale—from megawatt data halls to gigawatt AI “factories” and 100–600 kW rack roadmaps
What Schneider and NVIDIA are learning from real-world testing of Blackwell and NVL72-class reference designs
Why liquid cooling is no longer optional for high-density AI, and how to retrofit thousands of brownfield, air-cooled sites
How Compass is using AI, predictive analytics, and condition-based maintenance to cut manual interventions and OPEX
The shift from “constructing” to assembling data centers via modular, prefab approaches
The role of AI in grid-aware operations, energy storage, and more sustainable build and operations practices
Where power architectures, 800V DC, and industry standards will take us over the next five years
If you want a grounded, operator-level view into how AI is reshaping data center design, cooling, power, and operations—beyond the hype—this DCF Trends Summit session is a must-listen.

Tuesday Dec 02, 2025

On this episode of The Data Center Frontier Show, Editor in Chief Matt Vincent sits down with Rob Campbell, President of Flex Communications, Enterprise & Cloud, and Chris Butler, President of Flex Power, to unpack Flex’s bold new integrated data center platform as unveiled at the 2025 OCP Global Summit.
Flex says the AI era has broken traditional data center models, pushing power, cooling, and compute to the point where they can no longer be engineered separately. Their answer is a globally manufactured, pre-engineered platform that unifies these components into modular pods and skids, designed to cut deployment timelines by up to 30 percent and support gigawatt-scale AI campuses.
Rob and Chris explain how Flex is blending JetCool’s chip-level liquid cooling with scalable rack-level CDUs; how higher-voltage DC architectures (400V today, 800V next) will reshape power delivery; and why Flex’s 110-site global manufacturing footprint gives it a unique advantage in speed and resilience.
They also explore Flex’s lifecycle intelligence strategy, the company’s circular-economy approach to modular design, and their view of the “data center of 2030”—a landscape defined by converged power and IT, liquid cooling as default, and modular units capable of being deployed in 30–60 days.
It’s a deep look at how one of the world’s largest manufacturers plans to redefine AI-scale infrastructure.

Friday Nov 28, 2025

Artificial intelligence is completely changing how data centers are built and operated. What used to be relatively stable IT environments are now turning into massive power ecosystems. The main reason is simple — AI workloads need far more computing power, and that means far more energy.
We’re already seeing a sharp rise in total power consumption across the industry, but what’s even more striking is how much power is packed into each rack. Not long ago, most racks were designed for 5 to 15 kilowatts. Today, AI-heavy setups are hitting 50 to 70 kW, and the next generation could reach up to 1 megawatt per rack. That’s a huge jump — and it’s forcing everyone in the industry to rethink power delivery, cooling, and overall site design.
At those levels, traditional AC power distribution starts to reach its limits. That’s why many experts are already discussing a move toward high-voltage DC systems, possibly around 800 volts. DC systems can reduce conversion losses and handle higher densities more efficiently, which makes them a serious option for the future.
But with all this growth comes a big question: how do we stay responsible? Data centers are quickly becoming some of the largest power users on the planet. Society is starting to pay attention, and communities near these sites are asking fair questions — where will all this power come from, and how will it affect the grid or the environment? Building ever-bigger data centers isn’t enough; we need to make sure they’re sustainable and accepted by the public.
The next challenge is feasibility. Supplying hundreds of megawatts to a single facility is no small task. In many regions, grid capacity is already stretched, and new connections take years to approve. Add the unpredictable nature of AI power spikes, and you’ve got a real engineering and planning problem on your hands. The only realistic path forward is to make data centers more flexible — to let them pull energy from different sources, balance loads dynamically, and even generate some of their own power on-site.
That’s where ComAp’s systems come in. We help data center operators manage this complexity by making it simple to connect and control multiple energy sources — from renewables like solar or wind, to backup generators, to grid-scale connections. Our control systems allow operators to build hybrid setups that can adapt in real time, reduce emissions, and still keep reliability at 100%.
Just as importantly, ComAp helps with the grid integration side. When a single data center can draw as much power as a small city, it’s no longer just a “consumer” — it becomes part of the grid ecosystem. Our technology helps make that relationship smoother, allowing these large sites to interact intelligently with utilities and maintain overall grid stability.
And while today’s discussion is mostly around AC power, ComAp is already ready for the DC future. The same principles and reliability that have powered AC systems for decades will carry over to DC-based data centers. We’ve built our solutions to be flexible enough for that transition — so operators don’t have to wait for the technology to catch up.
In short, AI is driving a complete rethink of how data centers are powered. The demand and density will keep rising, and the pressure to stay responsible and sustainable will only grow stronger. The operators who succeed will be those who find smart ways to integrate different energy sources, keep efficiency high, and plan for the next generation of infrastructure.
That’s the space where ComAp is making a real difference.

Tuesday Nov 25, 2025

In this episode of the DCF Show podcast, Data Center Frontier Editor in Chief Matt Vincent sits down with Bill Severn, CEO of 1623 Farnam, to explore how the Omaha carrier hotel is becoming a critical aggregation hub for AI, cloud, and regional edge growth. A featured speaker on The Distributed Data Frontier panel at the 2025 DCF Trends Summit, Severn frames the edge not as a location but as the convergence of eyeballs, network density, and content—a definition that underpins Farnam’s strategy and rise in the Midwest.
Since acquiring the facility in 2018, 1623 Farnam has transformed an underappreciated office tower on the 41st parallel into a thriving interconnection nexus with more than 40 broadband providers, 60+ carriers, and growing hyperscale presence. The AI era is accelerating that momentum: over 5,000 new fiber strands are being added into the building, with another 5,000 strands expanding Meet-Me Room capacity in 2025 alone. Severn remains bullish on interconnection for the next several years as hyperscalers plan deployments out to 2029 and beyond.
The conversation also dives into multi-cloud routing needs across the region—where enterprises increasingly rely on Farnam for direct access to Google Central, Microsoft ExpressRoute, and global application-specific cloud regions. Energy efficiency has become a meaningful differentiator as well, with the facility operating below a 1.5 PUE, thanks to renewable chilled water, closed-loop cooling, and extensive free cooling cycles.
Severn highlights a growing emphasis on strategic content partnerships that help CDNs and providers justify regional expansion, pointing to past co-investments that rapidly scaled traffic from 100G to more than 600 Gbps. Meanwhile, AI deployments are already arriving at pace, requiring collaborative engineering to fit cabinet weight, elevator limitations, and 40–50 kW rack densities within a non–purpose-built structure.
As AI adoption accelerates and interconnection demand surges across the heartland, 1623 Farnam is positioning itself as one of the Midwest’s most important digital crossroads—linking hyperscale backbones, cloud onramps, and emerging AI inference clusters into a cohesive regional edge.

Friday Nov 21, 2025

In this episode, Matt Vincent, Editor in Chief at Data Frontier is joined by Rob Macchi, Vice President Data Center Solutions at Wesco and they explore how companies can stay ahead of the curve with smarter, more resilient construction strategies. From site selection to integrating emerging technologies, Wesco helps organizations build data centers that are not only efficient but future-ready. Listen now to learn more!

Tuesday Nov 18, 2025

In this episode of the Data Center Frontier Show, we sit down with Ryan Mallory, the newly appointed CEO of Flexential, following a coordinated leadership transition in October from Chris Downie.
Mallory outlines Flexential's strategic focus on the AI-driven future, positioning the company at the critical "inference edge" where enterprise CPU meets AI GPU. He breaks down the AI infrastructure boom into a clear three-stage build cycle and explains why the enterprise "killer app"—Agentic AI—plays directly into Flexential's strengths in interconnection and multi-tenant solutions.
We also dive into:
Power Strategy: How Flexential's modular, 36-72 MW build strategy avoids community strain and wins utility favor.
Product Roadmap: The evolution to Gen 5 and Gen 6 data centers, blending air and liquid cooling for mixed-density AI workloads.
The Bold Bet: Mallory's vision for the next 2-3 years, which involves "bending the physics curve" with geospatial energy and transmission to overcome terrestrial limits.
Tune in for a insightful conversation on power, planning, and the future of data center infrastructure.

Thursday Nov 13, 2025

On this episode of the Data Center Frontier Show, DartPoints CEO Scott Willis joins Editor in Chief Matt Vincent to discuss why regional data centers are becoming central to the future of AI and digital infrastructure. Fresh off his appearance on the Distributed Edge panel at the 2025 DCF Trends Summit, Willis breaks down how DartPoints is positioning itself in non-tier-one markets across the Midwest, Southeast, and South Central regions—locations he believes will play an increasingly critical role as AI workloads move closer to users.
Willis explains that DartPoints’ strategy hinges on a deeply interconnected regional footprint built around carrier-rich facilities and strong fiber connectivity. This fabric is already supporting latency-sensitive workloads such as AI inference and specialized healthcare applications, and Willis expects that demand to accelerate as enterprises seek performance closer to population centers.
Following a recent recapitalization with NOVA Infrastructure and Orion Infrastructure Capital, DartPoints has launched four new expansion sites designed from the ground up for higher-density, AI-oriented workloads. These facilities target rack densities from 30 kW to 120 kW and are sized in the 10–50 MW range—large enough for meaningful HPC and AI deployments but nimble enough to move faster than hyperscale builds constrained by long power queues.
Speed to market is a defining advantage for DartPoints. Willis emphasizes the company’s focus on brownfield opportunities where utility infrastructure already exists, reducing deployment timelines dramatically. For cooling, DartPoints is designing flexible environments that leverage advanced air systems for 30–40 kW racks and liquid cooling for higher densities, ensuring the ability to support the full spectrum of enterprise, HPC, and edge-adjacent AI needs.
Willis also highlights the importance of community partnership. DartPoints’ facilities have smaller footprints and lower power impact than hyperscale campuses, allowing the company to serve as a local economic catalyst while minimizing noise and aesthetic concerns.
Looking ahead to 2026, Willis sees the industry entering a phase where AI demand becomes broader and more distributed, making regional markets indispensable. DartPoints plans to continue expanding through organic growth and targeted M&A while maintaining its focus on interconnection, high-density readiness, and rapid, community-aligned deployment.
Tune in to hear how DartPoints is shaping the next chapter of distributed digital infrastructure—and why the market is finally moving toward the regional edge model Willis has championed.

Tuesday Nov 11, 2025

In this episode of the Data Center Frontier Show, DCF Editor-in-Chief Matt Vincent speaks with Ed Nichols, President and CEO of Expanse Energy / RRPT Hydro, and Gregory Tarver, Chief Electrical Engineer, about a new kind of hydropower built for the AI era.
RRPT Hydro’s piston-driven gravity and buoyancy system generates electricity without dams or flowing rivers—using the downward pull of gravity and the upward lift of buoyancy in sealed cylinders. Once started, the system runs self-sufficiently, producing predictable, zero-emission power.
Designed for modular, scalable deployment—from 15 kW to 1 GW—the technology can be installed underground or above ground, enabling data centers to power themselves behind the meter while reducing grid strain and even selling excess energy back to communities.
At an estimated Levelized Cost of Energy of $3.50/MWh, RRPT Hydro could dramatically undercut traditional renewables and fossil power. The company is advancing toward commercial readiness (TRL 7–9) and aims to build a 1 MW pilot plant within 12–15 months.
Nichols and Tarver describe this moonshot innovation, introduced at the 2025 DCF Trends Summit, as a “Wright Brothers moment” for hydropower—one that could redefine sustainable baseload energy for data centers and beyond.
Listen now to explore how RRPT Hydro’s patented piston-driven system could reshape the physics, economics, and deployment model of clean energy.

Copyright Data Center Frontier LLC © 2019

Version: 20241125