DiscoverThe Data Center Frontier ShowNomads at the Summit: AI Models and their Corresponding Infrastructure Needs
Nomads at the Summit: AI Models and their Corresponding Infrastructure Needs

Nomads at the Summit: AI Models and their Corresponding Infrastructure Needs

Update: 2025-09-25
Share

Description

In this DCF Trends-Nomads at the Summit Podcast episode, Chris James, CEO of NoesisAI, delivers a sweeping, insight-rich overview of how different classes of AI models—from LLMs and RAG to vision AI and scientific workloads—are driving a new wave of infrastructure decisions across the data center landscape. With a sharp focus on the diverging needs of training vs. inference, James breaks down what it takes to support today’s AI—from GPU-intensive clusters with high-speed interconnects and liquid cooling to inference-optimized, edge-deployed accelerators. He also explores the rapidly shifting hardware ecosystem, including the rise of custom silicon, heterogeneous computing, and where the battle between NVIDIA, AMD, Intel, and hyperscaler-designed chips is headed. Whether you're designing for scalability, sustainability, or the bleeding edge, this conversation offers a field guide to the infrastructure behind intelligent computing.

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Nomads at the Summit: AI Models and their Corresponding Infrastructure Needs

Nomads at the Summit: AI Models and their Corresponding Infrastructure Needs

Endeavor Business Media