Tech Field Day Podcast podcast

62. A Different Type of Datacenter is Needed for AI

0:00
22:08
Rewind 15 seconds
Fast Forward 15 seconds

Learn More from the AI Infrastructure Field Day Presentations

AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.

Host

Alastair Cooke, Tech Field Day Event Lead

Panelists:

Karen Lopez

Lino Telera

Denise Donohue

Follow the Tech Field Day Podcast ⁠⁠⁠on X/Twitter⁠⁠⁠ or ⁠⁠⁠on Bluesky⁠⁠⁠ and use the Hashtag #TFDPodcast to join the discussion. Listen to more episodes ⁠⁠⁠on the podcast page of the website⁠⁠⁠.

Follow ⁠⁠⁠Tech Field Day⁠⁠⁠ for more information on upcoming and current event coverage ⁠⁠⁠on X/Twitter⁠⁠⁠, ⁠⁠⁠on Bluesky⁠⁠⁠, and ⁠⁠⁠on LinkedIn⁠⁠⁠, or ⁠⁠⁠visit our website⁠⁠⁠.

More episodes from "Tech Field Day Podcast"