The Artificial Intelligence Podcast podcast

Interview #84 Hagay Lupesko, SVP of AI Inference at Cerebras Systems

0:00
47:34
Spol 15 sekunder tilbage
Spol 15 sekunder frem

Join Hagay Lupesko, SVP of AI Inference at Cerebras Systems, for a deep dive into the rapidly evolving world of AI inference. Hagay breaks down why inference has overtaken training as the dominant AI workload, how Cerebras' wafer-scale chip architecture delivers 10-20x faster performance than NVIDIA GPUs, and why CUDA is no longer the moat many think it is. He also covers how DeepSeek wiping $600 billion off NVIDIA's market cap in a single day was both a foundational and deeply misunderstood moment for the industry, the growing energy crisis in AI infrastructure, and what it will take to support the explosive rise of AI agents in the enterprise.

Flere episoder fra "The Artificial Intelligence Podcast"