MLOps.community podcast

Inside Uber’s AI Revolution - Everything about how they use AI/ML

0:00
45:23
Rewind 15 seconds
Fast Forward 15 seconds

Kai Wang joins the MLOps Community podcast LIVE to share how Uber built and scaled its ML platform, Michelangelo. From mission-critical models to tools for both beginners and experts, he walks us through Uber’s AI playbook—and teases plans to open-source parts of it.

// Bio


Kai Wang is the product lead of the AI platform team at Uber, overseeing Uber's internal end-to-end ML platform called Michelangelo that powers 100% Uber's business-critical ML use cases.


// Related Links


Uber GenAI: https://www.uber.com/blog/from-predictive-to-generative-ai/


#uber #podcast #ai #machinelearning


~~~~~~~~ ✌️Connect With Us ✌️ ~~~~~~~


Catch all episodes, blogs, newsletters, and more: https://go.mlops.community/TYExplore

MLOps Swag/Merch: [https://shop.mlops.community/]

Connect with Demetrios on LinkedIn: /dpbrinkm

Connect with Kai on LinkedIn: /kai-wang-67457318/

Timestamps:


[00:00] Rethinking AI Beyond ChatGPT

[04:01] How Devs Pick Their Tools

[08:25] Measuring Dev Speed Smartly

[10:14] Predictive Models at Uber

[13:11] When ML Strategy Shifts

[15:56] Smarter Uber Eats with AI

[19:29] Summarizing Feedback with ML

[23:27] GenAI That Users Notice

[27:19] Inference at Scale: Michelangelo

[32:26] Building Uber’s AI Studio

[33:50] Faster AI Agents, Less Pain

[39:21] Evaluating Models at Uber

[42:22] Why Uber Open-Sourced Machanjo

[44:32] What Fuels Uber’s AI Team

More episodes from "MLOps.community"