The Daily AI Show podcast

From Pokémon Go to Open Jarvis

0:00
1:00:58
Spola tillbaka 15 sekunder
Spola framåt 15 sekunder

This episode focused on the shift toward local, always-on AI systems and the tools making that possible. The conversation started with Pokémon Go as an example of users generating valuable spatial AI data, then moved into NVIDIA GTC, inference hardware, and the broader push toward on-device agents. The second half centered on building workflows with Claude Code, Open Jarvis, mobile coding limitations, Google’s new embeddings model, and how agent permissions change the way people work with coding tools.


Key Points Discussed


00:01:38 Pokémon Go as unpaid spatial AI field work

00:07:13 NVIDIA GTC and the shift from training to inference

00:10:08 How chipmakers plan for agentic AI and local inference

00:24:16 Stanford Open Jarvis and fully on-device personal AI agents

00:30:05 Beth’s Podcast Buddy build and weekend app experiments with Claude Code

00:31:45 Claude’s one million token context window discussion

00:34:06 Claude usage limits doubling outside peak hours

00:35:45 What Claude Code on a phone can and cannot do

00:38:34 Google’s new embeddings model for locating objects and multimodal search

00:41:05 Brian’s cruise ship hot-and-cold app idea using geolocation and embeddings

00:43:29 How Claude remote works from a phone

00:50:41 Bypass permissions mode and the risks of letting coding agents run freely

00:55:37 Codex full access mode and why Carl prefers its UI

00:58:57 Brian’s story about building for fun versus building on deadline


The Daily AI Show Co Hosts: Brian Maucere, Andy Halliday, Beth Lyons, and Karl Yeh

Fler avsnitt från "The Daily AI Show"