Hi there,
Welcome back to Untangled. It’s written by me, Charley Johnson, and supported by members like you. Help me make it better?
This week I’m sharing a conversation I had with Beth Rudden — founder of Bast AI, former chief data officer for a $34 billion division at IBM, and someone building a genuinely different vision of what AI could be.
🏡 Untangled HQ
Coming Up
* Stewarding Complexity: Our next session is about finding and using the agency you actually have — even inside institutions that weren’t designed for it.
* Untangled Collective: Your expense approval workflow is making decisions. So is your classification system, your algorithm, and your org chart. This session gives you a map of all of it — and shows you where to actually push.
* Stewarding AI: How to Build Responsible Principles, Workflows, and Practices will take place July 3, 10, 17, and 24. It will open to the waitlist tomorrow. Enrollment is capped - join the waitlist if you want dibs on signing up.
🧶 Deep Dive
Your data isn’t exhaust. It’s a belonging.
Even the tech CEOs with the most to lose from the narrative bubble popping are quietly conceding that the scaling law was never actually a law. We’ll eventually let go of the equally silly notion that intelligence — or AGI, or whatever we’re calling it this quarter — is simply an emergent property of scale. Probably around the same time we admit that attaching sensors to people’s extremities was not the path to ‘embodied intelligence.’ Anyway!
In the meantime, the story props up the technology. And the technology keeps doing what it does — make up false information, encode historical biases as neutral truth, and generate a mix of sloppy and genuinely useful outputs.
Because we’ve anointed a few tech CEOs as our AI-narrators-in-chief, they get to decide what the data represents and what it means. Knowledge! Intelligence! Truth! Beth is building an alternative system that allows meaning to form the old-fashioned way: through interactions between people and systems.
The critique starts with a claim about data that sounds simple but isn’t: decontextualized data doesn’t contain meaning. It carries patterns and associations. This distinction is fundamentally about whose meaning and knowledge grounds the AI system. This might sound academic but it matters a great deal. Take health care as an example — as Beth notes, seventy percent of patients don’t fully understand their outpatient procedures. A caregiver asks “why is my husband acting weird after his accident?” The clinical record says “behavioral dysregulation.” The gap between those two descriptions is where comprehension lives — and it’s invisible to any system that treats both as equivalent tokens.
When patients and caregivers interact with clinical information, they generate something that doesn’t exist anywhere else: a record of how humans actually try to understand medical knowledge, where they get stuck, what vocabulary they use, and what they’re really asking beneath the surface question. Beth calls this interaction data, and its where meaning lives.
From this you can start to build an ontology — a formal map of what exists within a domain and how concepts relate to each other. Here are the concepts in this field, here is how they connect, here is where each piece of knowledge sits relative to everything else. Without something to understand against, AI systems simply produce statistical appropriation rather than understanding. They pattern-match from frequency with no principled sense of how the patterns relate. The ontology is what offers the system ground truth.
This isn’t an approach without challenges. Every organization contains multiple competing ontologies. The C-suite has one map of how knowledge is organized. Frontline workers have another. These disagreements aren’t accidental — they reflect different positions in the power structure, different relationships to risk. When you formalize an ontology, you’re making a political choice about whose map becomes the standard. But I’d much rather make an intentional choice about what knowledge matters than no choice at all — and you can navigate through this complexity by triangulating across different perspectives representing different positionalities.
Beth has long described data as an artifact of human experience — carrying the fingerprints of its making, the lineage of decisions. But during a recent museum visit in Vancouver, a curator explained how her institution approaches Indigenous collections: these aren’t artifacts in our care. As Beth explains, they’re belongings. Artifacts can be extracted, cataloged, and owned. Belongings require consent and ongoing relationship with their communities of origin. Data isn’t an artifact of human experience. Data is a belonging.
The current AI economy is built on the opposite assumption — harvesting people’s data without consent, using poorly compensated annotators, treating the exhaust of human experience as raw material. I couldn’t agree more with the alternative vision Beth is articulating: people whose data contributes to AI systems get compensated. They choose whether to monetize their experiences. The lineage and provenance aren’t overhead. They’re the infrastructure.
That’s a long way from where we are. But I left the conversation feeling hopeful knowing someone is building toward it.
🙏 Share & Earn
Help me build this community of people thinking differently about technology and earn free rewards (e.g. 1:1 coaching sessions, even free entry into one of my courses). Just share your personal link far and wide.
💫 Work With Me
Here are 4 ways I can help:
* Facilitation: I can help facilitate your team through complex and fraught dynamics, so that they can achieve their purpose.
* Advising: I can help you navigate uncertainty, make sense of AI, and facilitate change in your system.
* Organizational Training: Everything you and your team need to cut through the tech-hype and implement strategies that catalyze true systems change. (For either Stewarding AI or Systems Change for Tech & Society Leaders)
* 1:1 Leadership Coaching: I can help you facilitate change — in yourself, your organization, and the system you work within.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
Flere episoder fra "Untangled"



Gå ikke glip af nogen episoder af “Untangled” - abonnér på podcasten med gratisapp GetPodcast.








