
What is your data trying to tell you? In this episode, John Dues talks to Andrew Stotz about why most leaders misread data, overreact to single results, and miss the real story. Discover how Deming thinking exposes when change is truly happening and how to use a process behavior chart to listen to the real story. Plus, find out why nine years of ‘stable’ results may still demand transformation. Tune in and rethink data-driven leadership!
0:00:02.2 Andrew Stotz: My name is Andrew Stotz, and I'll be your host as we dive deeper into the teachings of Dr. W. Edwards Deming. Today, I'm continuing my discussion with John Dues, who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. The topic for today is understanding variation is the key to data analysis. John, take it away.
0:00:27.8 John Dues: Andrew, it's good to be back. Yeah. So, we've just started the school year in Ohio, so I thought doing a session on goal setting would be a good place to kick off the year. And I was thinking a lot of leaders, school leaders and leaders in general, are setting goals around this time period. And I was really thinking about having this Deming lens. I was thinking, how did I set goals before I sort of started understanding this approach? And it's, you know, this is one of those things where if you really stop and think about it, goal setting is a lot harder than it seems at first glance. Things like, how do you set a reasonable goal? And then once you've gotten to that place, how do you know if things are improving? How do you know if things are getting worse? And I was thinking how powerful this understanding variation method is for folks that may be struggling with those questions.
0:01:32.9 Andrew Stotz: Yeah. In fact, that's a great question for the listener and the viewer. Like, how do you set goals? How did you set goals in the past? How have you improved that? And I was thinking when you were speaking, I didn't set goals. I gave proclamations. You know, 20% of I want to see this and that. And they were just stretch targets without any means or methods. So yeah, interesting.
0:01:55.2 John Dues: Yeah. How do you set the target? Was it arbitrary? Is it based on some standard that you heard somewhere? A lot of times you have no idea sort of what's behind that target or you've sort of associated it to something that's familiar. Like in my case, we often sort of set goals that sort of mimic the grade scale. So, you know, 80% is a common goal for something like test scores, you know.
0:02:23.7 Andrew Stotz: But they don't even call them goals anymore. They call them, let me remember, I think it's called KPIs.
0:02:30.0 John Dues: KPIs, targets, you know, lots of different things for sure. And I think what I've seen is that a lot of the reason that goal setting is so hard is because you, well, one, you misinterpret your data in the first place. And a lot of that misinterpretation, at least in the education sector, is because leaders don't have the knowledge. They don't know about natural variation. They're typically making comparisons between some current performance level, some previous value. But those two things, those two data points don't show you, don't convey the behavior of that data across time. So, what we do and what I did before I sort of discovered this method is you overreact to a single data point. Probably less frequently, you underreact to the data because you don't have this understanding of, you know, how much is the data moving up and down sort of naturally almost no matter what you're doing. Now, that's not always the case, but that's the case that I've found in a lot of situations. And so until you start to take that into account, those natural ups and downs, then you just misinterpret the data over and over again, usually by overreacting is what I've seen.
0:03:54.9 Andrew Stotz: Yeah.
0:03:56.0 John Dues: So there's, you know, I think as a starting point, people in the Deming community will be familiar with, a lot of people. But others listening to this probably have never heard of this idea of dividing variation into, I've heard it described as like two flavors. There's the routine variation, what I call natural variation, things vary naturally no matter what you're doing. And then there's exceptional variation where things are so different that there is reason to pay attention to this. And what I found through studying this is, the key is knowing how to tell the difference between those two types of variation. And don't do that, lots of confusion, lots of wasted effort. And so that's really where the power of this methodology comes into play. And for anybody that's studied this, you sort of realize that you have to have a tool to make that differentiation. It's not arbitrary. And so that's where what I call the process behavior chart, some people call the control chart, where that comes into play because that tool allows us to tell what type of variation is present. And it also allows us to tell if the system is predictable or unpredictable. And once we have that understanding, then we can chart an improvement sort of roadmap that makes sense.
0:05:21.8 Andrew Stotz: Yeah. In fact, I've applied two of the things, you know, one of the things to my pass rates and admission rates, I applied the process chart, the control chart, based upon your recommendation a long time ago. And it did help me to kind of think if, you know, in my case, I wanted to break out of that standard outcome that I was getting. And so I realized, something has to change substantially in the system in order to get a different result than this variation that I was getting. That was the first thing. And then the second one, a couple of nights ago, I was giving a lecture and I was using your work that you and I have talked about, as well as Mike Rother's stuff on goal setting and having the target. And then there's that obstacle. And what I realized when I gave a little talk on it and I used the diagram and I showed the obstacle, it became kind of apparent to everybody like, oh, yeah, there's an obstacle there that we don't know how to solve.
0:06:27.6 John Dues: Yeah.
0:06:28.3 Andrew Stotz: And that's where PDSA came in. And we started talking about that, as you have taught previously. So, yeah, I'm excited to hear what you have to say today.
0:06:38.2 John Dues: Yeah. And the Mike Rother model, I mean, he does have this target that's this long term target that's pretty hard to hit. And you don't really know what you're going to do. But the difference there then in the situation I'm describing is that that in Mike's model, that target is knowingly outside of the current sort of capability of the system. And they're coming together as a team and saying, how do we get to that target six months from now or a year from now? And we're working towards that rather than someone has just arbitrarily set some target, without a realization that the system isn't capable of hitting that currently. Those are two completely different scenarios. Yeah. So, I think I'll share my screen. Well, actually, before I do that, I would just say, too, because I know sometimes when I introduce these things, a lot of times people get scared away because they think the math is hard. And what I would say there is that there's the creation of a process behavior chart probably takes about fourth grade level math skills. You really only need to do addition, subtraction, multiplication, and division.
0:07:49.3 John Dues: That's it. But the thinking, I think, actually can be taught all the way down to the kindergarten level. And I've actually seen kindergartners explain the data on a process behavior chart. So, if anybody gets scared away at this part, the math is simple and the thinking is also pretty simple and powerful once you sort of have the basics. So, I'll go ahead and share my screen so the folks that are watching have a visual to follow along on. And for those that don't, I'll do my best to describe it. When we're talking about a process behavior chart, and this one's sort of an annotated version so that things are clear. But basically a process behavior chart is just a time sequence chart. It has upper and lower natural process limits, and we plot data for some measure that we're interested in. And the chart typically has a central line so that we can detect a trend of those plotted values toward perhaps either limit. So, this particular chart, the data is the percent of students who scored proficient or higher on the Ohio third grade reading state tests from spring 2004 through 2015.
0:09:06.8 John Dues: So, I've labeled sort of some of those key parts of the chart. So, just kind of call those out. Again, the red lines are the lower and upper natural process limits, sort of bound where you'd expect the data to be in a stable system.
0:09:21.1 Andrew Stotz: And those are 1, 2, 3 standard deviations or what?
0:09:28.1 John Dues: Well, this particular chart, it's what I call a process behavior chart. So it's actually, it's not standard deviation. It's based on a measure of dispersion called the moving range. And then there's a formula that smarter people than me figured out sort of how to use that moving range to set the red lines. But the important thing to know about the limits is that they're set empirically. And that just means that they're based on the data. And so they are where they are, not where I want them to be necessarily. I don't get to choose where they are, how wide they are, where they're placed numerically is based on the data itself. And then that green center line for this particular chart is the average of all the blue dots. And then the blue dots is each year of, again, testing data.
0:10:19.4 Andrew Stotz: 2004 to 2015 as the x-axis, yep.
0:10:27.0 John Dues: Yeah. So, you have a decade and a half or so, or sorry, a decade plus of data here. So, a good amount of data. So, you can kind of see how things are performing over time on this third grade reading test. And so the purpose of the chart, like we talked about, is to separate those two flavors of variation, the routine and the exceptional. And this chart is a really great example of just natural or routine variation. So, I'm looking for patterns in the data, like a single data point that would be outside the end of those red process limits. And you don't see that. The results for these years instead are just bouncing around an average of about 78.5%. Now there's some years where it's a little higher than that and some years where it's a little lower. But the point is none of those increases and decreases are meaningful. There's only that natural variation present. But the problem is, in the typical data analysis method, what I call the old way, the simple sort of limited comparison, is that a leader will rely on comparisons between the current figure and some previous value.
0:11:48.9 John Dues: And probably the most common and why I chose this data, at least in my world, is a leader will compare last year's test scores and this year's test scores. That's very, very common. But the problem is, again, that what I'm calling a limited comparison, the comparison between two years of data, it doesn't take natural variation into account. So, what happens is we try to ascribe meaning to those increases or decreases between data points when in reality there's often no difference to be found. And I have a really great example of this. Let me switch my screen here. So, there's a lot of information here, but it's pretty simple to understand. So, this is a snapshot from 2017/2018 state test results. And so this is a document that was published by our Department of Education here in Ohio back during those school years. And the thing is, it may be eight or so years old, but it's as relevant today as when it was published eight years ago. We're still making the same sort of mistakes. So, we're basically, when we look at the data in this chart, we're basically being led to believe that there's been this meaningful decline in performance in third grade ELA.
0:13:16.4 John Dues: That's what's signified by that red arrow in the first row of the table. So, you have the ELA data says that in '16/'17, 63.8% of our third graders were proficient. And in the following year, 61.2% were proficient. And there's this red down arrow to say, oh, things got worse this school year, at least when it was published. But then if you look at the blue box, the text for those that have video, in the text it says we're not supposed to worry because, "third grade saw decreases this year, but has maintained higher proficiency than two years ago." So. Then you start to think, well, which is it? Should I be worried about my third grade ELA state test scores because of the most recent decrease, you know, as of when this was published? Or should I not worry because the scores are better than they were two years prior to that?
0:14:21.7 Andrew Stotz: And that depends what side of the argument you're on.
0:14:24.4 John Dues: Depends what side of the argument you're on. What story do you want to tell with this data, right?
0:14:30.3 Andrew Stotz: So, it's bad enough to be potentially misled by this probably common variation, but then to have both sides of an argument be misled at the same time.
0:14:41.0 John Dues: Right, yeah. Ultimately it seems like what they're trying to do is show improvement because you have this big headline up here that says, Ohio students continue to show improved achievement in academic content area.
0:14:55.2 Andrew Stotz: Yay!
0:14:58.5 John Dues: But there's a way to actually answer these questions definitively using this method, right? And so what I did was I took the data from the three years of the state testing for third grade ELA from this state education department publication, and I just plotted it on a process behavior chart. And then I continued plotting it for the more recent data that's happened since this, because three data points isn't a lot, so I kept plotting it. And so now we have, going all the way back to the first year of data in this state testing document, we have 2015/2016 data, and of course now we have data all the way up through the end of the last school year, 2024/2025. So, we have nine data points. So I plotted it, right? It looks like this. So, here's those same data as the first three data points, spring 2016, spring 2017, spring 2018.
0:15:58.3 John Dues: That's from the table from the previous slide. And then I've continued plotting things for, you know, spring of 2018, '19, '21, '22, '23, '24, and '25. So, now we have nine years of data. And what we can see is, just like what I would have predicted, even if I had only had those three years to work with that were from the state testing document and not the more recent data, but there's no evidence of improvement. It's definitive. And so you see these nine data points. They're just simply bouncing around this average of 61%. That's what the green line shows. It's almost perfectly balanced, in fact. So, three of the points are actually below the average. One point is almost right on the line, the average line. And then there's five points above. And if you follow it from point to point, it increases, then decreases, then increases, then decreases, then increases very slightly for a couple or three or four years in a row. Right? But there's no signals or patterns in this data to indicate any changes of significance. Right? So claims like, you know, yeah, we've declined in this most recent year from that testing document or, oh, we shouldn't worry too much because it's better than two years ago. All of that is nonsense.
0:17:24.6 Andrew Stotz: So, the title should have been nine years of no improvement.
0:17:29.7 John Dues: Nine years of no improvement. Nine years of stable data. And the thing is, a lot of data looks just like the state testing data over time. Not only in education, but in other things. And how I've heard this described by people that use this methodology is that, claims of improvement are often nothing more than writing fiction. And I think that's a very good description for what we see here. And the thing is, is like, I'm not trying to throw the person that wrote that document under the bus. All I'm saying is that there's a better way to be looking at data like this, a way that makes more sense.
0:18:24.9 Andrew Stotz: It made me think of the Mark Twain quote, rumors of my demise are greatly exaggerated.
0:18:39.9 John Dues: Give me one second here. My screen switched on me. There we go. Okay. So, when I think about this data, there's no real decline in performance, there's no real increase in performance. It's just stable performance. I think the key for leaders, systems leaders especially, is that this system, the way we would describe it is it's producing predictable results, and it's performing as consistently as it is capable. And so it's a waste of time to explain the natural variation in a stable system. Because what people would say is that there's no simple single root cause for this noise.
0:19:24.5 Andrew Stotz: And I think it's even better way of saying it. It's not a waste of time, it's a waste of your career.
0:19:32.6 John Dues: That'd be a very apt way of describing this.
0:19:36.0 Andrew Stotz: It kind of goes back to the point that Dr. Deming said, which was that, a manager could spend his life putting out fires and never improve the system. And every little thing above and below was a little, little mini emergency or a response was made every year because of the under or over, you'd just spend, you know, it would just be whack-a-mole.
0:20:01.9 John Dues: Yeah. But I think the thing for people to understand is I'm saying this system is performing as capable as it is, or as the performances is what this particular system is capable of. But that doesn't mean just because it's stable and predictable, like this one is, you know, it's up above 61% one year, and then it's down below it a little bit or right on the line. That doesn't mean that stable means acceptable. It doesn't mean stable is satisfactory.
0:20:37.1 Andrew Stotz: I'm thinking that this is neutral, you know, it's an observation rather than a judgment.
0:20:42.5 John Dues: Yeah. It's just what is. It's the process is producing what you would expect it to produce because it's stable and predictable.
0:20:49.8 Andrew Stotz: I want to just mention that my mind's wandering because I know that you help people with these types of charts. And when I was working with a hospital here in Thailand, they had a great room that they set up that was all blacked out and it was full of these great computer screens and guys in their technicians, like 10 of them in this room. But the room was dead silent, blacked out 100%. And they were radiologists and all the x-rays, MRIs, and everything that were being done on the machines outside were coming into them and then they were making their judgments on it. And then they would submit that and then the doctors would very quickly get a read on that. And I was just thinking, imagine being a person that just all day long looking at these types of charts. Like just any system can be described by the... And then what's your judgment on this? Yep, common cause. That's it.
0:21:50.9 John Dues: Yeah. And I think it obviously doesn't mean that there isn't work to be done. Like in this case, even though it's stable and predictable, so if I was putting a bet down on what the results are for spring 2026, at the end of this school year, I'd put my money somewhere between, let's call it 55% and 65%. And I'd be right almost every single time, I think, as long as nothing changes. But that doesn't mean, like I said, it doesn't mean there's not work to be done because when you look at this, this means that about 60% of third graders are proficient in any given school year on this Ohio third grade state test, which means that two in five students are not reading proficiently. So, the improvement roadmap, there has to be some fundamental changes to how we do third grade reading instruction, curriculum, assessment. Something fundamental has to change if we want to get a different set of results.
0:22:54.8 Andrew Stotz: And one of the things that I've kind of come to believe in my life, right or wrong, I'm not exactly sure, but it's like having traveled to so many countries and seen so many places, I kind of feel like people get what they demand. Like the population of a country, if they don't demand certain behavior from politicians, they don't get it. And so on the one hand, this is a neutral thing, but I think you can also make a judgment that the population of Ohio is not in a continuous uproar to see this change.
0:23:39.0 John Dues: Yeah. Well, I would say very few people even have this picture in their head, whether it's educators or the general public, because every time we get one of these state testing reports, it usually has only two or three years of data. So no one even remembers what happened.
0:24:01.9 Andrew Stotz: I agree that they don't have clarity, real good clarity like you're bringing us here. They have an understanding of what's happening generally. And this is what, so the reason why I'm mentioning that is because part of the benefit of trying to understand the state of a system is to understand that the level of change or work or new thinking that has to go into saying, modifying, let's just say that the population was in an uproar and they decided that they wanted to get to 90% proficiency from 60%. The level of rethinking is such a huge thing. And I think what this chart tells me is like, that's kind of what's set in stone. And in order to move beyond what's set in stone, there is a whole lot of work and a whole lot of new thinking that has to go into that. And it must be continuous. And that's part of the constancy of purpose. And you do it for three years and then a new guy comes in and he changes it. And then next thing you know, it's not sustained.
0:25:17.4 John Dues: Yeah. I mean, yes, you'd have to do something significant and then you'd have to stick with it. That constancy of purpose phrase is right on because you'd have to, first you'd have to develop the right plan and then you'd want to test it. But then once you started seeing some evidence of improvement, you'd have to stick with that plan for a decade or more to see those types of results. And that's really hard when the political will shifts, the focus shifts, you have a pandemic, whatever the thing is, you have less money for school, whatever that thing is or any combination of that, it makes it very challenging to sustain.
0:25:57.8 Andrew Stotz: And the reason why I'm raising this point is because it just kind of really hits me that take away Ohio, take away education, take away all of those things and just produce a control chart on any process in any business, in any school, and you're gonna see the current state.
0:26:17.3 John Dues: Yep, absolutely. Yeah. You can use this in any setting, any data that occurs over time, you could use this methodology.
0:26:24.8 Andrew Stotz: And one of the questions I have in my mind as I was thinking is like, why change it? The level of effort required to sustainably change that is just incredible. And you could argue that, okay, there's companies that build a competitive advantage by saying, that's not the status quo that we want to exist in and therefore we're gonna create a whole new business built around something different that produces a result that's considerably better than that. But it happens for sure, but we're much more likely in our lives if we were to see that to just let it be.
0:27:03.6 John Dues: Yeah. Yeah. And when you get it down, when you sort of zero in and get down to the sort of local level, there are schools that sort of performed in this sort of general fashion that made changes at the building level and then got significantly different results. So, it becomes a little easier. It's not easy, but it becomes easier when you're talking about a single school building and coordinating the efforts there versus trying to do that across all the school buildings.
0:27:32.9 Andrew Stotz: And I think this is what, when Dr. Deming talks about leadership, this is what he's talking about.
0:27:39.1 John Dues: Yep. Yep. And I think, you know, the good thing is here, if this is resonating with you, whether you're a school leader or the leader of some other type of organization, you know, you've probably struggled to interpret your most important data. So, before I discovered this method, I didn't really have a method per se. I'd put numbers in a table and then try to look at them and try to sort of ascertain what was happening on. And so I think it's, you know, if you've never heard of this, it's totally fine. Most of us were never taught how to understand variation in our data. But I think there's two sort of big ideas I would take from this as we've talked about this. The first is just taking natural variation into account. Just meaning plot your data over time, plot your dots, and look at how it's moving up and down over time. So, that's the first big idea, this idea of natural variation. Things are going to move up and down just naturally, no matter what's happening, even if nothing of significance has occurred.
0:28:47.6 John Dues: And then big idea two is that you can use this chart, this process behavior chart methodology to differentiate between those two types of variation that I talked about, the routine or natural variation, and then the exceptional variation. And then once you do that, you're gonna get some very powerful insights into what your data looks like, because people are gonna say, oh, I know why that happened. I know why that looks like that. Now that I see it like this, I have an understanding for why the patterns look like they do. And then you can start to turn that sort of type of analysis into better outcomes. And that's really the point of doing this is that you, you know when to react, when not to react, you are making sound decisions based on a logic, a logical model, a logical data model. And the best part is it's very simple. Like I said, a fourth grader can do the math required to create the chart. And I've seen kids as young as five or six interpreting the data in a chart. So, that means that we can all do it for sure. It's not actually that difficult.
0:30:00.6 Andrew Stotz: Yeah. And I was just thinking of Newton's law of inertia, meaning an object stays at rest until acted on by an outside force.
0:30:12.7 John Dues: Yeah.
0:30:13.8 Andrew Stotz: And I think what you're showing is the state of inertia.
0:30:18.5 John Dues: Yeah. Yeah. Yeah. The state of inertia. And I think it's just, you know, you don't know what you don't know. But once you see this and, I did some of the figuring this out on my own reading about it, listening to other people talk about it, but I talk to a lot of people and got a lot of guidance. So if this has piqued your interest, my suggestion is reach out to somebody that has done this before, at least at the start. Because there are a lot of, you know, while I am saying you can create a chart with fourth grade math and I've seen kindergartners analyze the charts, there is some learning, there is some technicality to it. And so I think if you have a coach, even better because you're gonna learn it so much faster and be able to sort of turn that learning into results so much faster.
0:31:07.0 Andrew Stotz: And maybe the starting point is trying to figure out of all the different measures that I've got in my business, in my school, in my life, what's one that I get regularly? And I like data that comes out more than annual because then it's just such a long process. So if I have daily data, weekly, monthly, you know, those types of data points, then from that, you know, and what's one thing in your life that would be a data point that you'd like to look at? And I would even argue the first step is just to start collecting it into, let's say, an Excel file and just collect that raw data. And you can make a chart of that raw data. And the benefit of the process, you know, control chart and the process chart is that what you're seeing is, you know, tools within that chart to help you interpret. But even if you just start by figuring out what data point you wanna look at, start collecting it, do a month or two of getting that data, and then you can start saying, okay, now I'm gonna apply these tools, nothing wrong with that.
0:32:21.2 John Dues: Yeah. And you wanna show it to people, like whether that's teachers or students, you wanna show them the data that you're collecting because they're gonna be a part of that improvement process, no matter what type of data that you're looking at, at least in schools, you're always gonna want the front line people to be a part of that process.
0:32:39.4 Andrew Stotz: And the way I did that in my area of research when I was an analyst and I had a research, was I wanted to see the data of the output of our research operation. How much did we produce? I didn't have a strong opinion as to whether we should produce more or less or whatever. I just wanted to understand them. And so I started plotting that data on a weekly basis, and I labeled it pretty well. And then I just put it up on the wall, and I didn't talk about it. And I put it up, and people looked at it, and I didn't go and explain it, and I didn't put control limits or anything like that. I just put the data up. And I remember a Thai lady that worked for me came to me, and she said, I figured you out. And I was like, what are you talking about? And she said, I was out to lunch with a friend of mine, and she asked me, how many reports do you publish a month? And she said, my employee said, I publish six reports in a month. And my friend said, what?
0:33:45.4 Andrew Stotz: And she said, how many do you do? She said, I only do two in a month. And she said, what are Andrew's targets for you? My God, to get six reports. And then my employee said to her, he doesn't have any target for me. And then that employee of mine came back to see me after that lunch, and she said, I get it. You just put it up on the wall, and it raised the awareness for all of us, and we all looked at it, and then it influenced the way we thought about our job without you telling me, get four or six or two. And so sometimes, and I did that exact same thing when I worked at Pepsi when I was in 1989 when I joined Pepsi in the factory in Buena Park in particular, where I would put up on the wall, here's everybody's error rates from last night. And I would post that, and then the employees would just look at it and go, that's wrong. Okay. Fine, great, tell me. Let's look at the data.
0:34:44.8 Andrew Stotz: And I kept all the underlying data that was manual in my hands in stacks, and then they would go, oh, okay, so I did get that wrong. Let me fix that. And then I fixed it and put it back up, but it didn't look much prettier after I fixed it. And then all of a sudden, people started looking at it, and then they started having new information they never had. And I hadn't studied with Dr. Deming by that time, so I didn't even understand anything to do with the chart, but just putting up the chart without any major commentary is fascinating.
0:35:12.9 John Dues: Yeah. It starts those conversations, starts getting people sort of more involved, more engaged with the work. Yeah, I think those are all really smart moves that we often don't do.
0:35:25.2 Andrew Stotz: And I think that was why my boss suggested I go to a Deming seminar, because he saw me starting to do that, and then he had heard about Deming and knew a little bit, and then he was like, yeah, this guy could be suited for that.
0:35:36.6 John Dues: Yeah. It sounds like it was fate or something like that.
0:35:41.6 Andrew Stotz: Yeah. Definitely. So, I'm going to wrap up just by saying that, for the listeners and the viewers out there, I think a big takeaway is figure out that one data point, just one. You don't need five, just one that comes out consistently, daily, weekly, monthly, you know, something that's relatively regular, and then start collecting that data. Write it down on a, you know, I do have times that I just write it down on a manual chart, in my notebook. Write it down there. You don't even need Excel. Just start collecting that data and thinking about the collection of the data, what time of the day you get it or what time of the week or what time of the month, and then start collecting it. Then the second stage is start to, you know, obviously, if you can go to an expert, someone like John or others, reach out to them, LinkedIn or other place, you know, and say, hey, I've got this data. Can you help me? And then they can easily do the calculations and then send you back the Excel file and say, here it is with all the calculations, which you did to me on one of mine, and that was great. And then get that help, and then start to move yourself slowly into the process because I think one of the things that I take away from it is that this really is the present, and it is an accurate representation of what the system is capable of.
0:37:10.2 John Dues: That's right. Yeah.
0:37:10.8 Andrew Stotz: And if you don't understand that, then you're just going to be beating your head against the wall. So, anything you would add?
0:37:18.9 John Dues: No, just beat your head against the wall and you make stuff up about what is happening. That's often what happens. Yeah.
0:37:27.0 Andrew Stotz: Then you become AI. You're hallucinating.
0:37:30.1 John Dues: Yes.
0:37:31.0 Andrew Stotz: Well, John, on behalf of everyone at the Deming Institute, I want to thank you again for this discussion. And for listeners, remember to go to deming.org to continue your journey. And you can find John's book, Win-Win: W. Edwards Deming, The System of Profound Knowledge, and the Science of Improving Schools, on amazon.com. This is your host, Andrew Stotz, and I'll leave you with one of my favorite quotes from Dr. Deming, and that is that people are entitled to joy in work.
D'autres épisodes de "In Their Own Words"
Ne ratez aucun épisode de “In Their Own Words” et abonnez-vous gratuitement à ce podcast dans l'application GetPodcast.