Episode 5: The Pursuit of Perfection: Dominant Architectures and Dynamics
Guest: Dr. Steve Spear (Part 1)
On this episode of The Idealcast with Gene Kim, Dr. Steve Spear talks about the primary characteristics of dynamic learning organizations, through the lens of its structure and the resulting dynamics, and how it enables those organizations to win and dominant in the marketplace.
From his 1999 Harvard Business Review article “Decoding the DNA of the Toyota Production System” to his bestselling book The High-Velocity Edge to his monomaniacal advocate for the scientific method employed by everybody about everything all the time, Steve's influence on the successful pursuit of excellence and perfection is undeniable.
Discussing everything from the importance of curiosity and experimentation, fast feedback, mission orientation, leadership, healthcare organizations, military strategy and organization, and of course Toyota, Steve and Gene explain why organizations behave the way they do and demonstrate why dynamic learning organization are so successful.
Dr. Steve Spear (DBA MS MS) is principal for HVE LLC, the award-winning author of The High-Velocity Edge, and patent holder for the See to Solve Real Time Alert System. A Senior Lecturer at MIT’s Sloan School and a Senior Fellow at the Institute, Spear’s work focuses on accelerating learning dynamics within organizations so that they know better and faster what to do and how to do it. This has been informed and tested in practice in multiple “verticals” including heavy industry, high tech design, biopharm R&D, healthcare delivery and other social services, Army rapid equipping, and Navy readiness.
High velocity learning concepts became the basis of the Alcoa Business System—which led to 100s of millions in recurring savings, the Pittsburgh Regional Healthcare Initiatives “Perfecting Patient Care System”—credited with sharp reductions in complications like MRSA and CLABs, Pratt & Whitney’s “Engineering Standard Work”—which when piloted led to winning the engine contract for the Joint Strike Fighter, the operating system for Detroit Edison, and the Navy’s high velocity learning line of effort—an initiative led by the Chief of Naval Operations. A pilot with a pharma company cut the time for the ‘hit to lead’ phase in early stage drug discovery from twelve months to six.
Gene Kim is a Wall Street Journal bestselling author, researcher, and multiple award-winning CTO. He has been studying high-performing technology organizations since 1999 and was the founder and CTO of Tripwire for 13 years. He is the author of six books, The Unicorn Project (2019), and co-author of the Shingo Publication Award winning Accelerate (2018), The DevOps Handbook (2016), and The Phoenix Project (2013). Since 2014, he has been the founder and organizer of DevOps Enterprise Summit, studying the technology transformations of large, complex organizations.
In 2007, ComputerWorld added Gene to the “40 Innovative IT People to Watch Under the Age of 40” list, and he was named a Computer Science Outstanding Alumnus by Purdue University for achievement and leadership in the profession.
He lives in Portland, OR, with his wife and family.
Gene Kim (00:00:00):
This episode is brought to you by IT Revolution, whose mission is to help technology leaders succeed through publishing and events. You're listening to The Idealcast with Gene Kim, brought to you by IT revolution. In this episode of the Idealcast, I am so delighted that I have on Dr. Steve Spear, who has been a mentor of mine ever since I first took his workshop on high velocity learning at MIT in 2014. His book, The High Velocity Edge, is one that I've studied over and over, over the years. You can easily spot it in any pile of my books because nearly one third of the pages have been dog-eared because something on that page struck me as especially profound or brilliant. He is famous for many things, but he's probably most famous for writing one of the most downloaded Harvard business review papers of all time in 1999. It was called decoding the DNA of the Toyota production system. This is based in part on his doctoral dissertation that he did at the Harvard business school. And in support of that, he worked on the manufacturing plant floor of a tier one Toyota supplier for six months.
Gene Kim (00:01:13):
Since then he's extended his work beyond just high repetition manufacturing work to engine design at Pratt & Whitney, to the building of the safety culture at Alcoa to helping create genuinely safe healthcare systems. And he was part of the US Navy initiative to create high velocity learning across all aspects of the enterprise. I believe Dr. Spears' work is extremely important. Just as command and control leadership was a hallmark of the 20th century, I believe that dynamic learning organizations will be the basis of the 21st century. In this episode, I learn about how his journey of high velocity learning started, where that journey took him, and then explore with him his mental model of dominant architectures, structure, and dynamics, and how it can explain so much of why organizations behave the way they do. He also talks at length about what conditions must be present for organizational wide learning to occur and how it allows you to achievement of amazing goals and to dominate in the marketplace. Steve, welcome to the podcast.
Steve Spear (00:02:15):
Hey Gene, thanks for having me. Much appreciated.
Gene Kim (00:02:18):
Steve, I took the liberty of introducing you in my words, but could you introduce yourself in your own words and describe what you've been working on these days?
Steve Spear (00:02:27):
Yeah. So do you want me to go from the beginning to the end or from the present back to the [crosstalk 00:02:32]
Gene Kim (00:02:31):
Yeah, let's do the beginning to the present, how's that?
Steve Spear (00:02:35):
All right, let me just anchor on where we are today. I think I've become a monomaniacal advocate for the scientific method employed by everybody about everything all the time. And what I mean by that, if you think about the scientific method, it sounds sort of egg headed, pointy headed or whatever else. But fundamentally, the scientific method requires that before we take an action, we make a declaration of prediction about what action we think we're going to take, the conditions in which we think we're going to act, and the consequences of what our actions will be in those conditions. And then it asks us to seek feedback very fast, very frequently, to find out where we're wrong. I think we all do that naturally. No one walks out the door and just goes through their day rolling dice, flipping coins, to figure out what they're going to do with their next step.
Steve Spear (00:03:27):
We're all constructing predictions about our actions and their outcome. And I think in most cases, most of us actually learn in some fashion from the surprises when our actions yield outcomes which weren't expected and weren't desired. I think we do that naturally. The thing is, and this makes reference back to your point about command and control, as experimental as I think most people are outside their professional undertakings, inside the professional undertakings, this whole notion of predicting for the purpose of finding flaws and thinking through the experience of flaws and doing, that's suppressed. And constantly because it's suppressed, it kind of locks us into repeating actions based on the same mistake and assumptions.
Gene Kim (00:04:15):
So much of your works have influenced so much my thinking for nearly a decade. It's not at all a joke when I say that the publication of the [inaudible 00:04:23] handbook was delayed by over a year because after I took your class, I realized there was so much that was missing from it. And I felt that needed to get integrated into the book. So before we get into that, I'm wondering if you can rewind the clock and share with us how you ended up working on the plant floor at a tier one Toyota supplier and ended up having your book foreword written by the famous and recently departed Dr. Clay Christiansen.
Gene Kim (00:04:47):
Gene here, I now realize I should introduce who Dr. Clay Christensen is. The Economist described him as the most influential management thinker of his time. Among many other things, he was the author of the famous 1997 book Innovator's Dilemma. He coined the phrase disruptive innovation while he was a professor at the Harvard business school. And if you read Steven Spears' book, High-Velocity Edge, the first thing you will read is what I believe to be the most incredible foreword ever written by Dr. Clay Christensen. And I will actually read it to you later in this interview.
Steve Spear (00:05:22):
So there isn't a certain irony that my work is really meant to help people create value quicker, easier, better, than they would have otherwise. So the fact that it delayed the delivery of your book by over a year, I think we can [inaudible 00:05:39].
Gene Kim (00:05:38):
And I'm grateful for it.
Steve Spear (00:05:39):
Thank you. Yeah, nothing like having a manuscript sit on your desk for another year to really make you feel comfortable and at ease. But in terms of where this all got started, so it actually goes back more than 25 years. I had graduate school in the late 1980s when American manufacturing was facing this existential threat of challenge from Japanese firms. And when I got to MIT's Sloan school in 1989, the number of courses that were labeled Japanese management of some form of technology, could just be plain technology, data technology, accounting technology, whatever else it was, a good portion of the course catalog was framed that way.
Steve Spear (00:06:20):
And the simple reason was that Japanese firms had demonstrated their capacity to generate and deliver a whole lot more value, faster, easier into the marketplace than anybody else. And I was one of a very big cohort of MITers and others who were curious as to what was going on there to bring lessons back here. And part of that involved taking classes at MIT, making a summer long and then a year long venture into Japan to see what I could learn. But that was some background and motivation. And the way I ended up at Toyota is, as I worked through my first pass of graduate education in these periods in Japan, I still wasn't satisfied that I had gotten to a meaningful answer as to what was different there that explained the experience here.
Steve Spear (00:07:09):
Anyway, a fantastic opportunity came up. Former MIT professor opened and then kept open a door while I dilly dally in my decision making. Kent Bowen, who was a critical mentor to me, Kent created an opportunity for me to stay in the graduate student space, which is a tremendously luxurious. Giving you time to dig deeply and just dig around looking for something that seems interesting and important. And in conversation with Kent, we said, "It seemed odd that here you had Toyota, which had been the subject of so much study in the 1980s, 1990s around lean manufacturing, and there were all these really energetic and honest efforts to learn what Toyota had been doing. And yet there was no second Toyota. They were still crushing their competition in terms of manufacturing, quality, productivity, time to market with new designs, profitability per unit. Pick any metric relevant in the auto industry for the major makers, and Toyota was love leader or number two on all of them."
Steve Spear (00:08:14):
And so we had this opportunity to approach people at Toyota and say to them, "Look, you have been the subject of so much study and you've been the object of so much imitation or flood imitation, what is it that you're doing that other people have missed?" And the response we got back was, "Well, we can't tell you." And we were like, "What do you mean you can't tell us? You've been open to your major competitors? Why can't you tell us?" And they said, "No, no, no, you don't understand. It's not that we wouldn't tell you if we could, but the way we learn this is it's a very immersive learn while doing experience. And it's not that we wouldn't tell you, just that we don't know how. But if you're willing to learn our management system the way we learn it, see whatever you find out."
Steve Spear (00:09:04):
Well anyway, this was sort of like that scene in the matrix where the guy is offered a green pill and a red pill because it turned out learning it the way they do it required this deep karate kid immersion with very little explicit direction in terms of what the basic thinking was and how it works. But lots and lots of hands on, trying to solve problems. And largely learning by the fact that you try to solve a problem and your mentor would say, "Well, you might want to try that again." Never telling you what the right answer was, or even why your answer was wrong. Only, "You might want to try that again." And anyway, that process went on a good six months at a tier one supplier, and then at dozens and dozens of other Toyota and Toyota affiliated plants that I visited over the course of, I don't know, three, four or five years.
Gene Kim (00:09:56):
Can you connect the dots to you dissertation, Clay Christiansen, and where that took you from there?
Steve Spear (00:10:03):
Yeah, so when I first went into Toyota, if you think about the literature at the time about the company, the assumption was that Toyota had tools for production control that was superior to other people's tools. If you see a lot of what lean manufacturing became was almost a checklist or recitation of various tools, that if you had more of them, you are more lean. If you have fewer of them, you are less lean. And so based on that, I just assumed that what had happened was that there were some critical tool or tools that people had missed. So I went looking for tools. And I'm doing this karate kid immersion, "Go give this a try, now try it again. Oh boy, try it yet again."
Steve Spear (00:10:51):
And what I was coming to appreciate is that for all the trying and iterative learning I was doing, I was never being introduced to a tool. What I was being introduced to was a way of thinking in a way of acting which required more and more this idea that I make declarations about what I was expecting so I could be surprised by the reality I encountered. And to be honest, look, I didn't realize that at first through my own experience. I think more dense than not, and too dense to have appreciated it just on my own. One of the things that really struck me is, as I started visiting Toyota suppliers, both in North America and some in Japan, concurrent with my own immersive experience, I would ask people if they would show me and explain to me what work they did and how they did it.
Steve Spear (00:11:39):
And what I discovered is that in most places, when you ask someone about their work, they'll just show it to you. "Oh, here's the objective I have. And here's the sequence of steps and the tasks for which I'm responsible." And what would invariably in the Toyota setting, and I'm talking about shop floor associates, no advanced education, maybe some technical training on the job, et cetera, they would give me almost a dissertation. They'd say, "Well Spear, something you have to understand, I've been asked to do such and such a test, put a seat in the car, weld something, form something, whatever else it is. And ideally, I'd be able to do that in a defect-free way with no wasted action, right on demand when asked by my customer and immediately in the unit of use that they needed. That would be the ideal, but there was a problem."
Steve Spear (00:12:31):
" And in order to resolve that problem, here's what I did to modify my work, to close the gap between what I was currently doing and that ideal." And I said, "Oh, so can I see that?" "No, no, no, you can't see that one because when I started using that one, there was another problem. And in order to resolve that problem, I created this other corrective action to try and close the gap on the ideal." I said, "Oh, can you show me how that one works?" And another no. "I don't use that one anymore because when I used that one, there was a problem." And at that point into the conversation, when they really started building up an emotional momentum on... They pull out a binder and start flipping me through past versions of how they did their work. And if that work had a life span more than a few weeks, there might be dozens of pages in this binder.
Steve Spear (00:13:22):
And they could walk me through the series of experiments they had conducted from their first conception of how to do the work through the layer upon layer upon layer of corrective action to close the gap with this ideal of a defect-free, on demand, safe, immediate, and so forth. And as I'm listening to these, this wasn't from people with PhDs, this wasn't people with advanced degrees trained in the life sciences. These were people working on shop floors, explaining how they had reduced spatter from a welding process, how they reduce smoke from the same process. Anyway, what started to really strike me is, and we put in that first paper you referenced, what Toyota seemed to do with its management system, and I think we said, they had taken work which most people thought of in very physical terms, and used it as the platform on which to build a community of scientists where everything was subjected to this combination of prediction and feedback. Now that for us was the huge aha about what separated them from essentially everybody else we had observed or participated with.
Gene Kim (00:14:35):
Steve, that is so great. So I'm going to read you part of the end of the foreword that Dr. Christensen wrote. He said this, "I count having been one of Steve Spears' colleagues and advisors to be one of my foremost credentials. I hope from the pages of this book you'll be able to learn from Steve even a fraction of the valuable insights I've gotten as I've worked with him. In his field, he has no peers. Dr. Clayton Christiansen." So could you describe Dr. Christiansen's involvement in your work?
Steve Spear (00:15:06):
Yes. So I was tremendously fortunate as a graduate student to have two really fantastic champions for my work, for the ideas in the work, the evolution of those ideas, and just from my development as a human being. To put some context on this, I met Clay right when I arrived back into graduate school, this is for the second or third time. And Kent Bowen also, who was enormously influential in my development, I was, I don't know, still in my late twenties, early thirties then. Unmarried, no family. And these guys not only coached the development of my intellectual capital, but really had very big impact on me as a human being, and ultimately as a husband, a father, a member of the community. So like I said, I'm blessed in so many ways to have Clay and Kent having been in my corner over decades at this point.
Steve Spear (00:16:06):
So as far as how I think I connected with Clay, if you think about his work on disruptive innovation, he takes a look at the incumbents in a field, those who are really cranking along solidly in a groove. And he's trying to explain why it is that those folks who are so solidly in a group get disrupted by these much smaller, less resource rich firms. And his explanation goes something like this. Is that the incumbents get into a groove and they start chasing ever more rewarding, profitable, prestigious things that are in that groove. And as they do so, the groove becomes somewhat of a rut and they try to do more and more of what they've already been doing. And in doing so... And Clay has the technical explanations about the construction of business models, this and that. But fundamentally what happens is they lose sight of things that are outside their groove.
Steve Spear (00:17:09):
And in particular, the needs of people who aren't being satisfied by the work being done inside the groove. And what Clay is really describing is, as organizations become more and more dominant in their chosen field, they become less and less aware and less and less curious about what's going on around them. So anyway, that I think is Clay's message. And I think he carried it over, not only to the importance of being curious about what's out there professionally and commercially, but I think Clay lived a life which was really a testament that we should do that as individuals. And make sure that when we're in our own personal groove, we have an eye towards those who are out there whose needs aren't being satisfied. So anyway, that's, I think an appropriate tribute to Clay.
Steve Spear (00:17:55):
In terms of his connection to my own work, there were two. One I think, was this issue around curiosity. Because Clay's explanation for why I'm incumbents got disrupted is that they became incurious and consequently vulnerable. And here I was studying an organization which was almost energetically paranoid in cultivating curiosity across its workforce. And I think the other thing that really caught Clay's eye is that we had opportunity to investigate a really bonafide and significant anomaly, which was the ability of Toyota, in a very, very competitive industry, not only to have created a lead for itself, but sustained advantage over decades, when in fact the levelness of the playing field should not have allowed that. So a huge gigantic inside for me was this creation of a workforce in which everyone was part of a community of scientists. I think the other big insight is when I started looking at the healthcare field and trying to understand... So the community of scientists explains why Toyota was so successful.
Gene Kim (00:19:10):
Steve, that is so great. I love the phrase when your groove turns into a rut. I'd love to revisit that, but only after this. Let's talk about how you took what you learned at Toyota, where everyone is a part of a community of scientists. Where did you apply that beyond just Toyota?
Steve Spear (00:19:30):
So if you go back to the original motivation for looking at Toyota in the first place, it was this anomalous ability to create far more value with far less effort than anybody else. On any given day, the effort in versus the value out, the ratios were just crazily good compared to what was typical in the industry. As I was completing my dissertation and coming to this conclusion about community of scientists subjecting everything to this aggressive feedback to reveal faults in our thinking that became expressed as faults in are doing, both Clay and a doctor student... Not a medical student, but a doctor who was a student at the school, started raising my awareness to what was a growing frustration in the American healthcare delivery system.
Steve Spear (00:20:20):
When you consider the huge, huge, innate talent of the people in the field, the sophistication of the education and training they receive, the resources committed to the professional undertakings, basically we should all have wellness and health, the equivalent of having pet unicorns. It should be off the charts. And in fact, the system doesn't achieve anywhere near its theoretical potential. It's actually a source of often great disappointment in terms of access to care, cost of care, quality of care. So around 2000, when we started looking into healthcare, reports were coming out of the Institute of Medicine documenting the hundreds of thousands of Americans who died due to avoidable causes in hospitals each year.
Gene Kim (00:21:12):
Gene here. In High Velocity Edge, Steve writes about the risk of getting hurt or being killed while in the hospital. The probability of that happening is higher than flying into commercial airliner, taking a walk, riding a bike, parachuting, hand gliding, or even base jumping, where you jump off of fixed structures such as a building or a cellular base station with a parachute. These statistics are horrifying and show the dangers that exist for patients inside of the incredibly complex world of modern healthcare systems. Okay, back to Steve.
Steve Spear (00:21:46):
So anyway, Clay and this doctor student, this fellow John [inaudible 00:21:50] said, "Hey, the stuff you're learning at Toyota about converting everything into a feedback generating experiment, would that work in healthcare?" And I said, "I don't really know, but perhaps we can run an experiment." Now it turned out that it was an opportunity to run an experiment at a small community hospital here in the Boston area, part of the Beth Israel Deaconess Medical Center larger network. We started working with the staff there are things like access to beds, medication administration, overburden on nurses. And wouldn't you know that if you started encouraging people to take for granted that they weren't entitled to an ideal experience of delivering care in this ideal fashion of defect-free, on demand, to match the needs of patients, et cetera, in a way that's safe for doctors and nurses and pharmacists technicians, et cetera, that if they started with that and actually had some compromise in their ability to have this ideal experience, then they should call that out as a refutation of some foundational belief and investigate the source of the disappointment and they could take the corrective action.
Steve Spear (00:23:06):
And we started to get some positive results in his very, very small care setting. Now, out of the coincidence of simultaneously working with folks in Alcoa and the tremendous civic mindedness of Alcoa CEO Paul O'Neill, we got invited to present what we were learning about Toyota. And from this tiny little pilot here in the Boston area, we had a chance to present what we were learning so far to a group that was assembled by what was called the Pittsburgh Regional Healthcare Initiative. And I remember I went with this fellow, John [inaudible 00:23:41] and Kent Bowen down to Pittsburgh one day at the invitation of Paul O'Neill, and we presented to dozens of community leaders, both from the medical centers and elsewhere. We started to explaining it, just the basic thinking we had discovered. That anything designed by human beings is going to have flaws, so best when you design it, design it in such a way that the flaws are quickly evident. And when the flaws make themselves evident, respect that as a bonafide signal of opportunity to get better.
Steve Spear (00:24:10):
And we just walked them through. And what did you know, we walked through this explanation. And in this room of dozens of people, two stood up and said, "We've tried just about everything else. And I guess there's no harm in trying this too." So they said, " Can you spend a little time with us to help us see how this applies in practice?" And amongst those two or three people who stood up that day, one was Gail Wolf, who's a chief nursing officer in the UPMCs system. There's another guy, a cardiologist named Mark Schmidthoffer, who not only became a friend and a colleague, but a coauthor. And a third guy who stood up, whether it was in that room that day or later on, it was a fellow named Rick Shannon who's gone on to great things at University of Pennsylvania, University of Virginia, and now Duke. They invited us in to see if the basic thinking we had been absorbing from Toyota and testing out in a Boston [inaudible 00:25:08] if it would work on a large scale. And it did, but I'll pause there.
Gene Kim (00:25:13):
No, that's great Steve. In fact, those are some of my favorite parts of your book, High Velocity Edge. Before we depart this topic, I feel like you have to share what some of the findings were and walk through some of the amazing results that you were able to generate with the team.
Steve Spear (00:25:29):
So I'll tell you what, the healthcare experience, particularly in Pittsburgh, was a source of another really important insight as my work evolved. And so just to reset, we went to look at Toyota to try and understand the source of anomalous strong performance. And the conclusion we came out of that was feedback. The more feedback you can generate or in a situation, and the better you are at reacting to that feedback as bonafide signal of opportunity, the better off you are. That ties into the comment about creating a community of scientists. So that was our explanatory theory for success. Working in healthcare gave us the opportunity to construct an explanatory theory for failure.
Steve Spear (00:26:21):
And what we had observed was clinicians, so highly motivated to act in ways that were in the best interest of their patients. And look, we're all in the midst of this whole COVID thing now, many of us, at least, are fortunate enough that we have a comfortable place to which we can retreat. And we can talk to our friends on video conferences like we're doing right now. But we're all impressed by the healthcare providers who haven't shied away from the situation. And they're putting themselves in a high risk, high hazard environment, more hours than they ever signed on for, to take care of other people.
Steve Spear (00:27:03):
... environment, more hours than they ever sign on for to take care of other people. So anyway, we're working in Pittsburgh, and it's just a constant reminder of the tremendous altruistic character of this workforce. And yet, they're struggling to get the haul to be anywhere near the sum of the parts. Consider the parts, their innate ability, the resources, et cetera.
Steve Spear (00:27:28):
And we started to ask the question, "Well, what's going on here which is so different from the Toyota environment where the whole was so much greater than the sum of the parts?" And what struck us is that and perhaps because of the aspiration to meet other people's needs right away, what we found is that healthcare workers, when they encountered a problem, rather than taking a pause, a reflection on the expression of the problem, the cause of the problem, and taking time to construct a meaningful corrective action to prevent the problem from recurring, they worked around the problem.
Steve Spear (00:28:04):
And again, you have to think about the contrast, right? Because the motivation, the incentive to work around problems was so positive. There's a person who needs my attention right now, and I'm going to do whatever is required to deliver on what I thought my promise was to them. But the consequence of this workarounds on a cumulative fashion, which is so corrosive, because what it meant was if someone encountered something that was difficult and worked around it, the difficulty was still in the system. But now, it was in the system, but it was still invisible to somebody else. And once something else got worked around, it was still in the system but invisible to somebody else.
Steve Spear (00:28:48):
And sometimes, those invisible hazards, trip wires, trip somebody up. Or sometimes, they got worse and worse and worse, and individually caused great harm or somehow combined in just the right way to cause great harm. But it was this relentlessness of workarounds in order to maintain an operational tempo that really caught our attention, as both an explanation for the under-performance and a very striking, almost one-to-one contrast with this idea of design work with tests built in to reveal error. And when it's revealed, deal with it.
Gene Kim (00:29:29):
There's so much there in terms of the corrosive, cumulative harmed caused by daily workarounds. But let's put that in a box for a moment and let's go back to the notion of when groove gets turned into a rut.
Steve Spear (00:29:43):
Gene Kim (00:29:45):
What Steve revealed about what goes wrong inside of healthcare systems is so interesting and compelling. I promise we'll get back to it later in this interview.
Gene Kim (00:29:54):
So, one of the things that I'm so grateful to you for is being able to have these working calls with you once or even twice a week to better understand some of these core concepts that have dominated your thinking for decades. The first concept that really caught my attention was a notion of a dominant architecture. I think this really verbalizes what Dr. Christianson's work was in terms of that dominant architectures do what they're designed for. Can you explain dominant architectures to us and why they're so important?
Steve Spear (00:30:25):
Oh, sure. So, if you think about the early stages of any enterprise, it typically starts with some number of people, maybe small, maybe large, recognizing that there's an opportunity. That there's a need in the marketplace, there's a social problem to be solved. Whatever it is, there's an opportunity to make something better for somebody else. What exactly it is that's wrong, it's not so clear. What exactly the nature of the solution is, that's also not clear. What exactly it is that that group of people have to do to create and deliver the solution, that's also not clear. So essentially, you don't know what the problem is. You don't know what the solution is, and even if you knew the first two, you don't know how to generate the solution. So, what that invites is a lot of experimentation, whether it's formal scientific method experimentation or perhaps it's a more trial and error experimentation.
Steve Spear (00:31:25):
But it invites testing ideas. Eventually, you hope that you start getting answers, right? You have a better articulation of what the need is or the problem is that has to be resolved. You have a better articulation of what the solution is that'll be successful in satisfying those needs. You'll have a better articulation of what work has to be done by whom, in what way to bring that solution into... or to create and deliver that solution. And when that happens, when you start converging on answers to those critical questions about what need, what solution, and what approach, you can start getting clarity in terms of roles, responsibilities, and relationships. So, when we get started on something Gene, I don't know what you do. You don't know what I do. We don't know what Anne does, but we each try something.
Steve Spear (00:32:14):
But eventually, things start to emerge and there's a role for Gene, a role for Steve, a role for Anne. And once we have those roles, things start to solidify, which is a good thing. Okay? Because what we're doing is we're reducing uncertainty, we're increasing clarity, we're giving direction. You show up on the job, you know what you're supposed to do and why. I know what I'm supposed to do and why, all of that. So let's just pause that. It's all of a positive side, is that the architecture, and what I mean by architecture, and I'm drawing upon stuff I've talked with Clay, Carlos Baldwin, Steve Wheelwright, and others, Kim Clark, is this idea that an organization, like a technical device, makes assignments.
Steve Spear (00:32:58):
And a technical device, it's a function onto part. In an organization, it's a responsibility onto person and that is, we start getting clarity about what we're supposed to be doing and how we're supposed to be doing it. An architecture for our relationship starts to emerge. And again, that's a good thing. This is a source of efficiency, of efficacy, of multiplicative impact, et cetera.
Gene Kim (00:33:23):
One of the examples that you gave me, to make this concrete, was how these dominant architectures are at first absent and how one is created. You gave me the example of at the dawn of the automobile, as you're trying to design and manufacture one, it wasn't even clear what the dominant architecture should be. Can you give me the example of how vague... In fact, the example that you gave me was the steering column. Matter of fact, even before you can talk about the steering column, you have to first talk about the transmission of power to the wheels. Is the metaphor, one of steering going to be like a tiller in a ship? Is it like a buggy, with the rein? Walk us through with why the lack of a dominant architecture emerging was so critical for auto manufacturing design.
Steve Spear (00:34:08):
Yeah, yeah. So, I think it was Kim Clark's research on the topic that most influenced my thinking. He recounted exactly what you were saying, is that when people started thinking about taking motors or engines and using them as propulsive mechanisms for vehicles, there wasn't any kind of agreement. Was it going to be battery-powered? Steam-powered? Some kind of combustion? The nature of the source of the mechanical power, that wasn't even clear. The metaphor, is this going to be a boat on land? Or a version of a horse-drawn carriage?
Steve Spear (00:34:43):
And we know a horse one, because in terms of horsepower and not sail power as our metric. But that also wasn't known. But eventually, what did happen is that people tried and tested ideas. Some worked better than others. The ones that worked less well got discarded. The ones that worked better, kind of in a Darwinian thing of variation, selection, retention. They got retained and then replicated, and what came out of that was the basic structure of a car which we're all familiar with, which is internal combustion engine somewhere in the front or the back, a drive shaft connecting that to the wheels, a body around that, and an interior. Now, once that architecture started to emerge, you could get the clarity of role and responsibility that you and I were discussing.
Steve Spear (00:35:29):
That you can have styling do something which was not independent of, but dependent in a predictable fashion, dependent on and dependent with design, engineering, manufacturing engineering, production, et cetera, et cetera. So that's why back before Henry Ford got his start, the productivity in making a car was minuscule, because you didn't have sort of an Adam Smith clarity on division of labor roles, responsibilities, et cetera. And as this dominant architecture started to emerge, it allowed someone like Henry Ford to build a moving assembly line and with that, the huge, huge increase in efficacy, efficiency, productivity and output.
Gene Kim (00:36:11):
Oh, this is great. And so, that dominant architecture allowed for a division of labor, allowed teams to be able to work independently, or at least with known dependencies on each other.
Steve Spear (00:36:21):
Gene Kim (00:36:22):
So those are the great things about being able to build around or build within a dominant architecture. Let's talk about some of the downsides of when the dominant architecture actually conspires to prevent the achievement of objective and daily work.
Steve Spear (00:36:36):
Right. So, as we were talking, the dominant architecture is a response to unanswered questions, in terms of what should we be doing and how should we be doing it? Once we start getting answers, we come up with... We've been calling it dominant architecture. Other people might call it routines, business processes, et cetera. But it's the same idea, which is roles, responsibilities, and relationships.
Steve Spear (00:37:04):
Now, the thing about that is that the whole reason we've gotten to these legacy business processes is dominant architectures, is because we generated answers to questions. We generated solutions to problems. That raises the issue. What happens when the problems change? When the question we answered in terms of what to do, the question we answered in terms of how to do it, what happens when those are no longer good answers because the questions, the problems in society have changed and we need different answers in terms of what to do and how to do it?
Steve Spear (00:37:38):
Well, if we try to depend on the dominant architecture for that, the dominant architecture will still want to generate the things it's always generated. It was built to generate those things. It's very good at generating those things. It's not too good at generating these other things, which we actually don't even yet understand what they are and how to achieve them.
Gene Kim (00:38:05):
In fact, in the healthcare example, you talked about the daily workarounds and the heroics, and people really trying to fight the system, go around the system. That in my mind really says, "This is what happens when the dominant architecture is impeding." What are they working around? They're really working around the dominant architecture. Is that a correct-
Steve Spear (00:38:24):
Yeah, Gene. I think that's a fantastic statement. If you think about the dominant architecture which guided how medical care, medical practices were established, how much medical training still is delivered, it goes back to when medical science was really in its infancy, and we depended on the model of the doctor that was entitled to lay hands on. And for my doctor friends, I understand that is a profound privilege in their profession, that they can provide value to someone through the laying on of hands, both for the purpose of diagnosis and treatment.
Steve Spear (00:39:01):
And back in the day and probably until at least the 1960s, maybe into the 70s, the notion that there was the doctor who was the hub of care delivery, this was the doctor who could lay on hands and draw upon his or her expertise to assess, diagnose, treat, and everything else was in support of that interaction between the patient and the doctor. Well, if that's your way of thinking, then it gets reflected in how you organize. So you organize a practice around a doctor or a few doctors, you organize hospitals around departments which are framed in terms of not diseases, processes, procedures, but around medical specialties. A department of pulmonology, cardiology, et cetera, et cetera. And then you create social relationships within those same institutions which have as their model this once dominant architecture as the physician as the hub of value creation.
Steve Spear (00:39:59):
And you even train that way, right? Which is as people go through they go more and more specialized into profession with less and less perhaps understanding and context of what adjacent professions do. All right, well, here's sort of the curse from the blessing. In the last 30 some odd years, medical science has advanced exponentially and society's ability, its hypothetical ability or theoretical ability to assess, diagnose, and treat has gone simply off the charts. The thing is, in order to take full advantage of the medical science we now have, the single individual physician can't be the sole point of contact. And I encourage you to think even on your next visit to the doctor, if they ever come offline and you see them in person, think about the large number of people who will do something value-creating for you even if it's just for a routine exam.
Steve Spear (00:40:59):
And forget about if you're going in for something serious and something complicated. But if that's what the science wants, which is this highly integrated, interactive, complex system, but the dominant social architecture, dominant organizational architecture is still around this physician as a hub, it's going to force people to work around, firefight, cope. Kind of work under the radar to execute on the roles and responsibilities and relationships that science wants, and that technology wants, but which the organization doesn't necessarily endorse.
Gene Kim (00:41:35):
Steve, so great. Notions, constructs like dominant architecture helps one see something, at least in my case, see something that I didn't see before. And so, let's talk about the components of what makes up a dominant architecture. In your book, High Velocity Edge, there was actually one sentence that really caught my attention, which is the notion of structure and dynamics.
Gene Kim (00:41:57):
And I remember asking about this in a conference room in the Sloan building at MIT and being utterly dazzled by how simple it was, and yet how it could powerfully explain so much of our collective experience and reveal some surprising insights. So, can you describe briefly what structure and dynamics are and how it could possibly describe all the ways that organizations actually work?
Steve Spear (00:42:22):
Yeah. Connecting this question about structure and dynamics to architecture that's dominant or otherwise, the structure of any system involves the inherent answers to a number of questions. First of all, what's the purpose of the system? Why does it exist? What value is it going to create? And then in order to deliver on that purpose or promise, there have to be questions about assignment of responsibility or assignment of function in a technical system, but assignment of responsibility in an enterprise. And it's not only assignment of responsibility, you do A and Steve does B and Anne does C, but that with that responsibility, there are relationships in terms of dependencies that in doing A, you're doing A because B depends on A. And in doing B, I'm doing it dependent on you doing A but also because Anne is dependent on me doing C.
Steve Spear (00:43:22):
All right, so we could restructure that. We could say that it's actually... Steve doing B is not dependent on Gene, he's dependent on Anne, right? Which is a different structure, or Anne is dependent on Gene and whatever else. So it's this mapping of roles, responsibilities and relationships which gets us to the structure. The structure has a lot of determination on just the dynamics, the way in which the system behaves right off. So for example we say, "C depends on B, which depends on A." Then the dynamics are going to be that A creates something, gives it to B and B absorbs it, and creates something else and gives it to C. All right? There's that.
Steve Spear (00:44:06):
So, just to stay on the topic of structure, what is mapped to what, who is connected to whom is deliberately or inadvertently going to have a very profound effect on how a system behaves on its dynamics. Anyway, let's pause there and we can pick up, if you want.
Gene Kim (00:44:28):
I love that. I mean, essentially what you're saying is that structure is very concretely, how we organize the teams, the interfaces between those teams. In the software world, it is the architecture we work within and dynamics is almost everything else, somewhere which leaders can influence through cultural norms and so forth. But what I find so dazzling about this is that so much of how the system behaves is really dominated by the structure that we create. Is that an overstatement?
Steve Spear (00:44:56):
Right. No, no. I don't think it's an overstatement at all. So, you mentioned teams, so I'll make reference to one of my favorite books which is Team of Teams by Stanley McChrystal and some of his colleagues. I highly recommend that book, and very early on in the book, it describes-
Gene Kim (00:45:15):
I just read that you're acknowledgements of the book, which I only learned like a year ago.
Steve Spear (00:45:18):
Oh, yeah, yeah. Well, yeah. I met that group and I said, "Oh yeah, we're working on writing a book," and I said, "Well, I've written a book. Maybe I could take a look at your manuscript." And so they were very generous, let me see the manuscript in a sense, add some comments, and as a thank you for that, I got in the acknowledgements section. It's a terrific book. General McChrystal, Stan McChrystal describes his experience taking over the Joint Special Operations Command, which was tasked with defeating Al Qaeda in Iraq. And Al Qaeda in Iraq was a bunch of nihilistic sociopaths. It was terrible what they were doing.
Steve Spear (00:45:58):
What he found was that when he went to the headquarters, he found all these really fantastically trained and incredibly motivated specialists, whether it was the Army Rangers who could descend out of helicopters in the middle of the night and raid a place, or the CIA analyst who could [inaudible 00:46:18] through electronic and all sorts of other media in English and Arabic and this and that, and tribal languages and what not, to make sense of what was in there. And then the Navy SEALs and other special operators who could act out in very, very remarkable ways, he said, "Here I had responsibility for all these fantastic people with such an important and ethical mission of stopping this nihilistic sociopathology." And he said, "And yet, we weren't winning."
Steve Spear (00:46:49):
And he started to try and figure out why that was, and he realized that the pieces weren't configured into a whole appropriate to the mission. So the Rangers would go out and do their raids, and they'd collect all sorts of potentially valuable intelligence in tote bags and what not. The CIA analyst wouldn't necessarily go through it in a fashion... time to when it arrived and then the Navy SEALs would work off of information provided to them but again, also out of sync.
Steve Spear (00:47:16):
And so what General McChrystal describes is trying to take all these individual teams and align them towards the common purpose of beating Al Qaeda in Iraq. So, working off of that mission, working backwards, saying, "Well, in order to beat Al Qaeda in Iraq, there are certain things that people on the end, the pointy end of the spear have to do." What do they need?
Steve Spear (00:47:38):
They need information to know where to go and when to go and why to go, et cetera, et cetera. Well, the people that give them the information, what do they need? Well, they need inputs. Well, who's going to go out and collect those inputs? And he starts creating process across all these different what we'd say in a commercial setting, across all these functional silos to get everyone aligned on meeting the immediate need of the downstream value creator by making sure that every step along the way from far upstream to far downstream was highly aligned with the mission of the organization overall.
Gene Kim (00:48:15):
Right, and so to use the language of structure and dynamics, there was something profoundly unsuited in the before world where each... There was almost an over-parochial nature to each one of the functional silos. In fact, I love that phrase from the book, that the squad was a boundary of which everyone else sucked. Right?
Steve Spear (00:48:36):
Right, right, right.
Gene Kim (00:48:36):
It was almost impossible for these teams to work together towards a common objective, and the transformation to one that was really focused on mission orientation. So in some ways, the structure was unchanged, right? The Navy SEALs still reported up through the Secretary of the Navy, right? And the Army Rangers still reported to the Secretary of the Army, and yet as it gets toward the mission on the ground, they're working in a very different manner. Can you talk to that?
Steve Spear (00:48:59):
Yeah, so your reference is fantastic, and it makes me think that we identify heavily with people with whom we share common purpose and common effort, and if we don't have common purpose or we don't have common effort, our sense of identification is no longer of us. It's them, and General McChrystal describes exactly that, which is that the divisions weren't just Army and Navy. That even within the guys in the Navy in the SEALs community, that they would draw divisions between which class they were in. "Oh, you went to a summer class, I went to a winter class. Well, you guys had it easy because we had more hypothermia," or "If we were in the same class or the same season, then what boat were you in?" "Oh, well, you were on that boat. Everyone knows the truth about that boat."
Steve Spear (00:49:49):
I think the root cause of that was that whether we're defining by boat, by class, by service, we identify by what we have in common. And so a big part of what he was trying to do is come up with some kind of common unifying thread which had to do with mission and the essential role that each of us have in the achievement of that mission.
Gene Kim (00:50:16):
This episode is brought to you by IT Revolution, whose mission is to help technology leaders succeed and their organizations win in the marketplace through publishing and events. I'm so excited about the 13 books that IT Revolution has published to date, including The [inaudible 00:50:31] Project, DevOps Handbook, Accelerate, Project to Product and Making Work Visible. In 2020, the list of amazing and important titles continues to grow. This year, we've published one book with two more scheduled for later in the year.
Gene Kim (00:50:45):
As with Conversations by Douglas Squirrel and Jeffrey Frederick, it's a fabulous book on how leaders need to show up in order to get the outcomes we need to bring everyone on board. It's not just to get buy-in, but to get everyone fully working together towards our common objectives. Later this year, Sooner, Safer, Happier by John Smart will be released. It's a surprisingly breathtaking look at modern leadership practices. Until I read it, I didn't realize how much of it was explicitly addressed in organizational learning and leadership books in the last 50 years. And the Delicate Art of Bureaucracy by Mark Schwartz, it's brilliant. Do I need to say more?
Gene Kim (00:51:23):
He talks about how bureaucracies are such a major part of what drives daily work and how it can be tamed and enlisted to support all of our most important endeavors. Learn more about these books at ITRevolution.com.
Gene Kim (00:51:41):
I think one of the most vivid examples, I think the story that he tells in the book was that they went from sighting to capture in less than 40 minutes, with the people on the front lines, enlisted people, intelligence agents being able to make all the decisions necessary to actually make the capture. Whereas in the old world, it would have taken weeks, potentially months to get to [crosstalk 00:52:02]-
Steve Spear (00:52:02):
Right, right, right.
Gene Kim (00:52:02):
... to actually get the plan approved, let alone executed.
Steve Spear (00:52:06):
Yeah, yeah. And when you go from sighting and the time frame is anything but minutes or maybe hours, there's no capture to the sighting in that operating environment.
Gene Kim (00:52:18):
So, [crosstalk 00:52:19]-
Steve Spear (00:52:18):
And Gene, don't we get the same thing even in the commercial world? Where our equivalent of sighting is the identification of a market need and our equivalent of capture is that we're able to marshal the resources, organize them in the right way, that way we can actually deliver a solution onto that need in a way that's mutually beneficial. And if we're slow either in the sighting or the capture, the commercial equivalent of that, we just say, "Oh, man. I wish. I wish I had seen faster, I wish I had acted faster." And those folks who are more agile and adroit, they do the commercial equivalent of sighting and capture and reap disproportionate rewards for doing so.
Gene Kim (00:53:02):
Absolutely. So, let's go back to the team of teams example. I think there's so many things that we can sort of reveal through the lens of structure and dynamics. So, we talked about some of the concrete things that they changed structurally. They were sort of matrixed into this mission teams, right? So instead of a parochial focus, they had one towards mission-focused. And let's talk about some of the dynamics. I remember one of them was General McChrystal, one of the things he would do is he would assemble these war rooms where everybody who was relevant to the problem, they were assembled in one place as opposed to offices. He had the habit of putting conference calls on speakerphone so that everybody could hear.
Gene Kim (00:53:40):
What are other things that changed in the dynamics that we can see through the team of teams examples?
Steve Spear (00:53:48):
Yeah, so if you take your description about these common displays, common conference rooms, et cetera, I think what they were trying to achieve there is give everyone visualization of where their part fit into the whole-
Steve Spear (00:54:03):
... dualization of where their part fit into the whole. So that's one thing. But also because of issues related to real time display, everyone on the conference call, et cetera, not only was there visualization of where hypothetically their piece fit into the whole, they could also see where their piece was ill fitting. And that could be a trigger for and a source of information about what corrections needed to be made to get the pieces to better align.
Steve Spear (00:54:34):
And Gene, I think that starts getting into the dynamic piece, right? Which is the structure is that the process flows A to B to C. And by saying it goes A to B to C and not A to B to D back to C or whatnot, that establishes certain cycle times and lags, et cetera, et cetera. But then to add onto that, to layer on top of that this additional feedback dynamic, which is all right, we've made a prediction about what our relationship should be, and if they are that, we'll achieve our mission. And then to provide the feedback to tell us we're wrong, then that creates the opportunity to start the self corrective, self improving motions, which absence of feedback, wouldn't allow
Gene Kim (00:55:20):
Steve, I'm thinking about one example in the book. I mean, I think they were in so many situations where they were ready to start the raid, but they were missing one component. I think it was like a special type of helicopter or drone surveillance. And after having to scrub these missions, that they were able to actually inject a change in the rules in terms of taskings so that would free up this particularly scarce resources. It seems like one example of one of these kinds of self-correcting elements of the story. Does that resonate with you?
Steve Spear (00:55:52):
Yeah, it sure does. Look, Gene, it's not peculiar to special operations commands in Iraq. We did work with DT Energy, provider of electric and gas in Detroit and large parts of Michigan. And they had field service crews who got their assignments, which was to go out and either do residential calls, business calls, safe dig kind of things. These guys and gals would get out to their trucks ready to go and the trucks weren't ready to leave. They were missing parts. They were missing supplies. They were missing tools. Perhaps the engine didn't start because a Michigan winter, the engine block heater hadn't properly operated. What they did is they started saying, "Well, something that seems as banal as whether the truck will start or not is having huge impact on the efficacy of our organization."
Steve Spear (00:56:42):
And so what they started doing is, "Well, we've got to change the dynamics. We have to do is get out of a mindset where our field service workers have these problems and just sort of have to suck it up, and instead start getting into a situation where at the first sign of difficulty, the plug isn't working for the engine block heater, or these supplies are missing, I can't locate this or that, we're going to have that realization of deficiency trigger an escalation that internal will resolve in a corrective action."
Steve Spear (00:57:15):
And for what it was worth, the ratios were insane. So when the folks working at some of these service stations, this is where the places from which trucks would originated in the morning, the crews would originate in the morning, from the time they got started at these service stations within a few months, they doubled and tripled the productivity of their crews without any added hours or overburden on the crews. They were able to do twice and triple as much work in support of the needs of the community than they had when they started. And again, it always was about shifting the dynamic from just accept and absorb problems to use problems as a trigger to make things better.
Gene Kim (00:57:58):
Awesome. And so there's another example from citing to capture where how decisions were made were changed, the notion that the people working the problem could make decisions, the decision making was pushed as far down the organization as possible. Would you call that structure or is that part of dynamics?
Steve Spear (00:58:20):
Yeah. So look, you start to think about why is it necessary in some organizations to have a situation which is local to you and to me, but we've got to go way, way up to get permission to deal with it. And I think that gets into a problem of visualizing the process in the enterprise. Let's start off with the positive. If you know your role and the relationships your role implies, and I know my role and the relationship my role implies, then both of us can have a pretty good conceptualization about what changing our behavior, what impact it will have on the system. So that's one thing. And if you and I both have a very strong visualization of how our part fits into the whole, and consequently we have a very strong sense of how changing our behavior will or won't affect the environment around us, then that gives us an awful lot of latitude to act and also to censor ourselves. And there are times that we think we have to act, but we realize that our action will have implications way beyond where we want to have perhaps second order deleterious effects, and we escalate.
Steve Spear (00:59:33):
Now let's flip this around. What if we're working in an organization where we lack a clear visualization of the structure. I don't know how my part fits into the whole. I just know I have a part and I'm audited on the part and I'm checked on the part, but I don't know how it fits into the whole. If I have a problem, there's a lot of reason you don't want me to act in response to that problem, because I don't know the unintended consequences of my actions because I don't know how my part fits into the whole. So I don't know how my altering the Delta in my role will affect my relationships. So in that case, we have to escalate and escalate and escalate up through the bureaucracy till we find eventually someone who's got enough system perspective to say, "Steve, that's okay. You won't screw with Gene if you make that change."
Gene Kim (01:00:23):
And to concretely land this point, so what was present in the structure and team of teams that allowed the frontline workers to be able to make all the decisions without having to escalate all the way up the chain?
Steve Spear (01:00:38):
Yeah. So I think it gets back to what you were describing and crediting General McChrystal and his colleagues with doing, which is they created these various platforms on which people could see how they fit into the larger whole, the large conference room, the displays, et cetera. And they could also see how what they were doing was somehow compromising the performance of the whole. And with that in mind, their role in this larger system and the consequences of their actions or inactions on the performance of that system in terms of achieving its mission, then that could give them latitude to do the local corrections that improved how well they fit into the whole thing. Yeah.
Gene Kim (01:01:23):
And so it occurs to me that one of the most concrete examples is the Toyota and Encore, probably the most famous tool in the Toyota production system. But let's talk about that in the context of the dynamics of an organization. Now, one of the things I love about the parsimonious nature of structure and dynamics is how much we can describe what we see kind of in this almost absurdly mechanistic setup, right? We have structure, we have dynamics, we have things that emit signals. And then we have things that receive signals. And there are things that we can do in dynamics that allow signals to be amplified. And where there are things that we can do to allow, that enable or guarantee that signals will be extinguished.
Steve Spear (01:02:05):
Gene Kim (01:02:06):
So our common friend, Dr. Sidney Dekker, who wrote Just Culture, Safety Differently, and many other books is credited for saying that, "A safety culture is one where it's safe to tell the boss bad news." So obviously I think we can create a picture in our head of a dynamic where we encourage people to describe that we're obviously ignorant and we don't have the tools we need to solve the problem at hand. And we can probably create an equally vivid picture of one where no one's going to speak up because everyone knows that the person who raises their head will have it lopped off. Can you talk about that?
Steve Spear (01:02:40):
Yeah. So Gene, it begs the question, taking to the absurd, if you're a leader, why would you want anything other than bad news, right? Because if you get good news, what does good news say? Good news says that the predictions you've made in terms of the objectives for your enterprise, the roles and responsibilities, the methods, everything is working. And essentially good news says there's nothing for you to do, Gene. Predictions you made there, they're being validated and in practice. And in fact, not only is there nothing for you to do, we don't want you to do anything because it can only make the situation worse.
Steve Spear (01:03:17):
So in order for you to have value as a leader, you actually need something wrong, some bad news where you can engage in some collaborative, creative problem solving to make things better. So look, now we can acknowledge that human egos and human psychology being what they are, that if our days are spent getting nothing but bad news, we would be crushed by that burden. And so we have to give people good news because of ego gratification, self esteem, just positive feedback, et cetera. But really in terms of us as value creators, in order to change and be part of the incremental improvement, we need bad news, not good news.
Gene Kim (01:04:03):
So I think there were three examples that you say are the paragon of using bad news, scientific thinking, feedback into our standard work. One was the Toyota production process. I love the example you gave of the Apollo Space Program. And when I say Apollo Space Program, I really mean Mercury, Gemini, Apollo, and part of the work of Admiral Rickover. Can you talk about what they all have in common? What are the critical components of dynamic rapid learning?
Steve Spear (01:04:34):
Yeah. So I think in the case of NASA accepting on itself the challenge to send a man to the moon and return him safely to Earth by the end of the decade, the real articulation of Toyota's management system in the late 50s and the creation of the culture within the Navy's Naval reactor program, I think each of critical inflection points, leaders, and I suppose everyone else, looked at the challenge ahead of them and said, "We're just not capable." It's kind of Wayne's World. "We're not worthy. We're not worthy." But really that was the self description. We're not capable. So [inaudible 01:05:16] quickly, Toyota in the 1950s made its first venture into the United States market with a car that was horrible. It was the Toyopet. It fell apart. It rusted. It underperformed. And Toyota was really inefficient in making an ineffective car.
Steve Spear (01:05:33):
And at that point they had a choice, the Toyota leadership did, of blaming the quality of the workforce, the quality of the material suppliers, the quality of the capital equipment, et cetera. That was an alternative. But the alternative they actually took was to say, "The reason we're ineffective or inefficient making an ineffective car is we simply know how to harness the resources available to us better. And if we want to make an effective car, then what we have to do is see the deficiencies in our knowing and see the deficiencies in our doing and make steady progress from where we are to where we need to be." So that's the Toyota folks. We're just not capable and the way we become capable is we learn by recognizing our deficiencies.
Gene Kim (01:06:17):
Again, not to focus too much on the tool, but the notion of the Andon Cord was that anyone on the front line would be thanked for exposing an ignorance, a deficiency for trying to solve a problem that the dominant architecture or the current processes didn't foresee.
Steve Spear (01:06:33):
Right. That's right. I mean, basically the way the Andon Cord works is that you, Gene, have asked me, Steve, to do something I can't. And I'm calling it to your attention so that the deficiencies in my doing become a reflection of deficiencies in your thinking, your planning, your designing. And we're going to use the deficiency in my doing, the bad news, as a trigger to get together and try and improve on your thinking and my doing, our thinking, our doing.
Gene Kim (01:07:02):
And the way that's institutionalized amplifies signals within the system to self correct.
Steve Spear (01:07:09):
Gene Kim (01:07:09):
Right. Let's go to the Apollo program. What examples there that demonstrates the learning dynamic inside the Apollo Space Program?
Steve Spear (01:07:16):
Yeah. So if you think about President Kennedy's challenge in 1961, he said, "All right, what we want to do is launch a rocket from the earth, have it head towards the moon. And when it gets to the moon, have some portion of that rocket descend to the moon. Then ascend off the moon's surface. And then again, in some combination, bring those astronauts back to the earth safely." And the folks at NASA said, "Hey, that sounds great. I have no idea how to do that, but it sounds like really inspirational, exciting thing to do."
Steve Spear (01:07:45):
And so they organized the program where, built on the premise of not yet being capable, they never, or doesn't give the appearance that they sat down and tried to think their way to the right answer. There was no sort of conclave of NASA geniuses going into a room and coming out and saying, "Oh, this is what Apollo 11 will look like. This is how we're going to build it. This is what we're going to ask Neil Armstrong, Buzz Aldrin and Michael Collins to do. And if everyone follows our design intent, we'll get a man to the moon and return him safely to the Earth by the end of the decade."
Steve Spear (01:08:22):
Instead, what NASA did is they said, "Well, what's the first problem we have?" And it turned out it was launching a rocket that did not explode and destroy its payload. And they went through a lot of effort to solve for that problem, then figure out once it's launched, how do you get it to land? And it was only after solving that first layer of problems that they trusted themselves to ask Alan Shepherd to prove the solutions they developed so far and take the first American manned flight into space. Again, it was 18 minutes, did an orbit, got nowhere near the moon. But it was the validation of the things they had accomplished so far.
Steve Spear (01:09:02):
And then as you go through the Gemini Program-
Gene Kim (01:09:04):
Now Steve, if my memory's correct-
Steve Spear (01:09:04):
Gene Kim (01:09:05):
Right. So we're not even talking Apollo yet, right? This is not even Gemini yet. This is Mercury. This is the two programs Before the Apollo program.
Steve Spear (01:09:14):
Yeah. Early 1960s, ticker tape parade, all because the guy went up and took an 18 minute flight. Then you start thinking about the distances, right? Because the moon is 250,000 miles away. And let's say Alan Shepard went 250 miles away from the surface of the Earth. So he went one tenth of 1% there on a flight that was 18 minutes where the Apollo missions ran into beyond a week. So what he did in terms of order of magnitude was pretty close to zero. But you know what? The starting point was zero. And this was actually a big accomplishment.
Gene Kim (01:09:47):
Right. So march us through kind of all the challenges that Mercury, Gemini and then Apollo took through to be able to achieve the mission seemingly in one step.
Steve Spear (01:09:56):
Yeah. There were a lot of flights spread over several programs. We won't go into all the detail, but if you think about it, so you've Alan Shepard, first flight goes up and he comes down. John Glenn goes up, he orbits a couple of times, comes back down. Then you go through Gus Grissom and some of the other astronauts. And then you finally get to Gordo Cooper who takes a very, very long flight on Mercury and really is sort of gets to the limit of human performance in terms of how long one person can travel in a space capsule before being incapable. Then you get onto the Gemini Program, which became the platform on which a lot was learned, not about long distance travel, but long endurance travel, where crews of two could go up. And because it was crews of two, they could orbit not exactly indefinitely but longer periods because someone could work, someone could rest and switch roles.
Steve Spear (01:10:44):
And once you had crews of two on longer duration, then you could do some other practice. Flying in formation, rendezvous, spacewalks outside the capsule, on and on. And then that picks up into the Apollo Program, where having built up a tremendous repository of knowledge about how to manage long duration flights, you could start asking questions about long duration and long distance. And again, not to consume a lot of this with just a nerd version of Inside Baseball, you see the same approach on the Apollo Program. The reason it took Apollo 11 to land on the moon is because eight, nine, and 10 had the job of catching a lot of the incremental learning that had to be pulled together so Neil Armstrong and Buzz Aldrin could actually land safely and then come back off the surface of the moon.
Gene Kim (01:11:36):
I think this is awesome. In fact, one of our calls, you had this wonderful phrase. It took countless small steps that made it look like one giant leap, right? It was this incremental learning and this dynamic of ruthless iteration to eventually get to Apollo 11.
Steve Spear (01:11:54):
Yeah. Neil Armstrong gave a very poetic line when he said, "Small step for man, giant leap for mankind." It wasn't exactly, but it was fairly truthful a small step because when you ask the average person who was Neil Armstrong, they say, "Oh, he was the first American to go to the moon." And it's true, he's the first one to step there, but he wasn't the first one there. The crew of Apollo 10 actually had gone all the way to the moon. They had orbited the moon and they actually started descending to the moon. They got within 47,000 feet of the surface of the moon before they aborted their mission. It was planned to abort, not due to an emergency.
Steve Spear (01:12:33):
And the reason for that was that, that was Apollo 10, the crews of Apollo 8 and Apollo 9 had done a lot of the homework about how you get to the moon and a lot of the piloting necessary to descend successfully. So when Neil Armstrong said, "Hey, me and Buzz, we took a small step," not exactly a small step, but it wasn't a giant leap. What they had accomplished was the last 47,000 feet of descent, a little bit of walking around, and then 47,000 feet of ascent back to their rendezvous. And after that, up to that 47,000 feet on their way to the moon and everything from the 47,000 feet back had been done on Apollo 10 and previous missions.
Gene Kim (01:13:15):
That's so interesting. So that was really kind of the increment of the backsides of the learning between Apollo 11 and Apollo 10.
Steve Spear (01:13:26):
We were joking about Alan Shepherd getting one tenth of 1% of the entire mission accomplished. And you could sort of argue that the accrual of Apollo 13 also was in the one tenth of 1% of what they added to what was already known. And again, I'm saying this in a very provocative way to get attention and hopefully generate a humor. But let's be realistic. The heroism, the bravery, the capability of these people is just incomprehensible for me. And I say this, but joking just to make the point.
Gene Kim (01:14:02):
One of my learnings is the people who get all the credit are the one tenth of 1% on either side at the extremes.
Steve Spear (01:14:08):
Yeah. Yeah. I guess that's our aspiration, Gene. We want to be one tenth of 1%.
Gene Kim (01:14:14):
Awesome. So before we revisit the space program, let's talk about the concrete observable learning dynamics within the US Naval reactor core. Tell us about.
Steve Spear (01:14:24):
Yes. So the background on that is that coming out of World War II, a lot of people were enticed by the potential of using fission, not in a destructive fashion, but in a productive fashion to create energy. The US tasked a guy named Hyman Rickover with leading the charge on figuring out how to do this in a meaningful fashion. And it's funny because anecdotes about Rickover, and if you see a video of him, he comes across as a very aggressive, even condescending personality. But I think beneath that superficial layer was someone who had a fair amount of positive humility. Because when he was tasked with the challenge of building a program, his starting point was, "I have no idea what the answer is. And the only way to achieve the answer is to as aggressively, as energetically as possible, find out what we don't know, so that we can use that to orient ourselves towards where we should invest time and effort to figure out something new."
Steve Spear (01:15:34):
And in my book, in chapter five of my book, I give these various examples from very, very early on in the program where Rickover realized that his challenge as program leader wasn't the science and technology or necessarily organization. It was about creating an organizational dynamic in which it was not only safe, but actually required for people to reveal things they discover, things they didn't understand as a trigger to inquire further.
Gene Kim (01:16:04):
Let's go through a couple of them. The ones that come to mind are just how extensively he would model how important it is to be able to say, "I don't know." And you gave an example of how he presented himself in a class full of other-
Steve Spear (01:16:18):
Right. So this story goes back to literally the first days of the program where Rickover took what was then must've been a small team to Oak Ridge, Tennessee to learn what had been discovered by the Manhattan Project people. And Rickover at the time was a senior officer who was a captain in '06 in the Navy, but wasn't yet an Admiral, wasn't a flag officer. And it's time for the first presentation and Rickover made point to be first in the class, center seat front row. As soon as the lecture started, his hand goes up to interrupt. And when the instructor, with deference to Rickover's rank and age and seniority and whatnot, asked him if he has a question, Rickover says, "Yeah. What the hell are you talking about? I'm having a really hard time tracking you." So the instructor slows down, backs up, tries to reexplain it. And then minutes later, Rickover's hand goes up yet again. The guy says, "Oh, do you have another question?" And Rickover says, "No, it's the same damn question. I have no idea what you're talking about."
Steve Spear (01:17:13):
And this goes on several cycles according to accounts in the autobiography of one of his colleagues. And finally the instructor loses any sense of deference towards rank and seniority and age. Goes, "It's clear this is over your head. Perhaps you would like a tutorial this evening." Which Rickover says, "Yeah, name the time and place." Anyway, the way this account goes in this book, The Rickover Effect by a guy named Ted Rockwell, Rickover and his team go back to the room assigned for this tutorial. They get there punctually and discover that every seat is already taken. And the reflection this guy Rockwell has is that these other people who had dutifully been taking notes and nodding their heads up in sort of signs of acknowledgement that they were tracking what was going on, they also hadn't tracked the lecture in any meaningful fashion. They had just been too ashamed to raise their hand and draw attention to themselves.
Steve Spear (01:18:11):
It's kind of interesting how Rockwell explains this, because he does it twice in his book. First time he says, "Wow, at first I thought here was the Captain and later the Admiral, so humble and self secure that he is willing to raise his hand and admit he didn't know. And man, that was really a good example for the rest of us." And later on in the autobiography of this book, The Rickover Effect, Rockwell then reflects. And he says, "Come to think of it, there was no way that Rickover was stumped over what was being discussed. He was the smartest guy in the room and plus he was obsessive compulsive and he probably had reviewed all the literature and material available about the topic. And there's no doubt that he understood that topic better than the instructor himself." So Rockwell starts reflecting in the book about why Rickover would have been raising his hand and interrupting.
Steve Spear (01:19:05):
And he comes to the conclusion that Rickover realized that he was now tasked with something that was nowhere near any dominant architect, no architecture at all. This wasn't a case of managing a program of people that come in and just go demonstrate their ability to execute already establish routines. Everything had to be invented. And the obstacle to invention was the admittance of ignorance. And so this guy Rockwell starts reasoning through what was going through Rickover's head. And he said, "Yeah, yeah. He was tasked with an important undertaking. Build a fission based reactor." And that meant the invention of new science and new technology and new policies, procedures, training, contracting, et cetera. But he said Rickover, when he was really thinking about the most fundamental obstacle to success, it was people's unwillingness and hesitation to admit they didn't know. And so day one, he took opportunity to say, "This is how we run the program. If you don't know the answer or you're having a problem, you raise your hand and draw attention to the fact so that we can get together and come up with a better solution than the one we have."
Gene Kim (01:20:18):
I love it. And so to use the language of structure and dynamics, by doing that, I do agree that Rickover's really modeling, causing signals to be amplified to a degree that it normally never would without that constant modeling of behaviors that he wanted.
Steve Spear (01:20:35):
That's right. That's right.
Gene Kim (01:20:37):
And there's another example. I think the example you gave was report when there's too much radiation, but also he wanted to know if there was too little radiation. We don't care about just one side of the variance curve. Right?
Steve Spear (01:20:50):
Right. Yeah. So this is another anecdote that's in this book, The Rickover Effect by this engineer, Ted Rockwell. And Rockwell describes an experience where he's trying to internalize Rickover-
Steve Spear (01:21:03):
... describes an experience where he's trying to internalize Rickover's insistence that we make a strong declaration of what we believe to be true, in large part to see where we're wrong. And Rockwell is describing the first set of tests on the shielding that would go around the reactor to protect the rest of the ship and the crew and all of that.
Steve Spear (01:21:23):
And he describes how his engineering team is so excited because they'd worked so hard on the design and the construction of this prototype that was going to be tested, and they're just ready to push the button and get this thing to run. And around it is laced all sorts of sensors to measure emissions and escapes and that sort of thing, and he too is excited, and he's about to authorize and say, "All right, let's run the thing," and he said, "Wait, wait, wait, wait a second. Why did we design the shield?"
Steve Spear (01:21:52):
And the answer back is, "Because we've been told what acceptable emissions are, and we've designed the shielding to keep emissions below that." He says, "All right. Well, all right, let's say that acceptable level," and I don't remember what it is, but let's say it's 10, says, "How do we react to an 11?" He says, "Oh, well, 10, everyone understands that. If the goal was 10, and the acceptable limit is 10, then 11 is bad news."
Steve Spear (01:22:15):
He said, "Hey, all right, got that." He says, "Well, if we've got sensors and we've designed to 10, how do we react to nine?" Now, you think about how most of us would react to nine, nine is better than 10, so we would celebrate that.
Gene Kim (01:22:30):
Steve Spear (01:22:31):
Look at that. We're so smart that not only did we accomplish 10, we have secret smart. Not only the known smart, which got us to 10, but the secret smart, which gave us that turbocharge push to nine. And Rockwell challenged back, and he said, "Well, did any of us design for a nine?" "No, no, no. We designed for a 10." And he said, "So what does nine actually tell us?"
Steve Spear (01:22:56):
And where he got the team to concede was that nine wasn't a sign of super smart or turbo smart or secret smart, that a nine was an indication that they had gotten something wrong, that there was something they didn't understand about the reaction itself, the way the materials fissioned, the materials that surrounded it, et cetera, et cetera. There was something they simply didn't understand. Maybe the emission actually was nine, but it was designed to be 10.
Steve Spear (01:23:24):
Maybe there was something about the sensor. There was something wrong that the sensor read a nine and not a 10, and this was in part what Rockwell uses as an example to illustrate this discipline of engineering, that every action is taken based on belief. And if we want to get better and better and quicker and quicker, what we have to do is make sure that we articulate what we believe to be true, that this type of design will achieve a 10, so that if we're wrong, whether it's an 11 or a nine, we know where we were wrong and where we need to inquire further.
Gene Kim (01:23:59):
Gene here. I want to take a moment to reflect on the things that Steve is saying and its implications on coding, on things like blameless postmortems. So in the context of coding, a lot of us are familiar with assertions and things like test driven development. Assertion statements will cause a program to fail when things that we assert aren't true, such as a certain variable being more than zero or that a string is not empty. Similarly, in test driven development, we write the tests before we write the code.
Gene Kim (01:24:28):
So we start with a test failing, and we'll continue to iterate and refactor until the tests finally pass. Both involve automating the tests of what we think reality looks like and give us rapid and very visible feedback if they're not actually true. I find assertion statements to be astonishing the more I think about it. What it seems to suggest is it's better to have the program fail right away than to continue as the difference between reality as imagined and reality as it actually exists keep getting bigger.
Gene Kim (01:24:59):
To use Spear's language, we want to constantly test our assumptions early, quickly, and frequently where we have the strongest link between cause and effect and have the ability to deflect bigger problems. The work that John [Allspaw 01:25:11] has been doing on studying how organizations respond to incidents and outages is equally fascinating to me. The reason that he's so focused on studying outages is that these outages and incidents are important because they disrupt business operations.
Gene Kim (01:25:26):
But what dazzles me about John's work is that he believes by understanding how organizations handle incidents, we get incredible clues on how organizations learn. He asked of senior leaders, how often are teams reading the post-incident reviews? How often are those post-incident reviews being read from people outside their team? How often are these reports being referenced in code fixes? All these questions help us understand, to what extent are we changing how daily work is being performed by everyone in the organization?
Gene Kim (01:25:56):
So whether we're talking about things in code or things in our organization, the reason is the same. We do this because the more we rapidly learn, the more we can rapidly and decisively out-learn our competition. Back to the interview. I don't think we can leave this topic without first demonstrating to what extent the Rickover program achieved its goals. You have a wonderful comparison between the US Naval reactor program versus the adversary at the time, for much of the time, which was the Soviet Navy. Tell us about that.
Steve Spear (01:26:27):
Look, the consequences are fantastic. So the US was quote unquote first to market with an atomic powered submarine, the Nautilus, in 1955 underway under nuclear power compared to the Soviets who lagged. So now, while you had the Soviets, they were recovering from the second World War, et cetera. But over the following decades, the Soviet Union threw tremendous resources into its own submarine community, its own submarine program.
Steve Spear (01:26:53):
And maybe they got a late start, but they certainly had an opportunity to catch up. What we know about the Soviet experience with nuclear power onboard submarines is that they repeatedly, regularly had disasters, every year or two, having a failure on board that costs the crew its lives and costs the Navy its ship and caused the environment all sorts of degradation, and in comparison, this Rickover effect and this discipline of engineering, this constant aggressive pursuit of seeing what we don't know so we can get smarter and better.
Steve Spear (01:27:26):
The US Navy experience with nuclear power has been that since 1955, it's been perfect, that there's been no reactor failure that's caused environmental damage, no reactive failure that's caused human harm. And is it hundreds of ships since the Nautilus, crews which could never have been influenced directly by Admiral Rickover because they were born long after his passing, and they still continued to rack up a perfect record.
Gene Kim (01:27:53):
This is great. So we talk about these exemplars of Toyota, of the US space flight program through Mercury, Gemini, and Apollo. We talked about that US Naval reactor core. [inaudible 01:28:05] set up a structure and dynamic so that weak signals are amplified. Let's talk about one example where these weak signals got suppressed.
Gene Kim (01:28:13):
And you and I, we were talking about a Harvard Business Review article by Dr. Amy Edmondson and Dr. Michael Roberto, Dr. Richard Bohmer about what they called the two types of programs: experimental culture, which we think was embodied by Mercury, Gemini, Apollo; versus a standardized culture, which is probably best embodied by the US space shuttle program where they talk about how so many weak signals were suppressed by the system, and that led to a horrendous loss of life and tragic outcomes. Can you talk about, what are the conditions that lead to signals being suppressed and maybe in the context of the space shuttle program?
Steve Spear (01:28:56):
Yeah, so I think credit to NASA for both programs. Is an incredible inventiveness of the scientists, engineers, and the incredible bravery of the people who agreed to sit atop these rockets and be slingshotted into outer space. It's insane. I mean, we all ... not all of us, but many of us dream of being astronauts, but given the chance, would we actually accept the invitation to sit on top of one of those things?
Steve Spear (01:29:24):
I think most of us, even given the opportunity, would shy away. My impression of the Apollo program is that it accepted a challenge of sending a man to the moon and having him return safely to the earth for which there was no answer. And this was true of Admiral Rickover also and I think the senior leaders at Toyota back in the '50s: They had a problem for which no answer existed.
Steve Spear (01:29:49):
And so when they got into the business of defining their own personal objectives and what it meant for them to have accomplishment, accomplishment had to be discovery and lessons learned because they were starting from which there was no meaningful answer, and they were asked to come up with a solution and a solution that actually worked well in practice. And so when the folks at NASA in the Mercury, Gemini, and Apollo era had to justify what they were trying to do ...
Steve Spear (01:30:17):
Yes, end of decade, there was a standard: man to the moon and back safely. But when they had to explain budgets, I'm willing to bet ... and I'll bet that if you go back and look at congressional testimony and whatever else and memos and whatnot ... the justification is we need time and money to learn something which currently we don't know. All right. Now, let's move that forward.
Steve Spear (01:30:42):
You get to the space shuttle, which by its title is an operational technology. It's not an experimental technology. It's not like space shuttle X, X1, X2, X15, X33, X35. It's a shuttle, and a shuttle implies that the measure of success with the shuttle is that it shuttles things back and forth. And once you start framing things as a being a shuttle and putting standards of accomplishment and successes to how many loads did you shuttle back and forth, it creates these pressures for operational tempo, and [crosstalk 01:31:22].
Gene Kim (01:31:22):
In that paper, it says NASA, they were promoting to Congress the space shuttle as a cheap and reasonable spacecraft. Frequent, on time, these are the words, the values of the space shuttle program, to your point.
Steve Spear (01:31:33):
Right. And look, Gene, it requires a very deep research, which I've not done, nor have I read, to understand why they felt compelled to have a manned spacecraft defined that way, because we're back in the era when if you want to send something into space which doesn't involve a human, you don't send a human; you send a rocket and a robot, launch a satellite, launch an experiment, whatever else it is.
Steve Spear (01:31:58):
You don't send a person. Why NASA felt that manned flight was important ... and maybe there was some sort of agenda there in understanding how people behaved in space to go back to the moon long distance, whatever it is ... they felt they had to justify themselves in terms of this low cost load launching.
Gene Kim (01:32:16):
And in fact, they set a great example in the early space program. They would send test pilots who are basically signing up for high risk endeavors whereas in the space shuttle program, we sent up teachers. And it's just showing to what extent we believed what the risk was to send humans into space.
Steve Spear (01:32:32):
That's right. And all of that stuff, whether you call it a shuttle or explain its purpose as a low cost, high frequency delivery of payloads into space; or you put teachers and once upon a time astronauts but now old senators on board; what you're saying is, "We've got this sucker figured out." And once you have something figured out, your deliverable no longer is discovery, but your deliverable is repetition.
Gene Kim (01:33:00):
And so connect the dots about how that leads to suppression of weak signals. For example, how we dealt with foam strikes upon launch that led to these tragic events. I just mentioned foam strikes. This refers to the 2003 explosion of the space shuttle Columbia.
Gene Kim (01:33:18):
In the HBR article written by Dr. Edmondson and colleagues, the authors describe how NASA managers spent some two weeks downplaying the seriousness of a piece of foam breaking off the left side of the shuttle at launch, which had fatal consequences 16 days later when the Columbia destroyed itself as it reentered the Earth's atmosphere. The article describes the two years the author spent studying the incident and talks about the culture where these weak signals were lost in the system and ultimately prevented fruitful action that could have saved the shuttle.
Steve Spear (01:33:50):
If your goal is discovering things that allow you to get to the moon and back safely, and your starting point is the self-admittance that you don't know how to do that, then your daily deliverable is something learned. And you can imagine in the early days of NASA that, yeah, there was a lot of talk about, "We built this. We ran this. We tested that," but it was all in service to the thing we learned. When you start shifting your priorities to the physicality of what got moved, what got made, then signals that say, "Hey, Gene, we're having a problem," run up against the pressure to say, "Yeah, yeah, but you have to launch," or, "You have to ship," or, "You have to make. You have to deliver."
Steve Spear (01:34:36):
And we see this suppression of weak signal in so many settings where there's this increased dominance of operational tempo as the requirement, whether it's the operational tempo to see patients and see patients and see patients and consequently work around problems that actually impede the seeing of patients; or it's the operational tempo to launch space shuttles; to maintain some very rough approximation of what was originally promised in terms of cheap and frequent; or it's the pressures of operational tempo to deliver new models of things into the marketplace even though we haven't quite worked out all the bugs. All that pressure to adhere to the operational tempo allows us to be more and more accepting of things which once were not acceptable.
Gene Kim (01:35:29):
And in fact, the normalization of deviance, the foam strike is okay because it's happened before. And so therefore, that validates the notion that it can be safely ignored.
Steve Spear (01:35:40):
So I think to be even more charitable, so I think Diane Vaughan popularized that term normalization of deviance. Well, it wasn't designed to do that, but it did it, and no ill consequence. Now, it was designed not to do that. It wasn't designed to do that, but it did it again. It's like, "Oh, I guess it must be okay." And I don't think Dr. Vaughan meant it in such a sloppy fashion.
Steve Spear (01:36:02):
I think what it was was that you have to think about the other side, the offsetting pressure to maintain an operational tempo, which we didn't design it to do that. It went wrong, and we know it's wrong, but we have this operational tempo to which we have to adhere, and now we have a dilemma. Either we can come back and examine and study and investigate and experiment around this thing which went wrong, or we can keep to the operational tempo, but we can't do both.
Steve Spear (01:36:39):
And once we're in the face of that dilemma, then it takes a tremendous amount of self-discipline to say, "No, no. Of the two alternatives, I'm going to pick the one where discovery is the output and not parts made or launches completed."
Gene Kim (01:36:58):
Let's end with one last very relevant contemporary example, which is around the COVID-19 pandemic. You have another example of a system where it wasn't safe to tell bad news, and you talked about the setup of the surveillance network post-SARS where China, at one point in time, had a phenomenal network to be able to detect outbreaks. And yet, that sensor surveillance network went silent. Can you talk about some of your speculations about that and why it's important that we create a system where it's safe to tell bad news?
Steve Spear (01:37:34):
So I think this issue of operational tempo as a phrase that captures the pressure to keep going comes in. So what I understand ... and again, let's put my expertise in some context here. It's based upon reading one article which appeared in the New York Times about a week or two ago, so I'm not exactly the world's authority on this ... but according to the New York Times, so we're getting closer to the world authority on this, they said that after the SARS pandemic or epidemic that China didn't want to get caught flat-footed again by a highly infectious disease.
Steve Spear (01:38:13):
And so they created the infrastructure that if there were starting to be growing hotspots of infection locally, news of that could get elevated quickly towards central authorities so they can start making decisions. For example, are you going to quarantine, stop transportation, provide resources, et cetera, et cetera? And it makes perfect sense, which is as we understand from the exponential way in which disease propagates, seeing something early and containing it is of huge, huge, gigantic benefit.
Steve Spear (01:38:46):
Well, all good in theory, but then you have the local officials. Now, a local official is tasked with what? He's not tasked with discovering that things are going wrong. The whole reason you have local officials is to make sure that things are going right. The buses run at the time; the trash is picked up; people are getting to school; no one is going hungry; people are staying healthy and well.
Steve Spear (01:39:11):
According to the article, what happened was as healthcare professionals started recognizing the presence of this highly infectious, potentially dangerous disease and started reporting it, the local officials through whom they had to report were like, "Nah, nah, nah, it must be an outlier. Nah, nah, it's an aberration. Nah, nah, that's the exception to the rule. Everything else is working fine."
Steve Spear (01:39:37):
And look, the cynical answer would be, "That's heartless. That's heartless bureaucrats who are just trying to protect their own turf and protect their own reputations." Yes and no. Those quote unquote heartless bureaucrats were exactly the same people who were responsible for making sure things were working well. And they probably wanted things to work well, and they probably were ill-equipped by both temperament and position and authority and whatever else to respond to things not working well. And Gene, this is just thinking out loud is it does make us wonder if news of COVID got suppressed at the local official level because it ran counter to the pressure-
Gene Kim (01:40:23):
The smooth operation of daily life.
Steve Spear (01:40:26):
That's right. That's right. Here in the US where we're a month or two or three behind China, our reporting mechanisms through similar channels which would be forced into a very difficult dilemma to report problems when in fact their job is to more often than not deliver good news. So it's something to worry about.
Gene Kim (01:40:47):
Well, Steve, I cannot tell you how much I appreciate every interaction that we've had, how much I've learned from you, and how much I value all these working sessions that we have. And I'm looking forward to be able to share more of what I've learned and what we're working on together in the weeks months to come.
Gene Kim (01:41:03):
So thank you so much for the time today, and I'm so excited that you were able to share so many of your unique perspectives with not just me but everyone else. So can you tell everyone how to reach you?
Steve Spear (01:41:16):
Oh, thank you. Well, first, Gene, before I do that, let me say thank you. These conversations have been very much to my benefit, and I'm delighted they're to your benefit, and you're welcome for that, but thank you for the fact that they're to my benefit. Back to a basic value you and I both espouse, and I hope we live to to some extent, is that you don't know what you're wrong about, and you don't know what you have to fix and improve, until you actually make a declaration about what you think is right and then find out that in fact you're not.
Steve Spear (01:41:46):
And the generosity you've had of time and the enthusiasm you've had of interest in talking about such topics has given me any opportunity to ... Well, it's been forcing to say what I think is true and created the opportunity to find out that it's not and then cycle back and make it at least truer, if not truly true, so thank you for that. In terms of reaching me, a couple of ways. If people want to find me, LinkedIn, Steve Spear, I'm there somewhere.
Steve Spear (01:42:13):
Email, [email protected], short for High Velocity Edge llc.com. And one last thing I just want to plug since you've given me the opportunity with the several mentions of my book is that one of the things we've tried to do is give opportunity for people who have problems to call them out to those who need to know by solving for the problem of workforces that are distributed and experts that are centralized and remote. The theme of this work has been in order to learn and learn at high velocity, you have to problems in order to solve them. We created a product, See to Solve, and we have a website, www.seetosolve.com, so visit us there and see what we got.
Gene Kim (01:42:58):
Awesome. Thank you so much, Steve, and to be continued.
Steve Spear (01:43:02):
Yes, sir. Thank you.
Gene Kim (01:43:05):
Thank you so much for listening to today's episode. Up next will be Dr. Steven Spear's DevOps Enterprise Summit presentations, both from 2019 and 2020, where he talks about the need to create a rapid learning dynamic as well as how to create them. The 2019 presentation talks about many of the case studies we talked about today but in more detail.
Gene Kim (01:43:26):
And in 2020, he talks about one of the most remarkable and historic examples of creating a dynamic learning organization at scale, which was in the US Navy at the end of the 19th century at the confluence of two unprecedented changes. One was in the underlying technologies which you found in ships and in the strategic mission that they were in service of. As usual, I'll add my reflections and reactions to those presentations. If you enjoyed today's interview of Steve, I know you'll enjoy both of those presentations as well.