Gene Kim (00:00:00):
Welcome to another episode of The Ideal Cast with Gene Kim. This episode is made possible with support from LaunchDarkly. Get total control of your code to ship quicker, reduce risk, and reclaim your nights and weekends. Head to launchdarkly.com to learn more. You're listening to The Ideal Cast with Gene Kim, brought to you by IT Revolution. The last episode I interviewed Dr. Ron Westrum, Professor Emeritus of Sociology at Eastern Michigan University. He developed the Westrum organizational typology model, which I suspect almost everyone in the DevOps community already knows about, because that model featured so prominently in the state of DevOps research, which I had the privilege of working on for six years with Dr. Nicole Forsgren and Jez Humble.
Gene Kim (00:00:57):
In this model, he asserts there is a continuum of safety cultures that fall into three general categories: pathological, bureaucratic, and generative. Pathological organizations are characterized by large amounts of fear and threat. People often hoard information or withhold it for political reasons or distort it to make themselves look better. In bureaucratic organizations, they protect departments. Those in the department want to maintain their turf, insist on their own rules, and generally do things by the book, their book. And generative organizations focus on the mission. How do we accomplish our goal? Everything is subordinated to good performance. If you haven't listened to that first interview, I recommend that you listen to it first, because this is a continuation of that first interview. And by the way, I've listened to that interview numerous times because I get something new out of it each time, I guess I'm hoping by taking good notes and by repetition, I can internalize about what he talks about so eloquently.
Gene Kim (00:01:57):
Okay. In the second interview, I get to explore with him so many topics, expounding upon the topics we discussed last time, such as why he thinks creating generative cultures is more important now than it was, say, 100 years ago, his observations on the ever increasing number of functional specialties and just how long this has been going on, the challenges that arise from having matrix organizations, and tools to overcome those challenges. And we'll learn more about the book he's working on, which is all about information flow within organizations. As usual, I'll break in with some observations, and this time, I'll also share some additional references that didn't make it into the last interview, as well as some corrections.
Gene Kim (00:02:39):
Okay, let's start the second interview. I had told Dr. Westrum just how much I loved, even adored, his book, Sidewinder: Creative Missile Development at China Lake. I was telling him that when I was in elementary and junior high school, I used to pour over books on airplanes, fighter jets, and missiles. I loved reading Hunt for Red October. And so, it was so amazing to read his book, to learn about these famous missile programs described in such detail, and all the heroic efforts required to create them and even the failed programs that never even made it into production. So many topics in that book will seem very familiar to anyone doing software development. So, let's jump into that question right now.
Gene Kim (00:03:24):
First off, Dr. Western, I know that in the last conversation that we had, I told you about just how much I loved that book, I adored it even, just because of how it really brought to life so many of these programs that I had read about in a very cursory way. One of the things that I especially loved was the fact that you contrasted the outcomes of the wildly successful Sidewinder project versus the first Sparrow missile program and the failed Falcon missile program. And I think what resonated so deeply with me was that this Sidewinder program from 60 plus years ago validated so many of the key principles of Agile and DevOps, which in the case of Agile, it goes back over 20 years. I thought that the Sidewinder book showed how some of those unlined principles are required to create outcomes, and it predates so much of what we think of in modern software development.
Gene Kim (00:04:15):
So, you spoke last time about the importance of generative cultures and the royal roads to where it leads. It seems to me that now more than ever generative cultures are needed in almost every domain, so can you speak to why this is the case? Why does it seem like generative cultures are more important now than, say, 100 years ago?
Ron Westrum (00:04:33):
Well, are they? I'm reading a book on... It's called Seize the Fire. It's about a Lord Nelson and the battle of Trafalgar. And basically, I think you could argue that Nelson had a generative culture, his band of brothers, that was the phrase he used. Worked extremely well, and it worked for the same kind of reasons that generative organizations typically work. [inaudible 00:04:54] this is needed so much more today, well, it's certainly true that we have a very complex civilization, certainly from the studies that have been done on high reliability organizations. A generative organization seems to be something that a high reliability organization needs. But they've always been high reliability organizations. I mean, think about building a bridge a couple hundred years ago, if you made a mistake, there were severe consequences. So, what is it about generative organizations that tend to keep you from making a mistake?
Ron Westrum (00:05:27):
And this goes to, basically, the heart of the Westrum continuum and so forth, but it's all about basically providing an organization that uses all the information that it's got. And you need to think about that for a moment. Do you have an organization that is effective in utilizing all the information you've got? Well, obviously in many cases, if we look at accident reports, which is what I used to spend a lot of time doing, accidents are often generated by a failure in information flow. Somebody didn't tell somebody else something or what they told them was not accepted. But in any case, information didn't flow.
Ron Westrum (00:06:04):
Well, so generative organizations, basically, are based on the idea of maximum utilization of the information you've got, but it's absolutely amazing how many organizations are not like that. And obviously, one of the reasons, as far as I'm concerned, essentially, is they don't have the right kind of culture. People don't trust each other. They hoard information or they give out information which is misleading, or they fail to notify somebody about something that's important, and so on and so forth. It doesn't seem like it's a really deep idea, but it's amazing how many people don't do it. And this is something that has come up in one accident after another. In many cases, for instance, there have been previous instances of the same problem, and the takeaway from those previous accidents and so forth has not been absorbed. So, what is it about generative organization that gets people to absorb things? So, the simplest answer, it is simply trust. Okay? But trust is very difficult to build because we know in many organizations there is not a will to tell the truth.
Ron Westrum (00:07:10):
I mean, for instance, there was a recent study by people at the Army War College. So, for instance, they found that in regard to training, the training requirements are such that nobody can possibly do them all. However, you don't want to say to your commander, "Hey, I haven't done everything." The commander expects you to figure out what the important things are and to do those things. But you can't say that you haven't done the requirements because it's a bureaucratic organization. And what they discovered, essentially, is that the organization, the Army, is such that people typically lie to the higher commanders above them. So, here you've got a situation where basically people give out commands, which other people below them can't carry out.
Gene Kim (00:07:55):
And just a point of clarification, was this somewhat recent, in the last decade? Or was this the post Vietnam War?
Ron Westrum (00:08:01):
I mean, the interesting thing is, and I meant to communicate with the guys at the War College, but basically, it is a standard problem in large organizations that people issue requirements which, number one, cannot be changed and, number two, essentially ask you to do something which you really can't do. So, what do you do? Do you lie? Yes, you lie because that seems to work.
Ron Westrum (00:08:28):
I mean, a perfect example of this was the Sidewinder guys, having problems getting the qual shoots to work, they had a little harmonica, and the pilot blew into the harmonica, it would allow the missile to launch. And the problem was built into the system. The radio bandwidth had been made too small. And so, if you had the harmonica, you could warble the harmonica, and then the missile would launch. It sounds really crude, but that actually seemed to work.
Ron Westrum (00:08:57):
Now, of course, the thing is the people who ha who did this couldn't say that they'd done it because it's not operationally correct, but if they didn't do it, they couldn't have the qual shoot. And there's lots of things like that in organizations where unless you say that you're going to do something, you're in trouble. So you say you're going to do something which you can't do, and that's why you need to have trust. Because if you have trust, somebody could say, "Well, hey, I really can't do that," so then you don't have to do it, then you can get through. But people don't like to do that. People don't like to say to the commanding officer, "Hey, we couldn't do this requirement," because it's required.
Gene Kim (00:09:37):
And to concretize that, that could be training standards, it could be a process checklist, it could be requirements-
Ron Westrum (00:09:45):
It could be a safety checklist.
Gene Kim (00:09:47):
A safety checklist.
Ron Westrum (00:09:48):
Gene Kim (00:09:48):
Right. That could result in putting anyone, or a frontline worker, in a position where they essentially have to cheat or lie or essentially represent something that is actually impossible.
Ron Westrum (00:09:58):
I mean, for instance, when the Challenger was launched, there were some 200 things that they didn't do that they had to check off which were necessary to, in principle, to be able to launch the shuttle. So, they basically had to sign off on all these different things and so forth, some of which were critical and one of which, in the end, allowed the shuttle to be destroyed. But the basic structure of many organizations is so rigid that you can't get around these things. And if you don't have a generative structure which allows you some sort of bypass for whatever this is, then you're stuck. [inaudible 00:10:36] say you're going to do something which you can't do.
Gene Kim (00:10:40):
And it seems like, in this case, you're not suggesting bypass, you're actually suggesting some feedback into the process that allows the reshaping of the checklist to actually make it genuinely effective in service of whatever the goal is.
Ron Westrum (00:10:51):
Absolutely. I mean, there's as many cases where basically you got a bad software program. Let me take something from your realm here. You'd have a bad software program, but there's no way to get to whoever created the program and say, "Hey, this isn't working out for me." And quite often, if there is a sort of, "Do this if it's not working for you," and you do that and it doesn't work for you, and so there's no way you can have this higher level discussion to comment on what's going on.
Gene Kim (00:11:20):
Right. So, when I hear bypass, in my head, it goes into a daily workaround or temporary workaround, and what you're suggesting is something deeper than that, that generative culture has to allow for the high fidelity information bi-directional that allows for creation of the appropriate checklist.
Ron Westrum (00:11:35):
Yes. I mean, I'll give you another example of this. One of the hospital directors that I talked to said that he would often go up to nurses on various wards and say, "What's your favorite work around?" So, the nurse would [inaudible 00:11:50] and say, "Oh, this is what I do," and so forth. And, what, 50% of the time or something, the work around with something which was not a good idea. But the nurse had done the work around consistently because that was the one thing that allowed her to do whatever the main operation was.
Ron Westrum (00:12:06):
And so, here you have a system in which people are doing things that they don't feel comfortable with, but they can't talk about it. So, one of the things that you have to cultivate as a leader, basically, is the willingness to listen to things that you might not want to hear. So, that's all part of the issue of trust, and that's what generative organizations do, is they build trust. So, it allows you to make that communication, and then the other person can comment on it and say, "Oh, I didn't realize you were doing that." And so they can fix it.
Gene Kim (00:12:37):
Got it. Maybe just to connect some dots to a point that you brought up in the previous interview, but you talked about one of the contributing causes and the factors that led up to the Fukushima accident. Was that there was an absence of understanding of a certain procedure that led to the accident? In that case, it wasn't that it was hidden, it wasn't that someone's hiding that fact that they were changing the work procedures. One can imagine, in a more generative culture, where there's more wide sharing of information, that event could have been deflected.
Ron Westrum (00:13:06):
Well, yes. I think that one of the interesting features of Japanese culture is there's tremendous unwillingness to contest a point made by higher authority. In one case, for instance, one of the scientists who had been asked about how high was the tsunami likely to be, said, essentially, it could be 40 feet high. And that wasn't a response which was considered to be acceptable, so that was left out of the report. Okay? So, what happened is they planned for a tsunami that might be 20 feet high, and the actual tsunami was 30 feet high. So, you have these situations where people know something, but that something isn't taken into consideration.
Ron Westrum (00:13:50):
It's very similar to the problem of hidden profiles that I've mentioned in another interview, and that hidden profile is when somebody basically has information that other people don't have. But simply because they're the only one that has it, they don't feel comfortable in bringing it forward. The whole idea of a generative organization is you feel comfortable telling the truth, and the truth may be something that other people may have a difficulty with, but you bring it forward anyway. But in lots of organizations, there's rules like, "Hey, you don't contradict the leader, or B, you don't bring up unpleasant information, or C, you don't bring up information that other people don't already know." So, there's all these hazards to ordinary communications. And ordinary communications are responsible for a lot of the kinds of things that we see in organizations when things have to go well, but if you can't communicate what you need to communicate, then that information does not enter into the final solution.
Gene Kim (00:14:51):
It seems like there's another factor at work in terms of why now versus 100 years ago, why generative culture is so important. And what comes to mind is the emergence of knowledge work, where when the work doesn't involve creation of physical things, there are ways that inaccuracies will get surfaced when you try to manufacture something or integrated pieces together. And that is often not present when dealing with knowledge work or software, that often ideas made of thoughts stuff, as a famous computer scientist once said. It seems like that increases the need for high fidelity communications because, unlike the physical world, two things can't exist in one place, so that will reveal in accuracies or errors. Whereas, in software specifically, there's no physicality that will reveal the error, and so that has to be made up somewhere. Does that resonate with you?
Ron Westrum (00:15:43):
I think I'm getting your point. I keep thinking of physical examples like the City Corp building, where basically the building had been built in a different way than the original designer intended, and for some reason, that wasn't communicated. And the only way that they discovered that there was a problem is that some student did a project and discovered... Well, actually, when the student asked the consultant, the person who had really designed the building, about this particular thing, the consultant's answer was, "Oh yeah, we took care of that." And then he had a second thought, and the second thought was, "Well, maybe we didn't take care of that." He said, "I've got to make sure building was built as I intended it was built." So, he called up the actual builders and said, "Well, did you use welds to make the connections of this building?" And they said, "No, we didn't. We used rivets." So, it turned out when they did some checks, basically, the rivets would not hold the building up against a force of wind, for instance, that would happen every 16 years on average.
Ron Westrum (00:16:50):
So, in fact, the building was ready to fall apart under the right conditions and nobody had noticed it, and nobody had thought to say anything about this. And basically, here was something that needed to be checked, and it wasn't checked. And it was discovered by accident. So, the temptation in a situation where something has not happened the right way, to say nothing is an interesting issue, and in this case, the consultant was a very honest person and having called up the builders, he said, "We've got to change this." And they did change it, although during the process of changing it, they didn't tell anybody.
Ron Westrum (00:17:29):
But the thing is, is that we have to be aware that there's all sorts of pressures against telling the truth, against giving the critical pieces of information, so forth. And what generative organizations do is they help expose what the, Jim Reason has called, the latent pathogens, the things that are going to trip you up that you don't know about. And unfortunately, in all the things we do in life, and I'm sure this is true of software, like anything else, there's all sorts of things, assumptions that get made.
Ron Westrum (00:18:03):
And so, you're asking, well, what is it about the 21st century that makes us all more critical? Well, the truth of the matter is we live in a complicated society, we have complicated machines, we have complicated software programs, and we need to know how things are going. And the amazing thing, of course, is that as you create all these different specialties that need to get together to make an electromechanical system work, you often get silos, and silos are one of the reasons, basically, that generative organizations are important, because they allow you to go between silos, between information, bodies of information that are kept apart. Okay?
Ron Westrum (00:18:43):
And so, the thing is the more you have a multidisciplinary context, which all electromechanical systems generate, the more you need to have a situation where somebody can say to somebody in a different silo, "Hey, have you thought about this? Or have you checked on this?" and so forth because, often, people haven't or didn't. And so, the thing about generative organizations is that because everybody has the same final goal in mind, they're all working on the same team, they basically can go between silos, or they can ask the dumb question. And in fact, in many cases, the dumb question is what reveals that you have a serious problem that needs to be addressed. And so, that's the deal. I mean, I think what we're seeing with the vaccine rollout is a similar kind of thing, is that people have assumptions, or things like that, that they're really not willing to reveal. And if they don't reveal them, then they're not going to make a decision based on that additional knowledge that they would get if they brought up the question.
Gene Kim (00:19:40):
Gene here. I wanted to break in to bring up a couple of points. First is a correction from the first interview with Dr. Westrum. He had mentioned a book called The Forge of Democracy. The correct title to that book is Freedom's Forge: How American Business Produced Victory in World War II, by Arthur Herman. It is the amazing book about William Knutson who was at Ford, and later Chrysler, who played a large role in mobilizing the US war time production effort. Without doubt, one of the largest industrial efforts the likes of which the world had never seen before. Dr. Westrum had commented that Knutson was one of those technical maestros.
Gene Kim (00:20:20):
Second, also in my first interview of Dr. Ron Westrum, I had mentioned several times the term CNO without defining it. CNO refers to the Chief of Naval Operations, the highest ranking officer in the US Navy. Dr. Westrum mentioned Admiral Thomas Moore, who served as CNO from 1967 to 1970. He was also an experimental officer for the Sidewinder project at China Lake. And in a strange coincidence, I had just interviewed Admiral John Richardson just a couple of weeks prior, who served as CNO from 2015 to 2019.
Gene Kim (00:20:56):
Third earlier in this interview, Dr. Westrum mentioned to Admiral Nelson, that is vice Admiral Horatio Nelson. According to his Wikipedia page, he lived from 1758 to 1805. "His inspirational leadership, grasp of strategy, and unconventional tactics brought about a number of decisive British Naval victories, particularly during the Napoleonic Wars." I think in many circles, he is most famous for his role in the Battle of Trafalgar. Reading from that we Wikipedia page, "The Battle of Trafalgar was a Naval engagement between the British Royal Navy and the combined fleets of the French and Spanish navies during the Napoleonic Wars."
Gene Kim (00:21:35):
In my interview with Admiral John Richardson, he spoke so much about radical delegation. There's a great quote about how Admiral Nelson had achieved this. "Nelson was careful to point out that something had to be left to chance. Nothing is sure in a sea battle, so he left his captain free from all hampering rules by telling them that, 'No captain can do very wrong if he places his ship alongside that of the enemy.'" In short, circumstances would dictate the execution subject to the guiding rule that the enemy's rear was to be cut off and superior force concentrated on that part of the enemy's line.
Gene Kim (00:22:11):
All right, number four. Dr. Westrum mentioned the study done by the Army War College. So, this was a paper released in 2015 called Lying to Ourselves: Dishonesty in the Army Profession by Dr. Leonard Wong and Dr. Stephen Garris. I looked that publication up and read it, and holy smokes, it is a bit uncomfortable to read. This report is an unflinching self-assessment of the culture that Dr. Westrum described. I think the best way to convey that is to read the foreword by Douglas Loveless, director of the Strategic Studies Institute and US Army War College Press. He writes, "One of the hallmarks of a true profession is its ability to assess and regulate itself, especially with respect to adherence to its foundational ethos. Such self-examination is difficult and often causes discomfort within the profession. Nonetheless, it is absolutely necessary to enable members of the profession to render the service for which the profession exists."
Gene Kim (00:23:12):
US military professionals have never shied away from this responsibility, and they do not today, as evidenced by this riveting monograph. Discussing dishonesty in the Army profession is a topic that will undoubtedly make many readers uneasy. It is, however, a concern that must be addressed to better the Army profession. Through extensive discussions with officers and thorough and sound analysis, doctors Leonard Wong and Stephen Garris make a compelling argument for the Army to introspectively examine how it might be inadvertently encouraging the very behaviors it deems unacceptable. The unvarnished treatment of this sensitive topic presented by the authors hopefully will be the start of a dialogue examining this critical issue.
Gene Kim (00:23:52):
So, the two authors are both researchers. Dr. Leonard Wong is a research professor in the Strategic Studies Institute at the US Army War College. He focuses on the human and organizational dimensions of the military. He's a retired Army officer, he's a professional engineer. Dr. Stephen Garris is a professor of behavioral sciences in the Department of Command, Leadership, and Management at the US Army War College, and his PhD is in industrial and organizational psychology. It is a fascinating and uncomfortable read about what can happen in an organization when only certain types of information can flow upwards to senior leaders.
Gene Kim (00:24:35):
Which gets to item number four. I want to mention a book called The Generals: American Military Command from World War II to Today by Thomas E. Ricks. This is an astonishing book that has come up several times in previous episodes. It was actually recommended to me by my friend Brian Kroger, who helped create the Kessel Run Project inside of the US Air Force. I'm going to bring this book up later in the interview, but I'm going to leave two things with you now.
Gene Kim (00:25:02):
One thing that Mr. Ricks asserts is that the best periods of the day US Army was likely during those periods when top generals were frequently relieved of their commands when their leadership didn't think they were performing to the necessary high standards, either by their military or their civilian leadership. His observation is that this happens far less frequently now than, say, in World War II. The second point is that Mr. Ricks observes that during the Vietnam War, generals were very infrequently relieved of command, and makes a very good case that led to the conditions where the more senior the officer, the less they were trusted. I'll put a citation to this in the show notes.
Gene Kim (00:25:43):
Which leads me to my last point, is that so much of what Dr. Westrum was talking about, about critical information flows and the need for lots of channels of information to exist, exists very much in software as well. I'm reminded of the book, The Inmates are Running the Asylum: Why High Tech Products Drive Us Crazy and How to Restore the-
PART 1 OF 4 ENDS [00:26:04]
Gene Kim (00:26:03):
Why high-tech products drive us crazy, and how to restore the sanity by Alan Cooper. I remember reading this book nearly 20 years ago, it was recommended to me after one of the worst moments of my professional career. It must have been around 2004, and a fellow engineer and I, we were watching a customer use our product for the very first time. We had allocated about three hours for this, and we were watching this person do what we imagined to be a pretty routine operation, and it took him 52 clicks. And the whole time he was apologizing, he kept saying, "There must be a better way to do this." And my reaction as well as my fellow engineers reaction was, "No, there's not because we are terrible people." This was one of the first times that any of us inside the company had actually seen how customers were using our product, and suddenly it became so clear why the job of the Tripwire administrator was always given to the most junior person on the team, or the newest member of the team, because no one wanted to do it. And so, those three hours was just horrifying.
Gene Kim (00:27:15):
And this also led to our journey of atonement. The three of us, we vowed to fix these problems that we had inadvertently created. We went to Cooper University, so this is Alan Cooper who wrote the book, the training organization that he ran. This is where we learned about user personas, creating context scenarios, this is how we changed the way that we specified technology requirements. This is when we started building in customer feedback into the development process, listening to Dr. Westrum, It is so clear to me that this outcome was a direct result of not having information flows from the customer.
Gene Kim (00:27:52):
One of the reasons that made the [Sidewinder 00:27:55] projects so successful is that they had deeply studied the people who would be loading this ordinance onto airplanes, how it would be used by pilots. The absence of these practices lead to bad outcomes, both in the failed Falcon Missile Program, as well as in the far less successful Sparrow Program. In Tripwire 2004, we probably looked more like the Falcon and Sparrow Program than the Sidewinder Program. Although I'd like to think that as a result of all the investments that we had made, we got much, much better at building more humane software.
Gene Kim (00:28:27):
So to summarize, I love the way that Dr. Westrum talks about generative organizations, where they create all these extra channels of information that the entire organization can use in service of the mission. And not only do we have all the information we need, we can revisit and review all our safety checklists or our requirements, or the architecture that we are working within. Are they actually helping us or impeding us? And if they're impeding us, let's get rid of it and replace it with something that actually makes sense.
Gene Kim (00:28:57):
All right, let's go back to the interview. The phrase that really leaped out at me was, "Silos keep information apart." And I'm wondering if that too is a factor. Is it true that over the last 100 years, that siloization is becoming increasingly more prevalent, that the number of silos keep increasing, does that resonate with you?
Ron Westrum (00:29:21):
So think about the difference between the wreck in your bridge 200 years ago today. There's all kinds of issues that you have to come up with today. I mean, they didn't have airplanes back when, okay? So you have to think about airplanes flying into the bridge and so forth. If you were building a building, I mean, my favorite example is a crystal palace of 1851. I mean, here was something which was really a genius building at the time, but they didn't have to put any cables in it at all. I mean, literally there was not even a heating system in it. So by the time they built the City Corp building, by contrast, and certainly in today's world, the number of disciplines necessary to build a modern building, create silos. And obviously if you have a union or something like that, and you have all the people in one silo in a union, then you can't... I mean this happened at my university, it's like, I want to put a coat hanger on the wall. Well, I can't do that, because if I do that, I'll get in trouble and the union will scream and so forth.
Ron Westrum (00:30:25):
And I mean, just to give an example in the building where I used to work, they had a fire on the top floor of the building. So because I read accident reports, I thought, I have a copy of the accident report, okay? And I've sent this through two different chains and the bureaucracy, figuring that one of them would probably get through. No such luck. Either there isn't an accident report, which I think is not very smart, or there is an accident report that I will never see, because any requests that I sent, had to go all the way up my chain and all the way down the other chai,. And it doesn't work, okay? And I mean, this is a university, this is not a really complicated system, except sociologically.
Ron Westrum (00:31:07):
But the interesting thing is, is that I can't do anything with buildings and grounds because if I did, bad things would happen. And yet, here in my office, the simple thing about putting a coat hook on the wall, what I eventually did is I went to the janitor and the janitor asked for cooperation of Buildings and Grounds, and that's how the rack that I designed got put up on the wall. Okay? So you can imagine then if you've got something like a radio system, or electromechanical system that is managing drones or something like that, that's much worse. The book, the McChrystal wrote called Team of Teams, that's what you got. You have to have a team of teams where everybody feels that at some level, we're all on the same team, but when you've got people who are doing different things and to get all these people coordinated; the Navy, the Marines, the air force stuff, blah, blah, blah, it goes on to that. And the construction business basically, master contractors are supposed to solve all those problems. But as we know for billing accidents and rework issues, they often don't.
Gene Kim (00:32:11):
Your example of the university system as a microcosm for this was startling to me. It seems like in some ways this is a low stakes operation, right? The building facilities is probably not a critical part of running a successful university, and yet [inaudible 00:32:29] to prevent the free flow of information that you described in the last interview, you had the magic wand where you ran NASA in the post-Columbia Challenger era. You are now head of the university. How would you have set up the university so that the well-meaning professor can get the accident reports, while preserving the smooth flow of all the critical functions that needed to happen to make the university operate? Describe what that universe looks like?
Ron Westrum (00:32:55):
Well, let me describe the problem first. The problem is that basically, my university has a pathological aspect, in the sense that they have a tremendous fear that their mistakes are going to be outed. And so, what happens is that anything that could badly reflect on the administration is something that is deeply buried, but having had that happen only made people more willing to cover up, and more willing to silo off different parts of the university. And it's unbelievable.
Ron Westrum (00:33:30):
I mean, here's another example; I was giving a talk at General Motors many, many years ago. And so, everything was going well and so forth, but the TV, the thing that was doing the projection wasn't in quite the right position. So I got up out of my chair, I moved toward the television and about 12 people seized me and said, "You can't do that." Okay? Because I was about to do something that the union people were supposed to do. And so, built into the organization was an inability to cross the silo thing; I had to do this by the proper protocols. And it's so complicated that sometimes really terrible things can happen.
Ron Westrum (00:34:09):
In World War II for instance, General Patton realized at some paratroopers that he was about to drop were going to get shot down by the Navy gutters. But the system was so complicated that he couldn't get out to the other ships that the paratroopers they were about to see were American paratroopers, and those paratroopers got shot down and Patton gave up in the end, trying to get the information through. Okay? Very sad instance and so forth. And there's lots of stuff like that; the military is full of catch 22s and other kinds of infelicities, let's say. But I'm not sure I responded to your original question. I think the truth is, we live in a world today where it's easy to have silos and you think, "Well, hey, this is the 21st century. We're beyond that. We are not beyond that.
Gene Kim (00:34:53):
So you are now in charge of the university, congratulations. Can you at least describe in high-level terms, how would you organize the organization such that the professor could get an accident report in service of the greater good of the university?
Ron Westrum (00:35:10):
Right. If I were in charge, I would do the things that are likely to get information to flow. Number one, if you say you're going to do something, you do it. Keeping promises is one of the basic things, this is not a high level principle, but it's something that informs everything you do. And the second thing is, you do things to encourage trust, because people who trust will communicate. Okay, so, some positive policies essentially would be not to make departments compete against each other. So you don't encourage unfriendly competition, so you have to give up along with that penalties. Well, a lot of people are not going to give up their ability to inflict damage, and create penalties.
Gene Kim (00:35:57):
Are you talking about between peers, like between different schools, within the departments in schools?
Ron Westrum (00:36:04):
Yeah. That's right. I mean, I could tell you all sorts of stories, but the other thing you said is starting with a clean slate, but you don't. That's the problem because the people who have been part of the problem in the past are going to hire the people who are supposed to take care of it in the future. And they are going to be enormously reluctant to hire people who are going to make them look bad. So they're not going to do it. They're going to hire people who appeal to them, because they recognize in those people, the same sorts of dysfunctions that they have.
Gene Kim (00:36:34):
By the way, you had mentioned, "Discourage competition," and then we kind of mentioned the peers of departments, schools, I forget what the hierarchy is, but it seems like, in order to sort of make the necessary dynamic happen in terms of you being able to ask for accident reports, be able to hang up a coat hanger, that is not just the schools, it's also the infrastructure facilities. They too need to be a part of that conversation, otherwise we're not going to reach that dynamic, this kind of ideal vision.
Ron Westrum (00:37:04):
Let's take a more technological example. In building the Hubble Telescope mirror, the technicians decided that they didn't want to talk to the engineers because they consider the engineers sort of intrusive. So they actually, they literally walled themselves off, and every time the engineers knocked on the door, they would turn up the music. So the consequence of this was that a very inadequate job was done of making the mirror perfect. So the mirror went into space in a flawed manner, okay? I mean that the launch was flawless, but the mirror was damaged, I mean, it was wrongly manufactured. And the problem was that the technicians who did the manufacturing didn't realize the consequences of what they were doing, because they wouldn't consult with the engineers. And we see this again and again, in technological. One of the Japanese nuclear accidents basically took place because the workforce was encouraged to experiment, to make things more efficient. The problem was that the workers didn't have engineering knowledge, and they created a situation where you would have a sub-critical mass or even a critical mass, and they didn't check it with the engineers, and so what happened was basically in the end, a couple of people died because you had a radiation accident. And again and again, we see this sort of siloization and so forth.
Gene Kim (00:38:24):
Yeah, what it reminds me of is, I remember seeing this, it was a Nova episode on like super carriers, it was like a 10 part series on nuclear aircraft carriers. And there's a specific interview that I still remember. They interviewed a whole bunch of people in an aircraft care of like, "What's your job?" And I think they started with a fighter bomb pilot, and [inaudible 00:38:45] "My job is to deliver ordinance onto the target," and then they interviewed the cook and he said, "My job is to feed the pilots and the crew, so that we can land ordinance on targets," that everyone knew kind of what their role was in integrator system. And what you're talking about there is, it seems like something similar that, despite the silos, there is a sense of a common mission. And-
Ron Westrum (00:39:04):
Gene Kim (00:39:05):
Even the, "Back office" infrastructure, people know their place in the system and what they do is in support of it.
Ron Westrum (00:39:10):
That's right. But the other thing is, that built into the structure of the aircraft carriers' functioning is a willingness to always listen to the lowest man in the system, or man or a woman that is, and respond to that person. So the air boss who is the most important person in the aviation on the carrier is a person who will listen to anybody on the flight line who has a problem. And so what happens is that basically they get around the silos that you would otherwise have between the people, essentially arming the planes, or getting the planes ready and so forth, and the people who are managing the aviation itself. Because everybody realizes, if you don't do this job, right, either the pilot won't take off with the right equipment or when the pilot is landing, there won't be adequate communication.
Ron Westrum (00:39:59):
And so, what people have discovered about why you can have aircraft carrier landings that are safe, is that everybody cooperates, everybody's on the same page. And so literally the lowest guy on the flight line can call a landing off. If he sees something that is not right, it's sort of like, I think Toyota had one of these mechanisms in their system, where did the lowest guy in the line can pull the plug on whatever's going on, if something's wrong? Well, that requires an enormous amount of trust. How do you get that trust?
Gene Kim (00:40:33):
Right, yeah the Andon Chord, where everyone's trained to pull the Andon Chord and is thanked as a result, whenever they do one of the 3,500 Andon Chord pulls.
Gene Kim (00:40:41):
in a given day. So I love the fact that we're establishing that yes, there are an ever increasing number of specialties. And by the way, just for your amusement, two years ago, I learned in hospitals, they had the hospitalist role. And even in the last couple of years, they created the nocturnal role, which is the hospitalist, but at night. So, within the last couple of years there was actually the creation of a new role. It seems like what this suggests is that the role of functional organizations is getting larger. In other words, you do need a place for each one of these functional specialties can learn their craft and hold standards, and it seems like the challenges are increasingly around integration of those functional specialties.
Gene Kim (00:41:20):
So in the Sidewinder book, there was this phrase that you wrote, "A matrix organization inevitably generates tension between the projects and its interdisciplinary departments. Functional departments deliver expertise, project groups deliver prototypes," and this one line, "And both are likely to see themselves as primary."
Ron Westrum (00:41:37):
Gene Kim (00:41:39):
I absolutely love that. And then you write later, "Matrix organizations are inherently unstable that require utmost judgment from top management." So I was startled to see in one of our earlier correspondences, you had brought up Team of Teams, which you did again earlier, which I think is such an amazing example of this phenomenon, where you have these functional specialties that must integrate in order to achieve the mission. Can you talk about why matrix organizations are fundamentally unstable, and how do we cope with that inherent instability?
Ron Westrum (00:42:10):
Ah yes. Well, I mean, a good example of that is many years ago, I was told that essentially when McDonald Douglas went to a matrix organization, they had problems that they didn't really know how to solve. And this is obviously very important because eventually became part of Boeing and that same culture was carried over with consequences, which I think we all know. So the thing about it is, again, this is a basic issue of trust; everybody has to feel that they're on the same team, and that is very difficult to get, because many things encourage you to deal with the people in your silo, and to ignore basically what the consequences for the organization as a whole are likely to be. And this is especially true when you get people to compete against each other in a non-friendly way, which unfortunately is often what happens in organizations.
Ron Westrum (00:43:06):
And yes, and matrix organizations are very tricky. They were obviously too tricky for McDonald Douglas or at least this is what I was told. And the interesting thing is that you do get organizations where you can get everybody working together on the same team. And this was in fact, the gift that Boeing had for decades and decades, is that they got everybody to feel that they were part of a family. And, one could sort of write that off and say, "Well, gee, it's just a phrase," but it isn't just a phrase; it meant something at Boeing. It meant that everybody would tend to work together. When you have teams that are not part of a family however, people will work against each other. And it doesn't sort of coalesce, that fine tuning that you need to do is not going to happen, basically, if people feel that they are at risk. And how do people feel that they're at risk? They see somebody who has management come down hard on them for something that they did. For every time that there's a heavy punishment [inaudible 00:18:05] out, you have a situation where people's trust is going to go down. You cannot have punishment as a regular part of everyday events, and expect people to behave cooperatively.
Ron Westrum (00:44:17):
In one organization that I joined, and I won't say which one, basically there were all sorts of jokes about people taking advantage of other people. And then the metaphor used was, "Taking your jar petroleum jelly into the office." It's basically the metaphor that people use to describe conflicts, and I was appalled. I mean, I just said, "This is crazy," but that was the deal, is that everybody talked about basically being beaten up or whatever by other parts of the organization. And that was how it went on a day-to-day basis. So you can imagine the kind of conflicts and screw ups that this is likely to generate.
Gene Kim (00:44:58):
I am grateful that LaunchDarkly has sponsored the DevOps Enterprise Forum for the last two years where 60 of the best thinkers and doers I know come together for three days to create written guidance for the top problems facing this community. It's always been one of my favorite professional activities that I get to participate in. This sponsorship from LaunchDarkly makes possible the publication of this year's DevOps Enterprise Journal, where all of the hard work of this year's forum participants, as well as all the other journal authors are published. LaunchDarkly is today's leading feature management platform, empowering your teams to safely deliver and control software through feature flags. By separating code deployments, from feature releases, you can deploy faster, reduce risk, and rest easy. Whether you're an established enterprise like Intuit or an SMB, like Glowforge thousands of companies of all sizes rely on LaunchDarkly to control their entire feature life cycle, and avoid anxiety-fueled sleepless nights, break away from stressful nerve-wracking releases, which so many organizations face. By decoupling deployments from releases, developers are free to test and production and ship code whenever they want, even on Fridays. If there's a problem caused by a feature, a flip of a switch can disable it without the need to roll back or redeploy code.
Gene Kim (00:46:14):
If you want to make releases snoozefest, and start deploying more frequently and securely with less stress, head to launchdarkly.com. I trust that you are enjoying and learning as much from this interview of Dr. Ron Westrum as I am. If you are, you should know that Dr. Westrum will be presenting at the DevOps Enterprise Virtual Conference, which would be from May 17th to May 19th. I hope to see you there. Go to events.itrevolution.com/virtual for more information.
Gene Kim (00:46:47):
So maybe just to confirm my own understanding, I remember taking a course at MIT Sloan, and I think it was really portrayed, they really can't three classic ways to organize. You have the cross-functional teams that are often characterized by being able to more quickly adapt to customers, because they're closer to the customers, they can all independently do what's customer asks. Then on the other side, you have pure functional organizations. So instead of optimizing for speed, you're really optimizing for expertise. And then in the middle, you have these matrix organizations where you have people who have a competing set of loyalties, one to the, as you say, projects, one to their Functional Department. And it sounds like the emergence of the seems to be more important. Is my understanding of the kind of archetype is complete, and is it true that these kind of matrix forms are becoming more important?
Ron Westrum (00:47:34):
Well, that's a very interesting question. I would argue basically that the issue is not so much, which one of these things you have, but what is a system that connects all these things? In the original General Motors, for instance, the organization was built up of five or six different automobile companies that got put together. And you would think that something like that would be absolutely terrible. And in fact, I have a GM organization chart from 1920, and it's like a plate of spaghetti. And you think, "Well, how was all this managed?" Well, it was managed because the person who was running GM, guy named [inaudible 00:48:15] had the ability to get all these things to work. However, the problem was that if he was not there, and he was one of these people who was a 24 hour man, he was everywhere doing everything, it was not a good scheme. And in the end, basically that played a spaghetti was replaced by a far more functional system under Alfred P Sloan, and General Motors actually did extremely well during the years that the Sloan structure worked.
Ron Westrum (00:48:43):
As time went on however, there were political reasons why it was changed into something that was way more complicated, basically, so they wouldn't be broken up. And the clarity that had been under Sloan went away, and so the organization was made more complex, but the people who were running it were no longer the technological maestros, because see, when GM was built, basically, they got not only the organizations, but they got the people who were running them. So those people, the profit and loss individuals at the top were really good. So you had an organization that was loaded with people who knew what they were doing.
Ron Westrum (00:49:19):
And this is a very important point. Okay, so if you have an organization which has complicated and people don't know what they are doing, what did they do instead? And that explains a lot of what is wrong with Boeing, it explains a lot of what went wrong with General Motors and eventually what went wrong with NASA as well. So the thing is that running an organization is a very tricky business. And if you've got a large organization like Boeing, for instance, getting everybody to work together is an art. And if you have an organization where basically the organization itself is a work of art, the last thing in the world you want to do is to have somebody who's from a typical kind of hierarchical organization come in and mess around with that. And that's exactly what happened to Boeing is they had something that everybody tries to achieve, and then somebody says, "Well that's not good, we're going to do something else." That was not a good idea. Okay.
Ron Westrum (00:50:21):
It's sort of like the problem that Southwest Airlines used to have with people who wanted to come in and find out why Southwest was so good. So the visitors would always say, "Okay, well, what's the gimmick?" And the Southwest people would say, "There isn't that a gimmick. We have all these practices and so forth, but you have to basically put all these practices together or it's not going to work." And so the visitors would scratch their heads and they'd be puzzled and wonder, "Well, why can't we do that?" And the answer is because you got to do all those things at once, and you have to have the people, the technological maestros at the top who can get this to happen, okay? And so, what happens when you try to take an organization that's a family like Boeing was, and turn it into something else, it has not going to work as well. And with airlines and airliners basically, you can make some really serious mistakes because it is this very complicated thing, designing a big airline.
Gene Kim (00:51:14):
So what advice would you have to help increase the likelihood of success for matrix organizations, which are so inherently unstable and requires such judgment from top management?
Ron Westrum (00:51:25):
The simplest thing is when somebody is doing a good job, you don't replace them, okay? Somebody's a good job, don't replace them. I think I mentioned [inaudible 00:51:36], I think it's number 21 or something like that, when the boss is a dope, everybody under him is a dope, or as soon will be, okay? That has enormous applications when you're dealing with complicated organizations. And with Boeing, the dumbest thing they ever did is move the headquarters to Chicago, even though the engineers were in Seattle. I mean, I know this sounds awfully simplistic, but you don't do dumb things, okay? And they did. All right. And that's just the beginning. I mean, I could go on and so forth, but here's the deal-
PART 2 OF 4 ENDS [00:52:04]
Ron Westrum (00:52:03):
And that's just the beginning. I mean, I could go on and so forth, but here's the deal. So if we are going to have organizations that can manage the complexity, the team of teams that you need for the 21st century, we have to study organizations that are doing it successfully. And basically the people at Berkeley studied high reliability organizations. That's what they were doing. They were trying to figure out how do you get organizations to do the impossible? Oh, they talked about something like almost error-free operation, which is a joke. Of course, you can't have that.
Ron Westrum (00:52:33):
The issue is if you have errors, they have to be fixed quickly. They have to mitigate it and so forth. They have to be noticed, et cetera, et cetera. Well, the thing is all of this is an art. So the first thing you do is you study organizations that are able to do this, and you realize that there are certain kinds of people, these technological maestros that are essential to make the organization work perfectly. At Boeing, they lost their technological maestros. I mean, they got rid of Alan Mulally. I'm sure that will... Somebody would say, well, Alan Mulally wasn't that important. I think Alan Mulally was that important, because he could do all this stuff and they made a movie about it. It was called 21st Century Jet. It's a brilliant movie. Unfortunately it was never put out in DVD form. It was only in, I think, in videotape.
Gene Kim (00:53:17):
Ron Westrum (00:53:19):
Yes. So I mean, just by doing that, some important lessons were lost. And in fact, I remember, I have to admit, I hate to admit when I make mistakes. I remember when they were doing the 777 regionally and they were going to do fewer flight tests and I thought "That'll never work, or they're going to have all sorts of crashes, blah, blah, blah, blah." They didn't, why didn't they have the crashes that I thought they were going to have? And the answer is very simple. Because they paid attention to each of the details and they went through all the different procedures to make sure that everything was going to work out. So that was the organization then, which was changed into the organization that created the Dreamliner. And if you're an aviation you know the problems that Dreamliner had and the 737 Max. Those were basically the creation of organizations which were inferior to the corporate culture that Boeing had possessed.
Gene Kim (00:54:12):
That's interesting. I guess, what my head is filling in is a narrative that it was actually the merger or acquisition of McDonnell Douglas, where-
Ron Westrum (00:54:21):
That's part of it, yeah.
Gene Kim (00:54:22):
... That it was essentially a lot of their leaders who ended up on top. And that led to the destruction of some of the greatness that you were alluding to. Is there more to the story?
Ron Westrum (00:54:32):
Well, if you change the corporate culture and they did. The people from McDoug basically wanted to have a different organizational culture. And I mean, there were structural changes too. They decided they would contract out a lot more of the design of the airplane. And this was a very bad move, because the more complicated you make the structure making the airplane, the more sophisticated the management you have to have to be able to do it. The management that they got were not able to handle the complexity of the new organization. And so it's like the maestro that was directing the orchestra could not do the same kinds of things because they had basically lied and cheated and so forth. This is-
Gene Kim (00:55:21):
So they chose a more complicated score in your words said that lowering the skill level of the conductor and that led to very bad outcomes.
Ron Westrum (00:55:28):
Yeah. I think the thing that you need to understand is if you're building a large complicated airliner, you need somebody who basically thinks of themselves orchestra maestro, not somebody who is going to come in with a shovel and whack people over the head.
Gene Kim (00:55:43):
It's almost like evoking the Falcon missile program where now you're holding people to spec, right? Am I going the wrong direction?
Ron Westrum (00:55:50):
No, no. That's an interesting comparison. So the thing about the Falcon missile is you had some of the best engineers in the country, in that team that built that, okay. And each of the engineers concentrated on their particular part of the missile. But who was supposed to integrate all of these things and so forth. Back people from the Sidewinder project would go out to a Hughes Aircraft and they would come back and say, "Oh my God, those engineers are so smart. How can we possibly compete with them?"
Ron Westrum (00:56:21):
And so Howie Wilcox one of that Sidewinder engineers would put his arm around Lee Jagiello, who was the person who was typically going hysterical about this. And he said, "Look, first of all, all those people only get in each other's way." Second of all, he said, "We have a genius and Bill McClain who will basically figure out the answer to most problems in 10 minutes. And the third is we have a better design." So the Sidewinder team, which was small, was highly integrated. And they basically didn't make the kind of bad decisions that Hughes Aircraft made, even though Hughes had as individuals, people who were probably superior to many of the Sidewinder people, but they didn't work together as a team.
Gene Kim (00:57:06):
Right. That's a very interesting thing to connect, at least a strategy of the Dreamliner maybe 737 Max, this is partly speculation and so forth. I'd want to characterize it as speculation.
Ron Westrum (00:57:19):
Absolutely. I mean, I'm speaking off the cuff here, so.
Gene Kim (00:57:23):
Yeah, it's super great. Super insightful Gene here. Holy cow. I loved how Dr. Westrum described and characterized the story of Boeing acquiring McDonnell Douglas, and then somehow McDonnell Douglas replaces all of Boeing's top managers with their own leaders. Essentially replacing the top conductor at Boeing with one who has less skill and who has just chosen to manage a more complex score, how heartbreaking. So, Dr. Westrum had mentioned a Nova series called 21st Century Jet about the making of the 777 Airliner at Boeing. The Boeing 777 program took 5,000 people and nearly five years to build. It's a five-part series. That's five hours of a documentary describing the making of the 777 Airliner in an attempt to beat Airbus, which had emerged as a very fearsome competitor. It's an amazing documentary because it focuses on two main characters at Boeing. There's Alan Mulally the project manager for this $2 billion program that is a 777 and Phillip Condit, who was Boeing CEO from 1992 to 2003.
Gene Kim (00:58:35):
The 777 project would either make or break Boeing. They chose to launch this new aircraft in 1995, which was in the middle of an economic recession when airlines were going out of business left and right. United Airline also features prominently in this story because they were their launch customer who helped underwrite some of the huge R&D costs for this program. There are so many things that I loved about this documentary. It's five hours long. It's hard to know which parts to pick. I will post a YouTube link in the show notes. If anyone knows how to watch these documentaries in a way where the creators actually get paid, please let me know and I will put that revised link into the show notes as well. The first thing I'll note is this program they called "Working Together." It was a new management method that attempted to reduce the effect of silos.
Gene Kim (00:59:29):
It's in almost laughable name, but there's this terrific testimonial from someone who worked on the 757 and 767 program, who talked about how differently the 777 program was run. Despite the extremely tight and aggressive deadlines teams didn't throw each other underneath the bus. Instead, there was a genuine, integrated problem solving dynamic that was obviously very new at Boeing. The second thing that I absolutely loved was the story of how they landed the deal with United Airlines. There's a scene where the CEO of United Airlines wrote up a letter that they had the entire Boeing team sign committing that Boeing would work together with United Airlines as a partner to achieve their goals.
Gene Kim (01:00:15):
The letter reads, "Boeing 777 objective, United plus Boeing plus Pratt & Whitney. In order to launch on time a truly great airplane we have a responsibility to work together, to design, produce, and introduce an airplane that exceeds the expectation of flight crews, cabin crews, and maintenance and support teams. And ultimately our passengers and shippers from day one best dispatch reliability in the industry, greatest customer appeal in the industry, user-friendly and exemplary work. October 15th, 1990 in Chicago." Signed by executive VP of operations at United Airlines, executive VP of a Boeing Commercial Air Group, executive VP at Pratt & Whitney and executive vice president and general manager of something at Boeing. This is so great. So they show a picture of this handwritten letter that was written by the CEO of United Airlines. And there's a scene of Alan Mulally describing this epic meeting to the entire 777 team describing the promises they had made on behalf of the 777 project.
Gene Kim (01:01:34):
It is amazing. And Alan Mulally then describes how they brought in Phil Condit, the CEO of Boeing, and had him sign it too. And apparently Phillip Condit says, this is the new Boeing, and you can see this room full of the 777 team laughing and fully understanding the extent of the commitments that were made to United Airlines. There's a feeling of like hushed awe and nervous laughter, it's such a cool scene. And at the end of the fifth episode, they actually go through the acceptance protocols of United Airlines accepting the first 777 jet. And it was actually the first jet that was accepted as is in the history of the airline. Super interesting. The third thing that I just thought was really remarkable was how Boeing won the ETOPS rating. So according to Wikipedia, ETOP is short for extended range twin engine operations performance standards. So long story short before the 777, every twin engine airliner was required to be within 120 minutes of an airport, just in case if there is an engine failure. However, in order to make the 777 work for their customers, they had to figure out how to get an ETOPS rating of 180 minutes, thus vastly increasing the routes that they could fly. What was astonishing was how Boeing worked with the regulators to satisfy their biggest concerns. They went to the most conservative groups. There was actually the airline pilot union, who arguably had the most at stake. They represented the pilots who wanted to make sure that they weren't going to be flying an airplane that wouldn't be safe to fly outside of 180 minutes from an airport. So the Boeing team went to the pilot union and asked what would it take for them to be confident that they could fly with 180 minute ETOPS limit?
Gene Kim (01:03:31):
And they based their entire testing program around that. And so this wasn't someone playing fast and loose with the regulatory bodies. One couldn't interpret this as regulatory capture where the industry is being regulated, somehow take control of the regulatory bodies. This seemed like such an earnest way to make sure that the spirit and intent of the regulations were fully met and they launched this incredible extended and compressed testing protocol to make sure that the plane could actually meet or exceed the standards set by not only the pilots union, but also the FAA. To me, it just demonstrated the creativity of the 777 team, not just to build this incredible airplane, but to make sure that it satisfied all the regulatory concerns. The last thing I'll mention is Alan Mulally. Dr. Westrum mentioned Alan Mulally as the technical maestro that made the 777 possible from his Wikipedia page.
Gene Kim (01:04:28):
Alan Mulally was the executive vice president of Boeing and the CEO of Boeing commercial airplanes. He began his career with Boeing as an engineer in 1969 and was largely credited with their resurgence against Airbus in the mid 2000s. In 2015 Mulally was inducted into The International Air & Space Hall of Fame at the San Diego Air & Space Museum. Following the forced resignation of CEO's Phil Condit in 2003 and Harry Stonecipher in 2005, he was from McDonnell Douglas, Mulally was considered one of the leading internal candidates for the CEO position. However, when Mulally was passed over in both instances, questions were raised about whether he would remain at the company and he was then recruited to join Ford as their CEO. When automotive industry analysts asked, how are you going to tackle something as complex and unfamiliar as the auto business when Ford is in such tough financial shape, Mulally responded, an automobile has about 10,000 moving parts, an airplane has 2 million parts and has to stay up in the air all the time.
Gene Kim (01:05:38):
Alan Mulally was widely applauded for his work at Ford. In 2006, he had Ford borrow over $23.6 billion by mortgaging all of Ford's assets to "provide a cushion to protect for a recession or an unexpected event." In the 2007 financial crisis Ford was the only one of the Detroit three that did not ask for a government loan. So Dr. Westrum is speaking at DevOps Enterprise and his talk is on culture and he presents an expanded version of this analysis of the culture change at Boeing and it is absolutely fascinating. Okay, back to the interview.
Gene Kim (01:06:24):
You said something really interesting in the last interview and you said, well, I asked you why the Sidewinder project, you said you thought there were lessons learned. In fact, you said the generative culture really came from a thought experiment about like, how do R&D organizations work? What is really required?
Ron Westrum (01:06:40):
Let me tell you something that I didn't tell you, but which is important to understand. So where did the culture that China Lake had that led to the Sidewinder come from? And the answer is it came from World War II. In World War II there was an organization which nobody speaks about anymore called OSRD, the Office of Scientific Research and Development. And all the laboratories of OSRD basically had a similar kind of a motivation and a similar kind of operational code. And that code was basically very much similar to the SkunkWorks that Lock created in developing at Jet Planes. And so the issue was basically, you had people like Jack [inaudible 01:07:21], who said it was okay to do whatever you thought was good, as long as you did it on your spare time, which began at 9:00 AM in the morning and ended at five o'clock at night.
Ron Westrum (01:07:34):
The deal is that OSRD had a remarkable way of doing projects so that they would be done quickly, rapidly, and that the people who knew what they were doing were given that latitude to do them that way. All of that changed after the war was over. So the same scientists and engineers who during the war had basically operated in the sort of fast and loose system then were put back into their ordinary academic jobs.
Ron Westrum (01:08:05):
And the military then tried to create through a bureaucracy what had happened essentially through genius. And the results were that basically in the few places that OSRD norm still held like China Lake people were able to do these incredible things. Okay. And in fact, it was in the constitutional principles of China Lake that they were able to do the same things they'd done in World War II. And by contrast the rest of the military apparatus that acquired weapons got more and more bureaucratic about how they did things and... At China Lake people, I was sitting in the historian's office one day and she points to this whole line of books on the wall and she says, "See those books, those are the rules that we have to follow now." She said, "Every time we do something good, they pass a rules so we can't do it again."
Gene Kim (01:09:02):
Right, very bureaucratic response.
Ron Westrum (01:09:05):
If you haven't studied the military procurement system, this will sound exaggerated to you, but it's not. And the most interesting thing is actually that a big study was done by, I think, Battelle or somebody called Traces, where they actually looked at the origin of weapons systems that worked. And they found that in 50% of the cases, at least there was an illegal prototype that somebody had broke the rules so they could do it better. So what was the obvious thing to do B? Well, the answer is you copy that illegal prototype system and you do more of that. But the answer the military came up with is, no, we don't want that. We don't want a hobby shop. We don't want people going off on their own and doing things. We want everything to be rational and controlled.
Ron Westrum (01:09:48):
So the problem was that's not the right way to do it. Okay. And I mean, if you've ever seen the Pentagon Wars or other movies about stuff that we acquired, that we shouldn't have or didn't work or whatever, those are stories basically where they did it the company way. This is a fact of life. And so why wouldn't you do it the way that works using the smart knowledge. And this is a very important term in anthropologist, smart knowledge that allowed you to do that stuff. And the answer is because the smart knowledge is not something that you can get without being very careful about the people that you hire to exercise it. It's much easier to do it the bureaucratic way, because then you can do with ordinary people. But the results you get will not be the same. Because like the Lockheed SkunkWorks, they can do things other people didn't do. And the reason is because they basically cut back on the rules that they needed to use to do the process, and they concentrated on the outcome.
Gene Kim (01:10:51):
What I found so surprising about... By the way, thanks for the elimination on that. And yet it doesn't reduce all my surprise that you said that the generative characteristics came from an ideal R&D culture. So my question is, how do you bridge the world of R&D where so much of the work is uncertain, often work is being done for the first time to the world of operations. I know that you've already went there, right? You described team and teams, which is primarily in the domain of operations. So how do you-
Ron Westrum (01:11:30):
Gene Kim (01:11:31):
So in the operations management world they use the word term design, operate, improved. How do you kind of make the steps to go from, okay, it worked for R&D, it will also work in the higher tempo domain of "operations."
Ron Westrum (01:11:46):
Okay. So the people who've been doing the work on high reliability organizations have bridged that gap by studying people who manage aircraft carrier landings, or nuclear power operations, or in many cases chemical process things. You had Steve come in and talk about Alcoa. It's a perfect example of this. Okay. So the thing about these high reliability organizations is they do things other people don't do. And a lot of it has to do with hiring people who have the skills to be able to do things that other people can't do. And you cannot compromise on the people that you hire. And that means by the way, that they have to be hired by people who can recognize these skills. Okay? So this is why the university that I work for has such problems, because the people who typically do the hiring do not hire people who are going to make them look bad. But in many cases, it is the people who are better than you are the people you want to hire. So how do you get people to do that? Well, that's a really good question.
Ron Westrum (01:12:54):
I've been reading this book on the building of the empire state building. And so here is an instance of people hiring somebody who is really, really smart, this guy Stewart, who was the chief engineer, was absolutely brilliant guy. So the problem is that when you've got people like this Tesla, what's his name?
Gene Kim (01:13:16):
Ron Westrum (01:13:17):
Yeah. Elon Musk. So here's the guy who basically is one of these people, are you going to manage somebody like that? How do you manage somebody like that? And the answer is you don't, it's like hitching your wagon to a [inaudible 01:13:33].
Ron Westrum (01:13:35):
So most of us are scared about what would happen if we do this. But the fact is, is that it's people like this who frequently develop the things that other people can't or don't or won't. So the technological maestro is the person who ought to be studied. In fact, I've often thought we ought to write a book on it because technological maestros are so important in terms of how of getting these systems to work, but they don't grow on trees. I'm not sure how they do grow. I think this is very interesting question that they teach each other, for instance, they recognize each other. So really you have to think about if you've got people who can do the job, they'll hire other people who can do the job. But if you have people who are not going to be able to do the job, they're going to hire people who are going to make them not feel intimidy.
Gene Kim (01:14:25):
Right. The Robin knows laws. One of Robin knows laws, which for what it's worth the notion of technical maestro and the the corollaries of what happen when you have a dope at the top was one of the most astonishing startling insights I've heard in years. So thank you so much for that. So what I'm hearing is that in your mind it was just evident. What's applicable to R&D is applicable even to high tempo operations. And so there was no steps that you even need to manufacture to get there. So let's talk about the team teams example. So one of the things that Steve and I have discussed at length and forgive me that I can't quite verbalize it clearly because I haven't thought through it. I'm not able to think through it clearly, but it seems like one of the novel structures that was required to make the team of teams story happen was that they actually created these kind of new information flows that allowed these kinds of more mission-oriented teams to do their work.
Gene Kim (01:15:29):
So maybe to steal the condom and traverse ski language of thinking fast and slow. At the highest levels so many activities still look the same. Objective setting, resource allocation at the geography level or the country level, that's still happened at the top, but then they seem to create these other information flows and structures that allowed for much quicker, more dynamic conversation that happened at the lower levels.
Gene Kim (01:15:59):
For example, they were able to trade Eric Transport to get from here to there. They're able to trade capacity on information surveillance platforms. They were able to transact with each other, whether it was men materials and so forth or information. There's this kind of famous daily 90 minute phone call that happened every day, 365 days a year, where information was being shared across the organization, very different than you would see in a typical hierarchical organization. Can you react to that? Does that resonate with your own experience in terms of novel structures that are created to facilitate information flow and that this is kind of fast. These are in the same way that kind of in traverse, you say this is a new fast path that's activated to allow for these different ways of working?
Ron Westrum (01:16:53):
Well, it all depends on trust. I think this is one just one of the kinds of things. So you have to have the confidence in yourself basically to subject the team that you're running to essentially the dominance of the important fact over personality, over departmental perquisites and things like that.
Ron Westrum (01:17:17):
And so the thing is that it takes somebody like McChrystal. Here's a person who is likely to be in a traditional organization a problem because he speaks his mind. And even though he's accepted essentially the military structure, and it's really astounding to read his memoir of his experience silver, but he is a person who has absolute faith in himself. Well, if you have that faith, you have a humility toward essentially what's going on. And so if you have that necessary humility, you can do the cooperation with other people because you don't have to win every contest. You don't have to be the one that comes up with the idea, that's the dominant...
PART 3 OF 4 ENDS [01:18:04]
Ron Westrum (01:18:03):
You don't have to be the one that comes up with the idea.That's the dominant idea. And this is the tremendous difference between the person who simply a bully and the person who's a maestro. The maestro is willing to recognize the idea if somebody else has it. And Wernher von Braun was exactly that kind of person. If you look at what led to our going to the moon, using the lunar orbital system that we got, the LOR or they call it. The whole thing basically was something that, in a sense, it was a big discussion that Wernher von Braun was managing the discussion, but he didn't have to have the right answer, all he had to do is recognize the right answer. And that requires a level of self-confidence, which many managers do not have.
Ron Westrum (01:18:49):
So I think that the other thing is it's not surprising eventually McChrystal got into trouble as did David Petraeus, because they were people who basically were able to conceive of these very complicated things and figured out how to get them to work. At the beginning of World War II, General Marshall went through the list of generals and he crossed off a lot of people because they knew they couldn't hack. Crossed off, I think, hundreds of generals. And he said, "These people will never be able to do the job." Well, how did Marshall, first of all, have the knowledge to do this? And the answer is, he'd been training these people basically for the last 20 years, and he knew everybody.
Ron Westrum (01:19:36):
But the other thing is basically people recognized that Marshall knew what he was doing, and that's why he went from being, I think, a brigadier general at the beginning of world war II to a four-star general at the end, because they recognized that he had these unbelievable abilities to do things that other people could not do. So I think it would be interesting to reflect on Steve Jobs as an example of somebody like this, who both was a great leader of other people, but also in the end, became kind of a problem for the organization.
Gene Kim (01:20:10):
Gene here. Dr. Western mentioned General George C. Marshall, both in this interview and in the first interview. I had asked him, what is General Marshall a maestro of? And he responded, "Of being a general." But after doing some research, it occurs to me that he was a maestro of much, much more. So in the book, The Generals, which I mentioned earlier by Thomas E Ricks, the first third of the book seems to be written about General Marshall and how his work so much influenced and contributed to the US victory in World War II. But then I remembered hearing this NPR planet money episode on the Marshall Plan. And in fact, General Marshall's impact there is maybe even bigger than in World War II, so I'll put a link to that in the show. Here's some notes I took after reviewing the transcript. So after World War II, Europe was in chaos.
Gene Kim (01:21:07):
Over 50 million Europeans were homeless. People were starving. The entire continent was desperate for basics like wheat and milk, also coal, tractors and railroad lines. General George Marshall wanted to retire, but hen is asked by president Truman to become his Secretary of State. And so the first big question is what do you do with Germany? So I'm quoting, "Up until this point. America is still thought of Germany as the enemy. President Roosevelt's Secretary of the Treasury Henry Morgenthau believed that the policy should be to de-industrialized Germany." Literally to quote, "Turn it into pastureland and take out all of its industrial power and push it out." Which is what Victor's typically did for thousands of years. But Marshall's stance was essentially that Europe is broke, it needs help. If it doesn't get aid, things are going to get uglier. So the Marshall plan rebuilt Germany, but that money also convinced France to let a rebuilt Germany exist.
Gene Kim (01:22:08):
It created a Western block united against communism that would lead directly to the European Union. George Marshall wins the Nobel peace prize for this work. The Marshall Plan ultimately cost American taxpayers $13.2 billion. So adjusted in today's dollars, that would be $150 billion. But that actually understates the number because our economy is so much bigger now. "A better yard stick would be to compare against GDP. The Marshall Plan amounted to 5% of GDP." So that would be nearly a trillion dollars today. The Marshall Plan wasn't just about cleaning up the ruins of a past war, it was about starting a new one." Holy cow. That was an amazing NPR planet money episode for anyone interested in economics, it is definitely worth listening to. And this also makes me want to read a biography of General George Marshall, not just his military background, but clearly the vision that created and shaped Western Europe.
Gene Kim (01:23:18):
To wrap this section up, I had asked Dr. Westrum last time, what was General Marshall a maestro of? And his answer was, "He was a maestro at being a general." But I would claim he was actually a Maestro of much, much more than that. Two more things before we returned to the interview, one General Marshall was the chief of staff of the US Army. So that is the highest ranking officer in the US Army. That is the surface equivalent to the Chief of Naval Operations, which has come up numerous times over the last four interviews. And lastly, as much as we talk about these gifted individuals of which society owes so much to, I still believe that leadership is something that can be learned and taught. And there is a considerable body of research that supports this, such as the Transformational Leadership Instrument, which showed up in the 2015 state of DevOps research.
Gene Kim (01:24:13):
So while there may be aspects of these technical maestros that are innate, even they cannot do it alone. So we need leaders who can combine these skills and gifts of these tactical maestros with the ability to harness the full creative potential of the entire organization to achieve increasingly complex and difficult goals. With that, back to the interview.
Gene Kim (01:24:35):
Yeah, it just occurred to me as you're talking what those kinds of structures are. In fact, I'm going to read the famous organizational typology model. All those things that had mentioned in the team's context are specific examples of how information is actively sought and shared. It's the specific ways that teams and functional groups bridge. And these are mechanisms at which new ideas are injected into the organization. Can you describe other mechanisms you've seen in terms of novel ways that these high trust, high reliability, high functioning, high high-performing teams work? That what are these kinds of structures that don't show up in the other ones? What are the ones that [inaudible 01:25:21] you?
Ron Westrum (01:25:22):
If you read both of the books on the Navy, it's your ship and turning a ship around and so forth. And you look at other studies which are less well-known about generative organizations, what you find is that first of all, people tend to create a high trust level first. They will do things that let the people at the working level know that they are respected, that the management has confidence in them. And this is often in the case for people who in the past may not have been the best performers. So it requires, again, a tremendous level of confidence to be able to give these people the power to do essentially something that in the past, they have not been very capable of doing. So creating confidence in the workforce is one of the first steps that you see in terms of creating a generative organization. Because when people have the self-confidence that comes from A, getting better at what they do.
Ron Westrum (01:26:19):
And when they're recognized, and this is a very important principle. It's that when people have organizational justice, if they do something, that's good that the people on top recognize it. The worst thing in the world is to have the wrong guy, given the prize for something that somebody else did, because that's going to destroy the confidence of the people in the organization. So you have to be the kind of leader who listens, who goes down into the spaces where people are and finds out what they're doing, who really knows what's going on and makes decisions based on a real deep, technical or a socio-technical knowledge and not sort of flare or making it up, or just basically an undue pride. One of the classic things basically was a guy who was managing automobile companies and so forth. And they gave him the worst performing factory.
Ron Westrum (01:27:13):
So the first thing that he did basically is he created new lockers for the men and got the lockers painted and so forth. And these are the people who in the past, had not been doing a very good job. But when they saw that he was interested in them, then they saw that he was willing to give them the benefit of the doubt they then wanted to perform for him. And when you get people who want to perform for the top guy or gal and so forth, that's 50% of the way right there. There was a case where where basically on a project, similar to Sidewinder, they were doing this test in the middle of the night and the desert and the wind was howling and so forth.
Ron Westrum (01:27:56):
And one of the guys says, "I think the boss should be here." And the guy says, "Yeah, he should be here." So they called him up at home and it was two in the morning. And so the boss actually went out to the place where these guys were building this missile and so forth. And he worked with them until they got the thing right. And after that, there was no problem with people lacking confidence, because they knew he had confidence in them and they had confidence in him. It's building trust. And that trust is something that you cannot break. If you break the trust, if you don't do what you say you're going to do, or if you say something that obviously is not true, that trust goes away.
Gene Kim (01:28:36):
And can you give some concrete examples that you've seen, not from the very top, but like how the groups actually interact with each other? What are the novel things that you've seen happen kind of at these lower levels?
Ron Westrum (01:28:49):
The most common thing is that the basic cooperation that people will train other people to do the stuff that they can do. So when workers cooperate, it breaks down the walls between the workers if somebody says, "Well, I can see a better way that you do this." There was a hospital for instance, where that was the basis of their improving their ability to do cardiac surgery. They had a cardiac surgery rate that was okay, but they knew they could do better. So they would meet every week with the numbers of surgeries on how many people survived, how many people didn't survive and so forth. And every week they would sit down with the numbers and they would go over this stuff and so forth. And surgeons would go and watch other surgeons operate to learn better techniques from the people around them.
Ron Westrum (01:29:38):
Well, it takes a lot of self-confidence and it takes a lot of cooperation to get to the point where you're willing to learn from other surgeons, because usually surgeons don't like to have other surgeons reflect on what they're doing. But when somebody is really doing something well, then that's, of course, it's something that you should emulate, you should do yourself. And the interesting thing is that every week, they had to go down with these numbers and look at how people had actually made out. And not everybody could handle it. Some of the surgeons left. They didn't want to be judged by other people. When you're in a confident situation and you say, "Okay, well clearly bill is doing something I can't do. I need to learn that."
Gene Kim (01:30:20):
But what I loved about the surgeon example was that it didn't involve the boss. It was about something that was happening between the surgeon ranks. Is there another sort of novel practice that you see kind of emerging from within the front lines?
Ron Westrum (01:30:32):
Well, how do you get things that happen on the front lines that are good to be shared with the rest of the organization? That's a leadership issue. Because when you've got something good, then you want to share it. Now by contrast, you always have to have the image of the pathological organization here. By contrast, if you've got somebody who's a Cinderella or you've got a team, that's a Cinderella team, you're going to crush them.
Gene Kim (01:30:57):
I'm sorry. And Cinderella team, by that, you mean?
Ron Westrum (01:31:00):
Basically the thing about Cinderella is nobody likes Cinderella because she makes everybody else look bad. So if you've got a team ... Here's the perfect example is this guy, who taught these kids in high school calculus. I can't remember what his name was. Oh,
Gene Kim (01:31:16):
Oh, yeah. It was a stand up and deliver person.
Ron Westrum (01:31:19):
Stand up and deliver. So the thing is that guy was hated by the rest of the school district, he made everybody look bad. So there's always two possibilities for reaction. One reaction is, "Okay, he's doing it great, so I want to do what he does." The other reaction is, "He's doing it great and he's really a problem for the rest of us." That's the Cinderella problem. So China Lake was like that. They would make other people look bad because they were high performers. So rather than duplicating China Lake, is that basically the whole system thought, "These people are out of control. So we've got to put them back in their place." Well, the truth is when you've got people like this, it's like you're managing a stallion. You can either let them run or you're going to cripple them. So a lot of people feel, "Well, if that stallion is going to make me look bad, I'm not going to go along with that."
Gene Kim (01:32:17):
Love it. And by the way, that was an interesting example that you gave of the stand up and deliver. Even there, you could see a totally different dynamic within that classroom than you would have seen in the more traditional classroom setting, that is fantastic. So last time, you mentioned that you're working on a new book about information flow. Can you tell us why you're working on that book and what areas are you pursuing? We ran out of time. Last time.
Ron Westrum (01:32:44):
In a few words, basically, what I do is I go through a series of case studies of pathological, bureaucratic and generative organizations. And so the thing is that you'd like to say, "Well hey, all pathological organizations are alike. You can generalize and so forth. And all bureaucratic organizations are like and so forth." It's not that simple. It's that what you've got is a family of different responses that turn out to be pathological or generative or bureaucratic. And so what I do is I look at some examples in detail. For instance, in a generative area, I look at Southwest Airlines, which everybody would recognize as a generative organization. And I look at the Mayo Clinic. So the Mayo clinic is one of these things where you thought, "Oh my God, this is so praised to the skies, but it can't be that good."
Ron Westrum (01:33:31):
So the more you get into it, the more you realize it is that good. So what is it that makes it that good? Well, fortunately the Mayo Clinic has a book on lessons from the Mayo Clinic and so forth. But the truth is, is that most organizations are not going to do it for the same reason that the people go to Southwest and they say, "Well, this is fantastic, what's the gimmick?" And the bad news is there is no gimmick. You've got to go through all this stuff that we go through, otherwise you're not going to get an organization that performs like we do. So the thing is you can have any number of visitors and so forth. And it's amazing how people go to Mayo Clinic and they come back and it's like they got religion.
Ron Westrum (01:34:08):
Well, why would they not? It has done really well. I'm sure the same thing is true of the Cleveland Clinic and so forth. I know less about it and it doesn't, it doesn't have the biographers and historians that Mayo does, but that's the deal. So by the same token, there are things that typically are pathological. We've seen lots of examples. I'm not going to go through them at the moment because we can't go into them in detail. But suffice it to say, you've got organizations where you have bosses who basically make people feel terrible. I learned about them originally because one of my students said that his father used to go to work at such and such a place, and he would get sick on the way to work. That's a pathological organization. By the same token, you have democratic organizations where everything is basically done by the book or somebody's book.
Ron Westrum (01:35:02):
And it's perfectly okay for certain applications. A bureaucracy is a good ... if you look at the vaccine rollout thing and so forth, in many cases, you've got really good bureaucracies. If you look at the recent presidential election we had, the fact is that basically, bureaucratic organizations where the people who made that organization work because they did it by the book. Well, the truth is there's lots of things that you can do by the book.
Gene Kim (01:35:29):
Are you talking the execution of the votes counted?
Ron Westrum (01:35:33):
Yes. So here, we're ordinary people and I emphasize this, ordinary people, but they did the right thing. They did it by the numbers and so forth and it worked. And the funny thing is that obviously some people don't believe that it worked and they're convinced that there was some sort of hanky-panky and so forth, even though there's no evidence that there was any hanky-panky.
Gene Kim (01:35:55):
And so why are you writing this book?
Ron Westrum (01:35:57):
I want to understand this stuff better. And by writing a book, you can't just give a few examples, you've got to get in there and look at the principle. How does it work? Does it always work this way? And so on and so forth. And that's the deal is that when I write the book, basically I'm educating myself as much as anybody else. Because I will find in the process of doing this, I will find out things that I didn't suspect. Somehow in looking at the Boeing stuff, this word family keeps coming up for the Boeing organization that worked. This is not an accident. The fact that people experienced it as a family, the trust was what allowed it to do the things it did. And the destruction of that family culture was exactly the thing that has basically put Boeing in the crosshairs of everybody at the moment.
Gene Kim (01:36:47):
Yeah. That resonates with me so much in terms of why a book? To learn. What is another surprise that genuinely surprised you as you delved more deeply into the specific examples or the generative?
Ron Westrum (01:37:00):
Well, I think that the thing that astonished me the most essentially was realizing that places like China Lake were not necessarily created by accident. China Lake borrowed a culture that had worked in World War II. And the organization went out of its way to maintain that culture in peace time. So, here's an important point, is that a lot of times these things aren't so mysterious. This is I think what Tom Peters' great mistake was. It's that he always saw that the high performers were people who sort of did things in a weird and different way and so forth.
Ron Westrum (01:37:34):
But the thing is Peters should have known better than that, because it was from Peters that I learned about the study on Navy captains, which showed that typically the captains who were top people in service fleet all tended to practice in a very similar way, not the same way, but very similar way. And so Peter should have taken that and said, "Okay, look, here's the answer." The answer is basically, there are commonalities. But he never got to that point. It was like everybody was doing something that was typically different and you couldn't copy it and so forth. The fact is you can copy it. That's the deal. Is that there's things that are common principle. It is to find those common principles that I'm writing this book.
Gene Kim (01:38:18):
That's awesome. And maybe just to forgive Tom Peters, one of the giants on whose shoulders so much other work was based upon.
Ron Westrum (01:38:25):
Gene Kim (01:38:27):
Oh, very good. I want to make sure I convey this. And so we've now spent probably five plus hours together and it's just amazing listening to these recordings and it is as dazzling listening to the recordings as it is talking with you. So hopefully that conveys to what extent I'm just truly enjoying every interaction that we have and as well as how much I'm learning from it.
Ron Westrum (01:38:48):
Oh, it's great. I think DevOps has done a wonderful job of pursuing this stuff.
Gene Kim (01:38:53):
Wonderful. Anything I can do for you in the meantime?
Ron Westrum (01:38:56):
You have, Gene.
Gene Kim (01:38:58):
What exactly is that? If you don't mind me asking.
Ron Westrum (01:39:01):
Well, DevOps took my ideas seriously and they put them into effect. And I think that the Puppet Labs thing is a perfect example of that. I think this is the first serious empirical test of my ideas. I'm delighted to know that they worked out. It doesn't always happen that way. The funny thing about theorists, by the way, it's just an interesting insight-
Gene Kim (01:39:27):
Gene here. Oh no, I lost a little piece of that recording. We had a hilarious discussion about the theorists and the experimentalists. The people who do theory building versus those who do the work of theory testing. I can't wait to tell Dr. Nicole Forsgren about just how much fun we had talking about that. Back to the interview where I asked Dr. Westrum, how people can reach him and what in particular, he would love people to reach out to him.
Gene Kim (01:39:55):
About Dr. Westrum, I cannot overstate how much I've learned in these two interviews. How can people reach you and what do you want people to reach you about?
Ron Westrum (01:40:06):
Well, the interesting thing is that I've been reached by people in India, for instance, who were interested in things that were happening there. I've been reached by people in Japan. And basically, it's just write me at EMU or Eastern Michigan University or using my site on Facebook and so forth. I'm happy to talk to people who have an example that they think is really good. And I've learned by the way, about many of the things that I talk about by other people saying, "Well, you need to look at this and you need to look at that." And so it was my students that told me about the Benfold and told me about China Lake and so forth. And I remember in both cases, I originally said, "Oh, I don't believe that. That sounds like BS to me."
Ron Westrum (01:40:51):
And it wasn't until I read it myself, that I realized that, "Hey, this is pretty interesting to look at." And it was.
Gene Kim (01:40:57):
Is there any specific help that you're looking for?
Ron Westrum (01:40:59):
Yeah. I think the thing is that if people are using these things and finding that they're working, that's great. If they think the generative stuff isn't working for them, I'd like to know why. It's a problem not having good critics. And I think that's one of the things. If people have a problem with it, I'd like to know. I'd like to know, hey. The funny thing is that I often find the reverse and people are delighted to know that there are generative organizations, because if you've been only in pathological organizations and you figure it always has to be that way, there's good news. It doesn't have to be that way.
Gene Kim (01:41:37):
I hope you learned as much as I did in this interview. So you may have noticed that Dr. Westrum kept bringing up the COVID vaccination rollout. What's amazing is that I've gotten a chance to research this area with him, along with Dr. Stephen Spear, Dr. Nicole Forsgren and others. I can't wait to share more about this when we are a little further along.
Gene Kim (01:41:58):
But I can share some of what I've learned so far, because on the next episode, I interview Trent Green, chief operating officer of Legacy Health, a $2 billion integrated delivery health system based here in Portland, Oregon, who gave me an amazing three hour tour of the vaccination center at the Portland Convention Center, where they are doing over 8,000 vaccinations per day, up from 2000 per day in January.
Gene Kim (01:42:26):
It is the most amazing example of how human creativity has been unleashed in service of the most important societal mission right now, which is vaccinating everyone on the planet in the shortest amount of time. Joining me on that interview will be Dr. Steven Spear, and we learn about how these incredible improvements were made in the vaccination rollout, why these improvements have been typically so difficult in the healthcare setting, how these problems very much matched to the problems of integrated problem solving, using slow structures and fast structures, and how lessons from the COVID vaccination rollout can potentially inform improving the entire healthcare delivery system. It's such an amazing interview and I can't wait to share it with you. So see you then.
PART 4 OF 4 ENDS [01:43:19]