Skip to content
THE IDEALCAST WITH GENE KIM

The Sociology and Typologies of Organizations, and Technical Maestros with Dr. Ron Westrum

Episode 17
|
Ron Westrum
|
Emeritus Professor
|
1h 49m

- Intro

The Sociology and Typologies of Organizations, and Technical Maestros with Dr. Ron Westrum

In the first part of this two-part episode of The Idealcast, Gene Kim speaks with Dr. Ron Westrum, Emeritus Professor of Sociology at Eastern Michigan University. His work on organizational culture and his contribution of the Westrum organizational typology model have been instrumental in understanding what makes a high-performing organization across industries. For decades, he has studied complex organizations from medicine to aviation to the nuclear industry.

In part one of their conversation, Kim and Westrum talk about the stark contrast between NASA’s highly experimental culture of the Apollo space program versus the highly compliance-driven culture of the US Space Shuttle program, and Westrum’s opinions on how to bring that experimental culture back. They also discuss the origins of the Westrum organizational typology model and some of the insights that led to it. Finally, Westrum shares what organizations should do when things go wrong in complex systems.


- About The Guests
Ron Westrum

Ron Westrum

Emeritus Professor of sociology at Eastern Michigan University

Dr. Ron Westrum is Emeritus Professor of Sociology at Eastern Michigan University. He holds a B.A. (honors) from Harvard University and a Ph.D in Sociology from the University of Chicago.Dr. Westrum is a specialist in the sociology of science and technology, and on complex organizations. He has written three books, Complex Organizations: Growth, Development and Change; Technologies and Society: The Shaping of people and Things, and Sidewinder: Creative Missile Design at China Lake. He has also written about fifty articles and book chapters. His work on organizational culture has been valuable for the aviation industry and to medical safety, as well as to other areas of endeavor. He has been been a consultant to NASA, the National Research Council, and the Resilience Core Group.He is currently at work on a book on information flow cultures.


- You'll learn about
  • Why much of the body of knowledge around safety culture came from sociology as opposed to psychology.
  • How Westrum views the stark contrast in NASA between the highly experimental culture of the Apollo space program versus what has been characterized as a highly compliance-driven culture of the US Space Shuttle program.
  • Insightful and useful opinions on what would be required to bring that experimental culture back in NASA.
  • The origins of the Westrum organization typology model and some of the insights that led to it.
  • Why Westrum views the notion of a technical maestro important to get the desired outcomes.
  • What Westrum thinks should ideally happen when things go wrong in complex systems.

- Resources
  • State of DevOps Reports
  • Westrum organizational culture
  • The study of information flow: A personal journey by Ron Westrum
  • Sidewinder: Creative Missile Design at China Lake by Ron Westrum
  • Complex Organizations: Growth, Development and Change by Ron Westrum
  • Normal Accidents: Living with High-Risk Technologies by Charles Perrow
  • Crew resource management or cockpit resource management (CRM)
  • The Human Factor in Aircraft Accidents by David Beaty
  • Naked Pilot: The Human Factor in Aircraft Accidents by David Beaty
  • United Airlines Flight 232
  • Cockpit Voice Recorder Database
  • Captain Al Haynes' 1991 lecture at NASA Ames Research Center
  • It's Your Ship: Management Techniques from the Best Damn Ship in the Navy by Michael Abrashoff
  • Apollo 13
  • Space Shuttle Challenger disaster
  • Space Shuttle Columbia disaster
  • CBS News article: "Readdy says 'no rationale' for spy satellite inspection"
  • Apollo 13 (1995) - Square Peg in a Round Hole Scene
  • Health inequalities among British civil servants: the Whitehall II study by Jane Ferrie, Martin J Shipley, George Davey Smith, Stephen A Stansfeld and Michael G Marmot
  • Facing Ambiguous Threats by Michael Roberto, Richard M.J. Bohmer, and Amy C. Edmondson
  • DevOps Enterprise Summit Virtual
  • Nasa Cut or Delayed Safety Spending by Stuart Diamond
  • Mars Curiosity Rover Landing Space 2015
  • How Apple Is Organized for Innovation by Joel M. Podolny and Morten T. Hansen
  • Arthur Squires
  • The Tender Ship: Governmental Management of Technological Change by Arthur Squires
  • Jacob Rabinow
  • Federal Research and Development Expenditures and the National Economy: Hearings Before the Subcommittee on Domestic and International Scientific Planning and Analysis of the Committee on Science and Technology, U.S. House of Representatives, Ninety-fourth Congress, Second Session
  • Turn the Ship Around!: A True Story of Turning Followers into Leaders by L. David Marquet
  • Excellence in the Surface Navy by Gregg G. Gullickson
  • Excellence in the Surface Navy by Gregg G. Gullickson, Richard D. Chenette and Reuben T. Harris
  • How NASA Builds Teams: Mission Critical Soft Skills for Scientists, Engineers, and Project Teams by Charles J. Pellerin
  • The Generals: American Military Command from World War II to Today by Thomas E. Ricks
  • George C. Marshall: Four (4) Volumes - Education of a General, 1880-1939; Ordeal and Hope, 1939-1943; Organizer of Victory, 1943-1945; Statesman, 1945-1959 by Forrest C. Pogue
  • Team of Teams: New Rules of Engagement for a Complex World by General Stanley A. McChrystal with Chris Fussell, David Silverman, Tantum Collins
  • Forge of Democracy by Neil MacNeil
  • Freedom's Forge: How American Business Produced Victory in World War II by Arthur Herman
  • Email Ron Westrum

- Transcript

Gene (00:00:00):Welcome to another episode of The Idealcast with Gene Kim. This episode is made possible with support from LaunchDarkly. Get total control of your code to ship quicker, reduce risk, and reclaim your nights and weekends. Head to launchdarkly.com to learn more. Today, I am so delighted that I have on Dr. Ron Westrum, Professor Emeritus of Sociology at Eastern Michigan University. His name will be so familiar to anyone who has read the State of DevOps Reports that I had the privilege of working on for six years with Dr. Nicole Forsgren and Jez Humble from 2013 to 2019.Gene (00:00:40):This is the cross-population study that spanned over 36,000 respondents that allowed us to understand what high performing technology organizations look like as well as what are the architectural practices, technical practices and cultural norms that predict high performance. And without a doubt, some of the most key insights into what cultural norms might look like were made possible by the Westrum organizational typology model.Gene (00:01:06):In that model, Dr. Westrum asserts that there is a continuum of safety cultures that fall into three general categories: pathological, bureaucratic and generative. I'm quoting from his 2014 paper, The Study of Information Slow: A Personal Journey, "Pathological organizations are characterized by large amounts of fear and threat. People hoard information, or withhold it for political reasons or distort it to make themselves better. Bureaucratic organizations protect departments. Those in the department want to maintain their turf, insist on their own rules, and generally do things by the book, their book. And generative organizations focus on the mission. How do we accomplish our goal? Everything is subordinated to good performance, to doing what we are supposed to do."Gene (00:01:53):It is difficult to overstate how much the work of Dr. Westrum influenced our thinking. And it features prominently not only in the State of DevOps Research, but also in The DevOps Handbook and Accelerate Books. And yet as much as so many of us in the DevOps community have read Dr. Westrum's work few of us have actually seen or heard him talk, which is why I was so excited when I was CC'd on an email between Dr. Westrum and Dr. Forsgren. I almost fell out of my chair. I reached out to him to see if we could talk. And holy cow, I learned so much and he helped me connect some very disparate, but very important thoughts.Gene (00:02:31):For decades, Dr. Westrum has studied complex organizations from medicine, aviation and the nuclear industry. And in 2013, he wrote a book which I absolutely loved called Sidewinder: Creative Missile Development at China Lake. In this episode, we'll learn about why so much of the body of knowledge around safety culture come from sociology, as opposed to psychology, how he views the stark contrast at NASA between the highly experimental culture of the Apollo space program versus the, what has been characterized as a highly compliance driven culture of the US space shuttle program.Gene (00:03:10):He has some very insightful and very useful opinions on what would have been required to bring that experimental culture back. The origins of the Westrum organizational typology model and some of the insights that led to it, the very important notion of a technical maestro and why he views it as so important to get the outcomes that we want. And what he thinks should ideally happen when things go wrong in complex systems.Gene (00:03:37):This was such an interesting interview, and I learned so much from it, but I asked him a week afterwards, whether we could record a follow-on with some followup questions and I'm so grateful that he said yes. So this is part one of a two-part interview. Dr. Westrum. I am so honored that I have the opportunity to interview you. So I've introduced you in my own words. Can you introduce yourself in your words and describe what you've been working on these days?Ron (00:04:03):Sure. So I'm basically an organizational sociologist. I specialize in the sociology of knowledge, which is how people find out about things in terms of what I do. And I've been fascinated by organizations since I was probably in high school. And when I went to college, that's what I studied. And at Harvard, I studied with Harrison White, who was a brilliant theorist who had a PhD in physics and in sociology. Unusual combination.Ron (00:04:34):When I got to Chicago, I was very fortunate to work with Albert Wohlstetter, who was probably something like the world's expert on nuclear strategy. And then I worked briefly for the Rand Corporation. Eventually I went to Eastern Michigan University and I began working up the stuff that I was trained to do, and eventually wrote a textbook on complex organizations. So that was in 1984. And it was in 1984 that I first saw this book by Charles Perrow on accidents.Ron (00:05:07):So that really got me going, and I was really fascinated by the stuff that Perrow was doing. And very shortly thereafter got together with people in the aviation community, to Jim Reason and Irving Janis, and others, people from NASA and so forth. This was just a time that cockpit resource management was becoming popular.Ron (00:05:28):And cockpit resource management is basically working on information flow, getting the effective flow of information from the pilot so they can safely navigate the plane. So from that, basically... I didn't invent CRM. This was other people who were in NASA, the airlines, so forth. But I got really interested in this stuff. And the other thing is that the aviation industry got really interested in me. Well, I'll just tell you an interesting anecdote is that I remember going to a meeting one time, I was probably the only sociologist present. And so a very nice guy who worked in Canadian safety research, came up to me, he said, "I can't understand why you're here. Why are you here?"Ron (00:06:08):So the next day I gave a talk and he came up to me again, he said, "My God, thank God you're here." And I thought, "Okay. They understood." So really what I was talking about was the culture of aviation. What is it that makes an aviation organization effectively process information? So that then became the focus of the things that I did subsequently.Ron (00:06:33):And I've been writing things for aviation safety and medical safety and so forth since then, as a result of basically those early responses by people out in the organizational field to how they were doing their job. And it's sort of funny. I remember somebody coming up to me in the meeting, one time, he said, "Are you the famous Ron Westrum?" I said, "I don't know that I'm famous. But I'm Ron Westrum." And so that has been the response is that people really like the cultural types that I came up with and they find them very useful in interpreting organizational behavior.Gene (00:07:08):Is there an easy answer for why is it that so many prominent figures in the safety field come from sociology? In fact, for that matter, I wouldn't even claim to understand what sociology is. I suspect you would not find it at all surprising. Can you help educate on.Ron (00:07:23):Yeah, I'll tell you the truth, which is the important part. So the thing is, is in aviation studies basically in terms of what we've been looking at, what is it that determines safety? It's mostly been psychologists, either human factors people or people who are interested in cognitive psychology. And so I'm actually, believe it or not a standout in the sense that I'm a sociologist. Perrow is a sociologist too. And actually there were more people like Jim Reason who was a psychologist and Irving Janis who was a psychologist.Ron (00:07:53):But the truth is what we're really looking at is social psychology. What's the difference between social psychology and sociology? Well, that would be a long discussion. But suffice it to say that the social psychologists tend to look at these things more as a psychological phenomenon and sociologists tend to look at it as an organization or a group phenomenon.Gene (00:08:12):Gene here. Okay, that was super interesting. The perspectives one brings when one comes from psychology, which is grounded in an individual's perspective or when one comes from sociology. Dr. Westrum will speak more about that in just a second, but I wanted to break in for a moment and make a couple observations about the work around airline safety.Gene (00:08:33):He mentioned CRM, which has cockpit resource management, or sometimes crew resource management, which is widely heralded as a breakthrough in improving flight safety by how flight crews are trained. I'm reading from the Wikipedia entry on CRM. "It is a set of training procedures for use in environments where human error can have devastating effects. Used primarily for improving safety CRM focuses on interpersonal communication, leadership, and decision-making in the cockpit of an airliner. It's pioneer is David Beaty, a former Royal air force pilot, and who later wrote a book called the Human Factor in Aircraft Accidents, first published in 1969."Gene (00:09:14):He wrote a more mainstream book called The Naked Pilot: The Human Factor In Aircraft Accidents, which was published in 2011. "The term cockpit resource management was coined in 1979 by a NASA psychologist, John Lauber who had studied communication processes and cockpits for several years. While retaining a command hierarchy, the concept was intended to foster a less authoritarian cockpit culture, where co-pilots were encouraged to question captains if they observed them making mistakes. CRM grew out of the 1977 Tenerife Airport disaster, or two Boeing 747 aircraft collided on the runway killing 583 people. A few weeks later, NASA held a workshop on the topic, endorsing this innovative training in the US, United airlines was the first airline to launch a comprehensive CRM program starting in 1981. By the 1990s, it had become a global standard."Gene (00:10:07):Okay, lastly, "CRM is concerned with the cognitive and interpersonal skills needed to manage resources within the organized systems, as opposed to the technical knowledge and skills required to operate the equipment. In this context, the cognitive skills are defined as the mental processes used for gaining and maintaining situational awareness, for problem solving and for decision-making. Interpersonal skills are regarded as communications and the range of behavioral activities associated with teamwork."Gene (00:10:38):Okay, there are so many interesting papers and stories that we could go into here. I've seen so many documentaries on this topic and read so many papers and transcripts of airline disasters. But I think there's one that is worth going into more detail. And that is United Airlines Flight 232, which happened in 1989. I will revisit this incident during the next break in and contrast it with some other well-studied incidents. Back to the interview.Gene (00:11:06):When you gave that talk, what is it that people heard that said, Oh, your work is relevant here." That the perspective that you bring that is useful that might've been primarily seen as a psychological phenomenon.Ron (00:11:20):Well, I think the thing is that culture is terribly important when we look at organizations because culture basically is what essentially sets the organization's parameters. Just as each of us have a psychological reality that set by our internal parameters and our life experience and so forth, organizations, likewise have a set of parameters that are set by their experience and their orientation.Ron (00:11:47):And so if you begin to ask yourself, well, what is it that is going to get an organization to do really well? Then culture is going to be in there. It's going to be one of those things that is very important. The problem with culture is that it seems like an airy fairy kind of thing. It's sort of gossamer, it doesn't really have a sort of hard meaning for people. Until you realize that many of the hard outcomes, in aviation, a hard outcome might be hitting the earth with your airplane.Ron (00:12:13):Hard outcomes are shaped by basically the organization's culture. After the Columbia shuttle exploded many people inside NASA said, "We've got to change the culture." At the time that NASA did what it did. I was working with Lockheed and we were basically looking at that sort of organizational culture in Lock. And this was a very important thing for Lockheed because they knew that NASA was going to be looking at their organizational culture.Ron (00:12:37):At just that point NASA decided that it would actually pay for a big study of organizational culture in NASA. Now NASA is a huge organization. It's got maybe 10 national laboratories and so forth, an amazing amount of hardware and software and people and so forth. So this was going to be a very, very powerful study potentially. However, they also really did not want to do it. The reason I know that is because the way they set it up.Ron (00:13:06):They said, "Okay, first of all, everybody has to submit their requests from funds in three weeks." Now, remember this is an organization with, what, 100,000 people. And they're all rocket scientists and you've got to submit your proposal in three weeks. So I of course joined a team and the other people on the team were, in some cases, even brighter than I was, which is really nice.Ron (00:13:26):And so we submitted a proposal and was cordially turned down. And the organization that got the proposals specialized at lost time accidents. Keep that in mind for a moment. So their specialty was basically avoiding slips and falls and things like that. So they also had to do in six months, some major change in the organization. Now, remember how big this organization is, remember how many professionals and their qualifications are part of this. So after six months they did a survey, basically, and they asked the managers and the workers, how much change has been present?Ron (00:14:02):Well, they were delighted to hear it from the managers that something like 46% thought a significant change had taken place. Okay, well the punchline, when they asked the workers it was like 18% said a significant change had taken place. Now this is a difference that we see in a variety of different organizations when basically management does something that it thinks is impressive, that the workers don't. This is true in medicine, it's true in aviation, it's true in all kinds of things. So basically nothing important had happened. Now that's what you would expect after six months. So a couple months after that, NASA decided that they had fixed the culture.Gene (00:14:40):Of the entire ecosystem of suppliers and operation. I mean, what's the scope here?Ron (00:14:45):That's right.Gene (00:14:46):The entirety.Ron (00:14:48):It's a joke. It's a joke. And so of course they then fired this organization, I don't need to name it, that had been doing this study because they said, "You fixed the culture." Well, I can't tell you how stupid this was. And then if you believed it you were stupid, because it's going to take more than that to do it. And typically, by the way, unfortunately it takes a change of leadership to make a big difference like that. You can get changes in organizations in a short time, but it really takes doing. And usually that's for a smaller organization. I mean, many people have read the story of the USS Benfold. There's a book called It's Your Ship. S-H-I-P.Gene (00:15:28):Abrashoff.Ron (00:15:30):Abrashoff, yeah. I mean, and you find this, everybody has read this in the nuclear business and stuff like that. It is a great book. Now, the interesting thing about Abrashoff's book is basically he apparently didn't have any training in how to create a generative environment, but he managed to do it basically in about six months. Keep that figure in mind, six months. He took his ship from being essentially on the bottom of the performance continuum to the top. But this is a ship, it's 300 sailors, male and female on board. You can do that.Ron (00:16:02):If you're talking about NASA, this is a huge organization and it's going to take a lot of work. And I knew from the nature of this project, that they had really not made any difference at all. If you looked at the people that they hired to do it and so forth, they were not cultural specialists. They were good on basically getting people to communicate and so forth, but not the kind of change you needed.Ron (00:16:25):And so the whole thing was basically a shuck, it was an enormous shuck. And here is one of the biggest organizations in the United States, probably one of the most important, in the long run. And here we have people who know basically nothing. That's really a terrible statement, but it's true. Basically nothing about organizational culture. So it's not surprising to me then that basically NASA hasn't changed that much.Ron (00:16:49):And I don't want to go into the politics of NASA because this is not something I'm an expert on, but certainly looking at it from the outside you have to notice the difference between what happened with Apollo 13 and what happened with the Challenger. I mean, here's a perfect example... You want to talk about culture? Here's a perfect example. Apollo 13 was one of the examples of generative culture at its best. And the people in NASA basically felt failure was not an option.Ron (00:17:18):Eugene Kranz said, basically "We are not going to lose all the astronauts. Whatever we have to do, wherever we have to go in the world, whatever processes we have to go through, we are going to get those men back." Was a very, very strong commitment. And the reason there was a strong commitment is that Kranz had been in NASA at the time they had the Apollo 1 fire where they killed three astronauts, they burned to death. And so he had decided that after that, that was not going to be the way it was. He was going to change the culture. He was going to make sure that they would not do anything that was irresponsible. So we go forward basically, essentially, a couple of decades. And all of a sudden we have a different NASA, it's not all of a sudden, it's gradual.Ron (00:17:58):But all of the sudden we have very different kind of culture, a culture that is very bureaucratic, a culture that is much more concerned about schedules than it is about people. And we have a spaceship that's out in space and we have a mission, a control team on the ground who really don't seem to care about what happens to these astronauts because they saw in launching Columbia that there had been a foam strike. Now foam, does it sound like a dangerous substance, but of course it's a matter of how fast something strikes your ship. So there was an engineer he was so concerned about this. He wanted the air force to take pictures of the shuttle and make sure that it wasn't damaged.Gene (00:18:38):From a satellite, re-task the satellite so they could actually examine...Ron (00:18:42):I don't know how the air force was going to do this. But basically they said, Okay, we'll do that." The mission management team found out about the request and they canceled it. They said, "We don't need to do that. This is not a mission safety problem. This is simply a scheduling problem." It wasn't that ship was damaged. And very shortly thereafter, when the ship tried to re-enter the atmosphere, it was destroyed. All those astronauts, eight people, died immediately when this thing exploded. How did this happen? Why was it that they didn't pay attention to the people in the organization who were demanding, they were literally demanding to get the pictures that they needed to find out if this ship was safe. So here's a perfect example of cultural differences. And that was why after, Columbia, NASA said they had to find out how they could change the culture. Here, we get to the crux of the matter.Ron (00:19:31):So how are you going to study culture then? Because clearly what was going on with Apollo 13 was exactly what you want to see. High commitment, high competence, and in the end, the astronauts were brought home. I think if you read the story of Apollo 13, it will absolutely blow your mind how complicated it was to get that ship back to earth safely. And those astronauts back on an aircraft carrier. By contrast, what happened with Columbia is horrible. They literally, the mission management team was only convened for about two and a half weeks. Of those two and a half weeks they took two weekends off. So here were the astronauts in space, hoping that they were safe. And the mission management team was ignoring the problems that menaced their health. And I'm sorry, I forgot the name of Rodney Rocha. He was the engineer who had made such a fuss.Ron (00:20:19):He was responsible for structural integrity. All right. So here's an example of where culture is important, even though it's hard for many people to envision culture and especially it's hard, if you don't have a way of measuring it. How do you measure the culture of an organization? I think that's probably one of your key questions. How do you do that? And so the thing is that unless you can measure, you have no idea what's going on with the culture. But what you do know, and this is a perfect example of this. What you do know is that the culture makes a difference in how everything comes out.Gene (00:20:53):Gene here, I just wanted to break in quickly and give some more information on what Dr. Westrum was just referring to. Because believe it or not, this also came up in the workshop that I took at MIT with Dr. Steven Spear in 2014. So I finally found a citation to this. It is CBS News, March 14th, 2003, two months after the Columbia disaster. Quote, "NASA official William Readdy, Associate Administrator for Space Flight and a former shuttle commander himself told Columbia Accident Investigation Board he did not consider asking for a spy satellite inspection of Columbia's left wing during the doomed ship's mission because the agency had already concluded the shuttle could land safely."Gene (00:21:37):End quote, the article goes on to say that decision has been the subject of second guessing and criticism in the wake of the disaster. In the days after the mishap shuttle program manager, Ronald Ditmore said no request for satellite imagery was made because the resolution of the imagery based on past NASA experience would not be high enough to reveal damage to individual tiles and because Boeing analysis had concluded Columbia could land safely.Gene (00:22:03):During Columbia's mission, however, Wayne Hale a senior flight director, now serving as launch integration manager at the Kennedy Space Center made inquiries about the possibility of Air Force help inspecting Colombia. But those initial efforts were terminated by senior management.Gene (00:22:19):Quote, "The space shuttle program did not want any data. And in fact, there was never a formal mission operations directorate request made from the flight dynamics officers or the flight director, Steve Stitch, a flight director himself wrote in an email to a colleague." End quote.Gene (00:22:38):So of course, just like in any post-mortem we know the dangers of playing would have, could have, should have. With perfect knowledge of the past we can always come up with better decisions, but it doesn't actually usefully inform whether we'd make better decisions in the future. But the story does seem to affirm Dr. Westrum's notion that there was a real cultural difference in the appetite for information that NASA had between the Apollo era and the space shuttle era.Gene (00:23:04):In the introduction I had given an abbreviated version of the Westrum organizational typology model. I'm going to read the entire table that you are likely all already familiar with. So there are three columns in this table, pathological, bureaucratic, and generative. In pathological it is characterized by information being hidden, messengers of bad news being shot, responsibilities are shirt, bridging between teams and functional groups is discouraged, failure is covered up and new ideas are crushed.Gene (00:23:36):In the middle are bureaucratic organizations where information may be ignored, messengers of bad news are tolerated, responsibility is compartmentalized, bridging between teams and functional groups is allowed but discouraged, the organization is viewed as just and merciful, and new ideas create problems.Gene (00:23:57):Whereas in generative organizations, those mostly high performers information is actively sought, messengers are trained to tell bad news, responsibilities are shared, bridging between teams and functional groups is rewarded, failure causes a genuine sense of inquiry and new ideas are welcomed.Gene (00:24:17):I think clearly we are hearing those norms of the bureaucratic culture of the space shuttle program versus the generative culture of the Apollo era. So in my first break-in, I had mentioned United Airlines Flight 232 in the context of crew resource management, which through training had transformed the dynamics and information flows within the airline cockpit. Flight 232 is so interesting and memorable for so many reasons, but it is also credited for saving so many lives in this particular disaster.Gene (00:24:51):I'll read from the Wikipedia entry. "This was a regularly scheduled flight from Denver to Chicago on July 19th, 1989. The DC- 10 serving the flight crash landed at Sioux City, Iowa after suffering a catastrophic failure of its tail mounted engine, which led to the loss of many flight controls. Of the 296 passengers and crew on board 112 died during the accident. Despite the deaths, the accident is considered a prime example of successful crew resource management because of the large number of survivors and the manner in which the flight crew handled the emergency and landed the plane without conventional control."Gene (00:25:28):I remember reading the cockpit transcript of his flight back in 1995. Back when I was a graduate student at the University of Arizona. And then I remember reading a lecture that the captain of this flight, Captain Al Haynes gave at the NASA Ames Research Center on May 24th, 1991. Last night, I read the transcript of that lecture and it is as riveting as I remember it being when I first read it back in 1995, 25 years ago. I'll put a link to it in the show notes. I'm guessing it was probably an hour lecture. I'm going to read two portions of his talk.Gene (00:26:04):The first is the nature of the failure. He said, "Well, on July 19th, Murphy's law caught up with us and we did lose all three hydraulic systems. And as a result, we had no ailerons to bank the airplane, we had no rudder to turn it, no elevators to control the pitch, we had no leading edge flaps or slats to slow the plane down, no trailing edge flaps for landing, we had no spoilers on the wing to help us get us down or help us slow down once we were on the ground. And on the ground, we had no steering nose wheel or tail and no brakes. So what we had, what we went through today on one of the simulators was the throttles on the number one and number three engine to control us."Gene (00:26:47):So I'm imagining that earlier in the day, they must have had all the people attending the lecture, go into a simulator to experience what it was like to try to control that aircraft. That must have been incredible. And then later he talks about CRM. He says, "As for the crew, there was no training procedure for a complete hydraulic failure. We've all been through one failure or double failures, but never a complete hydraulic failure. But the preparation that paid off for the crew was something that United started in 1980 called cockpit resource management. All the other airlines are now using it. Up until 1980, we worked on the concept that the captain was the authority on the aircraft and what he, or she said goes. And we lost airplanes because of that. Sometimes the captain is not as smart as we thought they were. And we would listen to him or her and do what they said, and maybe they didn't know what they were talking about. On that day we had 103 years of flying experience there in the cockpit, trying to get that airplane on the ground. Not one minute of which we had actually practiced a complete hydraulic failure, not one of us. So why would I know more about getting that airplane on the ground under those conditions than any of the other three? So if I hadn't used CRM, if we had not let everybody put their input in..."PART 1 OF 4 ENDS [00:28:04]Gene (00:28:03):... used CRM. If we had not let everybody put their input in, it's a cinch we wouldn't have made it. He continues, "If you read the cockpit voice recorder transcript, there's a lot of that going on. When are we going to put the gear down? I don't know. How are we going to get the landing gear? Maybe one of two ways, let's try it. So in short," Captain Haynes concludes, "CRM really paid off."Gene (00:28:25):So one of the most famous and memorable aspects of this accident is that there just happened to be a DC-10 instructor in the cabin of the plane, Captain Fitch. Instructors have a reputation of not only having expertise, but also having lots of experience. So Captain Fitch joins the cockpit crew, and stands in between[Captain Haynes and the copilot to control the throttles, which were the only means of controlling the aircraft.Gene (00:28:51):So here's what Captain Haynes said about that during the lecture. "We were told that there was a DC-10 captain in the back who was an instructor. And we like to think that instructors know more than we do. So I figured maybe Denny Fitch knew something that we didn't. So we asked him to come up. Well, he took one look at the cockpit, and that's his knowledge. It was sort of funny listening to and reading the transcript, because he's about 15 minutes behind us now. And he's trying to catch up. And everything he says to do we've already done. And after about five minutes, that's 20 minutes into this operation, he says, "We're in trouble." We thought, that's an amazing observation, Denny."Gene (00:29:29):Laughter in the audience. "And we kid him about it, but he's just trying to catch up with our thinking. We were 15 minutes ahead of him. When he found out that he didn't have any new knowledge for us he says, "Now what can I do to help?" I said, "You can take these throttles," so he stood between us, not kneel on the floor as the news media said. He took one throttle in each hand, and now he could manipulate the throttles together."Gene (00:29:52):"With a number two throttle frozen, we couldn't get a hold of the throttles together. He could. And we said, "Give us the right bank. Bring the wing up. That's too much bank. Try to stop the altitude." He tried to respond. And after a few minutes of doing this, everything we do with the yoke he could correspond with a throttle. So it was a synchronized thing between the three of us. With Dudley," I think that's a flight engineer, "still being able to do all of his communications. So that's how we operated the airplane, and that's how we got it on the ground."Gene (00:30:21):So it is amazing to me that in the middle of this emergency they decide to integrate a new member onto their team. They quickly assessed the skills that he had, which was not as revelatory as they might have hoped. But they integrated him into the effort to control the airplane, and ended up landing on the ground. So clearly, this was not an individual effort, the realm of psychology. It was a collective and very dynamic effort from a group of individuals, the realm of sociology.Gene (00:30:51):One last side note, in the lecture Captain Haynes also mentions the interaction that they had very early in the emergency with the San Francisco Area Maintenance crew. Those are the maintenance experts sitting in San Francisco for each type of equipment that United flies. They have all the computers, all the logbook histories, the history of the aircraft, all the information that they can draw upon to help a crew that has a problem.Gene (00:31:15):And as he says, "Unfortunately, in our case, there was nothing they could help us with. Every time they tried to find something that we could do, we had already done it or couldn't do it because we had no hydraulics." So take that phenomena of Captain Fitch being 15 to 20 minutes behind the cockpit crew. This was even worse. In the last episode I talked about the slower, integrated problem solving paths that occur when you go up and down the organizational hierarchy, and the much faster integrated problem solving paths when it's within a team or across teams along sanctioned interfaces.Gene (00:31:49):And the fast and slow being a proxy for four characteristics. Frequency, speed or latency, granularity, and accuracy. And operations favor the first two, of frequency and speed, whereas planning, preparation, assessment favor granularity and accuracy, because we don't want to make plans on categorically false information.Gene (00:32:11):I think the dynamics of the cockpit inside of United Flight 232 presents such great examples of these. Examples of the slower cognitive activities include establishing the CRM protocols, training and roll out across all United States airlines. Setting up those maintenance crews, say, at the San Francisco Airport. Whereas, the faster integrated problem solving include figuring out how to keep the plane flying, assessing whether the maintenance team can actually help. In their case it didn't. Assessing whether and how Fitch could help, and integrating him into the flight operations, and establishing the new roles and responsibilities.Gene (00:32:50):And holy cow, reading the actual cockpit transcripts is amazing to see how laissez the communications between the cockpit crew and air traffic control was. Often there were significant misunderstanding, lots of need for repetition. In the lecture, Captain Haynes described his frustrations at the time with communications. And he had delegated all external communications to his flight engineer, because his focus was very much keeping that plane flying and figuring out how to get the plane landed on the ground.Gene (00:33:22):Okay. We started this break in by comparing the appetite for information within NASA during the Space Shuttle Era versus the Apollo Era. Before we go back to that interview, I just want to mention that one of my favorite scenes in Apollo 13 was the scene where they have to figure out how to change or connect the CO2 filter in order to save the astronauts. Gene Kranz, as played by Ed Harris, said to all the engineers, "Go figure out how to fit a square peg into a round hole." In the next scene, the engineers are assembled with the mission of how to fit this, a big square plug into that, a small round plug, using only that, a bunch of parts, including what look like space suits, tubing, and duct tape.Gene (00:34:09):In this case, the Apollo 13 astronauts were able to leverage the slower, cognitive problem solving capabilities of an enormous group of talented engineers to help them get safely back home. In contrast, United Airlines Flight 232, because of the nature of the emergency, the only experts of diagnosing and flying a plane that has lost all three hydraulic systems were in that cockpit.Gene (00:34:34):Okay. Back to the interview where Dr. Westrum talks about the continuation of his quest to understand the dynamics of generative organizations.Ron (00:34:43):Basically I started on this with research and development, because I knew that the nature of successful R&D teams had to be very different than the typical sort of bureaucratic organization where you do things by the book, and so forth. So what I did is, I began to think about, okay, so what is the nature of the culture that is going to be highly successful in very complicated things?Ron (00:35:04):So this is a really key question, okay? Because this is what has motivated many people to talk about high reliability organizations. Are you familiar with that category?Gene (00:35:13):It's been probably ten years since I've read any papers there, but certainly familiar with the vast body of research. But please, refresh my memory,Ron (00:35:19):The thing about high reliability organizations is, they do things that are very difficult to do, and they do it on a routine basis. Now, Chuck Perrow's book, it's called Normal Accidents, the reason that he calls them normal accidents is that you expect things to go wrong. The funny thing about high reliability organizations is, you don't expect things to go wrong, in the sense that you have a bad outcome. You may have mistakes, and things like that, but you're able to, essentially, to navigate through problems and so forth, and be successful in the end.Ron (00:35:49):Well, Perrow is very pessimistic about this. And I think if you look at third world countries, I would agree with Perrow. That you expect things to blow up, you expect trains to go off the rails, you expect airplanes to fall out of the sky. But ordinarily, we have people who manage high reliability organizations and they're successful. I mean, aviation is the perfect example of that. In spite of the fact that we have things like the 737 MAX and so forth, on the whole, basically, aviation is amazingly safe compared with, let's say, medicine. You're much more likely to die in a hospital, basically, than you are in an airplane.Ron (00:36:25):I know, basically, a couple people who's spouse died as a result of infections that they got in the hospital when there had been a successful operation. One of these people was a very close friend of mine. And so it's heartbreaking when you've done successful and difficult surgery, and so forth, and then the patient, while recovering in the hospital, catches an infection and eventually that patient dies as a result of the infection.Ron (00:36:49):Obviously, this is something people have worked on in medicine a lot, and there have been a lot of solutions to this, even in Michigan, by the way. We had a project called the Keystone Project, which was aimed, essentially, at eliminating a lot of the dangers of infection. And it was fairly successful.Ron (00:37:04):So the thing about high reliability organizations though, is they do difficult things on a routine basis with good outcomes. In the third world, this is typically not how it goes. You don't expect things to be successful, because so often, as I say, the train goes off the rails, or the nuclear reactor explodes, or airplanes fall out of the sky, and so forth. Often from very petty kinds of things.Ron (00:37:29):So high reliability is something, then, that is an example of a culture. It takes a certain kind of culture to be able to manage, for instance, an electric grid, or a nuclear power reactor. And in the United States we've been very good at, actually, in many cases, inculcating a culture. In nuclear power, for instance, the nuclear power industry realized that if they were not successful, if there was another nuclear explosion, basically, sort of like Three Mile Island, the nuclear power industry would literally go out of business. So they had to be perfect. Well how did they get to be perfect? Well, I'll give you an example, all right? So one day I got a call from the head of a nuclear reactor. I won't tell you where this was. But he called up and he said, "Dr. Westrum," he said, "you're supposed to know something about information flow." And I said, "Yeah." And he said, "Well," he said, "how do we make sure that we hear the faint signals?"Ron (00:38:22):Now, in the nuclear business, basically a faint signal is something that's a precursor of an accident, but it's often difficult to see. It's not right in your face, it's one of those things that you think, well here... That's an indicator that something bad is going to happen, all right? So the answer, as a consultant, you don't ever want to be without an answer. The answer I gave him was, "I don't know." I can't tell you how bad I felt. I thought, "Well, why is it that I don't have an answer for this?"Ron (00:38:49):So of course, needless to say, I worked until I got one. But the simplest answer is basically, you assure conditions for good information flow. And in this particular area, there are three things that you need. First of all, you need to have the employee identify with the organization. The employee has to feel that he or she is a part of this organization, they're not alienated and so forth, they're really totally engaged, all right? So that's number one.Ron (00:39:12):Number two is, you have to make available to people information that they may not have immediately at hand. I mean, for instance, there was a famous nuclear reactor explosion in Japan. Well, it wasn't an explosion, it was a contamination, that took place because the workers were trying to find faster ways to do things. Well one of the faster ways to do things was to put together, basically, something that was only a subcritical mass, all right? And that was a killer. I mean, literally, it eventually killed two of the nuclear workers, and so forth, because it was not something the workers should've done. They should've had access to the expertise of the engineering people.Ron (00:39:49):But because there was a kind of silo, the technicians were on one silo, and the engineers were on another silo, they didn't ask. Something similar, by the way, happened with the Hubble Space Telescope. The technicians who were working on the telescope had basically got one of the tests to pass by putting in some shims on the mirror, okay? Well, listen, it didn't make a very big difference in terms of inches, but it was a huge difference in terms of the ability of this huge lens to focus, the mirror, to focus.Ron (00:40:18):So they should've asked the engineers. But they didn't like the engineer. And literally, every time the engineers knocked on the door where the technicians were working they turned up the music. So when you've got silos like this, where the information is not passing, it can be really, terrible. And in the case of Hubble, what it meant is that a $3 billion space mission had to be put together to go and fix the telescope. Fortunately, the mission came out okay, but that was $3 billion later. And what would've happened if some of those astronauts on that mission had died?Gene (00:40:49):Oh, this is amazing. I'm dazzling. And I'll ask you another question, but you said there was three components. One was-Ron (00:40:55):Oh, I'm sorry. So the third component is empowerment. The worker has to feel that he or she has the ability to come forward and make a case. I know this has become a very big deal now with Amy Edmondson's calls the fearless organization. And Amy Edmondson, and I think this is a really fine researcher and a very good person, and she has made a very strong case that fear is a great danger for signals that organization need to have. You don't want to have fear.Ron (00:41:21):So how do you drive out fear? Okay, well that's a good question, isn't it? So anyway, that's the third element. So you've got identification, expertise, and empowerment. Those are the three things that you need to have. And that would've been a great answer to the question that this guy who was running the nuclear power operator had. But I didn't have at the time, and I put it together later. But it's a logical embodiment of the stuff I've been doing.Gene (00:41:46):Can we just back up a little? Because you've mentioned words like generative, the ops are generative, and maybe just to put some context for this question. Having read your papers, I was dazzled by the precision of how you describe the beliefs, behaviors, values, information flows within organizations. So could you describe what the three Westrum topologies are? Maybe kind of beyond the tables, which I'm very familiar with. Maybe speak to each one of them, and maybe even describe kind of the aha moment of, when did these things kind of crystallize into this, in my mind, this almost Mendeleev periodic table. It was obvious once you see it, but until then, right?Ron (00:42:27):Right. Okay, so let me answer your question. All right, so the first thing I did, I'm going to make this a lot shorter than it really took. But basically, I began to think, "Okay, so what is basically the case with people who are doing an innovation? How do they have to think?" And the answer is, they have to think in a generative way. They have to think, "What would essentially provide a sort of perfect solution to the problem?"Ron (00:42:49):So they have to be very focused on the mission. And this is the key thing about generative organizations, highly focused on the mission. Now, by contrast, if you have bureaucratic organizations, you're focused on the method. You've got a book of rules, you've got organizational norms. And that sometimes can work extremely well, and it works extremely well when everything is known.Ron (00:43:09):I mean, a good example would be the building of the Empire State Building in New York City. Because here was something where Starrett, the engine, or in charge, was very, very, very organized. So he figured out exactly how this had to be done, and then they carried this out in 13 months. The Empire State Building went up in 13 months. And if you've ever looked at any of the pictures of the workers out on the 70th floor or something like that, out on a steel beam eating lunch, okay?Gene (00:43:41):In the wind, right? I mean-Ron (00:43:42):In the wind, yes. And just getting me out on a beam would be curtains, I'm afraid. Okay, so that's... The distinction number one is, basically-Gene (00:43:51):Wait, I'm sorry. So why did you bring up the Empire State Building? The fact that it went up in 13 months.Ron (00:43:55):Because they knew how to do it. Starrett knew how to do it. Now, when you've got something like the aviation system and you can't necessarily do things by the rules, you have to have expertise, and skill, and stuff like that. And even more complicated, you've got a nuclear power source, you've got an electric network, a lot of times you can't predict in advance what kind of information you need. You have to figure out how you're going to do this on the spot, okay?Ron (00:44:23):Apollo 13 is the perfect example of basically generative organization. They did not know how they were going to do that in the beginning. I mean, when they realized that that tank had blown up in space, what were they going to be able to do to get back to earth? And, fortunately, the people at Grumman on Long Island had thought through, at least an issue associated with it, which was that they might need to use, essentially, the moon lander as a sort of lifeboat.Ron (00:44:53):And it was because they used the moon lander as a lifeboat, they were able to cut down on the amount of power, and oxygen, and so on, that was being used, because the moon lander needed a lot less oxygen and power than the command capsule, okay? Do you understand.Gene (00:45:05):Absolutely. I love the Gene Kranz book. Just dazzling beyond belief.Ron (00:45:10):Yes. So, the thing is then, is that they had to have the confidence and the expertise to be able to put all that together. Now, this is very different than the Empire State Building, where they actually knew how to do it, there was a formula to do it. And so, if you've got a formula, you can have a bureaucratic organization and it's just fine.Ron (00:45:29):So one day I was at one of the auto companies, I don't want to say which one. And I realized that in addition to the two categories I had set up, generative and bureaucratic, there had to be a third category, pathological, okay? Because that's where I was. So I thought, okay, so now we've got three categories, okay? The sad thing is, many of us work in pathological organizations. We don't expect things to go wrong, in most cases in every day, but in a pathological organization they can go wrong every day.Ron (00:45:59):Where did I learn about this? Well, one of my students. One of my students said that, "My father used to work at Fisher Body." Fisher Body was an organization associated with General Motors. And he said, "Every day when my father drove to work he would get sick on the way to work." That's what living in a pathological environment is like. Because basically, the pathological organization is oriented toward pleasing the people at the top. It is what they want, what they need, and so forth, that drives the organization.Ron (00:46:25):Just the opposite from generative, where the mission really takes priority. For bureaucratic organizations, there is something in between, basically that the structure, or the rules, or the department, or whatever, is basically the focus. But for pathological organizations, the focus is power, all right? Well, that's great if you're on top, but if you're not on top, it's pretty stressful.Ron (00:46:46):The British did a famous study called the Whitehall Study. I don't think you have probably heard of it. But in the Whitehall Study what they discovered is, the chance of having a heart attack went down every level you went up. People on top had the least heart attacks, people on the bottom had the most heart attacks. Now, consider that basically everybody in Britain, in principle at least, has the same health system. So what was the difference between the top and the bottom? And the answer was power.Ron (00:47:10):So if you're powerful, it tends to make you healthy. However, if you're not on the top, and if you're somewhere down in the innards of the organization, then it's a very dangerous place to go, and people's reaction to that on a health level is often sickness. So I won't ask you about your organization, but I have asked people in big meetings that I've gone to tell me what the organization was like.Ron (00:47:34):So let me just give you an example. I went to the United Auto Workers. I won't tell you which auto organization this was from. But basically, I had about 50 people from 50 factories. And so I said, "Okay, you tell me, which is your organization? Generative, bureaucratic, or pathological, all right?" I knew the organization very well, because I'd worked there in the past.Ron (00:47:57):So here's how it came out. There were, as I expected, something like 20% were pathological. I expected a higher number for that, by the way. About 70% were bureaucratic. And 10% were generative. And I thought, "I wonder if it's really like that." So after I gave my talk I talked to one of the guys who came from a generative organization. I said, "Tell me what's generative about it." And he said, "Oh, we never do anything without management giving us a heads up, and so forth, or us giving them a heads up. So we work together on a cooperative basis."Ron (00:48:29):Now, interestingly enough, people in Britain did a study of pharmacies and tried to figure out what was the sort of cultural level that pharmacies had. The people in the pharmacies resolutely refused to believe there were any generative organizations, because in their experience that is not how it went.Gene (00:48:48):I'm sorry. I just want to make sure I got your point there about United Auto Workers. The fact that the 10% that you said was generative.Ron (00:48:56):Okay. Let me go through it again. So, I said, I wasn't surprised to find that 20% of the organizations that they came from were pathological. And obviously, a lot of them were bureaucratic, 70%. But I was really surprised to find there were any generative organizations, because it's a bigger organization.Gene (00:49:17):Right. And did what you hear kind of satisfy your definition of generative? The fact that they seemed to be working together towards a common objective?Ron (00:49:24):I'll tell you what. What I did is, I gave them a list, the same list that was used by Jez Humble and the stuff with the-Gene (00:49:31):The CEDAR DevOps research, yes.Ron (00:49:33):Yes. So I used that same scale. And then I said, and I took these three categories, I said, "Now tell me what I didn't put it." And for each of these things, generative, bureaucratic, and pathological, they came up with additional identifiers. I looked over the list, and there wasn't anything that I disagreed with, I mean, they got it, okay? They understood. The fact is is that, once you've got this stuff, it's a very powerful tool for helping you understand what's going on in your organization.Gene (00:50:05):I just wanted to break in to add a little bit more information about that Whitehall Study. I am reading directly from The Lancet paper. "The Whitehall study of British civil servants began in 1967 and showed a steep inverse association between social class, as assessed by the grade of employment and mortality from a wide range of diseases. Between 1985 and 1988 we investigated the degree and the causes of the social gradient in morbidity in a new cohort of over 10,000 civil servants."Gene (00:50:36):"In the 20 years separating the to studies, there has been no diminution in social class difference in morbidity. We found an inverse association between employment grade and prevalence of a whole host of diseases, self-perceived health status and symptoms were worse in subjects in lower status jobs. There were clear employment grade differences in health risk behaviors, including smoking, diet, and exercise, in economic circumstances, in possible effects of early life environment as reflected by height, social circumstances at work, for example, monotonous work characterized by low control and low satisfaction, and in social supports."Gene (00:51:14):In their interpretations, I'm just cherry picking a couple of sentences. "Men in the lowest grades had a three-fold higher mortality rate than men in the highest grades." They explore other factors that could relate to morbidity, such as smoking, obesity, less leisure time, physical activity, blood pressure, height. But they say, "Even after controlling for standard risk factors, the lowest grade still had a relative risk of two times higher mortality compared to the highest grade."Gene (00:51:46):They conclude, "One possible explanation of the remaining differences is job control and job support. Blood pressure at work was associated with job stress, including lack of skill utilization, tension, and lack of clarity in tasks." I will put a link to this paper in the show notes.Gene (00:52:02):So, yikes, here is my reaction. The goal of this podcast is to explore how can truly unleash human creativity both in service of better achieving organizational goals, but to also better serve the goals of everyone in that organization. I've heard so many stories about how in generative organizations, such as Toyota, Alcoa, the Sidewinder missile project in Dr. Westrum's book, people talked about their time in those organizations representing the most satisfying point in their career.Gene (00:52:35):I think what the Whitehall Study shows is that there is a terrible cost and human toll of creating organizations where we don't achieve this ideal. That it robs people of a sense of importance, of challenge and satisfaction in what they do, and even shortens their lives. Okay. Back to the interview.Gene (00:52:54):And I can personally attest, [inaudible 00:52:56] community, your instrument resonates so widely. And I would say, probably as a population, they probably resemble very much the population that you've studied. I'll tell you, I was a little bit surprised after researching this call that I looked up the word generative, and it didn't actually mean what I thought it meant. It was actually a pretty sterile definition, capable of producing. Can you talk about what generative means to you? Why did you choose that word? And maybe what other words you had considered?Ron (00:53:23):Absolutely. That's a great question. Actually, the truth is, is that I've always struggled with, what do you call the organizations that can do this, okay? Generative's what I ended up with in the end, but you can also call them creative. But the thing is, if you're in the safety business though, what is it that's going to make you an effective organization? That you can fix the problems that you got.Ron (00:53:46):In fact, that's probably the best definition of generative, is that you can fix the problems that you've got. Other people have asked me, said, "Why do you call it that? You could call it something else." And I said, "Good, get me a better word for it." Nobody has got a better word for it. Now, if you want to call them high reliability organizations, you can call them that, but it doesn't have the exact meaning. It's pretty close.Ron (00:54:09):But I think the high performing organizations that I have looked at, at least 80% of them are generative. There a few, I don't know, outliers. And I, actually, I would pick the Navy's nuclear power thing, I'm not sure it's generative, okay? But they have good results. They also have some side effects. There are some stuff that they would probably not like to talk about where nuclear materials were mishandled. But in terms of innovation. Being able to get those things into ships and get them working, they're really good.Gene (00:54:42):There's two places I'd like to go. When you talk about Dr. Amy Edmondson's work, so she actually wrote a paper with Dr. Michael Roberto about the two NASAs. That there's kind of one NASA that was the Apollo 13 era, where it was Gene Kranz, but high risk, high gain, exploratory culture, that I think is what they called it. Where the weak signals were amplified and pounced on-Ron (00:55:03):That's right. Yes.Gene (00:55:03):... because they knew that that jeopardized the achievement of the mission, human life, and so forth. And then she called the Space Shuttle era compliance culture, where it was all about maintaining an operational tempo, about low cost access to space, frequent access to space. And then that takes priority over everything else.Gene (00:55:20):So if you could go back in time, and wave a magic wand. What would have happened, as result of the Challenger accident? What would you have wanted to have happened? How would you have driven out fear? Paint that story.Ron (00:55:31):Well, the Challenger is a little different than Columbia, but basically, I gave a talk one day, and actually the director of the safety division from NASA was there that day, all right? And I said, the thing is that you want an organization that can learn. You don't ever want to have the same accident twice. And I said, "The Challenger and Columbia, it's almost like having the same accident twice." He agreed with that. I mean, I'm not going to tell you who he was and so forth, but that's the deal. So the thing is that what-PART 2 OF 4 ENDS [00:56:04]Ron (00:56:03):So the thing is, is that what was the common factor with the Challenger and Columbia? So one answer was they really cut back on safety. I mean, in both cases, they had cut back substantially on safety, 80% of the safety stuff was cut, all right? That's not a good sign if you're going to be in that space ship, okay? And most of those people have gone away who are typically watching over you.Gene (00:56:27):This is at the same time when they're sending retired senators to space, they're sending school teachers into space, this is kind of a very odd juxtaposition.Ron (00:56:35):Yes. Okay, so if you wanted to characterize those two kinds of things, compliance culture is another way of saying bureaucratic. But the thing about Apollo is that there was a gold-plated program. If people in the program wanted something, they got it and they got it in spades. So basically the program was being driven by the mission, and by the time they got to the shuttle... I don't know what happened to the technological maestros that were there during Apollo, but they weren't running the show. There were smart people, I'm sure that the mission control people were probably pretty good, but the people who were in the trenches were not the same people. So if I was going to give you an anecdote about NASA, I wouldn't pick that.Ron (00:57:13):I would look at basically how they decided to go to the moon, because here was an example of an answer that came up by somebody very, very low in the organization. In fact, it was not a particularly important individual in the organization, but he actually had the answer. But what I remember is basically he came from a laboratory, which was not basically the sort of kingpin laboratory for space missions, but he had figured out that the right way to do it was to use a lunar orbit. So using a lunar orbit was something that he had to support with all kinds of facts and figures. So the first time he gave a talk about this, he was basically laughed out of the place. I mean, people who are really smart, made fun of him. But he was sure that was the right way to do it. So he went back and he basically did a whole bunch of new calculations, and he came back again. Well, this time he still didn't have numbers that convinced people and so forth, but they listened to him.Ron (00:58:11):And then there was a third and a fourth try until... And he finally got to the point where Wernher von Braun stood up in a meeting and he said, "Does anybody not think this is the right thing to do?" So that's one of the characteristics of a generative organization is that people are really interested in getting the mission done and how it gets done is less important than the fact that it does get done.Gene (00:58:32):I am grateful that LaunchDarkly has sponsored the DevOps Enterprise Forum for the last two years, where 60 of the best thinkers and doers I know come together for three days to create written guidance for the top problems facing this community. It's always been one of my favorite professional activities that I get to participate in. This sponsorship from LaunchDarkly makes possible the publication of this year's DevOps Enterprise Journal, where of the hard work of this year's forum participants, as well as all the other journal authors are published. LaunchDarkly is today's leading feature management platform, empowering your teams to safely deliver and control software through feature flags. By separating code deployments from feature releases, you can deploy faster, reduce risk, and rest easy. Whether you're an established enterprise like Intuit or an SMB like Glowforge, thousands of companies of all sizes rely on LaunchDarkly to control their entire feature lifecycle and avoid anxiety fueled sleepless nights. Break away from stressful nerve-wracking releases, which so many organizations face.Gene (00:59:35):By decoupling deployments from releases, developers are free to test their production and ship code whenever they want, even on Fridays. If there's a problem caused by a feature, a flip of a switch can disable it without the need to roll back or redeploy code. If you want to make releases a snooze fest and start deploying more frequently and securely with less stress, head to launchdarkly.com.Gene (00:59:58):I trust that you are enjoying and learning as much from this interview of Dr. Ron Westrum as I am. If you are, you should know that Dr. Westrum will be presenting at the DevOps Enterprise Virtual Conference, which will be from May 17th to May 19th. I hope to see you there. Go to events.itrevolution.com/virtual, for more information.Gene (01:00:21):I'm about to take my son to the Mars Curiosity 2020 [inaudible 01:00:24] JPL, the day before it's shipped off to Florida, it was so great. But apparently it's part of the Curiosity... No, it was the one before that, but the one with the very novel sort of landing vehicle where they lowered it on a crane. They had a Nova special on it, and it was apparently the same dynamic. The first time they showed it to people, I mean, they were laughed out of the room. And then they had to showed how problematic it was, and yet it turns out that there was a culture that allowed that best idea to solve the problem. So I just love all these analogies.Gene (01:00:57):But maybe I can just go back to that question. So if you could go back in time, armed with everything that you have now, and plus I'm giving you a magic wand can you describe what an alternate path events might've been that could have changed the culture at NASA? What would you imagine is the sequence of events that would have caused a different outcome?Ron (01:01:12):Well, what you have to understand essentially is that the organization's parameters are set by the people at the top, okay? So what they encourage, what they permit and so forth. And of course, all these things are affected by budgets, they're also affected by other external forces. You see, if you have the right people at the top, basically, you can get the right people in the slots below them.Ron (01:01:34):A famous technologist named Jack Rabinow, who's a great American inventor, who had, I think something like 230 US Patents. He had a series of laws about technology, and one of the laws was, if the boss is a dope, everybody under him as a dope or soon will be. Well, this is a very important point, because if you don't have an organization where people at the top are highly respected and really know what they're doing, then the people under them eventually are going to be like that. So this is a very important thing. So what I would have said about NASA is that you needed to put the people on top who made the right decision.Ron (01:02:11):Now let's take Challenger as an example of this. All right, so the thing about challenger was basically that, and I forget which director did this, but he would always ask... It wasn't Kranz, it was the guy who came before him. But he would always ask people at the beginning, " Should we do this mission or not?" Just an up or down thing. Now, you would think, well, that's not very scientific, but the fact is that really forces people to choose what way they really want to go, what way they really think is good.Ron (01:02:46):And so the problem with NASA at the time of Challenger was they had to have a scientific proof, okay? And so the people from Morton-Thiokol had to present a sort of scientific basis for what they did. But they weren't asked the question, "Well, really, is it a good thing to do this right now?" Okay? And so, because they couldn't make a scientific... Later analysis showed that actually the data was there, they just hadn't managed to process it yet.Gene (01:03:16):Despite that famous PowerPoint slide [crosstalk 01:03:21].Ron (01:03:20):Absolutely. Right, yes. There's a famous guy named Tufte who has done the study of graphics and so forth. And so yes, he has. He shows you the graphic that they used, which disguised essentially the fact that there was a relationship. So the fact is that if Wernher von Braun had been in charge instead of whoever was in charge of that, he would have said, "Are you willing to go forward with this, do you think this is a good idea?" And they could have said no, just on their gut, okay?Ron (01:03:51):Now that doesn't sound very scientific, but that actually is what basically you need to do. So von Braun, or somebody like that, who really had a lot of self-confidence, could have been able to take that and said, "Okay, we're not going to do it. If we don't feel good about it, we're not going to do it." But it was a bureaucratic organization, they needed some sort of science and so forth. And so it was-Gene (01:04:14):Very interesting.Ron (01:04:15):Yeah. Well, I mean, this is something that I got, basically. I looked at some of the... What did some of the leaders say about this kind of stuff? And so a lot of times, if you don't have a lot of self-confidence, you're not going to make a decision where you can't point out what it is that is wrong. They knew something was wrong, but they didn't have the guts essentially to go forward on it. And so the guys said, "Well, if you can't prove we shouldn't do it, we'll do it."Gene (01:04:43):Just to double-check, so the Rabinow, is this who you mentioned in the Sidewinder book, the Rabinow's [crosstalk 01:04:49].Ron (01:04:48):Absolutely.Gene (01:04:49):Okay.Ron (01:04:49):Jack Rabinow. He invented the system the Post Office uses for reading zipcodes.Gene (01:04:55):That's so cool. And then secondly, I mean, it seems like there's a whole bunch of literature, I think, Thinking, Fast and Slow, Kahneman, Traverse the... It would say-Ron (01:05:04):Yes, brilliant guy.Gene (01:05:06):And I don't claim to have any mastery of the material, but I think what you're saying is that these leaders put a lot of credence and credibility in that sort of, what does the fast part of the brain... The bodily feeling of, is this a good idea or not, versus the highly methodical, show the calculations.Ron (01:05:25):Well, it was interesting because it comes out with a contrary answer to the one that was actually the right one. I mean, in general, it is really true that the sort of reflexive answer is probably wrong.Gene (01:05:38):Mm-hmm (affirmative), that's right, that is one of the...Ron (01:05:40):Okay. The question is who you're asking the question of. If you got the right people, then their gut reaction is probably right. It's very difficult to make generalizations, you have to really look at the circumstances. But you're right, the Kahneman thing, I think Kahneman is an absolute genius. There's no question about it. I met Kahneman's pal, what was his name?Gene (01:06:04):Tversky?Ron (01:06:05):Tversky. Yeah, I met him one day. I had a very interesting discussion with him. He's a very smart guy. But the two of them are just absolutely brilliant.Gene (01:06:14):Well, in fact, one of the... I'm not sure this is one of the headliner findings in the [inaudible 01:06:18], but it's one that really just lingers with me. It turns out, this is correlative, not that we can't make the assertion and prediction, but there's this marvelous correlation between all the metrics, all the technical practices, cultural norms and so forth, but with one question you can ask. On a scale of one to seven, to what degree do you fear doing deployments? So the deployments are kind of where things integrate and blow up. It turns out, one is we have no fear at all [inaudible 01:06:42], we just did one. Seven is we have existential fear of doing deployments. And the way I've kind of always thought of it is that it just shows how good the human brain is at sort of associating fear with problematic activities. And it seems like that question you're asking, up or down, right? That you're essentially signaling kind of this feeling, based on all your experience, expertise, go, no-go. Does that resonate with you?Ron (01:07:06):One of the things that... Like I said, it matters who you're asking the question of, okay? Because if the people at top are experts, this is very different from a situation where they're not, where the expertise is somewhere lower in the organization. I will give an example from the Sidewinder book, which really shows the difference between knowing something and not knowing something. So one day the technical director came down to the spaces that decide Sidewinder team, and somebody described them as the sweaty spaces that we were using to develop this missile.Gene (01:07:37):The technical director, was this McLean?Ron (01:07:39):It wasn't McLean. It was an Admiral actually, but the Admiral had been around, he knew something. And so he says, "How's this Sidewinder project coming along? And the guy says, "Well..." He said, "We're going to be done within a year." "Aha." Says the Admiral. He said, "Well, have you solved this gyro problem?" Because they had a problem with their gyroscope. And the guy said, "Well, no, we haven't solved the gyro problem yet." So the Admiral says, "Well, here's what you're telling me." He said, "You haven't solved your problems yet, but you're going to be done within a year. Is that what you're trying to tell me?" And the guy who told me the anecdote said, "At that point, I realized this guy really knew something. He had put this together." Yeah. And so that's the question, a gut reaction from somebody who's really knowledgeable is very different from somebody who's not.Gene (01:08:23):Yeah. I mean, this is so interesting. That law of, who's at the top matters a lot. And so now I'm wondering, I think in healthcare, and I don't claim any knowledge of the field, but I know that there's always been this question of, can the head administrator really do their job without coming from a clinical background, without being a doctor? Is there an obvious answer to that in your... Do you have an opinion on that?Ron (01:08:50):I have an opinion, but this is not my expertise. But my guess would be, basically if you got somebody at the top who doesn't know what they're doing, then Jack Rabinow's law about the guy at the top is applicable. That probably the guy below him is not going to know what he's doing. You don't want a situation where the people at top are going to settle for answers that are half assed, sorry, incomplete from the people in the next level down. So I mean, my inclination would be yes, find somebody who is distinguished enough so they can ask the kind of questions that are the searching questions. In a generative organization, you have good people at the top because they know what they know, and if the people at the top don't know what they don't know, that's a very dangerous situation to be in.Gene (01:09:39):Can you describe what that looks like when there's a leader who knows? Can you sort of paint a picture?Ron (01:09:44):Absolutely. So let's look at the characteristic of a technological maestro. Okay, this is a term that is used in history of technology a lot, technological maestro. So first of all, you're dealing with somebody who has very high energy, that's critical. The second thing is they know what questions to ask, that's critical. The third thing is they're good on the details too, and that's unexpected. You don't usually expect people who are good on the big things to be good on small things. But they also have to have a high standard and they have to be willing to get immersed in the activity itself. So you would find, for instance, the Chief of Naval operations, if he or she is good, is going to be the kind of person who can ask the questions that could embarrass people at a lower level if they don't know what they're talking about.Ron (01:10:30):So this is what they told me at China Lake when I was doing the Sidewinder book, they said... We had directors who really knew what the story was, and the example about the Admiral and asking questions, that's what they expect. They expect people to know what they're doing. So with the technological maestro, you have a very different situation because here's the guy on top, it could be a gal, not only knows what the key questions are, but they know in detail what's probably going on, okay? And so a lot of times you'll find, chiefs of Naval operations, they'll go walk around at night and ask people questions at times when they're low on blood sugar and so forth, and they're going to tell you the truth. You need people at the top who are highly knowledgeable, and that also will attract people below them.Ron (01:11:20):At China Lake, for instance, one of the biggest mistakes they ever made, if they had some guy they didn't know what to do with, and so they kept promoting him upward. But Jack Rabinow's Law about the boss being adult, is that basically what happened is the whole organization eventually became infected by the sin of ignorance and so forth because the guy at the top didn't know what he was doing. So when people below him were hired, it was this person who was making the decision.Gene (01:11:45):Yeah. I got goosebumps twice. One, it's the specific attributes of the technical maestro just deeply resonates with me, and I haven't been introduced to those characteristics. I'm going to look that up, incredible. I think in popular culture, Steve Jobs was one of those.Ron (01:12:00):Absolutely, absolutely.Gene (01:12:03):Gene here. Holy cow, if you didn't notice the notion of a technical Maestro struck me as an utter revelation, you need a person with high energy, high standards. Great in the large, great in the small and someone who loves walking the floor, which I'll interpret as understanding the daily work of others.Gene (01:12:24):When I say revelation, I don't say that lightly. It seems something that is so very important. Once you see it, it's difficult or maybe even impossible to unsee it. After all, how can you have greatness if the person at the top doesn't know what they're doing? This is made even more profound by something else that Dr. Westrum mentioned, which is Rabinow's Rule Number 23 of Leadership, which is if you have a dope at the top, you will have, or soon will have dopes all the way down.Gene (01:12:57):I mean, holy cow, right? Does that not summarize so well the conditions when things are going so incredibly well because of who's at the top or when things are going so frustratingly and maddeningly not going well again, because of who's at the top. So of course we have great examples that we can cite from popular culture of technological maestros, like say Steve Jobs, this obviously goes much further than someone with a high popular profile or charismatic appeal. What this concept reminds me of is a Harvard business review article that Dr. Steven Spear and I have discussed at length from Dr. Joel Podolny and Dr. Morten Hansen called, How Apple is Organized for Innovation. It's a paper that, among many things, describes the functional nature of their organization and the need for the leaders to be very technical.Gene (01:13:49):There's lots of caveats, but one paragraph that stuck out to me was this, it reads, "One principle that permeates Apple is that leaders should know the details of their organization, at least three levels down, because that is essential for speedy and effective cross-functional decision-making at the highest levels. If a manager attends a decision-making meeting without the details at their disposal, the decision must either be made without the details or postponed. Managers tell war stories about making presentations to senior leaders who drill down into cells on the spreadsheet lines of code or test results on a product." And these results seem very consistent with the concept of the technological maestro. I'll have more to say on this topic later, but for now I wanted to give some concrete citations on this topic. It was surprisingly difficult to find references, but I did find some thanks to Dr. Westrum's Sidewinder book, which is so splendidly citation. The term technological maestro comes from Dr. Arthur Squires. According to his Wikipedia entry, he was born in 1916 and died in 2012. He received his PhD in chemical engineering from Cornell University, was a member of the Manhattan Project and was a professor at Virginia Tech. He wrote several books, including The Tender Ship, Governmental Management of Technological Change. Which, "Which, defends his thesis that governments are usually incompetent managers of technology projects." I'll put a link to his book in the show notes.Gene (01:15:21):And that gets to someone else who was mentioned, Jacob Rabinow, as in Rabinow's Laws. According to Wikipedia, Mr. Rabinow was a mechanical engineer who patented a number of revolutionary devices, including the first disc shaped magnetic storage media, the magnetic particle clutch, the first self-regulating clock and his famous reading machine, which was the first to use the best match principle, which was the basis for the reading, sorting and processing machines used by banks and post offices. He held 209 patents. He was a VP at Control Data, won numerous awards. The best citation for Rabinow's Law about the implications of having a dope at the top, is actually in the Congressional Committee on Science and Technology, where he was testifying on domestic and international scientific planning, it looks like in 1976, it's difficult to tell what the scanned copy that's in Google. I will put a link to that in the show notes.Gene (01:16:22):Okay, back to the interview where I try to understand how the technological maestro concept would apply to Captain David Marquet, author of the book, Turn the Ship Around. Where arguably he didn't have any expertise, because, as the story goes, he was trained on an entirely different class of submarine, what then?Gene (01:16:43):I guess the counter example I would bring up is there's a famous book that's up there with the Abrashoff book, Turn the Ship Around by Captain David Marquet. And so their story goes, he was trained on a certain class or subclass of an attack submarine at the last minute, got reassigned to a new one of which he was not prepared for, and then essentially had to radically delegate because the expertise that he had was not particularly relevant for the submarine that he was assigned to. So is there anything to reconcile there, that he inherited the captaincy without the expertise that he needed, and so therefore he had to rely on his staff out of necessity. How do you reconcile or relate?Ron (01:17:21):So what you do is you look for the subordinates who really know something.Gene (01:17:24):Yeah, right. So it's a very similar book to Abrashoff's book in the fact that... Very, very similar, in fact, it kind of was one of those worst to first stories. But here you take all of Abrashoff's skills and take that away because "He wasn't trained for that class of subs."Ron (01:17:40):Ah, I didn't tell the entire Abrashoff story. So when I read Abrashoff's book, I thought this is a one-off. But then I looked at a book called, Excellence in the Service Fleet, which has never been published as far as I know, you can find it on the web. And so what I found is basically that they did a study where admirals were asked to identify the best captains. So the best captains basically had all the kinds of characteristics that Abrashoff had signaled in his book. The study that I'm referring to was done before Abrashoff did his mission. In fact, if he had read it first, he wouldn't have had to invent all the things that he invented, okay?Ron (01:18:15):But basically at the key principles were already there. All the good captains had similar characteristics. They didn't all manage the same way, but they all had similar characteristics. They were very supportive of their men, they allowed people to be creative, they listened carefully to what people told them and so forth. And I haven't read this other book that you're referring to, but I would expect that you would find that he was a good example of the same kind of characteristics you'd find in the surface fleet. That he was very supportive of his men, and that he was able to listen successfully to what people did.Ron (01:18:50):The ability to figure out which subordinates know what they're doing though, it's a characteristic that's not technical, it's personal, okay? So at China Lake, for instance, one of the guys who was an early leader of the organization, had this ability to talk to engineers in great detail to find out what they knew and why they knew it and what they thought was going to happen and so on and so forth. So you need to have this ability to be good with people, as well as being good with the technical stuff. Otherwise, your hiring is not going to be at the level you need.Gene (01:19:23):It's interesting, as a person who comes from a technical background, I got my graduate degree in computer science, this will sound naive to the point of absurdity, but wouldn't it be great if people were more predictable? Everything's easy, except for the people part, that people are so highly idiosyncratic. I mean, to what degree do you need someone at the top that is not just good with people, but phenomenal?Ron (01:19:45):A people person, quite literally.Well sometimes if the guy on top is good technically, but not with the people, he needs another person working with them. Bill McLean, who was head of China Lake for a while, he would always get somebody else who was good at the administrative stuff because he was not. He might be great with the [inaudible 01:20:05], but he was not good at the other stuff. They had the intelligence to put him with somebody who had those people skills, and that saved them many times. Because Bill McLean didn't care, he was interested in the technical details. He didn't care about the people.Gene (01:20:19):So let's go to the scenario, I'm going to put you in charge of NASA, all right? You did such a great job in your investigation, the pilot to that, they said, "All right, you're the person in charge." That senior person's job, like I think most senior people, is to ensure the reliable and smooth operations of the organization, whether it's on time, on cost, et cetera. How do you achieve those goals and maintain reliable and smooth operations without devolving into a culture of compliance? What concrete examples have you seen where leaders pull that off?Ron (01:20:53):One answer to your question is looking at organizations that were bad that became good. So Abrashoff's book is a good example of how you turn an organization that's pathological... It was so bad actually, that when Abrashoff came on board originally to take command and the other captain left, all the sailors cheered when his departure was announced, okay? So that's one of the things [inaudible 01:21:17].Ron (01:21:17):First of all, you don't hire dummies. You hire people who are really good, who are going to challenge you. Understanding the technical part is very important, so you can understand what people are telling you instead of just sort of going along with, "Well, this guy seems smart." Well, that's not good enough, you need to know the details.Ron (01:21:33):But the other thing is, see, having those people skills is a critical thing for commanding officers and an ability to build a workforce that's cooperative, that's mission focused and so forth, that's something that's very difficult to acquire if you don't have it naturally. And some people just don't, there are a lot of smart people who are not necessarily smart with people and cannot build a team, and do not have a good workforce.Ron (01:21:56):There's some very interesting stuff at NASA in terms of how they build teams. There's a book called, How NASA Builds Teams, and I can't remember the guy who wrote it, but he he's got the best thing I've seen since sliced bread in terms of using essentially Myers-Briggs kinds of qualities to manage the workforce. So this guy's an astrophysicist, they don't teach people skills in astrophysics, but he was smart enough to pull all this stuff to help him design the teams and so forth. And the ability to be humble enough to realize there's something you don't know very well, and there's something you need to learn when you have a new job. That's a very important skill, just being humble, being able to build up from scratch. So this book, How NASA Builds Teams is really almost required reading if you're in a technical area.Gene (01:22:45):Let's go from that leader who has all these admirable qualities, but there are certain endeavors where the problem to be solved is so vast that you're dealing with tens of thousands, maybe hundreds of thousands of people. And so even as earnest as that person might be at the top, now you have to split the work up in teams. You have to have multiple levels of, presumably, some hierarchy.Ron (01:23:06):You have to get capable subordinates, okay? If you want to find a person who is really good at this, the person who is really good at it was General Marshall, George C. Marshall. He was the person who could do all these things. So at the beginning of World War II, he literally was given lists of generals, I mean, hundreds of people. And he would go through the list and he would cross out the people who he knew were not any good. You could say, "Well, how could anybody know that much?" But that's why they chose George Marshall to be the Chief of the Army, because he had that ability and that he was a perfect example of a technological maestro.Gene (01:23:41):Interesting, my head would not have gotten there. By the way, I read the book, The Generals, and that was such a wonderful description.Ron (01:23:47):I know this is an impossible assignment. I read the four volume version of The History of General Marshall Biography. And you would say, "Well, gee, four volumes that's really too much, you're just going to get totally bored." Every single volume had insights in it, I'll just give you one.Gene (01:24:03):Yeah...PART 3 OF 4 ENDS [01:24:04]Ron (01:24:03):Every single volume had insights in it. I'll just give you one that blew me away, absolutely blew me away. When Queen Elizabeth was crowned in Westminster Abbey, you've got the coronation ceremony. So George Marshall comes in, he walks in the room. Everybody stands up. So Marshall looks around to see who has come in, okay? But you have to remember, this was the General Marshall who not only had basically run World War II for the US, but also created the Marshall Plan, which saved England and many, many, many other countries. So they knew, it was obvious that Marshall was that highly respected. So he had those abilities. He was a Maestro. And I just thought, "My god. I'm glad I got to volume four because I found this particular thing," but there were so many other things.Gene (01:24:51):And maestro of what? Clearly he was great at personal relationships, at strategy. I mean, what would you say he was a maestro of?Ron (01:24:58):Being a general. He knew how to train people. He knew how to motivate people. He could tell how well people were doing. He always told the truth, and that's a very difficult thing to do. He always told the truth, and he always was honest, scrupulously honest, to the point where it's almost boring. And the thing is, it wasn't that other generals were incapable. I mean, obviously Eisenhower and Patton and so forth were also very highly gifted, but Marshall had a unique skill, and the same kind of skill that's necessary to be a CNO. I've only talked to one CNO, was Admiral Moore, but it takes a special person. And it's interesting to see what higher officers do, and higher corporate officials do, is the ability to understand where the organization has to go next. So you can be a terrific person by the assessment of your workmates and so forth, but if you can't figure out where the organization needs to go next, you're not going to be able to cut it. And this is the difference between being a Vice President and being a good department head.Gene (01:26:04):And these characteristics, to what extent can they be learned? On a scale of one to 10? One is, "No, you're born with it, or not." 10 is, "No, these are all, not attributes, but these are skills that are learnable. Teachable and learnable."Ron (01:26:19):I would plump down on the, "You got to have it from the beginning." Maestros learn from other maestros. No doubt about it. They recognize each other. But of course, even maestros can make mistakes. I mean, I'll give you an example. Vannevar Bush, who is probably the smartest technologist in the United States in World War II, a really brilliant man, thought that you couldn't put a atomic bomb in a missile. And he was famous for having said basically, "Intercontinental ballistic missiles are just not going to work, because you can't put an atomic bomb in them." He was wrong. Okay. You can't be perfect. Perfect is too high a standard. But talented, yes, absolutely. And it's interesting, the number of things that we owe to Vannevar Bush. He did the first pre-electronic computer that was effective, a differential analyzer. He was really brilliant in terms of creating an organization in World War II, a national brain trust. He also came up with the idea of, he had an essay called As We May Think, which is the basis, essentially, of a lot of the internet.Gene (01:27:27):Do you find that at all disheartening or dissatisfying that if creating these generative cultures are what's required to get amazing outcomes for every industry, for our most important societal goals, that it's dependent on that leader that we put on top? Do you find it at all disheartening that if these skills aren't learnable, that it's primarily those that you're born with, that seems like not the most hopeful-Ron (01:27:49):No, no. On the contrary, is that you'd have to be willing to search out these people when you need them for national tasks. I mean, if you read books on World War II, you will find out that there were a lot of technological maestros in World war II. Vannevar Bush was one of them. There were people working for Ford who were really geniuses. There were people who built ships. Chrysler and so forth, they built ships and did all sorts of stuff with steel production. There were a lot of maestros, but you have to feel comfortable with them.Gene (01:28:18):Maybe is the implication that they're not so scarce, they're out there, we just need to find them and acknowledge these gifts that they have?Ron (01:28:26):Absolutely.Gene (01:28:27):Yeah.Ron (01:28:28):So the thing is, if you don't have gifts, are you going to want somebody who's really smart working for you?Gene (01:28:35):Here's one of the big mysteries for me. So you can find your subordinates that have all these attributes that are trustworthy, right? And hopefully they hire their subordinates, and so forth, but then you end up with this other problem of the information flow between all those nodes in a graph. So just maybe using your computer science terminology, right? You structure these organizations that are, let's say, putatively hierarchical in nature, it seems to me, in my experience, there seem to be two extremes of how information flows, right? In one, like in the book Team of Teams, in the before picture, right, everything required vast escalations. You have to go up eight, nine levels and go down three. And the picture that kind of emerges at the end of Team of Teams is you had same general structure, right? Navy SEALs still report to the Secretary of the Navy. Army Rangers still report to the Secretary of the Army, but you had these kind of very intense, frequent information flows across the mid-level leadership of all of the service arms.Ron (01:29:33):Right. Absolutely. Yes.Gene (01:29:33):So how do you, as a leader, take all those things you've talked about, but then it doesn't end at N minus one, you have to go all the way down and make sure that those information flows exist. So what does that look like?Ron (01:29:46):What you do is you build a community of good judgment. A community of good judgment is when everybody in the community recognizes who's got the good judgment, and that's the person you turn to when you make a decision. It takes cultivation. This is one of the things I mentioned in the China Lake book later on in the book, is building a community of good judgment is what China Lake did. Everybody knew who was expert at what, and when you needed somebody who would need to have this expertise, you went to him. And some of them were, frankly, offensive people. One of the guys said to me, he said, "I didn't mind being humiliated by such and such a person because he really knew what he was doing." Okay?Ron (01:30:27):If you've got a community of good judgment, you're willing to turn to whoever has the expertise. And that person may be somewhat lower in the organization than you, and maybe in a different department, but you go to that person. This is why the Mayo Clinic, this is going to be in my book on information flow, the Mayo Clinic is such an amazing place, because people understand that they don't have all the facts to solve whatever the problem is. They know who to turn to to get those facts. And Mayo has basically institutionalized a willingness to ask other people. I read that really thick history of Mayo Clinic, but that's the deal, is that if you have information, you go to the person who's got it. And that's the typical thing about generative organizations, because there's no shame in asking for the information you don't have.Gene (01:31:18):And what does a leader do to reinforce that, from eight levels up?Ron (01:31:23):It doesn't matter how many levels there are. What happens is people watch what the leader does, and they do the things that the leader rewards. If you reward expertise, if you reward competence, people will notice that. If you reward flunkies, I mean, I don't have to go very far in recent experience to come up with an example, which I'm not going to mention, but the thing is, is that if you want the best people, you'll get the best people, but you have to want the best people.Gene (01:31:50):Can you give me some examples of behaviors and ways that leaders reward expertise and competence?Ron (01:31:56):Well, first of all, the good leaders are honest. So if somebody does something, they get a reward for it. In organizations where the leader isn't honest, somebody else who didn't do it will get a reward for it, all right? That destroys your sense of organizational justice, when the wrong person gets recognized for something, and shows that the people on top don't know what's going on.Ron (01:32:16):I worked for General Motors briefly, and I talked to people at Lotus, which was a subdivision of General Motors. And they said, "When we got to Detroit, we were really impressed because of the presentations they did. We were a lot less impressed when we figured out what was behind those presentations." And that's the deal, is that when people have the expertise, it is recognized by other people. And if you have a community of good judgment, expertise will be valued. It isn't a matter of how often you sort of brown nose it with the boss. It's a function is, do you know the stuff? Well, if you don't, then you don't belong there.Gene (01:32:52):Yeah.Ron (01:32:52):And people are quick to find out and quick to you on the basis of who you pick. I remember one organization where everybody was waiting for the new CEO to be chosen, and they thought, "Once we get a new CEO, everything will be wonderful." So I remember the first day that the new CEO came on board. His COO basically was Darth Vader, okay? And I just knew that basically my time in the organization was going to be cut short, because Darth Vader was not going to allow me to do it.Gene (01:33:22):Just a brief aside. You've mentioned the role of CNO a couple of times, and it seems like you're attaching a special quality or a stature to that. Can you just maybe talk a little bit about what it is that you're ascribing to CNOs? It certainly sounds like respect for the people who have been in that CNO position or something like that.Ron (01:33:40):Well, good CNOs are basically technological maestros. They know how to manage a big organization. And you mentioned lots and lots of people. Well, hey, CNO has got what?Gene (01:33:50):300,000. Yeah.Ron (01:33:52):Something like that, okay? But they don't lose track of what's going on, and they're willing to work 12 or 14 hours a day, because that's what it takes. And so when I talked to Admiral Moore, I didn't ask him how he managed to do it. What I found is even making an appointment with him was tricky, because he wanted to know where I was going to be before I talked to him, and where I was going to be after I talked to him. When you're dealing with a person who tend to not let things go flapping, and they look at all the different issues, and so it's not surprising to me when there was a fire on the Forrestal, and there were a couple of reports that came out of that, Admiral Moore took those reports and drilled them down into the Navy organization, and he asked for periodic reports about how things were coming along, learning the lessons that had been taught by the Forrestal fire.Gene (01:34:38):Can I ask what the nature of the work that you're doing with Admiral Moore was?Ron (01:34:41):He was one of the project engineers for Sidewinder.Gene (01:34:45):No kidding.Ron (01:34:47):Yes He was the experimental officer on the base and-Gene (01:34:53):Oh, that's so cool.Ron (01:34:53):Well, yeah. Those guys were different. I mean, they understood, and they understood that Bill McClain was very smart, even though he was not very good at conversation.Gene (01:35:02):Yeah.Ron (01:35:02):And often spoke elliptically and so forth. But he had abilities that they knew how to maximize.Gene (01:35:09):Yeah. So Steve Spear, he said something to me a couple of weeks ago that literally made my jaw drop. And he said, "You take a look at healthcare systems, hospitals, some of them can get of the vaccines available 100% into people's arms. Some of them can do like 30%."Ron (01:35:25):Right.Gene (01:35:25):If you take the school systems, once the decision's made to return to in-person learning, some can turn on a dime and execute immediately. Classes go in-person. Some require weeks, months, or even quarters. And he said both of those measures are probably an interesting proxy for to what degree those organizations are able to learn, adapt, have these kind of generative-Ron (01:35:46):Absolutely.Gene (01:35:47):... capabilities to achieve the mission.Ron (01:35:49):Yup.Gene (01:35:49):Can you react to that? And if you believe that, why do you believe that?Ron (01:35:54):Well, because that's the truth, is that if you look at the variety of organizations, you've got ... I mean, look at World War II. Building Liberty ships, okay? So they got to the point where ... And I don't remember the exact numbers, so don't hold me to this, but basically they got to the point where they were building a ship a week. That's incredible if you know how ships are built, okay? So what happened is it didn't start out that they were building a ship a week, but they got better at it all the time. There's a book called Forge of Democracy.Gene (01:36:22):Oh, such a great book. Oh, I love it. Yeah.Ron (01:36:24):And there are lots of technological maestros in that book. The building of the plant in Detroit that turned out the B26s, and so forth. That was an extraordinary example. They originally consolidated the Liberator basically, then just the Liberator. That was absolutely unbelievable. And the engineer from Ford, I can't think of his name, who did that, he was a genius. He really knew how to do something very complicated that a lot of people would not have been able to do. I mean, that building was, what, a mile long or something like that? Just being able to think of an assembly line that's that long, being able to get it so everybody worked on it, and so forth. He wasn't a sweetheart, but he was really very, very gifted when it came to doing the things. And there's lots of people like that, but those people don't do well when they threaten basically the egos of a lot of people who are basically bureaucratic leaders.Ron (01:37:17):So it's sort of like one of my friends who was in the advertising business, who did a wonderful campaign for this organization, and so the head of the organization comes down to the advertising agency and said, " Well, I'm really impressed with the campaign that your guy did for us." They said, "Can I talk to him?" And they said, "No, he's been let go." So why did they let him go? And the answer was because the client would soon learn who it was and he would hire that guy to do the job.Gene (01:37:44):Right. Right.Ron (01:37:45):So if you're a mediocre manager, you don't want to hire a whizzbang to come in and make you look bad. On the other hand, if you have the humility, the necessary humility, a servant spirit, then you will hire the guy who is really good, not the guy that makes you feel comfortable. And if the guy makes you feel uncomfortable because he's better than you, great. That's what you need. But I think that's an answer to your question.Gene (01:38:09):Yeah, actually, so connect the dots. Why do you believe that, especially in these hospital situations, of which you've seen plenty of, what do you think is happening in those organizations that are able to get these 100% vaccination rates that is not happening in the ones that are mediocre?Ron (01:38:25):Well, the answer is very simple. You unleash creativity. So the first thing you do is you ask anybody in the organization that's got an idea what their idea is. Then you put a committee together to consider those ideas and you take the best ideas. Now, sometimes, I know a fair amount about corporate creativity, I could give a lecture on that, but the basic point is that you look for the people who actually have the ideas and you empower those people to put them into effect.Ron (01:38:53):I mean, if you want a good example of this, look at something like the development of the Post-it note. Here's an example where basically the answer was way down in the organization, but as soon as somebody recognized that that was an answer, other people then got onto that, and then they got the Post-it Note developed. The guy who did it was Art Fry. I talked to Art Fry one day. In fact, I had him over to General Motors to give a lecture and so forth. But at 3M, they had a variety of things to surface good ideas. A lot of organizations don't want to surface good ideas because they'll interfere with the ideas you've already got. So are you willing to be superseded? Fine. Okay. But a lot of people aren't. They're not willing to have somebody who's smarter than they are.Gene (01:39:35):Right. And then paint what's happening in the organizations that are getting 30% vaccination rates.Ron (01:39:41):Oh, well, they try to do it the normal way, and the normal way isn't going to work. I mean, let me give you an example. Well, when the personal computer came out, IBM had a heart attack, and they knew that their processes, the ordinary processes would not result in a personal computer in any reasonable time. So they created the Skunk Works, and they took I think 20 engineers and sent them down to Georgia or something like that.Gene (01:40:05):Boca Raton, Florida, I think is famously what it was.Ron (01:40:08):That's right. So the thing is, there are times when, you know you're not going to be able to do it the ordinary way, so you do it an extraordinary way. The same is true of the invention of the P-80 Shooting Star at Lockheed. Basically Kelly Johnson, who was a brilliant engineer, the military was ... I can't get the numbers right and so forth. But they said, basically, "If you can do this in 180 days, we'll give you some additional benefit." He did it in 141 days. And what he did is instead of using the factory, which had three shifts going, this was middle of World War II, he built a tent. He got a circus tent and put it next to the Burbank branch of Lockheed, and with the circus tent, and it was in a very smelly place, that's why they called it the Skunk Works, and in 141 days they got the job done. The Skunk Work then did a whole series of airplanes, like the U2 and so forth, the stealth jet, so forth.Gene (01:41:03):Right.Ron (01:41:03):All of which basically were done in the way that the Skunk Works had learned to do it. So the thing is, not everybody can work in an environment like that. It's too scary, but the people who can really love it. So Kelly Johnson at the beginning said, "We're going to have something called the Dinosaur Committee. And the Dinosaur Committee is going to get rid of all the red tape that's associated with the development process." And so that's what they did. So they shrunk the development process down to something where they could do it very fast, but you had to have something like a Dinosaur Committee to get rid of all these stupid rules.Gene (01:41:38):Yeah. I was so delighted that you cited this Concourse book in the Sidewinder book. That was so fun to see. So when something goes wrong in complex systems, what should ideally happen? In the VW emission scandal, where there was large scale fraud happening, an engineer was held responsible for the firmware changes, and was imprisoned. Equifax breach, the former CEO of the Equifax, Richard Smith, said it was a result of human error and a technical failure, and said it was a security engineer's fault.Ron (01:42:10):Baloney.Gene (01:42:11):Right. So something very dissatisfying about that.Ron (01:42:14):Look at Wells Fargo, okay? And so you've got a system where basically the people on top decided to put pressure on the people below. People below were forced to do things that they didn't believe in, or they were fired. If you create a pathological environment like that, that's what's going to happen. You have to be willing to stand up for standards.Gene (01:42:34):When you look at who is being identified as the problem, right, it's a vast distance away from the leader of the organization. As an academic, if you were brought in as a consultant, what would be your advice to the people who mattered? In a just world, how ideally would you want those failures to happen?Ron (01:42:53):Oh, I can tell you the answer to that, okay?Gene (01:42:55):Oh, great.Ron (01:42:57):It's not pretty. So you get hired to go do something for organization X. About 10 days after you got there, you discover it's the boss. The boss is the problem, but the boss is the one that signs your paycheck. So what do you do? Do you tell them the truth? I'll tell you an interesting story. I was approached by somebody from the Energy Department, and he said, "We would like you to come down to the Energy Department to give a lecture on your schema." I said, "Fine, I'd be delighted to do that." A week later, he calls back and he says, "We can't do that." And I said, "Why not?" And he said, "Well, because the people down here, once they look at your chart, they'll realize they're all pathological." So that never happened.Gene (01:43:36):And how would you directly answer the question? When these high profile incidents occur, and blame is assigned to an engineer, if you were on the board of that organization or you're a consultant brought in to sort of figure out how to fix the culture, in a just world, who is held accountable and how is that done?Ron (01:43:56):If it was a Japanese organization, it would be the head that would be held accountable. And I say this remembering that I got called up one day by the head of a Japanese nuclear reactor, and he said, "We obviously can't continue with the kind of culture we've got, because after Fukushima, we need to do things in a different way." I said, "You're right." So I sent him a paper, but I never heard from him again. But the interesting thing is that Japan basically, the high level is the people who would think about [inaudible 01:44:26] as a result to deal with the shame of having caused the problem.Gene (01:44:30):Just to connect the dots, you're saying in a more just world, that is the people who need to be held accountable, are the people at the top, and not the people on the bottom?Ron (01:44:37):Yes, absolutely. Yes. I mean, there is a question of how high you want to go, but believe me, these things don't happen in a vacuum. And that the whole idea about having a corporate culture is that you have a sense of responsibility, and Eugene Krantz, the reason that he took the Apollo 1 incident so seriously is he personally felt responsible for having screwed up, and he was not the nearest person to the action, but he writes in his book, he said, he wrote on the blackboard, it said, "We will be accountable for everything from now on, without fail. We will not have this happen again."Gene (01:45:14):Right.Ron (01:45:14):And so when Apollo 13 came along, he took personal responsibility for seeing that those astronauts got back safely.Gene (01:45:22):So good. I was rereading your paper, the Study of Information Flow: A Personal Journey. You ended with, I thought, a very remarkable statement. You said, "In summing up, culture is no longer neglected. information flow is, of course, only one issue among many in safety culture, but I feel it is a royal road to understanding much else." And I just want to make explicit something that I believe with conviction, like moral certainty levels, that there's so much that's talked about in the last 50 years, whether it's innovation cultures, or safety culture. Dan Pink's work around autonomy and mastery, learning organizations, MIT beer game, psychological safety, right?Gene (01:46:04):All these things, I feel like that and your work are all pieces of a whole, and that is something very important as we look at the close of 100 years of management methods based on things like Taylorism and so forth, that this is really what's needed to succeed and win in a marketplace that increasingly needs innovation, adaptability, and so forth. To what extent does that sort of resonate with you? One is, "Not at all. Have we even been having the same conversation?" 10 is like, "No, no. That's kind of what you meant by 'part of our royal road to understanding much else.'"Ron (01:46:38):Yeah. I think the truth is that, as I say in the paper basically, information flow is, first of all, something that is part of the organization's nervous system. If it's not working, the organization's not going to work very well. But second of all, it's also an indicator of the kind of management that you've got. You have trust and confidence in organizations where the manager is competent, cares about his or her people, and is doing a good job. And if you have a culture that basically is very sound culturally, it doesn't guarantee success, but it is probably one of the things that you're not going to have success without. So you need to have a culture that is supportive, is going to basically do the job. And I think we need to realize that culture may be a sort of flimsy concept in terms of how we think about things, but that's because we don't understand how important it is. And as you say, if you look at hospitals, there are the ones that can turn on a dime, and the other ones that basically they're lucky if they get half of their vaccines used.Gene (01:47:45):This is so great. So I always end with, how do people reach you? And is there something that you're interested in having people reach you about?Ron (01:47:54):I'm always interested in hearing from people who are trying to solve real problems in real organizations. And you can contact me at AOL. It's ronwestrum@aol.com, and I would be delighted to respond to people to the extent that I can about this stuff. And obviously I've written a lot of books and things like that, and those are out there, but I think, yeah, I'm very interested in hearing about this stuff. If you send it to Eastern Michigan University at the moment, it's shut down, so I won't get it. But if you send it to ronwestrum@aol.com, it'll get to me.Gene (01:48:27):I want to close by saying just how elated that you were even willing to talk and that you were able to spend nearly two hours with me. And I cannot tell you how much I've learned from this. I just want to say how grateful I am for everything that you've done in the past and being able to spend this time with me. So thank you so much.Ron (01:48:45):It is a great honor, Gene. Thank you.Gene (01:48:48):Holy cow. I hope you learned as much as I did in this interview. In fact, I found myself with so many questions that I asked Dr. Ron Westrum if he'd be willing to do a second interview, and that is coming up next. I ask him about why he thinks generative cultures are more important now than it was, say, 100 years ago. His incredible observations on the ever-increasing number of functional specialties, and just how long that's been going on. The challenges that arise from having matrix organizations and tools to overcome them, and more about the book he's currently working on, on information flow. See you then.PART 4 OF 4 ENDS [01:49:33]


gene (2) (1)

Gene Kim

Gene Kim is a best-selling author whose books have sold over 1 million copies. He authored the widely acclaimed book “The Unicorn Project,” which became a Wall Street Journal bestseller. Additionally, he co-authored several other influential works, including “The Phoenix Project,” “The DevOps Handbook,” and the award-winning “Accelerate,” which received the prestigious Shingo Publication Award. His latest book, “Wiring the Winning Organization,” co-authored with Dr. Steven Spear, was released in November 2023.

Want to be the First to Hear About New Books, Research, and Events?