The culture of an organization is important. It makes a difference in how everything comes out. In this excerpt for Episode 17 of The Idealcast, Gene Kim espouses on Ron Westrum’s question: How do you measure culture?
I just wanted to break in quickly and give some more information on what Dr. Westrum was just referring to, because believe it or not, this also came up in the workshop that I took at MIT with Dr. Steven Spear in 2014.
NASA: Experimental versus Compliance-Driven Culture
According to CBS News, two months after the Columbia disaster, “NASA official William Readdy, Associate Administrator for Space Flight and a former shuttle commander himself told Columbia Accident Investigation Board he did not consider asking for a spy satellite inspection of Columbia’s left wing during the doomed ship’s mission because the agency had already concluded the shuttle could land safely.”
The article goes on to say that the decision had been the subject of second guessing and criticism in the wake of the disaster. In the days after the mishap, shuttle program manager, Ronald Ditmore, said no request for satellite imagery was made because the resolution of the imagery based on past NASA experience would not be high enough to reveal damage to individual tiles and because Boeing analysis had concluded Columbia could land safely.
During Columbia’s mission, however, Wayne Hale, a senior flight director, now serving as launch integration manager at the Kennedy Space Center, made inquiries about the possibility of Air Force help inspecting Colombia. But those initial efforts were terminated by senior management.
“The space shuttle program did not want any data. And in fact, there was never a formal mission operations directorate request made from the flight dynamics officers or the flight director,” Steve Stitch, a flight director himself wrote in an email to a colleague.
So of course, just like in any post-mortem, we know the dangers of playing would have, could have, should have. With perfect knowledge of the past we can always come up with better decisions, but it doesn’t usefully inform whether we’d make better decisions in the future. But the story does seem to affirm Dr. Westrum’s notion that there was a real cultural difference in the appetite for information that NASA had between the Apollo era and the Space Shuttle era.
Westrum Organizational Typology Model
In the introduction to this episode I gave an abbreviated version of the Westrum organizational typology model. There are three columns in this table, pathological, bureaucratic, and generative. Pathological organizations are characterized by information being hidden, messengers of bad news being shot, responsibilities are short, bridging between teams and functional groups is discouraged, failure is covered up, and new ideas are crushed.
Next is bureaucratic organizations, where information may be ignored, messengers of bad news are tolerated, responsibility is compartmentalized, bridging between teams and functional groups is allowed but discouraged, the organization is viewed as just and merciful, and new ideas create problems.
Whereas in generative organizations, those mostly high-performers, information is actively sought, messengers are trained to tell bad news, responsibilities are shared, bridging between teams and functional groups is rewarded, failure causes a genuine sense of inquiry and new ideas are welcomed.
Clearly I think we are hearing those norms of bureaucratic culture of the Space Shuttle program versus the generative culture of the Apollo era.
United Airlines Flight 232
Earlier in the episode, I mentioned United Airlines Flight 232 in the context of crew resource management, which through training had transformed the dynamics and information flows within the airline cockpit. Flight 232 is so interesting and memorable for so many reasons, but it is also credited for saving so many lives in this particular disaster.
According to Wikipedia, “This was a regularly scheduled flight from Denver to Chicago on July 19th, 1989. The DC-10 serving the flight crash landed at Sioux City, Iowa, after suffering a catastrophic failure of its tail mounted engine, which led to the loss of many flight controls. Of the 296 passengers and crew on board, 112 died during the accident. Despite the deaths, the accident is considered a prime example of successful crew resource management because of the large number of survivors and the manner in which the flight crew handled the emergency and landed the plane without conventional control.”
I remember reading the cockpit transcript of this flight back in 1995, back when I was a graduate student at the University of Arizona. And then I remember reading a lecture that the captain of this flight, Captain Al Haynes, gave at the NASA Ames Research Center on May 24th, 1991. Last night, I reread the transcript of that lecture, and it is as riveting as I remember it being when I first read it back in 1995, twenty-five years ago.
I want to look at two portions of his talk.
The first is the nature of the failure. He said, “On July 19th, Murphy’s Law caught up with us, and we did lose all three hydraulic systems. And as a result, we had no ailerons to bank the airplane, we had no rudder to turn it, no elevators to control the pitch, we had no leading edge flaps or slats to slow the plane down, no trailing edge flaps for landing, we had no spoilers on the wing to help get us down or help us slow down once we were on the ground. And on the ground, we had no steering nose wheel or tail and no brakes. So what we had…was the throttles on the number one and number three engine to control us.”
Later he talks about CRM (cockpit resource management). He says, “As for the crew, there was no training procedure for a complete hydraulic failure. We’ve all been through one failure or double failures, but never a complete hydraulic failure. But the preparation that paid off for the crew was something that United started in 1980 called cockpit resource management. All the other airlines are now using it. Up until 1980, we worked on the concept that the captain was the authority on the aircraft and what he or she said goes. And we lost airplanes because of that. Sometimes the captain is not as smart as we thought they were. And we would listen to him or her and do what they said, and maybe they didn’t know what they were talking about. On that day we had 103 years of flying experience there in the cockpit, trying to get that airplane on the ground. Not one minute of which we had actually practiced a complete hydraulic failure, not one of us. So why would I know more about getting that airplane on the ground under those conditions than any of the other three? So if I hadn’t used CRM, if we had not let everybody put their input in, it’s a cinch we wouldn’t have made it.”
He continues, “If you read the cockpit voice recorder transcript, there’s a lot of that going on. When are we going to put the gear down? I don’t know. How are we going to get the landing gear? Maybe one of two ways, let’s try it. So in short,” Captain Haynes concludes, “CRM really paid off.”
So one of the most famous and memorable aspects of this accident is that there just happened to be a DC-10 instructor in the cabin of the plane, Captain Fitch. Instructors have a reputation for not only having expertise, but also having lots of experience. So Captain Fitch joins the cockpit crew and stands in between Captain Haynes and the co-pilot to control the throttles, which were the only means of controlling the aircraft.
So here’s what Captain Haynes said about that during the lecture. “We were told that there was a DC-10 captain in the back who was an instructor. And we like to think that instructors know more than we do. So I figured, maybe Denny Fitch knew something that we didn’t. So we asked him to come up. Well, he took one look at the cockpit, and that’s his knowledge. It was sort of funny listening to and reading the transcript, because he’s about fifteen minutes behind us now and he’s trying to catch up. And everything he says to do we’ve already done. And after about five minutes, that’s twenty minutes into this operation, he says, ‘We’re in trouble.’ We thought, that’s an amazing observation, Denny.”
“And we kid him about it, but he’s just trying to catch up with our thinking. We were minutes minutes ahead of him. When he found out that he didn’t have any new knowledge for us, he says, “Now what can I do to help?” I said, “You can take these throttles,” so he stood between us, not kneeling on the floor as the news media said. He took one throttle in each hand, and now he could manipulate the throttles together.”
“With a number two throttle frozen, we couldn’t get a hold of the throttles together. He could. And we said, ‘Give us the right bank. Bring the wing up. That’s too much bank. Try to stop the altitude.’ He tried to respond. And after a few minutes of doing this, everything we do with the yoke he could correspond with a throttle. So it was a synchronized thing between the three of us. With Dudley still being able to do all of his communications. So that’s how we operated the airplane, and that’s how we got it on the ground.”
So it is amazing to me that in the middle of this emergency they decide to integrate a new member onto their team. They quickly assessed the skills that he had, which was not as revelatory as they might have hoped. But they integrated him into the effort to control the airplane, and they ended up landing on the ground. So clearly, this was not an individual effort, the realm of psychology. It was a collective and very dynamic effort from a group of individuals, the realm of sociology.
One last side note, in the lecture Captain Haynes also mentions the interaction that they had very early in the emergency with the San Francisco Area Maintenance crew. Those are the maintenance experts sitting in San Francisco for each type of equipment that United flies. They have all the computers, all the logbook histories, the history of the aircraft, all the information that they can draw upon to help a crew that has a problem.
And as he says, “Unfortunately, in our case, there was nothing they could help us with. Every time they tried to find something that we could do, we had already done it or couldn’t do it because we had no hydraulics.”
So take that phenomena of Captain Fitch being fifteen to twenty minutes behind the cockpit crew. This was even worse.
In the last episode of The Idealcast, I talked about the slower, integrated problem-solving paths that occur when you go up and down the organizational hierarchy, and the much faster integrated problem-solving paths when it’s within a team or across teams along sanctioned interfaces.
And the fast and slow being a proxy for four characteristics: frequency, speed or latency, granularity, and accuracy. Operations favors the first two, of frequency and speed, whereas planning, preparation, assessment favors granularity and accuracy, because we don’t want to make plans on categorically false information.
I think the dynamics of the cockpit inside of United Flight 232 presents such great examples of these. Examples of slower cognitive activities include establishing the CRM protocols, training and roll out across all United airlines. Setting up those maintenance crews, say, at the San Francisco Airport.
Whereas, the faster integrated problem-solving included figuring out how to keep the plane flying, assessing whether the maintenance team could actually help. In their case it didn’t. Assessing whether and how Fitch could help, and integrating him into the flight operations. And establishing the new roles and responsibilities.
And holy cow, reading the actual cockpit transcripts is amazing to see how laissez the communications between the cockpit crew and air traffic control was. Often there were significant misunderstandings and lots of need for repetition. In the lecture, Captain Haynes described his frustrations at the time with communications. And he had delegated all external communications to his flight engineer, because his focus was very much keeping that plane flying and figuring out how to get the plane landed on the ground.
Going back to the comparison of NASA during the Space Shuttle era versus the Apollo era. Before we go back to that interview, I just want to mention that one of my favorite scenes in Apollo 13 was the scene where they have to figure out how to change or connect the CO2 filter in order to save the astronauts. Gene Kranz, as played by Ed Harris, said to all the engineers, “Go figure out how to fit a square peg into a round hole.” In the next scene, the engineers are assembled with the mission of how to fit this, a big square plug into that, a small round plug, using only that, a bunch of parts, including what look like space suits, tubing, and duct tape.
In this case, the Apollo 13 astronauts were able to leverage the slower, cognitive problem-solving capabilities of an enormous group of talented engineers to help them get safely back home. In contrast, on United Airlines Flight 232, because of the nature of the emergency, the only experts capable of diagnosing and flying a plane that had lost all three hydraulic systems were in that cockpit.
Okay. Back to the interview where Dr. Westrum talks about the continuation of his quest to understand the dynamics of generative organizations.