Organizational Silence Panel Discussion

Uploaded by NASAappel on 06.12.2012

>>MR. ROGERS: So our afternoon panel will continue discussing the topic
of the day and how we'll work towards getting this balance right in a complex organization.
I think sort of a side story there from Apollo
addresses some of this we have this idea that everything was perfect in Apollo
they got it all right it was a miracle day, this myth
and it's kind interesting what Andy said in the beginning
we have this idea and he closed with this quote
too, people are going to ask how did they do that years from now
kind of like we look at the pyramids how do they do that
we don't see anything lying around what they did with it, did aliens get involved something like that,
so it's interesting to take time to look at it and how much it had to do with
how well the organization worked
particularly the first quote in Fortune that the whole
Apollo program was as much as an organizational success
and miracle as it was a technological miracle and sucess
to get that many people focused effectively, working together imperfectly,
it wasn't perfect but they got the job done and they had a lot of interesting meetings as I'm told.
So I'm going to introduce the panel, we're going to get started, those waiting in line, I
understand that you'll still be listening in while you're getting your book signed by Andy
that's perfectly fine
So our afternoon panel has a four members there's a good mix here as you'll see,
let me introduce them and I'm going to read them in the order I have them in my notes here and
so Michael Ryschkewitsch is here, I think most of you know him as chief engineer.
Dr. Michael Riskevich is responsible for the overall review and technical readiness
for all NASA programs.
He is also responsible for supporting the program and project and system engineering capability
at NASA with policy and best practice guidance, knowledge management coordination
and training and certification.
Previously Dr. Ryschkewitsch served at Goddard as the deputy director
for the NASA Goddard space flight center and as director of our engineering directorate.
He joined the center in 1982 as a cryogenics engineer
to work on the cosmic background explorer mission
since then he has supported a wide range of missions in various roles
ranging from the first servicing mission of the Hubble telescope in '93
to the Mars science laboratory Curiosity,
many science missions and all shuttle launches from 2007 through the final mission
Dr. Ryschkewitsch earned his bachelors degee in physics from the University of Florida,
his doctorate from Duke University
numerous group achievement awards throughout his career
also awarded the NASA exceptional service medal,
NASA medal for outstanding leadership,
and the Robert Bowman award for contributions to mission success.
Bryan O'Connor who is sitting at the other end
began active duty with the United States Marine Corps in June 1968
he was selected for the astronaut program in May of 1980
when the challenger and its crew were lost in January 1986
O'Connor was given a number of safety and management assignments
over the next three years as the space agency recovered from the disaster
in the first days after the accident he organized the initial wreckage reassembly activities
at Cape Canaveral,
then he established and managed the operation of the NASA headquarters action team
the link between NASA and the presidential blue ribbon investigation panel,
more commonly known as the Rogers commission, no relation,
O'Connor left NASA in 1991 to become the commanding officer
of the marine aviation detachment at the Naval Air Test Center in Patuxent River,
but within a year he had returned to NASA headquarters in Washington
retiring from the Marine Corps
to being the deputy associated administrator for space flight
he was immediately assigned a task for developing a comprehensive flight safety
improvement plan for the space shuttle
working closely with Congress for administration for funding of the major upgrade program.
O'Connor has flown over 5,000 hours in over 40 types of aircraft, that's interesting,
I counted one day I had visited 40 countries in the world
and you've flown in 40 types of aircrafts,
interesting statistic.
O'Connor was a pilot on STS- 61 B in 1985
at the time this flight carried the heaviest pay load weight to orbit by the space shuttle
and first to deploy four satellites.
In 1991, O'Connor commanded SES 40
the first space shuttle mission dedicated totally to life science studies
with these two missions he has 386 hours in space
covering nearly six million miles and 253 orbits of the Earth.
In June 2002 Mr. O'Connor rejoined NASA
as the associated administrator office of safety
and mission assurance
in August of 2004 that role was changed to chief safety and mission assurance
and he had functional responsibility for the safety, reliability, maintainability,
and quality assurance of all NASA programs
he retired from NASA August 31st, 2011
with barely a year in retirement
as an independent aerospace consultant
he was asked to rejoin NASA's aerospace safety advisory panel
which he is set to do so I believe in the next month
like Mike Ryschkewitsch, Bryan is a repeat participant here at the organizational silece workshop
>>MR. O'CONNOR: You did it right.
>>MR. ROGERS: Yeah I got it right. So welcome back to both of you.
These two gentlemen are joined by two academics
who we worked closely with over the years
and let me introduce them, first of all,
Dr. Amy Edmondson.
Dr. Edmondson is a Novartis professor of leadership and management
as Harvard business school,
where she has taught for 16 years
the Novartis chair was established to aid in the study of human interactions
that led to the creation of successful enterprises
for betterment of society. Sounds like a design for helping NASA by my read.
Her research has been reported in articles in academic journals
and her recent book Teaming: How Organizations Learn,
Innovate and Compete in the Knowledge Economy
was just published in April by Josie Bass
Michael Useem, director of the leadership center at Wharton said,
Teaming is the book on how to lead and learn from innovative teams and dispersed networks
I have my own copy here
we unfortunately don't have a lot of copies of this but I'm going to go get mine signed.
Okay it's a business thing, right?
So Edmondson received her PhD in organizational beahvoir, an AM in physcology,
an AB in engineering and design all from Harvard university
she teaches courses at Harvard business school in leadership, organizational learning,
and operations management in both the MBA and executive education programs,
you may also know her as an author of the Harvard Columbia case study
which is used around the world for teaching and learning management lessons.
I'd like to also welcome doctor and professor Robin Dillon Merrill.
She is an associate professor in the McDonald school of business here at Georgetown
Professor Dillon Merrill seeks to understand and explain
how and why people make the decisions that they do under conditions of uncertainty and risk
another sort of made for working with NASA you know topic.
This research specifically examines critical decisions
that people have made following near miss events in situations with severe outcomes
i.e. like hurricane evacuations, terrorism ,and things like that.
She has received research funding from the National Science Foundation, NASA,
Department of Defense and Department of Homeland Security
through USC's national center for risk and economic analysis for terrorism events
she has served as a risk analysis and project management expert
on several national academies committees, including the review of the New Orleans
regional hurricane protection projects
and the application of risk analysis techniques to securing the department
of energy special nuclear materials,
that's very nice to know.
Dr. Dillon has been working with us here at Goddard for over nine years,
researching and helping us understand how NASA learns
and has published a number of papers on the topic of her research
with NASA. We're glad to welcome you back
you're not an unfamiliar face here either.
So welcome to all four of you, thank you for being willing to come to be on this panel
for those of us participating here in the room and as well as Wallops
and IV and V and as well as or centers where you may be listening
these panelists have prepared no charts, they have no message to sort of deliver
for us other than this topic is as you heard very important to all of them by their experience,
their position, their research and their close work in the field.
So what I asked them to do and I'll ask them now as starters
is to just help us understand with a few minutes
why this topic is particularly important to you personally
from your own sort of experience
and work and then we'll get into some questions that I have as well as questions
that you have to try to address this topic of the day in a large sense.
So with that, who would care to go first, Amy may ask you.
>>MS. EDMONDSON: Sure I'm happy to go first. I like that question
I won't dwell too much on the personal,
but I was so moved this morning by Marcia Coleman Abadayo's talk
and her book, which I have been reading the last few days
and it's wonderful and shocking and heart wrenching
and so I'm happy to have the opportunity to talk about the other side of voice
or the other side of speaking up
because I've been doing research on this topic for about 20 years
and I tend to do essentially no research on the categories of organizational activities
that require whistle blowing, wrongdoing, malfeasance,
highly abusive supervision of the kind we heard about today.
When those things happen and they do happen,
and they do happend and we really had to talk about that this morning,
it's already in my categorization it's already a failure, it's a --
and bordering on the tragic and so I --
that's terribly important and got lots of well-deserved attention this morning.
What I feel we often spend less time thinking about are the much more ordinary
day-to-day silencing dynamics in organization
and these are instances of people in organizational hierarchies
who have something to say,
whether it's a question to ask, a concern to raise, a mistake to admit, an idea to share,
and choose silence over voice.
More often than not my research suggests they choose it almost without consciously
choosing it, it just happens and it happens even in the absense of scary bosses,
so it doesn't -- you don't have to have and I've studied in fact
situations where scary bosses are a major, major contributing factor to people
not speaking up with the ideas they have with --
with the important implications for safety worker safety,
patient safety a particular interest of mine
but also with important implications for organizational innovation,
the good ideas that dont get shared, that dont get raised.
So I've spent a lot of time trying to understand exactly why,
and I'll talk more about this later but exactly why and what are the dynamics
and where do they come from evolutionarily speaking and otherwise,
that lead people to withhold rather than share
and the kind of almost taken for granted programming
that's at work for humans in organizational systems
it's a topic that is particularly personally meaningful to me
because I think we can all understand the pain, the very real pain and tragedy of the --
of the kinds of things we talked about this morning,
but there is another kind of pain that I've been studying,
that is the sort of the pain that people feel when they are not able to fully be themselves at work,
not able to fully bring themselves to the mission,
the mission of the organization,
and it's a -- it's a much more low-level pain
and it even I would characterize it as a very low intensity fear,
it's not a fear that most of us are even aware of having much of the time,
and yet there is countless lost human potential and lost meaning
that people could have at work and do not have
and certainly a lot is left on the table for the organizations, as well.
So that's the kind of other side of voice that I wanted to just put on the table
and talk about why it matters to me, as well.
>>MR. ROGERS: Okay, thank you. Mike?
>>MR. RYSCHKEWITSCH: I would like to follow-up on that because I'm interested for some of the
same reasons this morning's discussions was interesting
and important because frankly if the kinds of things we talked about this morning happened
here or even if people think they might happen
that's more than enough to poison the environment
and frankly getting that fixed is merely the price of entry,
it's necessary but not sufficient to -- to ensure mission success,
and I'll you know I'll -- my comments here will be mostly about mission success.
One of the ways this got driven home to me is I chaired the genesis mishap investigation
a few years ago and in the process of doing that I read a whole bunch of mishap reports
because I wanted to A to figure out what the hell a mishap report looks like,
and you know and B see what other folks did and you know --
you know try not to step in the same pot holes
and two things immediately jumped out at me when you read these mishap reports
and these were all class A mishaps things where we lost hundreds of millions of dollars
worth of hardware or we killed somebody or seriously injured
somebody, and I've read a lot of them at this point, and two things jump out,
one of them is that almost all of them, if you read the executive summary
you could change a paragraph describing the mission
and the executive summaries would be about the same a few of the details,
so we clearly don't learn as much as we should from those,
the other one is that I've only actually come across two, and we'll talk --
we may jump later into the two that I can point to where somebody
within the team in all of the cases but those two somebody
within the team had a critical piece of knowledge
that had it been revealed to proper other person on the team
the problem could have or should have been avoided.
And when you look at that and you start looking at the circumstances in genesis
in particular one of the things that jumped out at me is --
is the kind of the bottom line is we're only human.
And so you look at things and you say,
what it takes to get people to take that piece of information to somebody.
And frankly it's not an overt or in this case at least it wasn't an overt decision
by anybody to --
to hide information or you know that it was really an objective decision
that said this isn't important.
It appeared to be more now this is of course after the fact
and because you know there was a long time in between,
but it was more the just the friction of life that got in the way,
little things like and when you look at a, you know the second people around the decision
that is were made and you say who had an opportunity perhaps
to see it and then you look at people that were working their first job,
which was their first time at that level of job, so they were probably struggling a little harder
pulling extra shifts as a test conductor and it happened over Christmas season
and probably what happens when they go home at night is they ain't worrying about this test
that was or wasn't run or should we run another test they're probably thinking about how am I
going to get Christmas shopping done and have Christmas for my family.
In other cases it's a friction of an organization
when we asked folks that said who do you think was the primary customer
for the entry phase of that mission that the person that you
know had their be all and their end all you know their number one job
was to say that went well and we couldn't get a clear answer in fact we got five answers and
that immediately tells you if somebody's looking around and think got this
nagging looking worry in the back of their head not this burning thing that oh, my God
we've got a problem and they don't know who to go to, that it takes
away their safety net of being able to walk down the hall and in the office and say is this something
that we should be worried about, so again
and again and again you see that just personal friction in --
in teams and organizations that gets in the way
and frankly when I read other mishap reports
I kept seeing things looked an awful like that, as well,
sometimes it's you know sometimes it's stress, sometimes it's confusion
so with that we'll --
>>MR. ROGERS: Okay Dr. Dillon.
>>MS. DILLON: I actually don't look at people not speaking out of fear
but not speaking because they don't recognize it is a problem
studying more of the cognitive biases we've
done it with some material we developed for NASA, some of you may recognize me oh,
I think I filled out a survey for her when she was with Ed at some training I did,
we've also done it for hurricane evacuation decisions
that we showed time and time again that if you take chances
and he get away with it and there was no obvious sign that there was a problem there
answer that it's likely that you're going to conclude that the system worked.
And so how I got motivated to do any of this research was in the early `90s
I saw the person who became my Ph.D. advisor mentor Elizabeth Pate Cornell
from Stanford she came and gave a talk and she did the risk analysis of the
tiles of the spaces shuttle in the early '90s and
if you haven't seen any of her work she drew a map of the orbiter
and colored the sensitive part of the orbiter and she said debris is a problem
the debris coming off of the external tank is a problem, and if a big enough piece hits a sensitive part of the orbiter, it could be catastrophic.
She turned this in, she interviewed people at Johnson, she interviewed people at Kennedy
she turned her report in and really the next time she heard about anything
was the day after they lost Columbia
when all of the people from NASA were like whoa, you know the press will know
we did a study of this and we can't find a copy of the report anymore.
So they were calling her again to get another copy to figure out what that report actually said
because at the time in the early `90s you worried about it, they were worried enough
about it that they hired professor to do a risk analysis and they collected the report.
But every time that the shuttle came back with damage on it and if you look at the
Columbia accident report they made that nice graph
after the fact that shows on the bars where there was damage, everytime it came back
and there were no obvious science that we almost lost this and to Mike's
comment when you have a lot of other things going on, you have main engine problems
and bearing problems and everything else we quit worrying about this one
and a faculty memberswho is not on this panel right now,
but Diane Vaughn who many of you are probably familiar with actually came up with a term
normalization of deviants and she used it to categorize NASA's behavior
in the past that things that were not normal over time become --
come to feel more normal and so that's what I spent oh,
I don't even know the last ten years trying to prove
that it's not even that people are scared to speak up
or that people don't want to speak up but over time they just don't worry about it as much
or they're worrying about something else or they're worrying about their Christmas shopping
so they don't even recognize it as a potentially bad event
because it worked or it was successful and so what I've been encouraging
and with some help from Ed is also trying to encourage people to look more closely
at the successes so that you can actually learn from them.
And my one statistic that maybe I should save for later is --
but I'll use it now anyway is in FAA one of the things they have done is a great deal
they looked at collecting all of their near events, they have incentivized everybody from pilots
to air traffic controllers that it's not for an attribution system
unless there's gross negligence
and you need to tell people if you're a pilot and
you accidentally taxi on an active runway, as long as you tell somebody than you can't be punished later,
but if you don't tell somebody and they find out about it
they can punish you. So it is incentive to tell anything report everything and they studied this database
that they created from it and they've shown over the last decade
there's been an 83 percent drop in fatalities so they have measurable success
from studying these near events
but there's this whole thing that you have to get into place here
people tell you, you have to be able to collect the data, and you have a different problem
than the FAA and the commercial industry
because commercial airline industries because each you're your problems are different
but I still think there can be success found
and improvements made from studying your successes
and looking more closely at them rather than just be like it was successful
let's move on before anybody find out those things almost didn't work out.
>>MR. ROGERS: Okay, thanks. Bryan?
>>MR. O'CONNOR: One of the things Ed sked us to think about is just a simple
question of why are we here today
when I think about why I am interest in this topic, why am
I here, I'm thinking back on a time when I had kind of an awakening
about my own deficiencies when it comes to being part of a time that's doing
high risk stuff with high stakes that was the human space flight program
it was after the challenger accident that I realized
that as I started reading some of this stuff that was coming in
and understanding some of the root cause things that get into communications failures
and such organizational concepts
that is cause barriers to communication,
a little bit of this center rilvary that you heard earlier is a very key component
to the challenge story but it really is about communications
and some people knowing there's a problem but that word not getting to the ones
that can do something about it because of the artificial, organizational barrier of some sort.
And as I start reading those things and thinking about them I realized
that I should feel guilty about this accident as --
as well, even though I wasn't personally involved in the operation or whatever,
I really wasn't assigned to that mission in any way,
but you know I lost a lot of good friends there,
and I realized that there were lots of times when I was in a meeting
or a review where I heard something that didn't sound right
or even worse, I would hear two people talking to one another
but not really they were talking past one another not realizing it.
And I would sit there and let that happen and not do anything about it.
And it was a matter of in one way it was sort of a matter of my own confidence
that this is the time when I Bryan O'Connor should stand up
and say something or raise my hand
I didn't have the confidence that I held that sort of credibility in the organization,
and as I look back on it I really should have had
it at that time, there's no excuse for that and that's why I felt guilty,
there was really nothing that I could remember that applied directly
to that mishap that I might have had a say in
but there were lots of other things that I had been in meetings on
and hadn't said things.
One of the things that I think I way allowed to get to me was I came from a different organization
where I did have relatively good confidence
that I was a team member, I had lots of authority, and --
and there was an accountability there that I had built up, and I was more --
more apt to speak up and say something.
But when I came over to NASA which was a different environment I was really quiet,
sort of listening, and that's okay for a few months
but after a while you have to realize that the reason they hired me here
is because what I was doing in that other outfit
that's why they brought me here,
they want people like me, they want lots of people
but one of the kinds of people they want is the kind that I am and they brought me in here not sit for five years and wait 'till I go fly, but
help this organization deal with its problems.
So I had not enough confidence and I should have had a lot more.
And I've often thought about it in terms of my badge,
I don't know why I think about badge, but --
but there's another term that -- that Andy talked about earlier
as part of the success story of Apollo and that's accountability, when --
when you come to NASA and you get your badge whether it's a contractor badge
or civil service badge or whatever there are some things that go with that
and one of them is the account act for safety.
Now your accountability it depends on things like your authority,
your resources and your responsibility but everybody's accountable
for safety to some degree when they put on this badge in this agency
the administrator has the biggest accountability
because it all comes up to him in his suite
but everybody that wears that badges has that accountability.
But there's another thing that you should understand
when you're wearing that badge you have a responsibility to speak up
and use your background use experience and so on at appropriate times.
Now that could be a study in itself right there how do you --
how do you not overdo that?
I won't talk about that right now because I've seen that happen as well
and that can make the credibility go other way
because for myself I felt so guilty about that, that from that point on
I never did allow two people in a technical meeting that had high stakes consequences
and not all meetings do but from then on I promised I would never
let two people talk past each other at the very least
if I had a question about what I just heard there
and I don't think necessarily they were agreeing when they thought they were
at the very least I can raise my hand you know I'm not sure I heard the same thing
from you two people that are agreeing with each other, can you two go over it again
it doesn't take much do something like that and sometimes it saves the day
but it's little things like that that came out of my road to Damascus light show there back in 1986.
>>MR. ROGERS: Thank you.
One of the points that's come up this morning and I think Amy you mentioned
it in your book too is this issue of people measure day-to-day risks from my own image
so everything I do and say I'm weighing constantly how is this going to play out for me,
how do we address the fact that everyone has some sort of contemplation
about their risk to their image and it effects whether
this he choose to speak or not on a day-to-day basis?
>>MS. EDMONDSON: That's something I've been really working on for the last couple of decades
and there's a couple ways to think about it. The first, I think we all --we lose-- it's not that we walk around
calculating these things consciously, day in and day out,
that would be absurd in fact we're highly skilled at it
but think about it this way, nobody no matter how much you love your job
or are committed to the mission of the organization
wakes up in the morning absolutely enthusiastic is it about going to work today
and looking ignorant, incompetent, negative or intrusive, right, full stuff we just don't.
And so at a very basic, but very real level we've learned how to manage those
what I just call interpersonal risks they're not big time risks, they're not shuttle destroying risks
until they are, and they're not you know related to real wrongdoing,
as we heard about this morning.
So given the fact that we don't think about these very wrong
but we manage them skillfully
and the easy and skillful way to manage them is simply don't want to look ignorant
don't ask questions,
don't want to look intrusive don't do what Brian just suggested
one should do when one observes one of those talk by conversations, it would be --
it would be intrusive to do so.
So -- so the bottom line is first you recognize that this happens it happens skillfully,
relatively unconsciously, the second thing we do is sign up in our own minds
to not behave in normal ways at work.
And so everything I'm about to say has very little to do with the dinner party, all right?
If you started behaving this way at a dinner party you wouldn't be invited back
but at work we have a different kind of responsibility
and it can either seem intrusive or it can be done with you know extraordinary curiosity,
compassion, genuine interest, you know, wait a minute,
didn't -- I'm not tracking, I didn't --
I didn't follow that or I don't yet understand.
So we have a responsibility and I think the badge is a nice image here.
We have a responsibility with the badge to be far more skillfully
in unraveling the things we're normally skillful about --
about avoiding so we have to almost remind ourselves
but more importantly as people who manage anyone or who are peers or team members
of anyone to help each other do this as well.
So we have to remind ourselves to be curious,
we have to remind ourselves to ask more questions than seem sort of socially normal to do,
to reveal more of our own kind of thought process
than might be socially normal to do but is mission critical in
and again this is not the conversation you have about where should we have lunch today,
it is what Brian called the high stakes situations.
And I'm quite stunned when I listened --
well actually not stumped, I mean, I see it all the times it could be conversations
that were recorded at NASA or conversations at a midwestern
manufacturing company it's the same stuff done when you listen to the conversation
how and look carefully at the transcript how --
how unthoughtful it is or how undigging in,
how these moments when we could have stopped
and inquired sort of go by without anybody noticing now
and then somebody's noticing but more often than not
we're not noticing it so I think the first things
we have to recognize that we're all managing those risks unconsciously
and learn how to manage different risks instead the risk of failing to achieve
the organizations mission, the risk to life and other kinds of real more technical risks,
get over ourselves with respected to the interpersonal risks easier said than done.
>>MR. RYSCHKEWITSCH: If I could build on something that Andy said this morning
when you talk to the veterans that were around in that day, one the things you hear consistently
is that the peer reviews
and technical arguments were far more combative than anything that we would --
would find acceptable today and it's something I struggle with because on the one hand
if you're not hanging hard on ideas
and challenging facts you're not going to get at the truth
because we're all subject to confirmation bias you need other people
to challenge you to get there and yet at the same time
we need to be able to preserve the ability to work together.
I won't suggest that we have any clue how to do that in the more general society,
but some combination of civility and the valuing of the challenging facts
and challenging facts and ideas and not challenging people is something we have a lot to build on.
>>MS. EDMONDSON: Can I ask.
>>MR. ROGERS: Yeah you can ask questions, that is perfectly legal.
>>MS. EDMONDSON: This is so near and dear to my heart
And I have been talking a lot about-- I do talk a lot about speaking up, but there is something called speaking up effectively.
>>MR. ROGERS: We call that NASA watch (laughter).
>>MS. EDMONDSON: Yeah, right, but speaking up effectively means when I am speaking up,
I also am at least indicating to others that i recognize the very real possibilty
that I may be missing something.
I am not speaking up as, here is some God's truth from up high
and you will hear it from me and I don't want to hear back.
So speaking up effectively means what with Chris Arbuous calls a balance of advocacy
and I'll tell you what I think I'll also check in
and ask what you think kind of implicitly or explicitly.
I talk about this we can't have the situation where we're advising everyone
to just yell and scream at each other all the time it really won't work
it's emotional, it shuts down thinking, it doesn't open up thinking
so I talk about manage self, manage conversations
and manage relationships and manage self is to learn all of us has to learn
as Brian I think nicely pointed out before to be a little bit more self-aware of what our own
emotions are telling us what our own cognitions are telling us
and to think-- get in the habit of being more skillful about that.
Manage conversations is slow it down,
make sure we're understanding
what we're really saying and manage relationships
is when things aren't hot and were not in the hot seat not were not in the midst of making big conflict
make sure to be reaching out and building solid trusting relationships
with those colleagues whose connections matter most to the share admission off line.
>>Mr. Rogers: Are we overly sensitive about offending people sometimes
and too quiet Brian you were talking about not saying something
because you weren't sure people would take it the right way.
>>MR. O'CONNOR: When I checked into NASA in the Johnson space center in 1980
they gave us a couple of jobs to do one of the first jobs I got was as a capsule communicator
so I got to work in the mission observations control room
worked on self of the early shuttle missions over the --
there was a plaque over the door that leads into the mission operations control room
at the Johnson space center or what they call mission control,
and it says in God we trust all others bring data.
Now that really was a powerful statement
and I read that every day as I walked in there
and I realize that you know they don't want to hear my opinion on anything, do they?
I haven't been here long enough to -- I don't -- I haven't been here really long enough to -- to --
to expect anybody to pay any attention at all to an opinion that I might have,
but if I bring data they will listen to me, that's what that thing says.
Now I knew from experience that there were people who had been there long enough
to be able to express an opinion or a gut feel.
When I saw who those people were, I realized I have a long way to go.
So I think that was more the nature of my thing I'm not sure I have the -- the credibility, yet,
to actually speak up and not be tuned out
because I saw what happened with some people when they spoke up too often
or they spoke up with without real data,
or they gave an opinion and no data behind it
they just got jumped on because these were old Apollo people still in that same mode
the language was more like the Navy than --
than -- than a convent.
And -- and you just didn't want to get piled on too early in your career
when you were trying to establish your reputation.So it was difficult to figure out
when it's okay to say something and when not.
I've often wondered about the people that are in these,
these real young groups up there in California putting together Googles
and all these various companies, little companies with very young people in them,
they tend to be just the opposite, they --
they bring people in from Stanford University right out of --
right out of the university and they want to hear their opinions.
In fact they don't want to hear their data, that is irrelevant, they just want to hear data
which way do you think we should go you've been in Standford, what do you think
at NASA, at JSC it was not that way so it was a cultural thing you had to kind of overcome.
>>MR. ROGERS: Steve jobs had an interesting quote
he was quoted as saying we didn't hire smart people at Apple
so we could tell them what to do, we hired smart people
so they could tell us what to do that's why we were so successful
and it is interesting that it's not that easy to do that in organizations. You come in
and people look around what are the rules here how do you not offend somebody not step on
the wrong persons toes. It's interesting you use the word uncertainty
about whether to speak up or not.
>>MR. O'CONNOR: Yeah if you had some doubt about your data, is this solid information
that's relevant to this conversation
I'll speak up, but if I had doubts about it then I tended not to, it --
it was almost something that it's one of the reasons we had such long hours because
people were really trying to do their homework so that
when it came to their time they would have credible data
that people could listen to and then ask them to come again
and that was kind of the game
we played and that was how you kind of made if in that environment.
>>MR. ROGERS: We certainly saw some of the roles of whether you do or do not have data
and how that plays out in both of course challenger and Columbia.
>>MR. O'CONNOR: There were a lot of words about that in the Columbia accident.
>>MR. ROGERS: Yeah yeah.
>>MR. O'CONNOR: Do you have data on this and challenger as well let me see the data.
You remember who the guy that does how to do presentations is.
>>MS. EDMONDSON: Tough tee.
>>MR. O'CONNOR: The tough tee thing on Challenger injury and if they had maybe drawn the charts
differently they would have made the point better.
>>MS. EDMONDSON: But that means you have to be able to play with data together.
>>MR. O'CONNOR: Yeah.
>>MR. ROGERS: Yeah, share your ambiguity and be able to discuss it rather than you tell me what
the answer is and I'll say yes or no. That's interesting.
>>MR. O'CONNOR: I really think that Amy hit on an important part of this
when it's time to talk a little bit of humility can actually help sort of grease
that uncertainty that you've got to narrate if you say look I don't have all the data, you know,
in other words I'm not coming down like -- like Moses
from the mountain here where I've got all ten of them, I've got 7 or 8 things here.
>>MR. ROGERS: 7 or 8 suggestions.
>>MR. O'CONNOR: And I think it might be-- Yeah, suggestions right--
and you know these mighting worthwhile if you have a little bit of humility
and one of the guys that I thought did the best of that is John Young
he was a very humble guy, came across and people always listened to him
because he was that way he didn't come barging in and say, this is it and plop things down
and fold his arm. He would say you know I know a little bit about this,
what it is, and everybody turns and listens and there's not a peep in that room
when he would do that. So I think that's a little bit of-- If you drew an equation on
how to make all this work, a little bit of humility would be there in the numerator.
>>MR. ROGERS: Robin you've worked with a number of case studies
that we've done here at Goddard and I know one of the --
one of the challenges is getting people to talk about lessons from a mission
that actually succeeded, in fact, I was challenged by a project leader unspecified
who asked me literally why are you sitting in my office, my mission is working,
as in there's nothing to learn here, why are you here to write lessons learned
when it's working sort of this assumption behind this idea,
go to the people who failed and I certainly don't want to be in that category.
My mission's working, how do we get to the point where you're talking about
where we learn from things that actually worked. We didn't blow something up
but if we can learn from those hopefully we'll have less disasters.
>>MS. DILLON: Well I think especially to tie in to what Bryan was most recently
saying with this quest for data sometimes the data you have is just one or two points
and it appears to work and that some of my research has actually been in hurricane evacuations.
You know what those levies were just fine, all the times up until hurricane Katrina
so if they're looking at all the data from all these past storms
where they're trying to decide whether they're going to have a problem
with hurricane Katrina, that data is not really going to help them. So what people need to do is sometimes challenge the data
that you have and I think that the most important thing is really the incentives
that somehow the person who is saying well why are you in my office
I'm a success if they could get past that and say well is there anything you think
that somebody could learn or is there anything that you could do differently.
So it's about the incentives so in an organization that's well known for doing this well as Pixar
mind I'm sure everyone in the room has seen one Pixar movie
and some people have seen a lot of them if they have small children
but they arguably had a really good string of successful movies,
but rather than walking away from their movies and saying we know how to do this,
after every Pixar movie they say, what would we do differently. We need to come up with ten things
that we need to do differently the nex time rather than just saying Cars One, Cars Two, Toy Story Three, let's crank out the next one.
They still looked at their successes because they still can do better.
I would argue that's what helps contribute to them continuing have a better than average
or better than lucky success ratio in the movie industry.
>>MR. ROGERS: So it takes a leadership- - They decided to do that frop the top, taking time
to stop and reflect on what they've actually learned even though it worked.
>>MS. DILLON: Even though it worked-- Even though, that movie toy story three, perfect,
but what can we still do differently.
>>MR. ROGERS: Mm-hmm okay.
>>MS. DILLON: I think you also had a couple of people with their hands up.
>>MR. ROGERS: If you want to question,please come up to the microphone and I'll be happy to take you for a question.
Does anybody at Wallops or IV and V have a question? Okay, go ahead.
>>AUDIENCE MEMBER: When Bryan was talking about data bring data,
as Mike had commented earlier in our side conversation
it's very difficult to bring data about human issues.
They're guts, they're little innuendos, they're micro inequities,
we don't have data so much so when you're bringing an issue
about a personnel problem or a management problem
or a leadership problem, the uncertainty is huge.
So the overcoming the fear factor of I don't know if I even know what I think I know right now
but I want to bring it to your attention, and that's --
that's a lot of the issues of overcoming the fear because you look like you're whining,
you look like you don't know what you're talking about
because you're in the minutiae but you have this feel or you see things going on
and you don't have hard data, how can we overcome that to get over the fear factor.
>>O'Connor: Well that's obviously a tough question. When I went to
-- I'll tell you why I say that, when I went to the saftey school in 1972 it was a 10-week course
that the Navy put on and out of that came what they call aviation safety officers
the first six weeks was engineering stuff, what makes airplanes fail
what makes wings come off, what does a fatigue crack look like in a wreckage,
how do you investigate an accident all that kind of stuff,
then the last four weeks I thought was really tough,
it was how to look for precursors for accidents on the human side, you know, the --
they had a couple books that we read on games people play and that sort of stuff,
so that you could try too understand that this is about people, now, you know,
we just showed you the bad things that can happen when people aren't working well together
in one way or another because most aircraft accidents were human error
by the way one way or another, even if you go all the way back to design.
Almost everything had some human error in it so they had to cover that
because we were supposed to be going to root cause on these mishaps
and quite often root cause had to do with people not getting along together, people
afraid to talk to each other, people not taking the time to understand their organization
that was a piece of Columbia accident where there was a guy who got all fowled up on taking
data from his tiger team to the right people
because he thought well I got to take it over here to my engineering management.
Well that's wrong, that tiger team was for the program that day, why would you possibly --
are you confused about matrix or what, yeah he was, and that was part of that story.
But if people don't get along they're not going to communicate
and I'll tell you one company that knows that pretty well is Southwest,
Southwest Airlines, they have arguably the best safety record in all of aviation,
if you count rather than hours you count, "sorties," number of flights
lowest accident record of number of flights. Quantas has it by flight hours.
I would argue that Southwest's is more impressive
because the most high risk parts of the flight are up and down.
And I asked a pilot, a Southwest pilot, I was sitting in the back of the airplane once--
People ask me why I do that because I went to safety school that's why, enough said, right?
Any way I was sitting in the back of the airplane and right next to the window
was a four striper, Southwest airline pilot, in his uniform studying his books
and I asked what are you doing if you don't mind me asking, he said
well I've got to go take my every six months test. Some airlines they only have to do it once a year
but in Southwest they have to do it every six months nor the captains
I said I'm aware that you guys have a tremendous safety record
what do you as a senior captain attribute that to and he said, well, it's pretty simple,
he says five things,
number one communications,
number two, training, which I'm doing right here you can see that part,
number three, communications,
number four, fly the high risk parts of the flight manually.
Southwest never uses auto pilot until they get to altitude.
Other airlines use it all the time
and then the fifth on communications,
and the communications he just had he said they like to repeat that in case people didn't hear it the first
and second time, it's that important. Therefore, the human element is incredibly important
people getting along with each other is awfully important
and when I first came to NASA I came from an environment
where they're basically two personality types,
these were marine fighter pilots and when I came to NASA and I was in --
involved in astronaut program, we had some training on personality profiles
and that sort of thing look you guys that came from two personality types, you're now
in a group that does all six of them okay you don't know how to communicate with them.
I guarantee you haven't been doing it because they were a bunch of test pilots coming in
and we had training on that and what I think sort of like cockpit resource management stuff,
how to get along how to communicate how to understand how they receive
so you're not trans mirthing on FM when they're receiving on AM,
stuff like that is very simple and can really prevent a lot of hassle and problems later on
as to being sensitive to it, but other than that I
don't know how to address the question you've asked.
>>MR. RYSCHKEWITSCH: Somehow I have two things I want to add to that. One of them is building
your own personal credibility because there are a lot of things that you can have data around
so what you want is the reputation of when there's data to be had, when there's analysis to be done
when it's possible to offer positive suggestions
because you're one of the people that brings them forward,
that creates a much better environment when you get into the
one that it's hard to put your finger on and frankly even when
it's the one that's hard to put your finger on, there's usually or at least anecdotal specifics that you can do.
It is one things to walk into somebody's office to say I'm worried about so and so.
It is a different thing to say I'm worried about so and so,
I saw this happen saw this happen and I saw this happen, no one of this would bother me,
but that's not what this person normally does I'm worried they're over stressed
or whatever it is there is not a perfect answer to that but I think there are tools you can put in your tool box.
>>MR. ROGERS: Do we have another question?
>>AUDIENCE MEMBER: I have a question for Dr. Edmondson, as the I was listening to you and you
were talking about people speaking up effectively I might be missing something
and what's very frustrating to me would you agree that the distraction of technology today
and the interruption of technology at meetings you're not really having a conversation
and it's hard to build the relationships but no one's listening to you any way.
>>MS. EDMONDSON: Your question imbeds the answer right in it.
I think it's actually a very real and growing, not shrinking concern.
So the kinds of really quite subtle and thoughtful observations such as Brian described earlier,
if you happen to notice that two people are talking by each other and you could call attention
to that or you could inquire into it and try to help cross that bridge,
more often than not won't happen not just because people won't think or worried
it's not appropriate I haven't been here long enough or maybe that's not done around here,
but simply because they're not paying attention because they're looking in their lap,
and so I mean this is something I sometimes wish I were an expert in this particular topic
because it's so fascinating and so -- so much mushrooming I think out of control.
So it is hard enough to talk skillfully, it's hard enough to have thoughtful conversations
where people mutually learn and problem solve around high risk situations,
I would venture to say it's going to be nearly impossible if half the room is checking their e-mail while we do it.
>>MR. RYSCHKEWITSCH: You know I may jus be too old for this,
but I see several times a week I'm part of e-mail trails where I stop them and you know --
>>MS. EDMONDSON: Stop the trail.
>>MR. RYSCHKEWITSCH: Stop the trail, tell the people we're not going to be able to solve this
problem via e-mail unless you want to be here for another three days.
We've got 15 minutes at noon everybody get in the room, let's solve this out.
I would be willing to bet everybody in this room has seen some of those
and ask you how many of you have had the courage to stop it or say walk down the hall or
pick up the phone, maybe the people that are 7 year olds today
will learn to do this differently but I sure as hell don't know how to do that.
E-mail is such a blunt weapon compared to a face-to-face conversation
that you know I don't know how to wield it skillfully enough and I don't see many people that do.
>>MS. EDMONDSON: You're also not, when doing e-mail, you're only half paying attention
to it any way which means in the face-to-face environment you're only half paying attention to it.
>>MR. ROGERS: So you're half in meeting and you're sending an email,
neither one is any good, but we're communicating more, we're more efficient.
>>MR. RYSCHKEWITSCH: It is true even when I'm thoughtfully trying to craft an e-mail.
>>MS. EDMONDSON: Then you still need to get together.
>>MR. ROGERS: So one of the things that seems to come out with your comments
is this role of the leader or management in setting the whole tone.
I think someone commented to me in the break earlier,
the outcome of the morning is we need to do massive training of the whole agency
on every single person on communications and skills,
and all these kind of things we seem to be-- So we conquer this, that's impossible.
The big part of this falls to the manager or the leader, the supervisor if you will.
How do you model this kind of behavior we're talking about that we want people to exhibit,
how do you model that so it's into the just talk . It's like do what I'm doing and I think Mike just gave an example, Robin?
>>DR. DILLON: Well, specifically or concretely around the e-mail things
like that I actually was asked to serve on a paneling about government continuity, risks,
things like that a few weeks ago and they actually had it at a secure space.
And there was nothing about the meeting that was secret or classified or any of this stuff.
I'm convinced they had this at a secure space because everyone had to leave their e-mail devices outside.
You had to lock up your phones
and go into the meeting and I think that's the only reason about it because there was no discussion that was classified.
We weren't passing clearances or any of that kind of stuff, but I think that is the leader.
If the leader is sitting there checking e-mails or anything like that, you know what,
that's the precedent you're setting for everybody else.
And legitimately people have serious concerns and people are working on mission critical
projects and things are time sensitive, but maybe for the first half hour of every meeting,
no e-mails, no blackberries, no iPhones, smart phones and then you know quarter to the hour,
we'll all take a break and everyone can go do what they need to do and check on their message.
The leader has to set the culture and the culture has to be one that people can work with.
There's also pretty famous research in academia that studies productivity
and in the workplace one the things they find is the most useful things is to give people uninterrupted time.
That if you actually give them a couple hours of uninterrupted time the amount of things
they can actually get done is so much greater than if they keep getting interrupted.
The organization has to set a culture that we don't have meetings between 10:00 and 12:00
because that's your uninterrupted time to get work done.
Then meetings maybe start at noon, still can't bring your blackberries or smart phones
for the first half-hour or something like that, but set rules for how organizations play.
>>MR. ROGERS: And they can help. They can help, that's interesting.
Any other experiences in how the leaders model the behavior we're talking about, people they want.
>>MS. EDMONDSON: Just picking up on the humility comment
I think leaders model the behavior in -- in -- in those kinds of behavioral ways as well.
And in terms of I think I mean I spend a lot of time thinking about psychological safety,
which I simply define as experiencing at work a comfort in offering one's ideas, experiences,
suggestions, and so forth that it would feel relatively more natural, relatively more expected,
relatively more okay to do that, and under what conditions will I, as a --
As an employee in a hierarchical organization feel that way well it's greatly increased if the person
I report to, and possibly levels up as well, do some pretty simple things not you know to --
To be relevant, not rocket science, really not, simple things like frame, framing,
or calling or naming the encounter the meeting what we're doing here together as problem
solving or as learning right as opposed to the implicit frame which I think is necessary --
The default frame is we're here to get something done, to rush through a checklist,
to you know to get our work done, to execute, as it were, but it --
We need to spend, if we really need these great brains together in a room to do something.
Quite often it's not simply to check in on something or to push something forward.
It's really to learn and so framing the encounter as --
As a learning or problem solving encounter, being modeling in some way some humility
or if that's too big a stretch just simply acknowledging limits, one's own and others,
so I may be missing something I need to hear from you that's not so scary to say
and by the way it's true I may be missing something.
Because the funny thing about that as a manager,
as anyone who's a manager knows, most people who are managers,
knows that they're not infallible or they know that they're fallible and other people know that they're fallible.
But the other people don't know that they know,
so just know and then letting them know, does that make sense, so you know you're fallible,
they know you're fallible they just don't know you know.
>>MR. ROGERS: Which sounds like once in a while you have to admit you made a mistake.
>>MS. EDMONDSON: Right, and it's individual and it is also organizational, kind of continually,
especially at NASA, reminding people that we don't know everything, right.
There are limits, uncertainties there are risks. We know they're there, but we don't name it often enough.
Then I think the third thing, we talked a little this morning about the open door policy, an open door policy is meaningless.
A door can be open, a door can be shut,
it doesn't mean anything unless if you are actually standing by that door.
You see people go in and out, when does that happen. It only happens when the invitation is quite explicit,
it's not hey I'm here when you need me, it's Mike what's on your mind, Robin what's on your mind.
So really, the voice calculus gets reversed.
As soon as I am asked for my thoughts, I look more foolish not speaking up than speaking up
so I will be likely to speak up.
So it's not enough to say I am open to input you, have to demand it, you have to go out and get it.
>>MR. ROGERS: One of the things I know that came up after Columbia
was this idea of making sure we hear people in reviews, you know the comments from Wayne Hail,
taking time to go around the room at the end of the meeting. We can do that, it is a tactic.
Do we really have the confidence we're going to hear all the things we need to hear
or is it just a little bit of help?
>>MS. EDMONDSON: It's a very powerful tactic because even Rodney Roach,
in the same program you were interviewed in, said, there was this moment
where Linda Ham looked around and it was like it's okay to say something, but I just couldn't do it,
he says, I'm way down here, she's way up there. Now that is such a vivid description to me
of this phenomenon where it's not enough to just pause
or just pausing doesn't mean loud and clear I need and want to hear from you,
it's got to be hey, and not just does anyone have anything to say.
I really think you've got to sort of go around and ask for the concerns, especially,
what are we missing because then you're actually asking people for their creative side.
If I say what are we missing, my competitive juices kick in I want to be the one that figures out
what we're missing rather than the kind of pain in the neck person who is stopping us from ending this meeting on time.
>>MR. O'CONNOR: One of the stories I like is Napoleon's corporal
where he had this guy that--
He was a corporal and he would invite him to sort of sit off to the side
whenever he had his general staff in to make plans for the next battle.
And that was one of his ways of making sure that the communications were complete
and that everything had been said because there's only so much the leader can do to elicit,
you know, and so this was -- this was a little technique he used,
Grant by the way did this with Eli Parker,
if you've ever heard the term Grants Indian, same thing,
there's one trusted person on your staff you say I want you to listen to everything in here
and then when we're done you and I are going to talk and you let me know if we miss something.
And the story on Napoleon's Corporal was that he would -- he would ask him,
do you understand what we just said we're going to do in this battle,
because they just been over the whole thing first we're going to do this and go over there
and a these people will come in and the cavalry blah, blah, blah, and after it was all done-
He would question everything on the table, it was quiet, they would all leave.
Napoleon's Corporal would come in and ask them, do you understand everything there.
And if the Corporal said, No, I didn't get that second part I got the first part and third,
but the second, I just didn't get it, he would call them back.
It was a just sort of a little technique he used to make sure that he had complete calm.
>>MR. RYSCHKEWITSCH: I would add that I think asking the questions in the big meetings
is important because it sets the table that says I wanted to hear,
but oftentimes you've also got to create safe space elsewhere that it --
And maybe elsewhere is also earlier in time, because I think we all know the launch train phenomena,
the closer you get to launch the harder it is to stop anything.
And the threshold starts going up so sometimes it may be before the big meeting
is getting the folks that I know are a little closer to the front line together and say,
what are you hearing, are you hearing anything? I know Brian did that with his folks before every shuttle FRR.
I did it with the engineering directors, normally we went out to dinner. [
Inaudible] used the Mars program manager in the run up for this launch,
he scheduled lunch with a senior leader, or a key person, 20 key people in the team over a period
of three weeks and he bought lunch every day, took them off site, and had lunch.
And just talked because he wanted to look them in the eye in a quiet environment that was just one on one,
hear what they were thinking, and what they were saying and whether there was anything else that you know was just niggling
in the back of their brain to see it.
So I think there's a lot of ways to create safe space for folks
because pretty damn intimidating if there are 200 people and all the senior leadership in an agency in the room.
>>MR. ROGERS: Certainly can be. Brings up a question ,
one thought I heard from folks here at NASA you get one shot.
You could bring -- you can raise up the issue, pull the red chain, so to speak,
call the big havoc because I think we're going off the rails.
You get one shot to do that, but if you are wrong and actually nothing happened,
it was a false alarm so to speak, no one will listen to you anymore.
So everyone is sort of weighing that, is it worth it.
Before I go and speak, I don't want to take my one bullet and use it and it doesn't work out to be really a mission safer.
So it kind of begs the question of what do we do with people who speak up,
bring up an issue that's really legitimate from their point of view.
We have to go off and raise it, look into, it and it turns out there was nothing there
or there was information that we cleared it, then what do we do?
>>MR. O'CONNOR: We thank them.
>>MR. ROGERS: That's the difficult thing because that is something's in the back of people's minds. If I get it wrong—
>>MR. O'CONNOR: You really have to though, Gerstenmaier was great at this.
We occasionally, Mike and I, would have reviews and meetings where somebody
wanted to appeal something to the next higher level
and even when he and I did not agree with that person,
we gave them the chance because we have a process in place at NASA
that says if someone wants to appeal, they have that right and they can go as high as they want.
So you ask them that, okay we don't agree with you do you want to take them higher,
yes I do. Okay, Gertzmeier was happy to hear it and at the end he would thank the person,
even though he didn't agree with them either, and I thought that was a pretty nifty way to handle things.
I hadn't seen that in earlier of times, there was a lot of fear of the program manager being the king,
you have to be careful what you say to them because they would never ask you to meetings again and all that.
I didn't see this more recently with this leadership in the shuttle program,
I thought this ended up with really neat techniques for encouraging and not killing the messenger.
>>MR. RYSCHKEWITSCH: Gerschwen had his own way of being his own Napoleon Corporal
because what he did, which I though was really effective is when you're at the end of the discussion,
when he thought he heard people, when he asked whether other people had things to bring forward,
he would replay the whole discussion that he heard and say this is what I think you told me
and this is the conclusion I would draw out of that.
And he would be looking around the room and he would asking people did I hear what you intended to say,
and you know make sure that there's that reprise and it was in the room so you knew there was one last chance to do that.
>>MS. EDMONDSON: And go beyond I would advise to going beyond yes, no, too, what I am missing.
>>MR. ROGERS: Do we have any questions --
>>MR. O'CONNOR: Can I introduce another term to the equation for good come.
It's the a small T, but it's the word tact.
I'm not saying that it's really important, I don't think they had any of it in the Apollo program,
but I think in most -- in most cases that is something you have to take a look at
and I'll show you an example or tell you an example I thought was pretty neat.
When Mike Griffin became the administrator, he went to his first flight readiness review
for shuttle and he was sitting in the front row, like he always does,
and he was looking at his blackberry during the flight readiness review.
Well, at that time there was a rule in the flight readiness review that you don't bring that stuff
in there, nobody sits in the flight readiness review looking your blackberry.
If you want to look at your blackberry, leave the room, and that was the rule.
Well, Griffin didn't know that and who's going to tell him about it right?
So I happened to walk out at the break
and I hadn't even thought about it myself so I'm not saying I was even considering telling him that, but someone had.
It was a young engineer that was sitting right behind him and she approached him out in the coffee area one-on-one.
And I just happened to be standing there and I listened to her and she said, you know boss
no offense, but you shouldn't be doing that.
We've got a rule that says nobody's allowed to be doing that and when you do it,
it really cuts into our discipline and people will think it's not a good rule
and you're setting a bad example when you do that.
And Griffin said, by golly thank you very much, you know I hadn't even thought of that.
He says, that's the last time I'll ever do it, it wasn't, but -- by far but at least --
you can imagine that she was thinking, herself, while this is going on,
should I say something in front of a bunch of people here or should I do it one or one later
and sometimes that very question can be very important in keeping the team working together
good relationships because a lot of people at NASA have some amount of pride and --
and that pride can be dinged by an inappropriate public disagreement when it doesn't necessarily have to be.
Sometimes in a formal review, everything has to be that way,
but if it doesn't sometimes it's better to collar somebody in the hall and say you know you said
this I'm not sure I understand that, let's talk about it one-on-one.
Sometimes you get a lot better result out of it. So sometimes that little tact word can help.
>>MR. ROGERS: Did -- was there a question from Wallops or IV and V.
>>MR. O'Connor: By the way, he came in and told everybody in that room after the break,
this engineer right here told me that I screwed up and I apologize for everybody
that saw me doing it. I'm not going to do it anymore, at least for a half hour anyways.
>>MR. RYSCHKEWITSCH: Mike griffin was I think the only person I met that really could multi task.
>>MS. EDMONDSON: Very knew people can.
>>MR. RYSCHKEWITSCH: I've never met anybody else that actually could multi-task, he was the only one that could.
Unless I sound like a Luddite in my anti-email, when I talk about safe space,
I violated that same rule because I had my laptop up,
but what I always had coming into one of those reviews is I had an e-mail group,
which was usually engineering directors
or chief engineers plus whoever I knew was important to the conversations
and we were having a conversation in the backround and we would be hearing something in the
formal reviewand they tended to be- - A lot of it was set piece
but it was conversation of okay, I know so and so had some concerns
about this, have we raised them all, have we answered all the questions
are we done with this one or do we need to pin onto it,
because you know you can't really stop the whole review and poll everybody all the time.
On the other hand, you can create safe space in the back
so I do think it's electronics has its place. It's definitely not the answer to everything.
>>MR. O'CONNOR: It has worked itself into a reasonable thing back then it was considered
to be a distraction they wanted everybody's eyes and ears on the speaker
because people weren't using them that way, it was really doing their own business
and it wasn't related.
>>MR. ROGERS: Let's me just check in with Wallops and IV and V did we have a question from Wallops or IV and V.
>>WALLOPS: Yes we have one quick question.
>>MR. ROGERS: Please.
>>WALLOPS: The Columbia accident investigation board report
mentioned the idea that the positive outcome should not be an indicator
for future positive outcomes. A long time on safety stand down day,
we talked about the idea of group think may have caused some problems
with that how does NASA avoid group think on a regular basis in readiness review?
In other words do they invite outside parties that may not necessarily agree
with their conclusions to look at the reviews and look at the technical information?
That's probably a questionfor Dr. Dillon, all right.
>>MS. DILLON: Yes so I'm going to try repeat back and other people
on the paneling could correct me if they don't think I understood the question correctly
because it was a little fuzzy, but I think you were pointing out that even the accident reports
say the positive outcomes don't necessarily always predict future positive comes
but there tend to be group think
and what can we do to get past group think I think is the essence of the question
and people are sort of nodding so I think that's what I'm going to try to address.
And what I actually subscribe to and maybe this is really why Ed asked me to be on the panel
is because I'm a huge advocate for everything that Ed does here
and has been doing here for the last several years,
but I think that having events like pause and learn and knowledge sharing workshops
where people can come and talk about things and have case studies
where we look more in depth intof things. So I've been to enough of these kind of events
where people are more self-critical, yes I know my mission was successful
and I've seen you lead case studies and eeen Bryan up on the panes before
why they are he talking a different events and looking at them and questioning them. And by looking
at this kind of history and realizing that success doesn't mean always success
and failure doesn't mean failure. Sometimes it is just Andy's little corner of bad luck. He had good luck on Apollo
but sometimes some of your missions just have bad luck, they didn't do anything worse thany anybody else
but luck wasn't in their favor when it all lined up
so I think by participating in all of these kinds of events that's where you get the culture
where we are going to be inquisitive and we are going to try to learn from these kind of events
to see well yeah it was successful and we're very happy we got the data
we got the science data we wanted and we understand better the
gravitational fields of moon and things like that
but what else can we learn from this and then what else can we instill and pass on to others either through a case or knowledge sharing
workshop and I think it's this kinds of events
that really do instill that kind of culture of inquisition
into it that then help you to look at the next case
and hopefully can the right questions when you're in the meeting to sort of say well
what are we missing here could we be doing something different.
>>MR. RYSCHKEWITSCH: Robin I'll ask you a question it's really easy when things went bad to
find the conversation that didn't happen, how do you --
how do you identify within a team the conversation that did happen,
I'll amplify when I look at Goddard's success chart there's this beautiful little chart
that lists all the successes over the years and they're mostly little green boxes
and it always scared the hell out of me because I never was sure that I understood
why there was so many green boxes on there because I knew damn well when I read the
mishap reports everything I read on there I saw a lot of places where we could have done
exactly the same thing just really easily how do you get that mindset where people come on
and say these are the things we did right this time and we've got to make sure we keep doing them.
>>MS. DILLON: I think you were talking about talking to the engineers and finding about what
their concerns are the person going to lunch the day before --the month before the launch and
talking to each person and say what are you worrying about, so now presumably
if we're talking about the Mars lander that's going to go on Sunday if it's a glorious success,
which we all cross our fingers and hope that it is, that he doesn't forget all those
conversations. So we worried a whole lot about things a month prior to this
and you told me a lot of good things that we should be worrying about and now that we're
successful, it doesn't mean we forget all these things. We need to remember those things and to capture those lessons learned possibly
through a knowledge share or case study something like that that we don't get clouded
by success check it move on that we dig into some of those green boxes
to say what are we missing by just declaring it a success
and everyone moving onto their next project.
>>MR. RYSCHKEWITSCH: If Andy's still here, there's your next book.
>>MR. ROGERS: Anybody else has a check on that.
>>MR. O'CONNOR: I was going to say the Duke University professor
Petroski wrote the book To Engineer is Human,
his theory is if all we ever learned from was our successes
we would never get passed the state of the art and what he's pushing for is we really need to
try to get to root cause of those things that are failures, but I think what Robin does is she says
you can learn from successes if you treat them like failures
because you probably had some close calls
and when you have something called a close call or a near miss even in NASA's accident
investigation policy, it gets the same treatment as if it were a failure.
And you can find stuff that almost bit you and learn just as much from that as you can from the other.
It is Good Stuff.
>>MS. EDMONDSON: I was going to add it's not always clear cut that something is successful
or failure. So in the case of Columbia yes there had been many phone strikes before
no there hadn't been one this large, all though we didn't know that for sure didn't know
how large it was before, but the -- it's almost what happened one of the things
that happened one the dynamics with Columbia was that it gets coded as a success
in the sense of well everybody came home safely, but really it was a failure
all along having bits of debris fly off and potentially hit the orbit
was a failure all along so it was in some ways coding it as a success prevents the deep died root cause analysis to --
>>MR. ROGERS: We had a discussion of the Columbia last week
with Ed Hoffman and his group and people were asking this question
about why didn't they look into it for these reasons
and it was a simple way to think of it if we had never had a foam strike period.
And we had that foam strike as ambiguous as it was.
They would have stopped to figure out what it was because it was first one.
>>MS. EDMONDSON: Right because it would have been novel.
>>MR. ROGERS: So the fact is the other foam strikes did in fact impact the way we treated that piece of data,
which was easy to illustrate
No one disagreed with me when I asked that question.
>>MR. RYSCHKEWITSCH: Ask yourself when your one car length behind someone on the beltway.
>>MS. EDMONDSON: Is this a smart place to be, but I've done it before and I am still alive.
>> MR. ROGERS: We have a question here from Gale.
>>AUDIENC MEMBER: I'm going to try to be cogent on this because it's a little complicated, muddled
my mind an assessment that I hold in this culture is there's a very large commitment
to achieving consensus in most context, and by that I mean defining consensus as everybody
whole heartedly or at least strongly supporting whatever the decision
or conclusion or action is and oftentimes I find that that means that people actually compromise
in order to get there and sometimes that results in what I'll call perhaps not as tactfully
as I should a lower common denominator solution,
so I think about that when I also think about the concept
and the tendency sometimes to indulge in group think,
and I'm inviting a reaction have any of you in your research, particularly in any experiences
you've had in NASA seen a connection
and like I'm describing or am I not even making any sense, which is possible.
>>MS. EDMONDSON: I think one of my favorite quotes from business history
comes from Alfred P. Sloan, who was the great leader who both created
and led General Motors for a very long time in its early history through decades
of success and it's this great quote where he says gentlemen,
you can tell it's old by that opening, opening salvo, gentlemen I take it we are in complete agreement,
then I propose that we break up now and give ourselves time to develop some disagreement
and perhaps gain greater understanding of what this issue is all about.
Now the reason I love that quote is I think all of us,
all people who manage or who are in organizations would say that we are open to dissent,
very few of us take agreement as a red flag, as a danger signal,
and I think that's what Sloan did so magnificently given high stakes or deeply strategic issue
or whatever it was, given an issue of some importance, then
it's important enough that we ought to be seeing a bit more disagreement before we come to a close.
>>MR. RYSCHKEWITSCH: I'll jump in on that one, you know I --
I'm not sure that the right word is consensus that we're -- or at least
what we should be striving for isn't really consensus, it -- you know I think the -- the --
what we're really striving for is have -- have we had the discussion in all of its fullness
to understand what's wrong with what we're actually doing so we know we think we know
whether it's good enough to go on whether that's to launch or something else,
and part and parcel I mean we all have to recognize we've never launched anything
that was perfect and if we waited to launch until they're perfect they wouldn't be anything up
there right now so there is always some difference between what we would like to do
and what we're we're able to do and then the question is, is it good enough
and have we had enough conversations to know whether that's good enough and that is any individual there not --
not at their personal threshold to say it's good enough to go on, and
sometimes it's you know I don't -- maybe I don't have all the information
but I've had put all my information out there and I believe that the people would have to make the decision
have hear it in enough fullness to consider it in with all of the facts so those, to me,
that's the kind of conversation we should be having and when you get to the end game
when people you know people say I'm -- I'm go to launch or whatever it is,
that's the question that they're really answering is have we had that full discussion, has all of my information--
Am I personally are we good enough that we're far enough above my threshold
that I don't feel that I need to raise this issue further even to the next level of decision making or call the conversations happening.
So you know that's a fairly high standard and yeah there's easy danger to take the easy way
out and look around the room and say obody else is opening their mouth, you know --
are we done, do I want to personally get out there and frankly I think that's the place that you have
the role that people in the decision making role and the leadership role is to make sure
that they're pulling that out is there something else out there leading by example in the previous
meetings when somebody brought one up this that thing is thanking them and saying you helped
us here because we know better where we are and where we were frequentlymaking sure you
don't do the negative, you shoot one messenger and there will be a thousand people
out there that won't speak up, that's the end of the huge danger and times
it's the not thanking sometimes it's the not explaining why you're doing what you're doing
even though everybody else told you not to do it and going on with that so I don't know if that's helpful.
>>AUDIENCE MEMBER: It is absolutely helpful because I think sometimes we either don't think to
or don't take the time to make our thinking transparent to other people
and if we were to do that it can actually have a very powerful impact ultimately.
So thank you for that.
>>MR. ROGERS: Certainly in the dynamics of meetings
which we know some of you have looked at, the longer you get into the meeting the dynamic changes
so you have an hour meeting, the last five minutes
if we haven't come to an agreement, there is tremendous pressure to relinquish your position or as you said, come to a consensus.
Which may not be actually the optimal decision of the group,
but the time is telling us to do that before the bell rings and it is easy to get caught in that. You see it in all kinds of situations.
>>MR. O'CONNOR: And it's hard to stop.
>>MR. ROGERS: Otherwise we have to meet again next week because we haven't solved the problem so lets get some solution.
>>MR. RYSCHKEWITSCH: The other Ed's folks wrote a case study on STS- 119 where it went to FRR three times
because we twice came to the conclusion that we didn't know enough yet and that doesnt feel very good
>>MR. ROGERS: And that gets us to another whole question we get in a lot of these cases,
and you've been involved with, as did Bryan,
where we try to get in the discussion of not just did we get to the decision right or wrong
we called the foam right or wrong, how did we actually go about making the decision
because that will apply to our future decisions on a different case that's a lot more difficult
to do than re-judging a decision someone else made in the past, yes
Robin, we have a couple questions in a second go ahead.
>>MS. DILLON: So an article that Ed referred to before is we actually wrote a Harvard Business
review pies on recognizing near misses last spring and on the chance
that it was going to come up in conversation I wrote down or 7 recommendations
and I'm not going to talk about all of them but the first one, our first recommendation was pay careful attention
to high pressure and that was basically what Mike was saying when you're getting really close
to the launch actually occurring and one of the things that we actually talk about in the article is
-- is that NASA screen saver prior to Columbia that was counting down seconds on
everybody's computer until the core complete space station core was complete,
and that everybody was so focused on that and the other anecdote
that we talk about in the article is the BP deep water horizon and that every day that they did
not have this well capped was costing them a million dollars in overrun fees
and so yes they had an engineer on the rig that could have done the test to tell them
that there was a problem with the well and they skipped the test he was even there they didn't even have to get him there
and they still did the test because they were trying to rush to get this done
because every day was a million dollar overrun and so I know I don't have good concrete things
except my recommendations always about awareness and being able to recognize it, but
especially being able to recognize it when you do get under a lot of schedule and budget pressure
and being able to really step back and say would we do this differently if we had had more time
and that's the question that you want to ask yes we're supposed to launch next week I know
that, but let's for a second say we didn't have to launch next week what would we actually do
if we had more time and then how is that different than what we're actually going to do.
>>MR. RYSCHKEWITSCH: I think that's an especially pro comment for this community
because frankly one of the luxaries that Goddard has always had is never dealt with the planetary launch
windows and you have two missions sitting infront of you,
Mabin and [Inaudible] where you have planetary launch windows and they are a different animal
and I think you know one of the things that does is it puts a lot of pressure
on forcing the discussions early and making sure that you have them
and do them because you can't get to that last day of the window because you're going to launch something, right?
And so that there's lots of runway now there won't be a lot of runway when you're sitting on top of a rocket.
>>MR. ROGERS: Ready for another question? Who -- I -- who is -- Sarah go ahead Sarah.
>>AUDIENCE MEMBER: My question is what I'm curious to know what the panel thinks about highly complex
organizations where certain chunks of highly skilled and very educated people are not
necessarily the folks that are doing the science and the engineering,
and then we've got the very technical science and engineers and yet these two groups are
working together to bring about mission success and the highly technical folks need the non-scientist
and non-engineers to do their jobs in coordination with the technical side of the house.
So what -- what I've seen in some of my work here at Goddard
is that at times the language there's a language barrier sometimes the -- the non-technical folks
don't really know the language and the acronyms of the scientists and engineers
and so there's a -- an issue there of not necessarily always speaking up,
even people at a very high level that feel like gee I should really know what that acronym means
sometimes won't speak up because they don't want to look like, hey, I don't know what's really going on,
so if -- if -- if certain folks are not involved in missions on a day in/day out basis
it's very difficult to understand the work of a highly complex organization like what happens here at Goddard.
So I'm curious and we talked a lot about having conversations today
so how can we bring these kinds of conversations that I'm talking about
and have open dialogue and keep the folks that are not on the mission side aware enough to be
able to positively impact the day in day out work of the organization, which they do, but it's a very -- it's very challenging.
>>MR. O'CONNOR: I'll start iwth that. I'm one of those folks that really doesn't think
there's a good excuse for people using jargon when nobody understands it. I think the onus is
on people to quit using jargon and acronyms and that sorted of thing as soon
as they get out of the group that uses the acronyms and jargon and get into speaking
English to people that understand English and I'm not good at other languages. English is my only
language, but I can understand when people in another country feel funny when you don't take
the time to learn their language one person that was really good on this was Sally Ride and you know
we just lost her last week. One of the things that I thought was great about Sally Ride
was that she was a really good communicator, she was not what I call a blabber mouth
she was pretty quiet but when she spoke she spoke the right language to the right audience
and everyone felt like she was talking to them personally and never speaking in jargon
and over their head or whatever and yet she could turn around in another environment,
with other people who were very deep into some engineering thing, and if she was aware of their jargon she would use that.
I've used her as an example to engineers a lot of times at NASA
when I read some gobbledygook that comes across in an email or a memo where an engineer doesn't feel it's important
to communicate very well in writing because if it was they would have put English classes
in the engineering schools, right? Well Sally was one of those people who went
and got her bachelor's degree in English literature,
and she studied Shakespeare around she became quite a good communicator as she went on and did her Ph.D. in physics
and I've often thought that that would really come in handy to a lot of our engineers to at least --
at least acknowledge that they're not very good communicators
and that people that they're talking to may not necessarily understand what they're saying
because they're not speaking in English and to work on that.
And I think that sometimes we leave that out we do a lot of technical education in
and not enough speaking and communicating type education in NASA.
>>MR. ROGERS: Anyone else?
>>MS. EDMONDSON: I will add one quick thing because I think Bryan further underlined
why this is very problematic and at times very dangerous and just the one thing I want to
underline is make it discussible. I mean this is clearly an issue, it's clearly an issue that people
recognize and understan,d it's got to be one that we actually do talk about,
it's got to be not only okay but you know maybe even an object of some humor
and good nature sort of jabbing when we get off on that in that wrong direction.
>>MS. DILLON: An example of that is I was once talking to a woman who was actually a former director
of Johnson Space Center who then actually left NASA and went to the department of energy,
and obviously a very technical person but switched fields dramatically
and obviously switched acronyms and I believe she came in as an assistant secretary level so a
lot of people reporting to her and she basically put a cup on the desk, and said that anyone who uses an acronym
had to throw a quarter in the cup until they quit using the acronymings because she obviously
could handle all the technical information, she was a very technical person
but she had a whole new set of languages but she was the boss and didn't want to keep
saying I don't know what that is, I don't know what that is. So she basically said every time
somebody uses an acronym you have to throw a quarter in the cup.
>>MR. RYSCHKEWITSCH: What she should have told them is you have to tell me what all the words in
the acronyms are and if you can't, you have to throw five bucks in the cup. My wife
works at NIH and she went to a seminar some months ago and the speaker was talking about autism, the definition of autism has evolved
over the years, but the speaker's comment when she was talking to an audience of scientists and engineers
about 90 percent of them would be diagnosable with autism
but if you're a non-technical person, non scientists and engineers, the almost universal trait in
the scientists and engineers they love what they do
and if you ask them they'll explain it to you, it may be more than you wanted
but they will explain it to you. There's no problem to get them to talk about what they do.
>>MR. ROGERS: Physics lessons from Sheldon.
>>MR. RYSCHKEWITSCH: Yes, yes, absolutely.
>>MR. ROGERS: A question over here.
>>AUDIENCE MEMBER: I think this is a question of mechanic from the NASA guys. I've seen the JSC
decision making process illustrated loop with feedback
and in addition to the and to the ability to kick your minority report upstairs, there's also the option
to write down your minority report if you're one of the dissenting opinions
on the decision that's made. So my question is, are these things
written down did people take you up on the option to write the minority report and if they are written down
are they collected anywhere because these things could be more valuable than a whole bunch of lessons learned.
>>O'Connor: Well the policy for them is written down in Mike's policy documents
on how we d program management, the dissension process is pretty well laid out, each program has to follow
that and they take that and apply it to their own approach. In the shuttle program everything was
recorded so you know for history those all those flight readiness reviews, including the dissenting
opinions who were asked to come to the microphone, so they could get their stuff on tape.
So yeah I say that's a program I had some experience with and I thought they did a very good job of that.
>>MR. RYSCHKEWITSCH: The question is whether they can get access to all of them
because things like the alternate opinions brought to a control board for a decision
something they'll be all part of the documentation of the control board getting your hands-on them
or doing some kind of search to find them is not going to be easy. The higher level ones like the run ups or
shuttle process or safety admissions success reviews are all recorded
and they're accessible maybe not trivially accessible but they are accessible.
>>MR. ROGERS: Thank you.
>>MR. O'CONNOR: And then there's the case where the example I used was the dissent
from someone way down there four or five levels down, in one case it was a contractor
these descents found their way up sometimes it's a matter of the actual decision forum itself.
We, somebody used the term consensus. When I looked up consensus I was looking it up to
figure out what was the protocol of this board they're setting up, someone was setting up a
decision board. The Navy folks had told me that we were misusing the term consensus at that in
the Navy they had another term called unanimous consent and there -- there is a --
there is a decision board in nuclear Navy that has a protocol of unanimous consent.
In other words the decision maker must have everyone agree before the decision
maker can write down and accept that decision. At NASA we tend to have consensus
and consensus literally means you try to get anonymity, but it's not necessary. The consensus
is the process but it doesn't mean you have to have verybody volting okay
couple cases in shuttle and one in oh, the delta rocket we launched out to the asteroids
where one or more of the members of the actual board itself said, no, actually voted no,
and that got recorded. Now it turned out that the policy was that if that happens it has to go to
the next level, but it doesn't mean you're done it just means that if you want to -- you know if you
-- if the decision maker still wants to go and one of the technical authorities said no,
then he has to elevate it to the next level and in those cases the administrator himself made the decision
to go, but on the records there is the no votes from those board members and everybody can look it up.
>>MR. ROGERS: Do you have a question?
>>AUDIENCE MEMBER: Yes, Does civility have a role in all of this?
>>MR. RYSCHKEWITSCH: I think so. I mean frankly it -- it sets the stage it's the under pinning you know
okay if you grew up a culture in the early `60s and you were, you know, [Inaudible]
you're in those conversations and the culture you
you lived in says it's perfectly fine to tell people no way in hell we're doing that as your
opening salvo on a discussion that's one thing, that's not the world we live in right now
and frankly I think it's really, really important to set the stage that if I'm in an argument
if I'm in a passionate discussion with Michael Hartman that I respect Michael Hartman as a
person and I value what you do and if I've got an issue I've got an issue with the idea that you're
putting on the table because I don't think you substantiate it or an issue with the facts you put on the take,
but when we walk out of the room I will still respect Michael Hartman as a person and as a colleague
so I think that that, in the world we live in today, is an absolutly essential underpinning of having that conversation because
it creates the safe space in the conversation if you can build the trust that that's fundamentally
who we are what we do and how we respect and value each other.
>>MR. O'CONNOR: Well a lot of times I tell the safety emission assurance people that their job is to be skeptical
by definition when they put on their safety badge they must be a skeptic, they have to be the
people on that pyramid that are asking what if it doesn't work that was a pretty key piece of that
pyramid, that's their job. Well, that is a skepticism, which sometimes over period of a time
depending on pressures, personality, what's going on at home and so on there's this line between
skepticism and cynicism and skepticism to me is hey I don't know that I trust
what I just heard you say there, cynicism is hey I don't think I trust you personally,
and as soon as you get from that skeptical outlook to a cynical outlook
you become poison to the team and I think that's a tough line to draw. It takes supervisors
watching for it, takes peers saying you know you were kind of getting in there in that meeting you
were kind of putting one leg on the skepticism and one on the cynicism on that you might want
to take a day off or something because it's a tough thing to do you can really blow your
credibility as a safety guy sceptic, when you start pointing your finger
in somebody's face like they're the problem rather than their idea.
>>MR. RYSCHKEWITSCH: I will tell you that I've actually had to remove somebody from some important conversations
because they had gone way over the line on the cynical side and that was a very hard thing to do to make it clear to everybody
that it was because they were disrespecting the people and --
and had gone to a full cynic and not because of the ideas and
that's the subtlety that's into the easy to communicate, but frankly I was in a position, the other conversations
could happen in a room as long as that person was in the room. So judgment
call but one of those things you hit in real life sometimes.
>>MR. O'CONNOR: I always thought a good example where that line gets drawn is some of the stories
you heard about Regan and Tip O'Neil, those guys could be pretty intense about policy
and yet they go have a whiskey after wards, you know, with each other
because they were not cynics and that's the way we need to be.
>>MR. ROGERS: There's a nice point to end on, more whiskey would help things go smoother in Washington.
>>MR. O'CONNOR: What do you know it's happy hour time.
>>MR. ROGERS: And it also happy hour time, great question there at the end Michael
,I appreciate that. So with you're permission, I'll have a few closing comments we'll wrap things
up here we are approaching the end of our day, I know you've been sitting for quite a long time. So just a few point to summarize
the day, first I think it's pretty clear that we really can't succeed if we don't accept
that there's this thing called organizational silence and do something about it if not perfectly
nail it down to the floor once and for all but if we don't at least address it and are willing to accept
that it's here and it sometimes rears its ugly ahead. We also sort of heard
we need to err the sides of hearing people out sometimes and accept that we might actually
waste sometime from time to time hearing things out that may not have saved the mission
but we needed to hear them out
because we have to respect the process we don't know that next time we may need to hear that person out
and think may save our mission. So there will be a little bit of waste, we're not going to be able to
shave it so close, so faster, better, cheaper that we'll never waste anytime
looking at an issue that isn't critical-- some issues will come up how to judge that is how to be a
good leader, figuring out how to get that balance right almost every major mishap, I guess according to Mike,
except two, have some element of silence involved where people knew things
or had indication or somehow for some reason the message wasn't effectively communicated to the right people
and the right way, in the right time to avoid the accident.
There are many -- many types of silence that we heard today there isn't just one type
we've addressed some aspects to bring attention to the subjects
but they're all actually really important
even the ones we didn't discuss today, for example we didn't discuss much at all about scientific transparency
and research clearly that's a big issue that's very to NASA and our role in society equally
important is the other topics we discussed today personnel harassment and silence in the
workplace is clearly important where people fear retaliation or retribution. We can't do our other
work if people are worried about their behavior will cause retaliation
we're also I think not naïve to think none of this happens at Goddard or NASA
and won't happen again in some regards we learned if anything we need to be vigilant and we need to be willing to talk
and willing to open the conversations to have people feel free to bring up issues
that they may not outside rates wise bring up including the topic of bring up issues itself, making
it discussible here at our workplace. So I just noticed three things I would like to leave us to think about
that happened somewhat in the natural realm I think as we were talking this this panel
confusion communication is often at the top of the list of lessons learned and I read most of them as
well and communication is always there and a lot of these failures in communication are simply
around confusion misinterpreted signals lack of follow through someone thought they heard or
didn't hear or they thought they had sent a message that wasn't received so as Bryan aptly
pointed out when you see confusion in communication, you become the person who
knows thousand fix that because you're the one who observed it and neither party may actually
be observing or they may be caught up in their eye motions at the moment or a conflict that's
going on if you observe it you become the one who has the option of speaking up to that issue at
the moment and resolving that confusion instead of reading about it later in a mishap report
how simple confusion may have led to something. There's at question of certainty that came up. People aren't certain whether the issue
I have or data I have is worthy of bringing forward, that's a risk it doesn't show up in any of our charts
but it's a risk of the work that we do every day we don't always know the rights things to communicate
and the lesson is really if we try to cut it really close we're going to miss signals
that we really want to hear
so it's kind of back to that we have to be willing to accept a little more information than maybe we
can handle so efficiently in our time, but it's not worth the risk of mission failure
and we have to be willing to accept some of that uncertainty, which kind of goes to I think one of the points I think
Robin said at some point you need some slack. If we run with no slack personally you know every
day on our half day work, work days you know only 12 hours a day you don't have any slack
there's no time to process that no time to decide and weigh the uncertainty and things just get
moved over and moved on from. And the third one is this clipping information there's just so much information that rolls up an organization
like NASA to people at the top, sitting in say Mike's seat, who have to decide engineering are you go,
his simple answer represents thousands of voices that have rolled up
and this information gets clipped in the natural flow of information. Two people have a top ten lists
and they need a top ten list. Ten items are going to be disappear just by the nature of mathematics.
So we have to be aware that naturally information gets clipped if some of that information is yours
and you knew more about it then what was being talked about then it becomes your issue. It
got clipped not because someone didn't like you or didn't like your issue or voted you to be less important
it's part of the process, part of these observations are that we invent process to help us get through all
this stuff sometimes those processes we need to be aware of them and be careful that the process
doesn't lead us, we own the process, it's still more bout us. If it needs to be discussed then making
sure it actually does get discussed, that accountability that we heard about both from Andy
as well as from on this panel. It's geting this balance right that makes it difficult
and we've heard about the problems of if you get it right you know, that is wonderful,
but if we get it wrong then it's easy to sometimes point fingers quickly rather than saying thank you
for respecting the process because we know our successes are in part due to us following
this process which is sometimes messy. And sometimes we've taken little detours
to be sure we are we're covering ground and making sure we point that out to people rather
than taking the easy road which was that wasn't necessary that person wasted our time
and sending a very different signal that we don't want to send and which we'll probably pay a
terrible price for later. I just wanted to end with some words that were written by someone
much better at poetry than me, Chris referred to him earlier, a song that was written some 40
years ago and this seemed very appropriate to -- to us today and it goes like this and you all
know these words and song, I'm sure, and he said, Paul Simon, 40 years ago,
"In the naked light I saw 10,000 people, maybe more, people talking without speaking,
people hearing without listening, people writing songs that voices never shared,
and no one dared disturb the sounds the silence."
Today we have dared to disturb the sound of silence, we have not fixed it,
we have not solved this once and for all, but we have dared to disturbing the sounds of silence.
Thank you for your attention and your participation I hope you have gained much and you will now be responsible
for using what you've learned in your own environment and be that much better equipped.
This concludes our Goddard 2012 sound of silence workshop on organizational silence
I would like to thank all of you for participating
and thank you especially to our panel for helping us pull it together at the end thank you.