Content uploaded by Ron Westrum
Author content
All content in this area was uploaded by Ron Westrum on Nov 01, 2017
Content may be subject to copyright.
The study of information flow: A personal journey
Ron Westrum
⇑
Department of Sociology, Anthropology and Criminology, Eastern Michigan University, Ypsilanti, MI 48197, USA
Society and Risk, University of Stavanger, Norway
article info
Keywords:
Information flow
Culture
Organizational quality of life
Employee empowerment
Hidden events
abstract
Information flow has been shown to be a key variable in system safety. Not only is information flow vital
to the organization’s ‘‘nervous system,’’ but it is also a key indicator of the quality of the organization’s
functioning. The author describes how his personal trajectory took him from the study of social informa-
tion about anomalous events to the role of information in causing or preventing technological accidents.
The important features of good information flow are relevance, timeliness, and clarity. Generative envi-
ronments are more likely to provide information with these characteristics, since they encourage a ‘‘level
playing field’’ and respect for the needs of the information recipient. By contrast, pathological environ-
ments, caused by a leader’s desire to see him/herself succeed, often create a ‘‘political’’ environment
for information that interferes with good flow.
Ó2014 Elsevier Ltd. All rights reserved.
1. Introduction
The role of information in making systems safe is profound. Not
only is information flow a prime variable in creating safety, but
also it is an indicator of organizational functioning. By examining
the culture of information flow, we can get an idea of how well
people in the organization are cooperating, and also, how effective
their work is likely to be in providing a safe operation.
My interest in information flow is the result of a lifetime of pro-
fessional work and theorizing. My studies on information flow go
back into my undergraduate years, and weave through my gradu-
ate work at Chicago and later the RAND Corporation. But my re-
search wasn’t initially focused on safety. In the early 1970s I was
studying the role that information played in scientists’ decisions
about anomalous events. This was the result of a fascination with
such unusual events as UFOs and meteorites (Westrum, 1978).
Why didn’t information about anomalies flow to the people who
needed it? In my studies, I discovered, early on, that scientists of-
ten were unaware of their own biases in making decisions about
unusual events. For instance, they often assumed that they would
be ‘‘the first to know’’ about anomalous events, but in fact they of-
ten remained ignorant. They were unaware that their own biases
interfered with search and with information flow. Thus, they often
suffered from the ‘‘fallacy of centrality’’ (my term), thinking that
they were critical in the flow of information about anomalies,
but in fact they were often left out.
Eventually I realized that anomalies were often what I learned
to call ‘‘hidden events’’ (Westrum, 1982,1986). Hidden events were
things that in the words of Raymond Moody, were ‘‘very widely
experienced but very well hidden’’ (Moody, 1975). Hidden events
might lie beneath the surface of social radar, because of the ‘‘plu-
ralistic ignorance’’ (Katz and Allport, 1931) of those who experi-
enced them. Pluralistic ignorance is the reluctance to report
something no one else seems to be reporting. So of course, if others
are not reporting, then you don’t want to report, either; no one
wants to speak up. Similar forces kept silent the many victims of
sexual abuse by the British celebrity Jimmy Savile, all of whom
thought they would be singled out, when in fact they numbered
in the hundreds (Burns and Cowell, 2013; Burns and Castle,
2012). But these disparate pieces of data were not put together.
In 1978–1979, I had a sabbatical year at the Science Studies Unit
of the University of Edinburgh. During the 8 months or so I spent
there, I was doing two things. The first was the preparation of
my book on the sociology of hidden events. I was trying to put to-
gether everything that was known on how society makes decisions
about anomalous events. I had started with UFOs, but soon
branched off into other areas. For instance, while taking a course
on ‘‘Forensic Medicine for Lawyers,’’ taught by Professor John Ma-
son, an expert on air crashes, I first heard about the ‘‘battered child
syndrome.’’ Aha, I thought, another ‘‘hidden event,’’ like meteorites
and UFOS! The book on hidden events was not fated to be pub-
lished, but did stimulate further research.
While taking the Forensic Medicine course, I had began to stop
over at the Medical Library, and shortly encountered the periodical
Aviation Psychology and Environmental Medicine. It was in this
magazine that I found the first articles I had read on the social
http://dx.doi.org/10.1016/j.ssci.2014.01.009
0925-7535/Ó2014 Elsevier Ltd. All rights reserved.
⇑
Address: Department of Sociology, Anthropology and Criminology, Eastern
Michigan University, Ypsilanti, MI 48197, USA. Tel.: +1 734 355 5538.
E-mail addresses: ronwestrum@aol.com,ron.westrum@emich.edu
Safety Science 67 (2014) 58–63
Contents lists available at ScienceDirect
Safety Science
journal homepage: www.elsevier.com/locate/ssci
psychology of the airliner cockpit. Some were written by NASA
psychologist John Lauber, with whom I was later to correspond.
These articles engaged me, but I was still researching the hidden
events dynamics, and would later sum up my research in a paper
on ‘‘Social Intelligence about hidden events: Its implications for
scientific research and social policy’’ (1983). But ‘‘hidden events’’
seemed to engage few besides myself, and my research did nothing
to advance my career. Other sociologists of science urged me to
switch to a less controversial topic.
And gradually, that happened. I first begin to realize that the
framework I had slowly erected to study hidden events might be
useful for the study of technological accidents. And about this time
I had another visiting professorship at the University of Hawaii.
While shopping in its bookstore I noticed Charles Perrow’s book
Normal Accidents (1984). This broke like a phosphorus shell over
my head—as I imagine it broke over the heads of others. It sug-
gested that accidents were amenable to sociological analysis.
About the same time I also discovered the writings of James Reason
on human error and accidents. This seemed to be a promising area,
and I dove into it. Lauber, Perrow, and Reason all seemed to be onto
something important.
Then in 1986 there was a conference on aviation safety in Bad
Windsheim, Germany. I got invited through a funny turn of events.
I had sent some very half-baked preprints to Perrow, who was in-
vited to this conference. But he couldn’t go, so he suggested that
the sponsors invite me. When I got there, I discovered I was to take
Perrow’s place! I certainly was not up to Perrow at that point, but
at this conference were Jim Reason, and also Irving Janis, both
giants, in my estimation (Wise et al., 1994). In 1986 I was a min-
now. However, I rose to the challenge, and shortly began to be in-
vited to other conferences. (Conferences are like pinball games:
You win one, you get to play another. You win that, you get invited
to yet another, and so on.)
And then I had a real breakthrough. In October 1988 the World
Bank would have a large conference on ‘‘Safety Control and and
Risk Management.’’ It seemed as if this was intended to be the
be-all and end-all of conferences on human factors in system
safety. Participants had to prepare papers for the conference, and
I was working on mine one evening when I suddenly got a string
of insights that were extremely helpful. First of all I realized that
there was a continuum of safety cultures that fell into three gen-
eral categories: pathological, bureaucratic and generative.
Pathological organizations are characterized by large amounts
of fear and threat. People often hoard information or withhold it
for political reasons, or distort it to make themselves look
better.
Bureaucratic organizations protect departments. Those in the
department want to maintain their ‘‘turf,’’ insist on their own
rules, and generally do things by the book—their book.
Generative organizations focus on the mission. How do we
accomplish our goal? Everything is subordinated to good per-
formance, to doing what we are supposed to do.
This classification (and derivations of it) would become widely
used.
2. Elaboration of the Three Cultures Model
James Reason and Patrick Hudson, have suggested expanding
my classification of cultures into a five-part version, essentially
by defining ‘‘reactive’’ as between ‘‘pathological’’ and ‘‘bureau-
cratic,’’ and ‘‘pro-active’’ as between ‘‘bureaucratic’’ and ‘‘genera-
tive.’’ (e.g. Hudson, 2007) And some persons have felt that this
was a useful improvement. I believe that there might be two valid
reasons for doing this. The first would be theoretical, that there
might be some theoretical advantage for this more articulated
scale. The second would be that there is some empirical stimulus
for such an articulation. However, I have not seen either theoretical
arguments nor empirical evidence for doing so.
When I originally created the concept, I thought of pathological,
bureaucratic, and generative as ideal types. However, over time I
have come to think of the three as forming points on a scale, what
one friend of mine, Russell Briggs, calls the ‘‘Westrum continuum.’’
Why ‘‘bureaucratic’’ should be in the middle might be questioned,
but there are many case studies that show that this is a distinct
state of affairs. However, no such case studies exist, I believe, to
support the proposed two additional types. There certainly must
be intermediate points between, e.g., pathological and generative,
something can be more pathological or less pathological, more or
less generative, etc. In specific empirical examples, it might be
valuable to name the intermediate points, but you could also argue
that this is a continuous scale, and numbering the points from, say,
one to ten would work just as well.
It might seem small-minded to refuse the Reason/Hudson elab-
oration, rather like Marx saying ‘‘je ne suis pas un Marxiste,’’ as he
did, in regard to some of his followers. I believe that the two schol-
ars first mentioned it to me in the 1980s, and I didn’t object. But
time has passed, and the additional arguments and data have not
come forth. Professor Dianne Parker of the University of Manches-
ter has done some work using these five types (e.g. Parker et al.,
2006), and asked a sample of persons in her business seminars to
describe how they would put organizational cultures into the five
categories. In looking over her materials, however, I discovered
that the behaviors her survey participants imagined did not corre-
spond to the behaviors I had found in the case studies I did. In par-
ticular, the ‘‘pathological’’ examples did not correspond to what I
see as being genuinely pathological environments; real pathologi-
cal organizations are much worse. I am not saying that Reason and
Hudson are wrong, I am simply saying that I am unconvinced that
this is a genuine improvement. I believe it is up to the aforemen-
tioned gentlemen to show the value they see in the classification
they have. This is a task yet to be accomplished.
Related to my classification of culture was another idea. These
cultural styles are associated with how the organization is likely
to respond to information that things are not going well. We might
see a set of ways in which an organization might respond to anom-
alous information. There were six ways:
(1) First of all, the organization might ‘‘shoot the messenger.’’
(2) Second, even if the ‘‘messenger’’ was not executed, his or her
information might be isolated.
(3) Third, even if the message got out, it could still be ‘‘put in
context’’ through a ‘‘public relations’’ strategy.
(4) Fourth, maybe more serious action could be forestalled if
one only fixed the immediately presenting event.
(5) Fifth, the organization might react through a ‘‘global fix’’
which would also look for other examples of the same thing.
(6) Finally, the organization might engage in profound inquiry,
to fix not only the presenting event, but also its underlying
causes. The scale of reactions might appear like this:
Public Relations Global Fix
***@********@***************@*****@**************@********@
Suppression
Encapsulation Local Fix Inquiry
Each of these reactions was associated with the cultural types.
Suppression seemed to be a type marker for pathological environ-
ments. Bureaucratic climates tended to use encapsulation, public
relations and local fixes. Generative environments were more
likely to engage in global fixes and inquiry. These ideas about
R. Westrum / Safety Science 67 (2014) 58–63 59
reactions occurred to me over a few hours, though the work on
culture and information flow had been longstanding. Yet bringing
it together changed my perspective. Sometimes nothing is as
valuable as a theory uniting disparate things.
In great contrast to the tepid reactions to my work on hidden
events, the ‘‘three cultures’’ idea caught on quickly, especially after
Jim Reason cited it his classic Human Error (1990). (Shortly after, I
met a French colleague, Jean Paries for the first time, and he asked
me if I was the ‘‘famous’’ Ron Westrum!.) And information flow
seemed to bring together not only culture, but also management
style. Later I would suggest that pathological environments were
caused when the leader puts stress on his/her own advancement
and perquisites (Westrum 2004). By contrast, bureaucratic envi-
ronments come when leaders put departmental goals ahead of
organizational ones. And generative environments focus on the
organization’s mission above all else.
But how, in fact, does a theory of ‘‘information flow’’ unite so
many disparate currents? Let us reflect for a few moments and
see how it works. Recall, there are really two critical reasons for
attending to information flow.
(1) First, when information does not flow, it imperils the safe
and proper functioning of the organization.
(2) Second, information flow is a powerful indicator of the orga-
nization’s overall functioning.
3. Information flow as a vital resource
Organizations function on information. Stop the information
and the organization stops, too. Better organizations require better
information. Worse information flows leads to worse functioning.
Better information flow leads to better functioning.
The words ‘‘information flow’’ suggest water moving through a
pipeline. Yet the mere fact of information itself tells us little about
the character of the information. More information is not necessar-
ily better. Good information has these characteristics:
(1) It provides answers to the questions that the receiver needs
answered.
(2) It is timely.
(3) It is presented in such a way that it can be effectively used
by the receiver.
These sound like simple criteria, but they are fact often very dif-
ficult to meet in practice. Take the first one, for instance, which is
often violated. Getting the right information to the right person is
absolutely critical. But the underlying issue is that the information
should respond to the needs of the receiver, not the sender. Recei-
ver-focused information is a powerful sign that the organization is
engaging in teamwork. Yet, often, information may instead serve a
political purpose, to protect the sender (‘‘covering one’s posterior,’’
in common language), to provide the illusion of cooperation, or in
the worst case, to baffle or deceive the receiver.
Timeliness is another desideratum. The same information, pre-
sented a day later or a second later, may be useless. We want to
be updated in a timely way, otherwise we are ‘‘out of the loop.’’
Information is used to make decisions. If it is late, it may put us
in a bad way versus an adversary, it may blind us to a current dan-
ger, it may lead to a wrong decision. Cf. Col. John Boyd’s famous
idea to stay ‘‘inside the loop’’ of an opponent, so that one’s own
decision cycle is faster (Coram, 2002). But more often, it is simply
the needs of the system that drive timeliness. Whether driving a
truck, building a building, or operating a search-and-rescue sys-
tem, information need is driven by the reality that decisions are
constantly being made, and they need to be informed by the best
information.
And then there is the signal-to-noise ratio. The information we
get consists not only of the facts we need to know, but many we do
not need to know. Has the sender offered us a ‘‘bill of lading’’ tell-
ing us the value of the information we have been sent, i.e., why it
has been sent, the reliability of the source, the relevance to current
problems, etc.? [a ‘‘bill of lading’’ is an old maritime term for the
list of cargo a merchant has consigned to a steamer for shipment.
I find it useful] If not, we have to do the sifting, trying to figure
out what is important, and what is not. A useful analogy is an oper-
ation carried out by the Emergency Department in a hospital,
where confusion, multi-tasking, and constant pressure exist. We
need to get the information to answer our questions, we need to
get it right now, and we need to avoid the information being fuzzed
or confused by shifting attention and inappropriate prompts. To
have a ‘‘mind like the moon,’’ that shines serenely on everything,
is demanding, but necessary (Westrum, 2009a,b).
A particularly interesting example of this clarity was the com-
munication of the information regarding the NASA lunar orbit deci-
sion. Engineer John Houbolt successfully convinced NASA that the
Apollo program needed a lunar orbit solution. John Houbolt, then
director of Dynamic Loads at the Langley Space Center, was sure
that the only right way was to use a ‘‘lunar orbit.’’ This would in-
volve a spaceship circling the moon sending down a ‘‘lander’’ to
the moon’s surface. Other methods that were being considered
were a large ‘‘Nova’’ rocket that could land on the moon and then
take off, and an ‘‘earth orbit’’ solution, in which an earth-circling
space station would send a ship off to the moon. The immediate
reception of Houbolt’s idea inside NASA was largely negative. Max-
ime Faget, one of the ‘‘maestros’’ of NASA’s technological culture,
suggested that Houbolt’s numbers would never work out. Von
Braun still liked his ‘‘Nova’’ idea. Houbolt, though, went ‘‘back to
the drawing board,’’ and then came back with somewhat different
numbers. This proposal, too, was rejected. But Houbolt kept com-
ing back with better proposals. Eventually, Werhner von Braun
would say that clearly Houbolt was right. Everyone was ready to
consider all the possibilities, because everyone wanted to fulfill
Kennedy’s promise to land on the moon before the end of the
1960s. It was this vision, to fulfill the Kennedy promise, that got
people initially hesitant to agree finally that Houbolt’s idea was
the only one that would work in the time frame (Hansen, 1995).
What is important about the decision is that originally, Houbolt
wasn’t one of the major ‘‘players’’ in the moon shot decision. How-
ever, because of the value of his concept, and his persistence, he
was able to call attention to the work that he was doing, and he
had more than one chance to present it. Sometimes problems need
a champion like Houbolt, who will insist on others paying attention
to his idea.
In a similar (though of course much less important) instance,
Col. Jack Broughton describes becoming a safety champion in re-
gard to the problems of the F-106 ejection seat in 1964. Broughton,
as an Air Force squadron commander, had had many of his pilots
killed when the ejection seat on the F-106 failed. After 14 pilots
had died, Broughton forced the commanding general (4 stars) of
the Air Defence Command, Herbert Thatcher, to ground this defec-
tive fighter, the F-106. Yet even though the seat had killed over a
dozen pilots, General Thatcher did not want the F-106’s grounded.
They provided a needed service as bomber interceptors. But by
putting his career on the line, and confronting General Thatcher
in a forceful way, Broughton managed to get him to change his
mind. Thatcher, of course, could have punished Broughton for
insubordination, but he understood that the passion Broughton
exhibited testified to a loyalty to the pilots rather than to simple
rebellion. Hughes then fixed the seat (Broughton, 2008).
Summing all this up, we might look at the organization as a
whole. In regard to the information inside the organization, how
much of it can the organization actually access? Suppose we could
60 R. Westrum / Safety Science 67 (2014) 58–63
give some general parameter that might represent the ability of the
organization to use some piece of information it has, regardless of
who has it, why it has been acquired, and how it is labeled. Can we
use what we have got? Do we know what we have got? And the
answer is frequently ‘‘no, we can’t’’ or, ‘‘no, we don’t.’’ We can’t
use it because A doesn’t consider herself on the same team as B.
We can’t use it because it would reveal something embarrassing
about the maintenance department or admit some organization
fault bad for morale. We can’t use it because information from that
particular source is low-status, so we don’t listen, etc.
Let’s call this parameter IF. Pathological organizations have low
IF, bureaucratic organizations have middling IF, and generative
organizations have high IF. That means that if you ask a patholog-
ical organization to use its information, it will have big problems
doing that. And it means that often generative organizations will
succeed where pathological organizations fail, because the former
are better at utilizing the information they have. An organization’s
ability to use the information it has is a most important feature.
And it also reflects the general quality of performance in the
organization.
4. Information flow as an indicator
If we can agree that ability to use its information is important to
an organization, it is also true that this ability says other things
about an organization. This is less obvious, but it is no less true.
Information flow reflects culture. So what people in organiza-
tions do with information reflects a number of other things.
The first thing it reflects is cooperation. Information tends to
flow well when people co-operate, because cooperation and infor-
mation flow both respond to trust. Where there is trust, there is
more cooperation. And trust also means that people share informa-
tion. By contrast, when there is no trust (e.g. in pathological envi-
ronments) information becomes a political commodity. It is used to
help friends, but also to harm enemies.
Trust is established when ‘‘the walk matches the talk,’’ one of
the best indicators of a generative environment. Leaders who are
honest and forthright create an environment in which people will
talk about their concerns. This has further impacts in getting peo-
ple to report the ‘‘latent pathogens’’ (Reason, 1990; Westrum,
2009a,b) and increases the willingness to consult when puzzling
events happen. When people consult more, their grasp of the tech-
nical issues increases, and they are less likely to do things they do
not know are dangerous. A lack of such consulting was a major fac-
tor in the Tokai-Mura nuclear accident in Japan. In that event,
workers engaged in work-arounds that were technically unwise,
leading to a radiation accident (Westrum, 2000).
The second thing information flow reflects is the quality of deci-
sion-making. Good decisions are made on more complete informa-
tion, but also, once the decision is made, the decision is more open
to examination, because openness and transparency are both part
of information flow. If people make bad decisions in an open orga-
nization, they become more obvious—and in some cases can be re-
versed. By contrast, when they take place in secret, not only are the
decisions often worse, but they are harder to change. A bad deci-
sion in a closed environment is not only a mistake, but it is a fail-
ing. The decision-maker can be attacked for making it. So it is often
covered up or denied.
A perfect example of the latter is seen in Mayor (New York)
Giuliani’s decision to fire Richard Murphy, an outstanding youth
administrator, who had created the ‘‘beacon schools’’ project.
Giuliani had accused Murphy of favoritism and corruption. After
those charges were proven false, Giuliani’s response was not to
acknowledge that he had made a mistake, but to suggest that the
mistake was a minor matter.
‘Mr. Giuliani shrugged. ‘‘This happens all the time,’’ he said.
‘‘And you write about those things all the time. Sometimes they
turn out to be true. And sometimes they turn out to be wrong.’’
(Dwyer, 2013b).
But the accusation wasn’t a minor matter. Giuliani had said ‘‘My
immediate goal is to get rid of the stealing, to get rid of the corrup-
tion.’’ He ruined Murphy’s reputation, which never quite recovered,
though Murphy went onto do many other good things. (Martin and
Murphy, 2013; Dwyer, 2013) This kind of political assassination is,
unhappily, typical of pathological environments.
Third, information flow reflects the quality of life in that organi-
zation. An organization where information flows well is also one
that, I believe, does a better job with its people. Certainly, a prime
example of a generative organization would be Southwest Airlines,
one of the most popular employers in the USA (Freiberg and Frei-
berg, 1998). It is also one of the best airlines in terms of its promo-
tion of people. And it promotes everybody, not just the elite few.
This is in striking contrast to say, Microsoft or General Motors,
where policies promote only those on the star track (GM) or the
top, say, 10% (Microsoft) (see Eichenwald, 2012). It is also in strik-
ing contrast to American Airlines, which in many respects seems to
exhibit a pathological culture (Gittell, 2002). A culture similar to
Southwest’s was seen on the US destroyer Benfold, during the ten-
ure of captain Michael Abrashoff, when he took what was appar-
ently a pathological culture, and turned it into a generative one
(Abrashoff, 2002). Information flow was one of the benefits of this
generative environment. Not only were the ship’s ‘‘latent patho-
gens’’ quickly identified and fixed, but the openness led to solving
a technological communication problem that had long hobbled the
Pacific Fleet.
What is it about generative culture that makes information
flow? Actually, generative culture supports IF for three reasons.
First of all, there is the trust issue, mentioned earlier. Second, gen-
erative culture emphasizes the mission, an emphasis that allows
people involved to put aside their personal issues and also the
departmental issues that are so evident in bureaucratic organiza-
tions. The mission is primary. And third, generativity encourages
a ‘‘level playing field,’’ in which hierarchy plays less of a role (e.g.
Edmondson, 1996). Thus one can have a ‘‘boundaryless organiza-
tion,’’ to use a phrase popular at General Electric (Ashkenas et al.,
1995). Because hierarchy is de-emphasized, most of the problems
that attend getting information to flow up hierarchies disappear.
Information flow, then, is testimony to quality of worklife, because
a bad quality of worklife will interfere with flow.
Another feature of generative environments is getting informa-
tion to flow across the internal boundaries of the organization.
When every department competes with other departments, e.g. in
a bureaucratic environment, sharing information with other
departments is a problem. Yet often the ‘‘faint signals’’ that some-
thing is amiss are perceived by the ‘‘wrong person’’ organization-
ally. There is a very interesting anecdote about Charles Franklin
(‘‘Boss’’) Kettering in Douglas MacGregor’s book (1966, 118–119).
He mentions that Kettering one day had gone down to the shipyard
to see about the installation of one of his diesel engines. While
there, he noticed a painter looking at the ship’s propeller. He asked
the painter why he was staring at the propeller, and the painter of-
fered that he thought the propeller was too large by five inches.
Concerned that something might be wrong, Kettering called the de-
sign office, and asked them to come down and measure the propel-
ler, which they claimed they had already done. When the propeller
was re-measured, however, the painter turned out to be correct.
And then we have the famous ‘‘bottle of champagne’’ story
about Wernher von Braun. Since this special number regards
the ‘‘Foundations of Safety Science,’’ I am going to quote the ori-
ginal source of this story, von Braun himself.
R. Westrum / Safety Science 67 (2014) 58–63 61
‘‘One of our early Redstone missiles developed trouble in mid-
flight. The telemeter records indicated that the flight had been
flawless up to that instant, and permitted us to localize the
probable source of trouble. However, the suspected area had
been very carefully checked in numerous laboratory tests so
that all explanations sounded highly artificial.
Several theories were advanced. Finally one theory was
accepted as most likely and remedial action on it was initiated.
As this point an engineer who was a member of the firing group
called and said he wanted to see me. He came up to my office
and told me that during firing preparation he had tightened a
certain connection just to make sure that there would be good
contact.
While doing so, he had touched a contact with a screwdriver
and drawn a spark. Since the system checked out well after this
incident, he hadn’t paid any attention to the matter. But now
everybody was talking about a possible failure in that particular
apparatus, he just wanted to tell me the story, for whatever it
was worth. A quick study indicated that this was the answer.
Needless to say, the ‘‘remedial action’’ was called off and no
changes were made.
I sent the engineer a bottle of champagne because I wanted
everybody to know that honesty pays off, even if someone
may run the risk of incriminating himself. Absolute honesty is
something you simply cannot dispense with in a team effort
as difficult as that of missile development.’’ (von Braun, 1956,
p. 41)
And in fact von Braun was an exemplary leader when it came to
seeing that information flowed up, down, and across at Marshall
Space Flight Center (Tompkins, 1993). By contrast, in pathological
organizations, information flow is problematic. First of all, an envi-
ronment of fear and dread makes people cautious about what they
say. People try to minimize risks. Second, the emphasis is on pro-
viding additional power and glory to the chiefs. Whatever does
not serve this end is less important. And third, hierarchy provides
strong incentives not to ‘‘speak up,’’ to avoid provoking retaliation.
For instance, the original reason for creating ‘‘cockpit resource
management’’ seemingly had to do with encouraging co-pilots to
speak up to their pilots. But ‘‘speaking up’’ is discouraged when
blame and punishment are a major element in the environment,
or when differences in power and position are great (Stein, 1967).
Problems with hierarchy can become much worse when the
manager is a bully, and is willing to seek severe sanctions against
those unwilling to ‘‘get in line.’’ The candidacy of John R. Bolton
for American ambassador to the United Nations allowed a rare
glimpse of a public figure who seems routinely to have used bully-
ing to pressure unwilling subordinates and associates. Bolton
sought to punish, by threat and removal from office, those under
his management control (and even those not) who defied his
wishes. Several officials testified to Bolton’s threats and attempts
to have them dismissed when they had a different opinion than
his own and to his willingness to override their ideas in favor of
his own (Jehl, 2005).
Another special problem, when powerful people are involved, is
that certain subjects become ‘‘off-limits.’’ This is very common, for
instance, in sexual abuse cases. We have already mentioned the
case of Jimmy Savile at the BBC (see also Dintner, 2003). ‘‘Off-lim-
its’’ was also an operative factor in the disturbing Penn State child
abuse case, in which protecting an athletic team and its coach was
more important than the safety of young people (Belson, 2012).
And of course it was also involved in virtually all the child abuse
cases within the Catholic Church (Berry, 1992; Dintner, 2003)
and furthermore, many other churches and religious organizations
as well.
But power also operates to bottle up in information in frankly
political situations, for instance in authoritarian or totalitarian re-
gimes, when certain facts are ‘‘inconvenient’’ (French, 2000). Dur-
ing the ‘‘Great Leap Forward’’ in China, a famine was raging, but
information about it was encapsulated because it was politically
incorrect. If local leaders came forward, they would be accused of
being ‘‘right deviationists’’ or food hoarders, and would suffer bru-
tal punishments or death. Lower level leaders, eager to appear
proper communists, lied about the harvest results, with the out-
come that little about the reality appeared on the surface. In addi-
tion the party cadres often had food when the peasants did not,
decreasing their motivation to raise the issue. In the end, 38 mil-
lion people would die during the famine. It would be years before
the truth was generally known. (Jisheng, 2012).
5. Conclusion
What I have tried to establish here is that the safety scientist
must necessarily be concerned with information flow. Not only is
information flow the life blood of the organization’s nervous sys-
tem, but it is also a powerful indicator of other processes within
the organization. I must admit, that early on, I did not appreciate
how valuable IF analysis was. But its value became clear as more
and more fellow scholars, and their industries as well, adopted
and used these concepts in evaluating organizations and opera-
tions, from the emergency department, to the flight deck, from
the nuclear power console to the oil platform. I remember Jim Rea-
son asking me, in the late 1980s, ‘‘Why do think information is so
important?’’ I don’t think anyone would ask that question now.
Information flow is deeply linked to the safety culture of the
organization (Cf. Curry et al., 2011). Indeed, many others have pro-
posed and used other measures of safety culture, notably Prono-
vost (2010) in relation to infection control. Furthermore, others
have modified—i.e. elaborated—my own three categories into
more. As I have indicated, I think the value of this elaboration is
shaky. No doubt there is a ‘‘Westrum continuum.’’ In fact one
might say, as one goes from pathological to generative, that the
organization becomes more-mission oriented, and less ‘‘personal.’’
Identifying sub-categories might be helpful in particular contexts. I
am not convinced that there is a theoretical reason to do so.
In summing up, culture is no longer neglected. Information flow
is of course only one issue among many in safety culture, but I feel
it is a royal road to understanding much else.
References
Abrashoff, M., 2002. It’s Your Ship: Management Lessons from the ‘‘Best Damn Ship
in the Navy’’. Warner Books, New York.
Ashkenas, R., Ulrich, D., Jick, T., Kerr, S., 1995. The Boundaryless Organization:
Breaking the Chains of Organizational Structure. Jossey-Bass, San Francisco.
Belson, K., 2012. Abuse Scandal Damns Paterno and Penn State. New York Times, 13
July, pp. A1, B13.
Berry, J., 1992. Lead Us Not Into Temptation: Catholic Priests and Sexual Abuse of
Children. Doubleday, New York.
Broughton, J., 2008. Rupert Red Two: A Fighter Pilots Life from Thunderbolts to
Thunderchiefs. Zenith Press, St. Paul.
Burns, J.F., Castle, S., 2012. BBC’s Leaders Faulted as Lax in Handling Sex Abuse
Crisis. New York Times, 20 December.
Burns, J.F., Cowell, A., 2013. Report Depicts Horrific Pattern of Child Sexual Abuse by
BBC Celebrity. New York Times, 12 January, p. A8.
Curry, L. et al., 2011. What distinguishes top-performing hospitals in acute
myocardial infection mortality rates? Ann. Intern. Med. 156 (6), 384–390.
Dintner, P.E., 2003. A Catholic Crisis, Bestowed from Above. New York Times, 1
January, A19.
Dwyer, J., 2013. In the End, He Stole Nothing and Gave Plenty. New York Times, 19
February.
Coram, R., 2002. Boyd: The Fighter Pilot who Changed the Art of War. Little Brown,
New York.
Edmondson, A.C., 1996. Learning from Mistakes Is Easier Said than Done. J. Appl.
Behav. Sci 32 (1), 5–28.
Eichenwald, K., 2012. Microsoft’s Lost Decade. Vanity Fair, August.
62 R. Westrum / Safety Science 67 (2014) 58–63
Gittell, J., 2002. The Southwest Airlines Way: Using the Power of Relationships to
Achieve High Performance. McGraw-Hill, New York.
Freiberg, K., Freiberg, J., 1998. Nuts! Southwest Airlines’ Crazy Recipe for Business
and Personal Success. Broadway Books, New York.
French, H.W., 2000. Japan Debates Culture of Covering Up. NEW YORK TIMES, 2
May, p. A12.
Hansen, J.R., 1995. Enchanted Rendezvous: John C. Houbolt and the Genesis of the
Lunar Orbit Rendezvous Concept, Monographs in Aerospace History, #4.
Washington: NASA History Office.
Hudson, P., 2007. Implementing a safety culture in a major multinational. Saf. Sci.
45, 697–722.
Jehl, D., 2005. 3 Ex-officials Describe Bullying by Boulton. New York Times, 3 May
2005, p. A10.
Jisheng, Y., 2012. Tombstone: The Great Chinese Famine, 1958-1962, Farrar, Strauss
and Giroux.
Katz, D., Allport, F., 1931. Student Attitudes. Craftsman Press, Syracuse, NY.
MacGregor, D., 1966. Leadership and Motivation. MIT Press, Cambridge.
Martin, D., Richard Murphy, 2013 Feb 16. 68; Aided disadvantaged youths,
New York Times.
Moody, R., 1975. Life After Life. Bantam, New York.
Parker, D. et al., 2006. Framework for understanding the development of
organizational safety culture. Saf. Sci. 44, 551–562.
Perrow, C., 1986. Normal Accidents: Living with High Risk Technologies. Basic, New
York.
Pronovost, D., 2010. Safe Patients, Smart Hospitals. Penguin, New York.
Reason, J., 1990. Human Error. Cambridge University Press, New York.
Stein, L.I., 1967. The doctor-nurse game. Arch. Gen. Psychiatry 16 (6), 699–703.
Tompkins, P.K., 1993. Organizational Communication Imperatives: The Lessons of
the Space Program. Roxbury Publishing, Syracuse, NY.
von Braun, W., 1956. Teamwork: Key to Success in Guided Missiles. Missiles and
Rockets, October, pp. 38–42.
Westrum, R., 1986. The Blind Eye of Science. Whole Earth Review, Fall, pp. 36–41.
Westrum, R., 1978. Science and social intelligence about anomalies: the case of
meteorites. Soc. Stud. Sci. 8 (1978), 461–493.
Westrum, R., 1982. Social intelligence about hidden events: its implications for
scientific research and social policy. Knowl.: Creat., Diffus., Util. 3 (3), 381–400.
Westrum, R., 2000. Safety planning and safety culture in the JCO criticality accident.
Cogn. Technol. Work 2, 240–241.
Westrum, R., 2004. A typology of organizational cultures. Qual. Safety Health Care
13 (Suppl. II), ii22–ii27.
Westrum, R., 2009a. Information flow and problem-solving. In: Cosby, K.,
Crosskerry, P., Wears, R. (Eds.), Handbook of Patient Safety in Emergency
Medicine. Lippincott Williams, Philadelphia.
Westrum, R., 2009. Hearing Faint Signals. Presentation at the Petroleum Safety
Authority, Stavanger, Norway, May 15, 2009.
Wise, J. et al. (Eds.), 1994. Verification and Validation of Complex Systems: Human
Factors. Springer, New York.
R. Westrum / Safety Science 67 (2014) 58–63 63