ArticlePDF Available

The study of information flow: A personal journey

Authors:

Abstract

Information flow has been shown to be a key variable in system safety. Not only is information flow vital to the organization’s “nervous system,” but it is also a key indicator of the quality of the organization’s functioning. The author describes how his personal trajectory took him from the study of social information about anomalous events to the role of information in causing or preventing technological accidents. The important features of good information flow are relevance, timeliness, and clarity. Generative environments are more likely to provide information with these characteristics, since they encourage a “level playing field” and respect for the needs of the information recipient. By contrast, pathological environments, caused by a leader’s desire to see him/herself succeed, often create a “political” environment for information that interferes with good flow.
The study of information flow: A personal journey
Ron Westrum
Department of Sociology, Anthropology and Criminology, Eastern Michigan University, Ypsilanti, MI 48197, USA
Society and Risk, University of Stavanger, Norway
article info
Keywords:
Information flow
Culture
Organizational quality of life
Employee empowerment
Hidden events
abstract
Information flow has been shown to be a key variable in system safety. Not only is information flow vital
to the organization’s ‘‘nervous system,’’ but it is also a key indicator of the quality of the organization’s
functioning. The author describes how his personal trajectory took him from the study of social informa-
tion about anomalous events to the role of information in causing or preventing technological accidents.
The important features of good information flow are relevance, timeliness, and clarity. Generative envi-
ronments are more likely to provide information with these characteristics, since they encourage a ‘‘level
playing field’’ and respect for the needs of the information recipient. By contrast, pathological environ-
ments, caused by a leader’s desire to see him/herself succeed, often create a ‘‘political’’ environment
for information that interferes with good flow.
Ó2014 Elsevier Ltd. All rights reserved.
1. Introduction
The role of information in making systems safe is profound. Not
only is information flow a prime variable in creating safety, but
also it is an indicator of organizational functioning. By examining
the culture of information flow, we can get an idea of how well
people in the organization are cooperating, and also, how effective
their work is likely to be in providing a safe operation.
My interest in information flow is the result of a lifetime of pro-
fessional work and theorizing. My studies on information flow go
back into my undergraduate years, and weave through my gradu-
ate work at Chicago and later the RAND Corporation. But my re-
search wasn’t initially focused on safety. In the early 1970s I was
studying the role that information played in scientists’ decisions
about anomalous events. This was the result of a fascination with
such unusual events as UFOs and meteorites (Westrum, 1978).
Why didn’t information about anomalies flow to the people who
needed it? In my studies, I discovered, early on, that scientists of-
ten were unaware of their own biases in making decisions about
unusual events. For instance, they often assumed that they would
be ‘‘the first to know’’ about anomalous events, but in fact they of-
ten remained ignorant. They were unaware that their own biases
interfered with search and with information flow. Thus, they often
suffered from the ‘‘fallacy of centrality’’ (my term), thinking that
they were critical in the flow of information about anomalies,
but in fact they were often left out.
Eventually I realized that anomalies were often what I learned
to call ‘‘hidden events’’ (Westrum, 1982,1986). Hidden events were
things that in the words of Raymond Moody, were ‘‘very widely
experienced but very well hidden’’ (Moody, 1975). Hidden events
might lie beneath the surface of social radar, because of the ‘‘plu-
ralistic ignorance’’ (Katz and Allport, 1931) of those who experi-
enced them. Pluralistic ignorance is the reluctance to report
something no one else seems to be reporting. So of course, if others
are not reporting, then you don’t want to report, either; no one
wants to speak up. Similar forces kept silent the many victims of
sexual abuse by the British celebrity Jimmy Savile, all of whom
thought they would be singled out, when in fact they numbered
in the hundreds (Burns and Cowell, 2013; Burns and Castle,
2012). But these disparate pieces of data were not put together.
In 1978–1979, I had a sabbatical year at the Science Studies Unit
of the University of Edinburgh. During the 8 months or so I spent
there, I was doing two things. The first was the preparation of
my book on the sociology of hidden events. I was trying to put to-
gether everything that was known on how society makes decisions
about anomalous events. I had started with UFOs, but soon
branched off into other areas. For instance, while taking a course
on ‘‘Forensic Medicine for Lawyers,’’ taught by Professor John Ma-
son, an expert on air crashes, I first heard about the ‘‘battered child
syndrome.’’ Aha, I thought, another ‘‘hidden event,’’ like meteorites
and UFOS! The book on hidden events was not fated to be pub-
lished, but did stimulate further research.
While taking the Forensic Medicine course, I had began to stop
over at the Medical Library, and shortly encountered the periodical
Aviation Psychology and Environmental Medicine. It was in this
magazine that I found the first articles I had read on the social
http://dx.doi.org/10.1016/j.ssci.2014.01.009
0925-7535/Ó2014 Elsevier Ltd. All rights reserved.
Address: Department of Sociology, Anthropology and Criminology, Eastern
Michigan University, Ypsilanti, MI 48197, USA. Tel.: +1 734 355 5538.
E-mail addresses: ronwestrum@aol.com,ron.westrum@emich.edu
Safety Science 67 (2014) 58–63
Contents lists available at ScienceDirect
Safety Science
journal homepage: www.elsevier.com/locate/ssci
psychology of the airliner cockpit. Some were written by NASA
psychologist John Lauber, with whom I was later to correspond.
These articles engaged me, but I was still researching the hidden
events dynamics, and would later sum up my research in a paper
on ‘‘Social Intelligence about hidden events: Its implications for
scientific research and social policy’’ (1983). But ‘‘hidden events’’
seemed to engage few besides myself, and my research did nothing
to advance my career. Other sociologists of science urged me to
switch to a less controversial topic.
And gradually, that happened. I first begin to realize that the
framework I had slowly erected to study hidden events might be
useful for the study of technological accidents. And about this time
I had another visiting professorship at the University of Hawaii.
While shopping in its bookstore I noticed Charles Perrow’s book
Normal Accidents (1984). This broke like a phosphorus shell over
my head—as I imagine it broke over the heads of others. It sug-
gested that accidents were amenable to sociological analysis.
About the same time I also discovered the writings of James Reason
on human error and accidents. This seemed to be a promising area,
and I dove into it. Lauber, Perrow, and Reason all seemed to be onto
something important.
Then in 1986 there was a conference on aviation safety in Bad
Windsheim, Germany. I got invited through a funny turn of events.
I had sent some very half-baked preprints to Perrow, who was in-
vited to this conference. But he couldn’t go, so he suggested that
the sponsors invite me. When I got there, I discovered I was to take
Perrow’s place! I certainly was not up to Perrow at that point, but
at this conference were Jim Reason, and also Irving Janis, both
giants, in my estimation (Wise et al., 1994). In 1986 I was a min-
now. However, I rose to the challenge, and shortly began to be in-
vited to other conferences. (Conferences are like pinball games:
You win one, you get to play another. You win that, you get invited
to yet another, and so on.)
And then I had a real breakthrough. In October 1988 the World
Bank would have a large conference on ‘‘Safety Control and and
Risk Management.’’ It seemed as if this was intended to be the
be-all and end-all of conferences on human factors in system
safety. Participants had to prepare papers for the conference, and
I was working on mine one evening when I suddenly got a string
of insights that were extremely helpful. First of all I realized that
there was a continuum of safety cultures that fell into three gen-
eral categories: pathological, bureaucratic and generative.
Pathological organizations are characterized by large amounts
of fear and threat. People often hoard information or withhold it
for political reasons, or distort it to make themselves look
better.
Bureaucratic organizations protect departments. Those in the
department want to maintain their ‘‘turf,’’ insist on their own
rules, and generally do things by the book—their book.
Generative organizations focus on the mission. How do we
accomplish our goal? Everything is subordinated to good per-
formance, to doing what we are supposed to do.
This classification (and derivations of it) would become widely
used.
2. Elaboration of the Three Cultures Model
James Reason and Patrick Hudson, have suggested expanding
my classification of cultures into a five-part version, essentially
by defining ‘‘reactive’’ as between ‘‘pathological’’ and ‘‘bureau-
cratic,’’ and ‘‘pro-active’’ as between ‘‘bureaucratic’’ and ‘‘genera-
tive.’’ (e.g. Hudson, 2007) And some persons have felt that this
was a useful improvement. I believe that there might be two valid
reasons for doing this. The first would be theoretical, that there
might be some theoretical advantage for this more articulated
scale. The second would be that there is some empirical stimulus
for such an articulation. However, I have not seen either theoretical
arguments nor empirical evidence for doing so.
When I originally created the concept, I thought of pathological,
bureaucratic, and generative as ideal types. However, over time I
have come to think of the three as forming points on a scale, what
one friend of mine, Russell Briggs, calls the ‘‘Westrum continuum.’’
Why ‘‘bureaucratic’’ should be in the middle might be questioned,
but there are many case studies that show that this is a distinct
state of affairs. However, no such case studies exist, I believe, to
support the proposed two additional types. There certainly must
be intermediate points between, e.g., pathological and generative,
something can be more pathological or less pathological, more or
less generative, etc. In specific empirical examples, it might be
valuable to name the intermediate points, but you could also argue
that this is a continuous scale, and numbering the points from, say,
one to ten would work just as well.
It might seem small-minded to refuse the Reason/Hudson elab-
oration, rather like Marx saying ‘‘je ne suis pas un Marxiste,’’ as he
did, in regard to some of his followers. I believe that the two schol-
ars first mentioned it to me in the 1980s, and I didn’t object. But
time has passed, and the additional arguments and data have not
come forth. Professor Dianne Parker of the University of Manches-
ter has done some work using these five types (e.g. Parker et al.,
2006), and asked a sample of persons in her business seminars to
describe how they would put organizational cultures into the five
categories. In looking over her materials, however, I discovered
that the behaviors her survey participants imagined did not corre-
spond to the behaviors I had found in the case studies I did. In par-
ticular, the ‘‘pathological’’ examples did not correspond to what I
see as being genuinely pathological environments; real pathologi-
cal organizations are much worse. I am not saying that Reason and
Hudson are wrong, I am simply saying that I am unconvinced that
this is a genuine improvement. I believe it is up to the aforemen-
tioned gentlemen to show the value they see in the classification
they have. This is a task yet to be accomplished.
Related to my classification of culture was another idea. These
cultural styles are associated with how the organization is likely
to respond to information that things are not going well. We might
see a set of ways in which an organization might respond to anom-
alous information. There were six ways:
(1) First of all, the organization might ‘‘shoot the messenger.’’
(2) Second, even if the ‘‘messenger’’ was not executed, his or her
information might be isolated.
(3) Third, even if the message got out, it could still be ‘‘put in
context’’ through a ‘‘public relations’’ strategy.
(4) Fourth, maybe more serious action could be forestalled if
one only fixed the immediately presenting event.
(5) Fifth, the organization might react through a ‘‘global fix’’
which would also look for other examples of the same thing.
(6) Finally, the organization might engage in profound inquiry,
to fix not only the presenting event, but also its underlying
causes. The scale of reactions might appear like this:
Public Relations Global Fix
***@********@***************@*****@**************@********@
Suppression
Encapsulation Local Fix Inquiry
Each of these reactions was associated with the cultural types.
Suppression seemed to be a type marker for pathological environ-
ments. Bureaucratic climates tended to use encapsulation, public
relations and local fixes. Generative environments were more
likely to engage in global fixes and inquiry. These ideas about
R. Westrum / Safety Science 67 (2014) 58–63 59
reactions occurred to me over a few hours, though the work on
culture and information flow had been longstanding. Yet bringing
it together changed my perspective. Sometimes nothing is as
valuable as a theory uniting disparate things.
In great contrast to the tepid reactions to my work on hidden
events, the ‘‘three cultures’’ idea caught on quickly, especially after
Jim Reason cited it his classic Human Error (1990). (Shortly after, I
met a French colleague, Jean Paries for the first time, and he asked
me if I was the ‘‘famous’’ Ron Westrum!.) And information flow
seemed to bring together not only culture, but also management
style. Later I would suggest that pathological environments were
caused when the leader puts stress on his/her own advancement
and perquisites (Westrum 2004). By contrast, bureaucratic envi-
ronments come when leaders put departmental goals ahead of
organizational ones. And generative environments focus on the
organization’s mission above all else.
But how, in fact, does a theory of ‘‘information flow’’ unite so
many disparate currents? Let us reflect for a few moments and
see how it works. Recall, there are really two critical reasons for
attending to information flow.
(1) First, when information does not flow, it imperils the safe
and proper functioning of the organization.
(2) Second, information flow is a powerful indicator of the orga-
nization’s overall functioning.
3. Information flow as a vital resource
Organizations function on information. Stop the information
and the organization stops, too. Better organizations require better
information. Worse information flows leads to worse functioning.
Better information flow leads to better functioning.
The words ‘‘information flow’’ suggest water moving through a
pipeline. Yet the mere fact of information itself tells us little about
the character of the information. More information is not necessar-
ily better. Good information has these characteristics:
(1) It provides answers to the questions that the receiver needs
answered.
(2) It is timely.
(3) It is presented in such a way that it can be effectively used
by the receiver.
These sound like simple criteria, but they are fact often very dif-
ficult to meet in practice. Take the first one, for instance, which is
often violated. Getting the right information to the right person is
absolutely critical. But the underlying issue is that the information
should respond to the needs of the receiver, not the sender. Recei-
ver-focused information is a powerful sign that the organization is
engaging in teamwork. Yet, often, information may instead serve a
political purpose, to protect the sender (‘‘covering one’s posterior,’’
in common language), to provide the illusion of cooperation, or in
the worst case, to baffle or deceive the receiver.
Timeliness is another desideratum. The same information, pre-
sented a day later or a second later, may be useless. We want to
be updated in a timely way, otherwise we are ‘‘out of the loop.’’
Information is used to make decisions. If it is late, it may put us
in a bad way versus an adversary, it may blind us to a current dan-
ger, it may lead to a wrong decision. Cf. Col. John Boyd’s famous
idea to stay ‘‘inside the loop’’ of an opponent, so that one’s own
decision cycle is faster (Coram, 2002). But more often, it is simply
the needs of the system that drive timeliness. Whether driving a
truck, building a building, or operating a search-and-rescue sys-
tem, information need is driven by the reality that decisions are
constantly being made, and they need to be informed by the best
information.
And then there is the signal-to-noise ratio. The information we
get consists not only of the facts we need to know, but many we do
not need to know. Has the sender offered us a ‘‘bill of lading’’ tell-
ing us the value of the information we have been sent, i.e., why it
has been sent, the reliability of the source, the relevance to current
problems, etc.? [a ‘‘bill of lading’’ is an old maritime term for the
list of cargo a merchant has consigned to a steamer for shipment.
I find it useful] If not, we have to do the sifting, trying to figure
out what is important, and what is not. A useful analogy is an oper-
ation carried out by the Emergency Department in a hospital,
where confusion, multi-tasking, and constant pressure exist. We
need to get the information to answer our questions, we need to
get it right now, and we need to avoid the information being fuzzed
or confused by shifting attention and inappropriate prompts. To
have a ‘‘mind like the moon,’’ that shines serenely on everything,
is demanding, but necessary (Westrum, 2009a,b).
A particularly interesting example of this clarity was the com-
munication of the information regarding the NASA lunar orbit deci-
sion. Engineer John Houbolt successfully convinced NASA that the
Apollo program needed a lunar orbit solution. John Houbolt, then
director of Dynamic Loads at the Langley Space Center, was sure
that the only right way was to use a ‘‘lunar orbit.’’ This would in-
volve a spaceship circling the moon sending down a ‘‘lander’’ to
the moon’s surface. Other methods that were being considered
were a large ‘‘Nova’’ rocket that could land on the moon and then
take off, and an ‘‘earth orbit’’ solution, in which an earth-circling
space station would send a ship off to the moon. The immediate
reception of Houbolt’s idea inside NASA was largely negative. Max-
ime Faget, one of the ‘‘maestros’’ of NASA’s technological culture,
suggested that Houbolt’s numbers would never work out. Von
Braun still liked his ‘‘Nova’’ idea. Houbolt, though, went ‘‘back to
the drawing board,’’ and then came back with somewhat different
numbers. This proposal, too, was rejected. But Houbolt kept com-
ing back with better proposals. Eventually, Werhner von Braun
would say that clearly Houbolt was right. Everyone was ready to
consider all the possibilities, because everyone wanted to fulfill
Kennedy’s promise to land on the moon before the end of the
1960s. It was this vision, to fulfill the Kennedy promise, that got
people initially hesitant to agree finally that Houbolt’s idea was
the only one that would work in the time frame (Hansen, 1995).
What is important about the decision is that originally, Houbolt
wasn’t one of the major ‘‘players’’ in the moon shot decision. How-
ever, because of the value of his concept, and his persistence, he
was able to call attention to the work that he was doing, and he
had more than one chance to present it. Sometimes problems need
a champion like Houbolt, who will insist on others paying attention
to his idea.
In a similar (though of course much less important) instance,
Col. Jack Broughton describes becoming a safety champion in re-
gard to the problems of the F-106 ejection seat in 1964. Broughton,
as an Air Force squadron commander, had had many of his pilots
killed when the ejection seat on the F-106 failed. After 14 pilots
had died, Broughton forced the commanding general (4 stars) of
the Air Defence Command, Herbert Thatcher, to ground this defec-
tive fighter, the F-106. Yet even though the seat had killed over a
dozen pilots, General Thatcher did not want the F-106’s grounded.
They provided a needed service as bomber interceptors. But by
putting his career on the line, and confronting General Thatcher
in a forceful way, Broughton managed to get him to change his
mind. Thatcher, of course, could have punished Broughton for
insubordination, but he understood that the passion Broughton
exhibited testified to a loyalty to the pilots rather than to simple
rebellion. Hughes then fixed the seat (Broughton, 2008).
Summing all this up, we might look at the organization as a
whole. In regard to the information inside the organization, how
much of it can the organization actually access? Suppose we could
60 R. Westrum / Safety Science 67 (2014) 58–63
give some general parameter that might represent the ability of the
organization to use some piece of information it has, regardless of
who has it, why it has been acquired, and how it is labeled. Can we
use what we have got? Do we know what we have got? And the
answer is frequently ‘‘no, we can’t’’ or, ‘‘no, we don’t.’’ We can’t
use it because A doesn’t consider herself on the same team as B.
We can’t use it because it would reveal something embarrassing
about the maintenance department or admit some organization
fault bad for morale. We can’t use it because information from that
particular source is low-status, so we don’t listen, etc.
Let’s call this parameter IF. Pathological organizations have low
IF, bureaucratic organizations have middling IF, and generative
organizations have high IF. That means that if you ask a patholog-
ical organization to use its information, it will have big problems
doing that. And it means that often generative organizations will
succeed where pathological organizations fail, because the former
are better at utilizing the information they have. An organization’s
ability to use the information it has is a most important feature.
And it also reflects the general quality of performance in the
organization.
4. Information flow as an indicator
If we can agree that ability to use its information is important to
an organization, it is also true that this ability says other things
about an organization. This is less obvious, but it is no less true.
Information flow reflects culture. So what people in organiza-
tions do with information reflects a number of other things.
The first thing it reflects is cooperation. Information tends to
flow well when people co-operate, because cooperation and infor-
mation flow both respond to trust. Where there is trust, there is
more cooperation. And trust also means that people share informa-
tion. By contrast, when there is no trust (e.g. in pathological envi-
ronments) information becomes a political commodity. It is used to
help friends, but also to harm enemies.
Trust is established when ‘‘the walk matches the talk,’’ one of
the best indicators of a generative environment. Leaders who are
honest and forthright create an environment in which people will
talk about their concerns. This has further impacts in getting peo-
ple to report the ‘‘latent pathogens’’ (Reason, 1990; Westrum,
2009a,b) and increases the willingness to consult when puzzling
events happen. When people consult more, their grasp of the tech-
nical issues increases, and they are less likely to do things they do
not know are dangerous. A lack of such consulting was a major fac-
tor in the Tokai-Mura nuclear accident in Japan. In that event,
workers engaged in work-arounds that were technically unwise,
leading to a radiation accident (Westrum, 2000).
The second thing information flow reflects is the quality of deci-
sion-making. Good decisions are made on more complete informa-
tion, but also, once the decision is made, the decision is more open
to examination, because openness and transparency are both part
of information flow. If people make bad decisions in an open orga-
nization, they become more obvious—and in some cases can be re-
versed. By contrast, when they take place in secret, not only are the
decisions often worse, but they are harder to change. A bad deci-
sion in a closed environment is not only a mistake, but it is a fail-
ing. The decision-maker can be attacked for making it. So it is often
covered up or denied.
A perfect example of the latter is seen in Mayor (New York)
Giuliani’s decision to fire Richard Murphy, an outstanding youth
administrator, who had created the ‘‘beacon schools’’ project.
Giuliani had accused Murphy of favoritism and corruption. After
those charges were proven false, Giuliani’s response was not to
acknowledge that he had made a mistake, but to suggest that the
mistake was a minor matter.
‘Mr. Giuliani shrugged. ‘‘This happens all the time,’’ he said.
‘‘And you write about those things all the time. Sometimes they
turn out to be true. And sometimes they turn out to be wrong.’’
(Dwyer, 2013b).
But the accusation wasn’t a minor matter. Giuliani had said ‘‘My
immediate goal is to get rid of the stealing, to get rid of the corrup-
tion.’’ He ruined Murphy’s reputation, which never quite recovered,
though Murphy went onto do many other good things. (Martin and
Murphy, 2013; Dwyer, 2013) This kind of political assassination is,
unhappily, typical of pathological environments.
Third, information flow reflects the quality of life in that organi-
zation. An organization where information flows well is also one
that, I believe, does a better job with its people. Certainly, a prime
example of a generative organization would be Southwest Airlines,
one of the most popular employers in the USA (Freiberg and Frei-
berg, 1998). It is also one of the best airlines in terms of its promo-
tion of people. And it promotes everybody, not just the elite few.
This is in striking contrast to say, Microsoft or General Motors,
where policies promote only those on the star track (GM) or the
top, say, 10% (Microsoft) (see Eichenwald, 2012). It is also in strik-
ing contrast to American Airlines, which in many respects seems to
exhibit a pathological culture (Gittell, 2002). A culture similar to
Southwest’s was seen on the US destroyer Benfold, during the ten-
ure of captain Michael Abrashoff, when he took what was appar-
ently a pathological culture, and turned it into a generative one
(Abrashoff, 2002). Information flow was one of the benefits of this
generative environment. Not only were the ship’s ‘‘latent patho-
gens’’ quickly identified and fixed, but the openness led to solving
a technological communication problem that had long hobbled the
Pacific Fleet.
What is it about generative culture that makes information
flow? Actually, generative culture supports IF for three reasons.
First of all, there is the trust issue, mentioned earlier. Second, gen-
erative culture emphasizes the mission, an emphasis that allows
people involved to put aside their personal issues and also the
departmental issues that are so evident in bureaucratic organiza-
tions. The mission is primary. And third, generativity encourages
a ‘‘level playing field,’’ in which hierarchy plays less of a role (e.g.
Edmondson, 1996). Thus one can have a ‘‘boundaryless organiza-
tion,’’ to use a phrase popular at General Electric (Ashkenas et al.,
1995). Because hierarchy is de-emphasized, most of the problems
that attend getting information to flow up hierarchies disappear.
Information flow, then, is testimony to quality of worklife, because
a bad quality of worklife will interfere with flow.
Another feature of generative environments is getting informa-
tion to flow across the internal boundaries of the organization.
When every department competes with other departments, e.g. in
a bureaucratic environment, sharing information with other
departments is a problem. Yet often the ‘‘faint signals’’ that some-
thing is amiss are perceived by the ‘‘wrong person’’ organization-
ally. There is a very interesting anecdote about Charles Franklin
(‘‘Boss’’) Kettering in Douglas MacGregor’s book (1966, 118–119).
He mentions that Kettering one day had gone down to the shipyard
to see about the installation of one of his diesel engines. While
there, he noticed a painter looking at the ship’s propeller. He asked
the painter why he was staring at the propeller, and the painter of-
fered that he thought the propeller was too large by five inches.
Concerned that something might be wrong, Kettering called the de-
sign office, and asked them to come down and measure the propel-
ler, which they claimed they had already done. When the propeller
was re-measured, however, the painter turned out to be correct.
And then we have the famous ‘‘bottle of champagne’’ story
about Wernher von Braun. Since this special number regards
the ‘‘Foundations of Safety Science,’’ I am going to quote the ori-
ginal source of this story, von Braun himself.
R. Westrum / Safety Science 67 (2014) 58–63 61
‘‘One of our early Redstone missiles developed trouble in mid-
flight. The telemeter records indicated that the flight had been
flawless up to that instant, and permitted us to localize the
probable source of trouble. However, the suspected area had
been very carefully checked in numerous laboratory tests so
that all explanations sounded highly artificial.
Several theories were advanced. Finally one theory was
accepted as most likely and remedial action on it was initiated.
As this point an engineer who was a member of the firing group
called and said he wanted to see me. He came up to my office
and told me that during firing preparation he had tightened a
certain connection just to make sure that there would be good
contact.
While doing so, he had touched a contact with a screwdriver
and drawn a spark. Since the system checked out well after this
incident, he hadn’t paid any attention to the matter. But now
everybody was talking about a possible failure in that particular
apparatus, he just wanted to tell me the story, for whatever it
was worth. A quick study indicated that this was the answer.
Needless to say, the ‘‘remedial action’’ was called off and no
changes were made.
I sent the engineer a bottle of champagne because I wanted
everybody to know that honesty pays off, even if someone
may run the risk of incriminating himself. Absolute honesty is
something you simply cannot dispense with in a team effort
as difficult as that of missile development.’’ (von Braun, 1956,
p. 41)
And in fact von Braun was an exemplary leader when it came to
seeing that information flowed up, down, and across at Marshall
Space Flight Center (Tompkins, 1993). By contrast, in pathological
organizations, information flow is problematic. First of all, an envi-
ronment of fear and dread makes people cautious about what they
say. People try to minimize risks. Second, the emphasis is on pro-
viding additional power and glory to the chiefs. Whatever does
not serve this end is less important. And third, hierarchy provides
strong incentives not to ‘‘speak up,’’ to avoid provoking retaliation.
For instance, the original reason for creating ‘‘cockpit resource
management’’ seemingly had to do with encouraging co-pilots to
speak up to their pilots. But ‘‘speaking up’’ is discouraged when
blame and punishment are a major element in the environment,
or when differences in power and position are great (Stein, 1967).
Problems with hierarchy can become much worse when the
manager is a bully, and is willing to seek severe sanctions against
those unwilling to ‘‘get in line.’’ The candidacy of John R. Bolton
for American ambassador to the United Nations allowed a rare
glimpse of a public figure who seems routinely to have used bully-
ing to pressure unwilling subordinates and associates. Bolton
sought to punish, by threat and removal from office, those under
his management control (and even those not) who defied his
wishes. Several officials testified to Bolton’s threats and attempts
to have them dismissed when they had a different opinion than
his own and to his willingness to override their ideas in favor of
his own (Jehl, 2005).
Another special problem, when powerful people are involved, is
that certain subjects become ‘‘off-limits.’’ This is very common, for
instance, in sexual abuse cases. We have already mentioned the
case of Jimmy Savile at the BBC (see also Dintner, 2003). ‘‘Off-lim-
its’’ was also an operative factor in the disturbing Penn State child
abuse case, in which protecting an athletic team and its coach was
more important than the safety of young people (Belson, 2012).
And of course it was also involved in virtually all the child abuse
cases within the Catholic Church (Berry, 1992; Dintner, 2003)
and furthermore, many other churches and religious organizations
as well.
But power also operates to bottle up in information in frankly
political situations, for instance in authoritarian or totalitarian re-
gimes, when certain facts are ‘‘inconvenient’’ (French, 2000). Dur-
ing the ‘‘Great Leap Forward’’ in China, a famine was raging, but
information about it was encapsulated because it was politically
incorrect. If local leaders came forward, they would be accused of
being ‘‘right deviationists’’ or food hoarders, and would suffer bru-
tal punishments or death. Lower level leaders, eager to appear
proper communists, lied about the harvest results, with the out-
come that little about the reality appeared on the surface. In addi-
tion the party cadres often had food when the peasants did not,
decreasing their motivation to raise the issue. In the end, 38 mil-
lion people would die during the famine. It would be years before
the truth was generally known. (Jisheng, 2012).
5. Conclusion
What I have tried to establish here is that the safety scientist
must necessarily be concerned with information flow. Not only is
information flow the life blood of the organization’s nervous sys-
tem, but it is also a powerful indicator of other processes within
the organization. I must admit, that early on, I did not appreciate
how valuable IF analysis was. But its value became clear as more
and more fellow scholars, and their industries as well, adopted
and used these concepts in evaluating organizations and opera-
tions, from the emergency department, to the flight deck, from
the nuclear power console to the oil platform. I remember Jim Rea-
son asking me, in the late 1980s, ‘‘Why do think information is so
important?’’ I don’t think anyone would ask that question now.
Information flow is deeply linked to the safety culture of the
organization (Cf. Curry et al., 2011). Indeed, many others have pro-
posed and used other measures of safety culture, notably Prono-
vost (2010) in relation to infection control. Furthermore, others
have modified—i.e. elaborated—my own three categories into
more. As I have indicated, I think the value of this elaboration is
shaky. No doubt there is a ‘‘Westrum continuum.’’ In fact one
might say, as one goes from pathological to generative, that the
organization becomes more-mission oriented, and less ‘‘personal.’’
Identifying sub-categories might be helpful in particular contexts. I
am not convinced that there is a theoretical reason to do so.
In summing up, culture is no longer neglected. Information flow
is of course only one issue among many in safety culture, but I feel
it is a royal road to understanding much else.
References
Abrashoff, M., 2002. It’s Your Ship: Management Lessons from the ‘‘Best Damn Ship
in the Navy’’. Warner Books, New York.
Ashkenas, R., Ulrich, D., Jick, T., Kerr, S., 1995. The Boundaryless Organization:
Breaking the Chains of Organizational Structure. Jossey-Bass, San Francisco.
Belson, K., 2012. Abuse Scandal Damns Paterno and Penn State. New York Times, 13
July, pp. A1, B13.
Berry, J., 1992. Lead Us Not Into Temptation: Catholic Priests and Sexual Abuse of
Children. Doubleday, New York.
Broughton, J., 2008. Rupert Red Two: A Fighter Pilots Life from Thunderbolts to
Thunderchiefs. Zenith Press, St. Paul.
Burns, J.F., Castle, S., 2012. BBC’s Leaders Faulted as Lax in Handling Sex Abuse
Crisis. New York Times, 20 December.
Burns, J.F., Cowell, A., 2013. Report Depicts Horrific Pattern of Child Sexual Abuse by
BBC Celebrity. New York Times, 12 January, p. A8.
Curry, L. et al., 2011. What distinguishes top-performing hospitals in acute
myocardial infection mortality rates? Ann. Intern. Med. 156 (6), 384–390.
Dintner, P.E., 2003. A Catholic Crisis, Bestowed from Above. New York Times, 1
January, A19.
Dwyer, J., 2013. In the End, He Stole Nothing and Gave Plenty. New York Times, 19
February.
Coram, R., 2002. Boyd: The Fighter Pilot who Changed the Art of War. Little Brown,
New York.
Edmondson, A.C., 1996. Learning from Mistakes Is Easier Said than Done. J. Appl.
Behav. Sci 32 (1), 5–28.
Eichenwald, K., 2012. Microsoft’s Lost Decade. Vanity Fair, August.
62 R. Westrum / Safety Science 67 (2014) 58–63
Gittell, J., 2002. The Southwest Airlines Way: Using the Power of Relationships to
Achieve High Performance. McGraw-Hill, New York.
Freiberg, K., Freiberg, J., 1998. Nuts! Southwest Airlines’ Crazy Recipe for Business
and Personal Success. Broadway Books, New York.
French, H.W., 2000. Japan Debates Culture of Covering Up. NEW YORK TIMES, 2
May, p. A12.
Hansen, J.R., 1995. Enchanted Rendezvous: John C. Houbolt and the Genesis of the
Lunar Orbit Rendezvous Concept, Monographs in Aerospace History, #4.
Washington: NASA History Office.
Hudson, P., 2007. Implementing a safety culture in a major multinational. Saf. Sci.
45, 697–722.
Jehl, D., 2005. 3 Ex-officials Describe Bullying by Boulton. New York Times, 3 May
2005, p. A10.
Jisheng, Y., 2012. Tombstone: The Great Chinese Famine, 1958-1962, Farrar, Strauss
and Giroux.
Katz, D., Allport, F., 1931. Student Attitudes. Craftsman Press, Syracuse, NY.
MacGregor, D., 1966. Leadership and Motivation. MIT Press, Cambridge.
Martin, D., Richard Murphy, 2013 Feb 16. 68; Aided disadvantaged youths,
New York Times.
Moody, R., 1975. Life After Life. Bantam, New York.
Parker, D. et al., 2006. Framework for understanding the development of
organizational safety culture. Saf. Sci. 44, 551–562.
Perrow, C., 1986. Normal Accidents: Living with High Risk Technologies. Basic, New
York.
Pronovost, D., 2010. Safe Patients, Smart Hospitals. Penguin, New York.
Reason, J., 1990. Human Error. Cambridge University Press, New York.
Stein, L.I., 1967. The doctor-nurse game. Arch. Gen. Psychiatry 16 (6), 699–703.
Tompkins, P.K., 1993. Organizational Communication Imperatives: The Lessons of
the Space Program. Roxbury Publishing, Syracuse, NY.
von Braun, W., 1956. Teamwork: Key to Success in Guided Missiles. Missiles and
Rockets, October, pp. 38–42.
Westrum, R., 1986. The Blind Eye of Science. Whole Earth Review, Fall, pp. 36–41.
Westrum, R., 1978. Science and social intelligence about anomalies: the case of
meteorites. Soc. Stud. Sci. 8 (1978), 461–493.
Westrum, R., 1982. Social intelligence about hidden events: its implications for
scientific research and social policy. Knowl.: Creat., Diffus., Util. 3 (3), 381–400.
Westrum, R., 2000. Safety planning and safety culture in the JCO criticality accident.
Cogn. Technol. Work 2, 240–241.
Westrum, R., 2004. A typology of organizational cultures. Qual. Safety Health Care
13 (Suppl. II), ii22–ii27.
Westrum, R., 2009a. Information flow and problem-solving. In: Cosby, K.,
Crosskerry, P., Wears, R. (Eds.), Handbook of Patient Safety in Emergency
Medicine. Lippincott Williams, Philadelphia.
Westrum, R., 2009. Hearing Faint Signals. Presentation at the Petroleum Safety
Authority, Stavanger, Norway, May 15, 2009.
Wise, J. et al. (Eds.), 1994. Verification and Validation of Complex Systems: Human
Factors. Springer, New York.
R. Westrum / Safety Science 67 (2014) 58–63 63
... Two additional levels of reactive and proactive were extended into the culture typology resulting in five levels of safety cultures, adding depth to the framework and allowing more subtle classification to increase the accessibility of the framework to the industry to allow the idea of progressing through the levels of safety culture (Parker et al., 2006), though the additional levels have been criticised for deviating from the ideas of the original author regarding typology and being too lenient particularly in its definition of pathological culture (Westrum, 2014). ...
... Another aspect relates to information flow. When information flow can be delivered to the right people at the right time or information is used as a political commodity, there is a continuum between trust, cooperation and no trust (Westrum, 2014). Almost as an analogy to life blood, information flow in generative cultures becomes mission oriented and less personal (Westrum, 2014). ...
... When information flow can be delivered to the right people at the right time or information is used as a political commodity, there is a continuum between trust, cooperation and no trust (Westrum, 2014). Almost as an analogy to life blood, information flow in generative cultures becomes mission oriented and less personal (Westrum, 2014). ...
Article
Ninety-two per cent of industrial fatal accidents in Hong Kong in 2021 were attributed to the construction industry. Previous construction safety culture studies focused on projects as a singular organisation. Studies in building projects from a multitier perspective merit further examination. This study develops a safety culture maturity (SCM) framework to assess (new) building projects in Hong Kong at three levels: Client Safety Culture (CSC), Main Contractor Safety Culture (MSC) and Subcontractor Safety Culture (SSC). A closed question survey (N-31) incorporating subculture constructs (Informed, Just, Reporting, Learning and Flexible Cultures) with Hudson’s safety culture ladder (Pathogenic, Reactive, Calculative, Proactive and Generative Cultures) is conducted. The findings reveal: CSC (Mean: 3.55, SD: 0.512), MSC (Mean: 3.71, SD: 0.311) and SSC (Mean: 1.90, SD: 0.605). Kruskal-Wallis One-Way-ANOVA shows that the mean maturity value of three organisations are unequal (χ² = 53.8, df: 2, p: < 0.001). A SCM framework is designed for (new) building projects related to near misses and accidents. It would not be applicable to health-related, workplace violence, burglary and accidents outside construction-related activities. This is the first study to examine the safety culture maturity at three organisational levels. The study recommends improving project-based SCM by intervening and aligning individual levels between CSC, MSC and SSC.
... Westrum studied organizational information flow as a cultural classification (37). He found a continuum of safety cultures in three main categories: pathological, bureaucratic, and generative (38). ...
Article
Full-text available
Academicians and outside experts have modified HROs to better understand high-reliability organizing and fit corporate business models. During this process, the moral agent who provided the reliability part of HRO has been injured, and HROs have not met their promise. In this paper, we discuss the importance of moral agency for the operation of an HRO and the effect of corporate behaviors similar to corporate psychopathy.
... Only then can the system as-a-whole achieve the outcomes commensurate with its purpose [13,77]. Systemic leadership is a constant balancing act; how to best adjudicate the inevitably emerging competing demands of very different stakeholder interests [78,79]. ...
Article
The concept of ‘values’ in healthcare is widely debated, with no universal definition despite its central role in health system reform. This paper does not seek to define value narrowly but instead aims to stimulate a generative discussion among stakeholders. Using a systems-thinking framework, we explore four key perspectives: the subjective and value-laden nature of ‘value,’ the influence of financial interests, the role of personal values in shaping care delivery, and the potential for a shared human-centred value framework. By highlighting the diverse and often conflicting interpretations of value, we encourage an inclusive dialogue to guide the development of health systems that are equitable, patient-centred, and sustainable, benefiting individual stakeholders and society at large.
... 安全建言(safety voice)是对组织中的安全问题提出意见和担忧的特殊建言行为 (Tucker et al., 2008;Bienefeld & Grote, 2012;Noort et al., 2019a)。对于高可靠性行业,如航空航天业,建筑业,医疗保健等行 业都曾发生过因没有及时进行安全建言而导致的悲惨事故 (Noort et al., 2019a;Reader & O'Connor, 2014; Mid Staffordshire, 2013)。安全建言行为可以对安全隐患和不安全信息进行反馈,促进安全信息的流动, 从而达到预防事故发生,减少事故造成的人身危害及财产损失的作用,提高组织安全绩效。安全建言也 被认为是维持安全运行的核心,在安全系统的迭代优化和持续改进中也发挥着重要作用 (Westrum, 2014;Turner et al., 2015)。若没有及时反馈安全问题,即安全沉默(safety silence),容易导致更高的死亡率和更 差的安全性 (Anicich et al., 2015;Kines et al., 2010;Fioratou et al., 2010 ...
... (Hwang people are working together and how effective the work is in ensuring quality and providing safe operations can be obtained by checking the flow of information. So if information does not flow, it can negatively affect the functioning of the organization (Westrum, 2014). Therefore, the possibility of rework can be reduced by implementing an effective communication system at the beginning of the project to facilitate communication between project stakeholders (Hwang & Yang, 2014) According to (Frese & Keith, 2015), error management shows that it is not possible to avoid errors altogether, but negative consequences can be prevented, reduced, or even eliminated if errors are addressed as soon as they are discovered. ...
Article
Rework in a construction project is an unplanned and unwanted activity that requires an action to repeat part or all the work to meet project standards or requirements and, in its implementation, causes increased work, time, and costs. Based on previous studies that discuss the factors that cause rework, poor communication management or coordination between stakeholders is one of the causes of rework. Therefore, a literature study was conducted to identify project communication management risks that allow rework to occur and resulted in 54 communication management risk factors. Furthermore, the results of the literature study were validated by experts who have more than 10 years of experience in the construction field. As a result of the validation, there were 48 communication management risk factors that could cause rework.
Book
Full-text available
Komunikasi yang efektif merupakan kunci utama dalam menjalankan dan mengembangkan sebuah organisasi. Tanpa komunikasi yang baik, berbagai upaya dan strategi organisasi tidak akan berjalan dengan optimal. Buku ini dirancang untuk memberikan pemahaman komprehensif tentang konsep-konsep dasar komunikasi organisasi, tantangan-tantangan yang sering dihadapi, serta solusi praktis yang dapat diimplementasikan untuk meningkatkan efektivitas komunikasi di dalam organisasi. Setiap bab dalam buku ini membahas berbagai topik penting mulai dari teori-teori komunikasi, peran komunikasi dalam membangun budaya organisasi, teknik-teknik komunikasi yang efektif, hingga penggunaan teknologi dalam komunikasi organisasi. Selain itu, kami juga menyertakan studi kasus dan contoh nyata dari berbagai jenis organisasi untuk memberikan gambaran praktis tentang penerapan strategi komunikasi yang efektif.
Article
Purpose This study establishes an ontology-based framework for rework risk identification (RRI) by integrating heterogeneous data from the information flow of the prefabricated construction (PC) process. The main objective is to enhance the automation level of rework management and reduce the degree of reliance on human factors and manual operations. Design/methodology/approach The proposed framework comprises four levels aimed at managing dispersed rework risk knowledge and integrating heterogeneous data. The functionalities were realised through an integrated ontology that aligned the rework risk ontology with the PC ontology. The ontologies were developed and edited with Protégé. Ultimately, the potential benefit of the framework was validated through a case study and an expert questionnaire survey. Findings The framework is proven to effectively manage rework risk knowledge and can identify risk objects, clarify risk factors, determine risk events, and retrieve risk measures, thereby enabling the pre-identification of prefabricated rework risk (PRR) and improving the automation level. This study is meaningful and lays the foundation for the application of other computer methods in rework management research and practice in the future. Originality/value This research provides insights into the application of ontology to solve rework risk issues in the PC process and introduces a novel risk management method for future prefabricated project research and practice. The findings have significant theoretical value in terms of enriching the methods of risk assessment and control and the information management system of prefabricated projects.
Article
Failures of listening to individuals raising concerns are often implicated in safety incidents. To better understand this and theorize the communicative processes by which safety voice averts harm, we undertook a conceptual review of “safety listening” in organizations: responses to any voice that calls for action to prevent harm. Synthesizing research from disparate fields, we found 36 terms/definitions describing safety listening which typically framed it in terms of listeners’ motivations. These motivational accounts, we propose, are a by-product of the self-report methods used to study listening (e.g., surveys, interviews), which focus on listening perceptions rather than actual responses following speaking-up. In contrast, we define safety listening as a behavioral response to safety voice in organizational contexts to prevent harms. Influenced by cognitive, interactional, and environmental factors, safety listening may prevent incidents through enabling cooperative sensemaking processes for building shared awareness and understanding of risks and hazards.
Chapter
Within the framework of the new concept of strategic management and entrepreneurship, it has become customary to represent an organization in the form of a business model reflecting core aspects of the organization in economic, social, cultural or other contexts. The article is devoted to the development of a concept of constructing business models, reflecting new ways of existence and functioning of organizations in conditions of continuous interaction with the information environment. A model of an organization as a fragment of an information network formed by a set of open interacting organizations has been built. The state of the system in terms of the finite network is modeled as the result of its interaction with the environment, determined by the input and output information flows. Changes of the input and output flows leads to changes in the system state. A topological model of a digital information space is built, which is represented by interconnections between organizations through input and output information streams. It is shown that in the space there always exist contours formed by chains of information flows, connecting organizations with each other. A method has been developed for finding the contours that determine the mission of each organization in society. The following concepts are proposed as characteristics of the contour that determine the current state of the organization: the length of the contour, the strength of the connectivity of the contour, the potential of the contour. The principle of building business models are based on considering the of managing the organization is formulated as the task of ensuring that the system belongs to the contours formed by the chains of information flows that ensure the inclusion of the system in the information space, predicting the emergence of new needs and inclusion in new contours to meet these needs. This approach allows us to correlate local goals and goals of society and organize management in organizations from the point of view of cooperation of goals. As part of further research, it is planned to develop a management support system (MSS) for supporting planning and decision making in organizations based on the proposed models and methods.
Article
Full-text available
Poor quality in construction often manifests in the need for rework to be performed. Yet, no specific theory exists in the literature to provide practical guidance to mitigate rework. While common conceptualizations of rework abound, they fall short of the strict empirical testing required to develop a “good” theory. Rumsfeld's heuristic suggests that within complex production systems such as construction, there are known knowns, known unknowns, and unknown unknowns. Using this heuristic as a metaphor, we undertake a narrative review of the literature to make sense of what is known , determine the key unsolved problems , and identify issues that research has failed to consider about rework in this article. Our article questions prevailing knowledge and aims to stimulate richer and more in-depth lines of inquiry to make headway toward developing a theory that explains rework causation. The much-needed theory would provide a platform for designing and developing effective strategies for mitigating rework in construction. Thus, this article's contributions are twofold as it identifies unresolved issues stymieing the ability to address rework effectively and promulgates new avenues of inquiry to create an improved understanding of the rework phenomenon, thus setting the foundation for sparking theoretical development.
Article
Full-text available
This research explores how group- and organizational-level factors affect errors in administering drugs to hospitalized patients. Findings from patient care groups in two hospitals show systematic differences not just in the frequency of errors, but also in the likelihood that errors will be detected and learned from by group members. Implications for learning in and by work teams in general are discussed.
Article
Full-text available
In making decisions about the reality of alleged anomalous events, scientists are likely to weigh both the a priori plausibility of what is alleged and the credibility of the reports which reach them. The present paper is an attempt to examine the anomaly reporting processes which led to the scientific recognition of the reality of meteorites in the eighteenth century. It is shown that scientists fail to make realistic assumptions about anomaly reporting, and that this failure affects the accuracy of the decisions made about anomalies. The treatment of reports about alleged anomalous events is further shown to be related to the scientific community's concerns about protecting its internal processes from external Interference. The recognition of meteontes took place only when the savants of the eighteenth century 1) found a way of evaluating the reports, 2) devised a theory to explam them, and 3) received unimpeachable eyewitness testimony of their occurrence.
Book
Despite its increasing importance, the verification and validation of the human-machine interface is perhaps the most overlooked aspect of system development. Although much has been written about the design and developmentprocess, very little organized information is available on how to verifyand validate highly complex and highly coupled dynamic systems. Inability toevaluate such systems adequately may become the limiting factor in our ability to employ systems that our technology and knowledge allow us to design. This volume, based on a NATO Advanced Science Institute held in 1992, is designed to provide guidance for the verification and validation of all highly complex and coupled systems. Air traffic control isused an an example to ensure that the theory is described in terms that will allow its implementation, but the results can be applied to all complex and coupled systems. The volume presents the knowledge and theory ina format that will allow readers from a wide variety of backgrounds to apply it to the systems for which they are responsible. The emphasis is on domains where significant advances have been made in the methods of identifying potential problems and in new testing methods and tools. Also emphasized are techniques to identify the assumptions on which a system is built and to spot their weaknesses.
Article
Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them. The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.