ArticlePDF Available

The study of information flow: A personal journey

Authors:

Abstract

Information flow has been shown to be a key variable in system safety. Not only is information flow vital to the organization’s “nervous system,” but it is also a key indicator of the quality of the organization’s functioning. The author describes how his personal trajectory took him from the study of social information about anomalous events to the role of information in causing or preventing technological accidents. The important features of good information flow are relevance, timeliness, and clarity. Generative environments are more likely to provide information with these characteristics, since they encourage a “level playing field” and respect for the needs of the information recipient. By contrast, pathological environments, caused by a leader’s desire to see him/herself succeed, often create a “political” environment for information that interferes with good flow.
The study of information flow: A personal journey
Ron Westrum
Department of Sociology, Anthropology and Criminology, Eastern Michigan University, Ypsilanti, MI 48197, USA
Society and Risk, University of Stavanger, Norway
article info
Keywords:
Information flow
Culture
Organizational quality of life
Employee empowerment
Hidden events
abstract
Information flow has been shown to be a key variable in system safety. Not only is information flow vital
to the organization’s ‘‘nervous system,’’ but it is also a key indicator of the quality of the organization’s
functioning. The author describes how his personal trajectory took him from the study of social informa-
tion about anomalous events to the role of information in causing or preventing technological accidents.
The important features of good information flow are relevance, timeliness, and clarity. Generative envi-
ronments are more likely to provide information with these characteristics, since they encourage a ‘‘level
playing field’’ and respect for the needs of the information recipient. By contrast, pathological environ-
ments, caused by a leader’s desire to see him/herself succeed, often create a ‘‘political’’ environment
for information that interferes with good flow.
Ó2014 Elsevier Ltd. All rights reserved.
1. Introduction
The role of information in making systems safe is profound. Not
only is information flow a prime variable in creating safety, but
also it is an indicator of organizational functioning. By examining
the culture of information flow, we can get an idea of how well
people in the organization are cooperating, and also, how effective
their work is likely to be in providing a safe operation.
My interest in information flow is the result of a lifetime of pro-
fessional work and theorizing. My studies on information flow go
back into my undergraduate years, and weave through my gradu-
ate work at Chicago and later the RAND Corporation. But my re-
search wasn’t initially focused on safety. In the early 1970s I was
studying the role that information played in scientists’ decisions
about anomalous events. This was the result of a fascination with
such unusual events as UFOs and meteorites (Westrum, 1978).
Why didn’t information about anomalies flow to the people who
needed it? In my studies, I discovered, early on, that scientists of-
ten were unaware of their own biases in making decisions about
unusual events. For instance, they often assumed that they would
be ‘‘the first to know’’ about anomalous events, but in fact they of-
ten remained ignorant. They were unaware that their own biases
interfered with search and with information flow. Thus, they often
suffered from the ‘‘fallacy of centrality’’ (my term), thinking that
they were critical in the flow of information about anomalies,
but in fact they were often left out.
Eventually I realized that anomalies were often what I learned
to call ‘‘hidden events’’ (Westrum, 1982,1986). Hidden events were
things that in the words of Raymond Moody, were ‘‘very widely
experienced but very well hidden’’ (Moody, 1975). Hidden events
might lie beneath the surface of social radar, because of the ‘‘plu-
ralistic ignorance’’ (Katz and Allport, 1931) of those who experi-
enced them. Pluralistic ignorance is the reluctance to report
something no one else seems to be reporting. So of course, if others
are not reporting, then you don’t want to report, either; no one
wants to speak up. Similar forces kept silent the many victims of
sexual abuse by the British celebrity Jimmy Savile, all of whom
thought they would be singled out, when in fact they numbered
in the hundreds (Burns and Cowell, 2013; Burns and Castle,
2012). But these disparate pieces of data were not put together.
In 1978–1979, I had a sabbatical year at the Science Studies Unit
of the University of Edinburgh. During the 8 months or so I spent
there, I was doing two things. The first was the preparation of
my book on the sociology of hidden events. I was trying to put to-
gether everything that was known on how society makes decisions
about anomalous events. I had started with UFOs, but soon
branched off into other areas. For instance, while taking a course
on ‘‘Forensic Medicine for Lawyers,’’ taught by Professor John Ma-
son, an expert on air crashes, I first heard about the ‘‘battered child
syndrome.’’ Aha, I thought, another ‘‘hidden event,’’ like meteorites
and UFOS! The book on hidden events was not fated to be pub-
lished, but did stimulate further research.
While taking the Forensic Medicine course, I had began to stop
over at the Medical Library, and shortly encountered the periodical
Aviation Psychology and Environmental Medicine. It was in this
magazine that I found the first articles I had read on the social
http://dx.doi.org/10.1016/j.ssci.2014.01.009
0925-7535/Ó2014 Elsevier Ltd. All rights reserved.
Address: Department of Sociology, Anthropology and Criminology, Eastern
Michigan University, Ypsilanti, MI 48197, USA. Tel.: +1 734 355 5538.
E-mail addresses: ronwestrum@aol.com,ron.westrum@emich.edu
Safety Science 67 (2014) 58–63
Contents lists available at ScienceDirect
Safety Science
journal homepage: www.elsevier.com/locate/ssci
psychology of the airliner cockpit. Some were written by NASA
psychologist John Lauber, with whom I was later to correspond.
These articles engaged me, but I was still researching the hidden
events dynamics, and would later sum up my research in a paper
on ‘‘Social Intelligence about hidden events: Its implications for
scientific research and social policy’’ (1983). But ‘‘hidden events’’
seemed to engage few besides myself, and my research did nothing
to advance my career. Other sociologists of science urged me to
switch to a less controversial topic.
And gradually, that happened. I first begin to realize that the
framework I had slowly erected to study hidden events might be
useful for the study of technological accidents. And about this time
I had another visiting professorship at the University of Hawaii.
While shopping in its bookstore I noticed Charles Perrow’s book
Normal Accidents (1984). This broke like a phosphorus shell over
my head—as I imagine it broke over the heads of others. It sug-
gested that accidents were amenable to sociological analysis.
About the same time I also discovered the writings of James Reason
on human error and accidents. This seemed to be a promising area,
and I dove into it. Lauber, Perrow, and Reason all seemed to be onto
something important.
Then in 1986 there was a conference on aviation safety in Bad
Windsheim, Germany. I got invited through a funny turn of events.
I had sent some very half-baked preprints to Perrow, who was in-
vited to this conference. But he couldn’t go, so he suggested that
the sponsors invite me. When I got there, I discovered I was to take
Perrow’s place! I certainly was not up to Perrow at that point, but
at this conference were Jim Reason, and also Irving Janis, both
giants, in my estimation (Wise et al., 1994). In 1986 I was a min-
now. However, I rose to the challenge, and shortly began to be in-
vited to other conferences. (Conferences are like pinball games:
You win one, you get to play another. You win that, you get invited
to yet another, and so on.)
And then I had a real breakthrough. In October 1988 the World
Bank would have a large conference on ‘‘Safety Control and and
Risk Management.’’ It seemed as if this was intended to be the
be-all and end-all of conferences on human factors in system
safety. Participants had to prepare papers for the conference, and
I was working on mine one evening when I suddenly got a string
of insights that were extremely helpful. First of all I realized that
there was a continuum of safety cultures that fell into three gen-
eral categories: pathological, bureaucratic and generative.
Pathological organizations are characterized by large amounts
of fear and threat. People often hoard information or withhold it
for political reasons, or distort it to make themselves look
better.
Bureaucratic organizations protect departments. Those in the
department want to maintain their ‘‘turf,’’ insist on their own
rules, and generally do things by the book—their book.
Generative organizations focus on the mission. How do we
accomplish our goal? Everything is subordinated to good per-
formance, to doing what we are supposed to do.
This classification (and derivations of it) would become widely
used.
2. Elaboration of the Three Cultures Model
James Reason and Patrick Hudson, have suggested expanding
my classification of cultures into a five-part version, essentially
by defining ‘‘reactive’’ as between ‘‘pathological’’ and ‘‘bureau-
cratic,’’ and ‘‘pro-active’’ as between ‘‘bureaucratic’’ and ‘‘genera-
tive.’’ (e.g. Hudson, 2007) And some persons have felt that this
was a useful improvement. I believe that there might be two valid
reasons for doing this. The first would be theoretical, that there
might be some theoretical advantage for this more articulated
scale. The second would be that there is some empirical stimulus
for such an articulation. However, I have not seen either theoretical
arguments nor empirical evidence for doing so.
When I originally created the concept, I thought of pathological,
bureaucratic, and generative as ideal types. However, over time I
have come to think of the three as forming points on a scale, what
one friend of mine, Russell Briggs, calls the ‘‘Westrum continuum.’’
Why ‘‘bureaucratic’’ should be in the middle might be questioned,
but there are many case studies that show that this is a distinct
state of affairs. However, no such case studies exist, I believe, to
support the proposed two additional types. There certainly must
be intermediate points between, e.g., pathological and generative,
something can be more pathological or less pathological, more or
less generative, etc. In specific empirical examples, it might be
valuable to name the intermediate points, but you could also argue
that this is a continuous scale, and numbering the points from, say,
one to ten would work just as well.
It might seem small-minded to refuse the Reason/Hudson elab-
oration, rather like Marx saying ‘‘je ne suis pas un Marxiste,’’ as he
did, in regard to some of his followers. I believe that the two schol-
ars first mentioned it to me in the 1980s, and I didn’t object. But
time has passed, and the additional arguments and data have not
come forth. Professor Dianne Parker of the University of Manches-
ter has done some work using these five types (e.g. Parker et al.,
2006), and asked a sample of persons in her business seminars to
describe how they would put organizational cultures into the five
categories. In looking over her materials, however, I discovered
that the behaviors her survey participants imagined did not corre-
spond to the behaviors I had found in the case studies I did. In par-
ticular, the ‘‘pathological’’ examples did not correspond to what I
see as being genuinely pathological environments; real pathologi-
cal organizations are much worse. I am not saying that Reason and
Hudson are wrong, I am simply saying that I am unconvinced that
this is a genuine improvement. I believe it is up to the aforemen-
tioned gentlemen to show the value they see in the classification
they have. This is a task yet to be accomplished.
Related to my classification of culture was another idea. These
cultural styles are associated with how the organization is likely
to respond to information that things are not going well. We might
see a set of ways in which an organization might respond to anom-
alous information. There were six ways:
(1) First of all, the organization might ‘‘shoot the messenger.’’
(2) Second, even if the ‘‘messenger’’ was not executed, his or her
information might be isolated.
(3) Third, even if the message got out, it could still be ‘‘put in
context’’ through a ‘‘public relations’’ strategy.
(4) Fourth, maybe more serious action could be forestalled if
one only fixed the immediately presenting event.
(5) Fifth, the organization might react through a ‘‘global fix’’
which would also look for other examples of the same thing.
(6) Finally, the organization might engage in profound inquiry,
to fix not only the presenting event, but also its underlying
causes. The scale of reactions might appear like this:
Public Relations Global Fix
***@********@***************@*****@**************@********@
Suppression
Encapsulation Local Fix Inquiry
Each of these reactions was associated with the cultural types.
Suppression seemed to be a type marker for pathological environ-
ments. Bureaucratic climates tended to use encapsulation, public
relations and local fixes. Generative environments were more
likely to engage in global fixes and inquiry. These ideas about
R. Westrum / Safety Science 67 (2014) 58–63 59
reactions occurred to me over a few hours, though the work on
culture and information flow had been longstanding. Yet bringing
it together changed my perspective. Sometimes nothing is as
valuable as a theory uniting disparate things.
In great contrast to the tepid reactions to my work on hidden
events, the ‘‘three cultures’’ idea caught on quickly, especially after
Jim Reason cited it his classic Human Error (1990). (Shortly after, I
met a French colleague, Jean Paries for the first time, and he asked
me if I was the ‘‘famous’’ Ron Westrum!.) And information flow
seemed to bring together not only culture, but also management
style. Later I would suggest that pathological environments were
caused when the leader puts stress on his/her own advancement
and perquisites (Westrum 2004). By contrast, bureaucratic envi-
ronments come when leaders put departmental goals ahead of
organizational ones. And generative environments focus on the
organization’s mission above all else.
But how, in fact, does a theory of ‘‘information flow’’ unite so
many disparate currents? Let us reflect for a few moments and
see how it works. Recall, there are really two critical reasons for
attending to information flow.
(1) First, when information does not flow, it imperils the safe
and proper functioning of the organization.
(2) Second, information flow is a powerful indicator of the orga-
nization’s overall functioning.
3. Information flow as a vital resource
Organizations function on information. Stop the information
and the organization stops, too. Better organizations require better
information. Worse information flows leads to worse functioning.
Better information flow leads to better functioning.
The words ‘‘information flow’’ suggest water moving through a
pipeline. Yet the mere fact of information itself tells us little about
the character of the information. More information is not necessar-
ily better. Good information has these characteristics:
(1) It provides answers to the questions that the receiver needs
answered.
(2) It is timely.
(3) It is presented in such a way that it can be effectively used
by the receiver.
These sound like simple criteria, but they are fact often very dif-
ficult to meet in practice. Take the first one, for instance, which is
often violated. Getting the right information to the right person is
absolutely critical. But the underlying issue is that the information
should respond to the needs of the receiver, not the sender. Recei-
ver-focused information is a powerful sign that the organization is
engaging in teamwork. Yet, often, information may instead serve a
political purpose, to protect the sender (‘‘covering one’s posterior,’’
in common language), to provide the illusion of cooperation, or in
the worst case, to baffle or deceive the receiver.
Timeliness is another desideratum. The same information, pre-
sented a day later or a second later, may be useless. We want to
be updated in a timely way, otherwise we are ‘‘out of the loop.’’
Information is used to make decisions. If it is late, it may put us
in a bad way versus an adversary, it may blind us to a current dan-
ger, it may lead to a wrong decision. Cf. Col. John Boyd’s famous
idea to stay ‘‘inside the loop’’ of an opponent, so that one’s own
decision cycle is faster (Coram, 2002). But more often, it is simply
the needs of the system that drive timeliness. Whether driving a
truck, building a building, or operating a search-and-rescue sys-
tem, information need is driven by the reality that decisions are
constantly being made, and they need to be informed by the best
information.
And then there is the signal-to-noise ratio. The information we
get consists not only of the facts we need to know, but many we do
not need to know. Has the sender offered us a ‘‘bill of lading’’ tell-
ing us the value of the information we have been sent, i.e., why it
has been sent, the reliability of the source, the relevance to current
problems, etc.? [a ‘‘bill of lading’’ is an old maritime term for the
list of cargo a merchant has consigned to a steamer for shipment.
I find it useful] If not, we have to do the sifting, trying to figure
out what is important, and what is not. A useful analogy is an oper-
ation carried out by the Emergency Department in a hospital,
where confusion, multi-tasking, and constant pressure exist. We
need to get the information to answer our questions, we need to
get it right now, and we need to avoid the information being fuzzed
or confused by shifting attention and inappropriate prompts. To
have a ‘‘mind like the moon,’’ that shines serenely on everything,
is demanding, but necessary (Westrum, 2009a,b).
A particularly interesting example of this clarity was the com-
munication of the information regarding the NASA lunar orbit deci-
sion. Engineer John Houbolt successfully convinced NASA that the
Apollo program needed a lunar orbit solution. John Houbolt, then
director of Dynamic Loads at the Langley Space Center, was sure
that the only right way was to use a ‘‘lunar orbit.’’ This would in-
volve a spaceship circling the moon sending down a ‘‘lander’’ to
the moon’s surface. Other methods that were being considered
were a large ‘‘Nova’’ rocket that could land on the moon and then
take off, and an ‘‘earth orbit’’ solution, in which an earth-circling
space station would send a ship off to the moon. The immediate
reception of Houbolt’s idea inside NASA was largely negative. Max-
ime Faget, one of the ‘‘maestros’’ of NASA’s technological culture,
suggested that Houbolt’s numbers would never work out. Von
Braun still liked his ‘‘Nova’’ idea. Houbolt, though, went ‘‘back to
the drawing board,’’ and then came back with somewhat different
numbers. This proposal, too, was rejected. But Houbolt kept com-
ing back with better proposals. Eventually, Werhner von Braun
would say that clearly Houbolt was right. Everyone was ready to
consider all the possibilities, because everyone wanted to fulfill
Kennedy’s promise to land on the moon before the end of the
1960s. It was this vision, to fulfill the Kennedy promise, that got
people initially hesitant to agree finally that Houbolt’s idea was
the only one that would work in the time frame (Hansen, 1995).
What is important about the decision is that originally, Houbolt
wasn’t one of the major ‘‘players’’ in the moon shot decision. How-
ever, because of the value of his concept, and his persistence, he
was able to call attention to the work that he was doing, and he
had more than one chance to present it. Sometimes problems need
a champion like Houbolt, who will insist on others paying attention
to his idea.
In a similar (though of course much less important) instance,
Col. Jack Broughton describes becoming a safety champion in re-
gard to the problems of the F-106 ejection seat in 1964. Broughton,
as an Air Force squadron commander, had had many of his pilots
killed when the ejection seat on the F-106 failed. After 14 pilots
had died, Broughton forced the commanding general (4 stars) of
the Air Defence Command, Herbert Thatcher, to ground this defec-
tive fighter, the F-106. Yet even though the seat had killed over a
dozen pilots, General Thatcher did not want the F-106’s grounded.
They provided a needed service as bomber interceptors. But by
putting his career on the line, and confronting General Thatcher
in a forceful way, Broughton managed to get him to change his
mind. Thatcher, of course, could have punished Broughton for
insubordination, but he understood that the passion Broughton
exhibited testified to a loyalty to the pilots rather than to simple
rebellion. Hughes then fixed the seat (Broughton, 2008).
Summing all this up, we might look at the organization as a
whole. In regard to the information inside the organization, how
much of it can the organization actually access? Suppose we could
60 R. Westrum / Safety Science 67 (2014) 58–63
give some general parameter that might represent the ability of the
organization to use some piece of information it has, regardless of
who has it, why it has been acquired, and how it is labeled. Can we
use what we have got? Do we know what we have got? And the
answer is frequently ‘‘no, we can’t’’ or, ‘‘no, we don’t.’’ We can’t
use it because A doesn’t consider herself on the same team as B.
We can’t use it because it would reveal something embarrassing
about the maintenance department or admit some organization
fault bad for morale. We can’t use it because information from that
particular source is low-status, so we don’t listen, etc.
Let’s call this parameter IF. Pathological organizations have low
IF, bureaucratic organizations have middling IF, and generative
organizations have high IF. That means that if you ask a patholog-
ical organization to use its information, it will have big problems
doing that. And it means that often generative organizations will
succeed where pathological organizations fail, because the former
are better at utilizing the information they have. An organization’s
ability to use the information it has is a most important feature.
And it also reflects the general quality of performance in the
organization.
4. Information flow as an indicator
If we can agree that ability to use its information is important to
an organization, it is also true that this ability says other things
about an organization. This is less obvious, but it is no less true.
Information flow reflects culture. So what people in organiza-
tions do with information reflects a number of other things.
The first thing it reflects is cooperation. Information tends to
flow well when people co-operate, because cooperation and infor-
mation flow both respond to trust. Where there is trust, there is
more cooperation. And trust also means that people share informa-
tion. By contrast, when there is no trust (e.g. in pathological envi-
ronments) information becomes a political commodity. It is used to
help friends, but also to harm enemies.
Trust is established when ‘‘the walk matches the talk,’’ one of
the best indicators of a generative environment. Leaders who are
honest and forthright create an environment in which people will
talk about their concerns. This has further impacts in getting peo-
ple to report the ‘‘latent pathogens’’ (Reason, 1990; Westrum,
2009a,b) and increases the willingness to consult when puzzling
events happen. When people consult more, their grasp of the tech-
nical issues increases, and they are less likely to do things they do
not know are dangerous. A lack of such consulting was a major fac-
tor in the Tokai-Mura nuclear accident in Japan. In that event,
workers engaged in work-arounds that were technically unwise,
leading to a radiation accident (Westrum, 2000).
The second thing information flow reflects is the quality of deci-
sion-making. Good decisions are made on more complete informa-
tion, but also, once the decision is made, the decision is more open
to examination, because openness and transparency are both part
of information flow. If people make bad decisions in an open orga-
nization, they become more obvious—and in some cases can be re-
versed. By contrast, when they take place in secret, not only are the
decisions often worse, but they are harder to change. A bad deci-
sion in a closed environment is not only a mistake, but it is a fail-
ing. The decision-maker can be attacked for making it. So it is often
covered up or denied.
A perfect example of the latter is seen in Mayor (New York)
Giuliani’s decision to fire Richard Murphy, an outstanding youth
administrator, who had created the ‘‘beacon schools’’ project.
Giuliani had accused Murphy of favoritism and corruption. After
those charges were proven false, Giuliani’s response was not to
acknowledge that he had made a mistake, but to suggest that the
mistake was a minor matter.
‘Mr. Giuliani shrugged. ‘‘This happens all the time,’’ he said.
‘‘And you write about those things all the time. Sometimes they
turn out to be true. And sometimes they turn out to be wrong.’’
(Dwyer, 2013b).
But the accusation wasn’t a minor matter. Giuliani had said ‘‘My
immediate goal is to get rid of the stealing, to get rid of the corrup-
tion.’’ He ruined Murphy’s reputation, which never quite recovered,
though Murphy went onto do many other good things. (Martin and
Murphy, 2013; Dwyer, 2013) This kind of political assassination is,
unhappily, typical of pathological environments.
Third, information flow reflects the quality of life in that organi-
zation. An organization where information flows well is also one
that, I believe, does a better job with its people. Certainly, a prime
example of a generative organization would be Southwest Airlines,
one of the most popular employers in the USA (Freiberg and Frei-
berg, 1998). It is also one of the best airlines in terms of its promo-
tion of people. And it promotes everybody, not just the elite few.
This is in striking contrast to say, Microsoft or General Motors,
where policies promote only those on the star track (GM) or the
top, say, 10% (Microsoft) (see Eichenwald, 2012). It is also in strik-
ing contrast to American Airlines, which in many respects seems to
exhibit a pathological culture (Gittell, 2002). A culture similar to
Southwest’s was seen on the US destroyer Benfold, during the ten-
ure of captain Michael Abrashoff, when he took what was appar-
ently a pathological culture, and turned it into a generative one
(Abrashoff, 2002). Information flow was one of the benefits of this
generative environment. Not only were the ship’s ‘‘latent patho-
gens’’ quickly identified and fixed, but the openness led to solving
a technological communication problem that had long hobbled the
Pacific Fleet.
What is it about generative culture that makes information
flow? Actually, generative culture supports IF for three reasons.
First of all, there is the trust issue, mentioned earlier. Second, gen-
erative culture emphasizes the mission, an emphasis that allows
people involved to put aside their personal issues and also the
departmental issues that are so evident in bureaucratic organiza-
tions. The mission is primary. And third, generativity encourages
a ‘‘level playing field,’’ in which hierarchy plays less of a role (e.g.
Edmondson, 1996). Thus one can have a ‘‘boundaryless organiza-
tion,’’ to use a phrase popular at General Electric (Ashkenas et al.,
1995). Because hierarchy is de-emphasized, most of the problems
that attend getting information to flow up hierarchies disappear.
Information flow, then, is testimony to quality of worklife, because
a bad quality of worklife will interfere with flow.
Another feature of generative environments is getting informa-
tion to flow across the internal boundaries of the organization.
When every department competes with other departments, e.g. in
a bureaucratic environment, sharing information with other
departments is a problem. Yet often the ‘‘faint signals’’ that some-
thing is amiss are perceived by the ‘‘wrong person’’ organization-
ally. There is a very interesting anecdote about Charles Franklin
(‘‘Boss’’) Kettering in Douglas MacGregor’s book (1966, 118–119).
He mentions that Kettering one day had gone down to the shipyard
to see about the installation of one of his diesel engines. While
there, he noticed a painter looking at the ship’s propeller. He asked
the painter why he was staring at the propeller, and the painter of-
fered that he thought the propeller was too large by five inches.
Concerned that something might be wrong, Kettering called the de-
sign office, and asked them to come down and measure the propel-
ler, which they claimed they had already done. When the propeller
was re-measured, however, the painter turned out to be correct.
And then we have the famous ‘‘bottle of champagne’’ story
about Wernher von Braun. Since this special number regards
the ‘‘Foundations of Safety Science,’’ I am going to quote the ori-
ginal source of this story, von Braun himself.
R. Westrum / Safety Science 67 (2014) 58–63 61
‘‘One of our early Redstone missiles developed trouble in mid-
flight. The telemeter records indicated that the flight had been
flawless up to that instant, and permitted us to localize the
probable source of trouble. However, the suspected area had
been very carefully checked in numerous laboratory tests so
that all explanations sounded highly artificial.
Several theories were advanced. Finally one theory was
accepted as most likely and remedial action on it was initiated.
As this point an engineer who was a member of the firing group
called and said he wanted to see me. He came up to my office
and told me that during firing preparation he had tightened a
certain connection just to make sure that there would be good
contact.
While doing so, he had touched a contact with a screwdriver
and drawn a spark. Since the system checked out well after this
incident, he hadn’t paid any attention to the matter. But now
everybody was talking about a possible failure in that particular
apparatus, he just wanted to tell me the story, for whatever it
was worth. A quick study indicated that this was the answer.
Needless to say, the ‘‘remedial action’’ was called off and no
changes were made.
I sent the engineer a bottle of champagne because I wanted
everybody to know that honesty pays off, even if someone
may run the risk of incriminating himself. Absolute honesty is
something you simply cannot dispense with in a team effort
as difficult as that of missile development.’’ (von Braun, 1956,
p. 41)
And in fact von Braun was an exemplary leader when it came to
seeing that information flowed up, down, and across at Marshall
Space Flight Center (Tompkins, 1993). By contrast, in pathological
organizations, information flow is problematic. First of all, an envi-
ronment of fear and dread makes people cautious about what they
say. People try to minimize risks. Second, the emphasis is on pro-
viding additional power and glory to the chiefs. Whatever does
not serve this end is less important. And third, hierarchy provides
strong incentives not to ‘‘speak up,’’ to avoid provoking retaliation.
For instance, the original reason for creating ‘‘cockpit resource
management’’ seemingly had to do with encouraging co-pilots to
speak up to their pilots. But ‘‘speaking up’’ is discouraged when
blame and punishment are a major element in the environment,
or when differences in power and position are great (Stein, 1967).
Problems with hierarchy can become much worse when the
manager is a bully, and is willing to seek severe sanctions against
those unwilling to ‘‘get in line.’’ The candidacy of John R. Bolton
for American ambassador to the United Nations allowed a rare
glimpse of a public figure who seems routinely to have used bully-
ing to pressure unwilling subordinates and associates. Bolton
sought to punish, by threat and removal from office, those under
his management control (and even those not) who defied his
wishes. Several officials testified to Bolton’s threats and attempts
to have them dismissed when they had a different opinion than
his own and to his willingness to override their ideas in favor of
his own (Jehl, 2005).
Another special problem, when powerful people are involved, is
that certain subjects become ‘‘off-limits.’’ This is very common, for
instance, in sexual abuse cases. We have already mentioned the
case of Jimmy Savile at the BBC (see also Dintner, 2003). ‘‘Off-lim-
its’’ was also an operative factor in the disturbing Penn State child
abuse case, in which protecting an athletic team and its coach was
more important than the safety of young people (Belson, 2012).
And of course it was also involved in virtually all the child abuse
cases within the Catholic Church (Berry, 1992; Dintner, 2003)
and furthermore, many other churches and religious organizations
as well.
But power also operates to bottle up in information in frankly
political situations, for instance in authoritarian or totalitarian re-
gimes, when certain facts are ‘‘inconvenient’’ (French, 2000). Dur-
ing the ‘‘Great Leap Forward’’ in China, a famine was raging, but
information about it was encapsulated because it was politically
incorrect. If local leaders came forward, they would be accused of
being ‘‘right deviationists’’ or food hoarders, and would suffer bru-
tal punishments or death. Lower level leaders, eager to appear
proper communists, lied about the harvest results, with the out-
come that little about the reality appeared on the surface. In addi-
tion the party cadres often had food when the peasants did not,
decreasing their motivation to raise the issue. In the end, 38 mil-
lion people would die during the famine. It would be years before
the truth was generally known. (Jisheng, 2012).
5. Conclusion
What I have tried to establish here is that the safety scientist
must necessarily be concerned with information flow. Not only is
information flow the life blood of the organization’s nervous sys-
tem, but it is also a powerful indicator of other processes within
the organization. I must admit, that early on, I did not appreciate
how valuable IF analysis was. But its value became clear as more
and more fellow scholars, and their industries as well, adopted
and used these concepts in evaluating organizations and opera-
tions, from the emergency department, to the flight deck, from
the nuclear power console to the oil platform. I remember Jim Rea-
son asking me, in the late 1980s, ‘‘Why do think information is so
important?’’ I don’t think anyone would ask that question now.
Information flow is deeply linked to the safety culture of the
organization (Cf. Curry et al., 2011). Indeed, many others have pro-
posed and used other measures of safety culture, notably Prono-
vost (2010) in relation to infection control. Furthermore, others
have modified—i.e. elaborated—my own three categories into
more. As I have indicated, I think the value of this elaboration is
shaky. No doubt there is a ‘‘Westrum continuum.’’ In fact one
might say, as one goes from pathological to generative, that the
organization becomes more-mission oriented, and less ‘‘personal.’’
Identifying sub-categories might be helpful in particular contexts. I
am not convinced that there is a theoretical reason to do so.
In summing up, culture is no longer neglected. Information flow
is of course only one issue among many in safety culture, but I feel
it is a royal road to understanding much else.
References
Abrashoff, M., 2002. It’s Your Ship: Management Lessons from the ‘‘Best Damn Ship
in the Navy’’. Warner Books, New York.
Ashkenas, R., Ulrich, D., Jick, T., Kerr, S., 1995. The Boundaryless Organization:
Breaking the Chains of Organizational Structure. Jossey-Bass, San Francisco.
Belson, K., 2012. Abuse Scandal Damns Paterno and Penn State. New York Times, 13
July, pp. A1, B13.
Berry, J., 1992. Lead Us Not Into Temptation: Catholic Priests and Sexual Abuse of
Children. Doubleday, New York.
Broughton, J., 2008. Rupert Red Two: A Fighter Pilots Life from Thunderbolts to
Thunderchiefs. Zenith Press, St. Paul.
Burns, J.F., Castle, S., 2012. BBC’s Leaders Faulted as Lax in Handling Sex Abuse
Crisis. New York Times, 20 December.
Burns, J.F., Cowell, A., 2013. Report Depicts Horrific Pattern of Child Sexual Abuse by
BBC Celebrity. New York Times, 12 January, p. A8.
Curry, L. et al., 2011. What distinguishes top-performing hospitals in acute
myocardial infection mortality rates? Ann. Intern. Med. 156 (6), 384–390.
Dintner, P.E., 2003. A Catholic Crisis, Bestowed from Above. New York Times, 1
January, A19.
Dwyer, J., 2013. In the End, He Stole Nothing and Gave Plenty. New York Times, 19
February.
Coram, R., 2002. Boyd: The Fighter Pilot who Changed the Art of War. Little Brown,
New York.
Edmondson, A.C., 1996. Learning from Mistakes Is Easier Said than Done. J. Appl.
Behav. Sci 32 (1), 5–28.
Eichenwald, K., 2012. Microsoft’s Lost Decade. Vanity Fair, August.
62 R. Westrum / Safety Science 67 (2014) 58–63
Gittell, J., 2002. The Southwest Airlines Way: Using the Power of Relationships to
Achieve High Performance. McGraw-Hill, New York.
Freiberg, K., Freiberg, J., 1998. Nuts! Southwest Airlines’ Crazy Recipe for Business
and Personal Success. Broadway Books, New York.
French, H.W., 2000. Japan Debates Culture of Covering Up. NEW YORK TIMES, 2
May, p. A12.
Hansen, J.R., 1995. Enchanted Rendezvous: John C. Houbolt and the Genesis of the
Lunar Orbit Rendezvous Concept, Monographs in Aerospace History, #4.
Washington: NASA History Office.
Hudson, P., 2007. Implementing a safety culture in a major multinational. Saf. Sci.
45, 697–722.
Jehl, D., 2005. 3 Ex-officials Describe Bullying by Boulton. New York Times, 3 May
2005, p. A10.
Jisheng, Y., 2012. Tombstone: The Great Chinese Famine, 1958-1962, Farrar, Strauss
and Giroux.
Katz, D., Allport, F., 1931. Student Attitudes. Craftsman Press, Syracuse, NY.
MacGregor, D., 1966. Leadership and Motivation. MIT Press, Cambridge.
Martin, D., Richard Murphy, 2013 Feb 16. 68; Aided disadvantaged youths,
New York Times.
Moody, R., 1975. Life After Life. Bantam, New York.
Parker, D. et al., 2006. Framework for understanding the development of
organizational safety culture. Saf. Sci. 44, 551–562.
Perrow, C., 1986. Normal Accidents: Living with High Risk Technologies. Basic, New
York.
Pronovost, D., 2010. Safe Patients, Smart Hospitals. Penguin, New York.
Reason, J., 1990. Human Error. Cambridge University Press, New York.
Stein, L.I., 1967. The doctor-nurse game. Arch. Gen. Psychiatry 16 (6), 699–703.
Tompkins, P.K., 1993. Organizational Communication Imperatives: The Lessons of
the Space Program. Roxbury Publishing, Syracuse, NY.
von Braun, W., 1956. Teamwork: Key to Success in Guided Missiles. Missiles and
Rockets, October, pp. 38–42.
Westrum, R., 1986. The Blind Eye of Science. Whole Earth Review, Fall, pp. 36–41.
Westrum, R., 1978. Science and social intelligence about anomalies: the case of
meteorites. Soc. Stud. Sci. 8 (1978), 461–493.
Westrum, R., 1982. Social intelligence about hidden events: its implications for
scientific research and social policy. Knowl.: Creat., Diffus., Util. 3 (3), 381–400.
Westrum, R., 2000. Safety planning and safety culture in the JCO criticality accident.
Cogn. Technol. Work 2, 240–241.
Westrum, R., 2004. A typology of organizational cultures. Qual. Safety Health Care
13 (Suppl. II), ii22–ii27.
Westrum, R., 2009a. Information flow and problem-solving. In: Cosby, K.,
Crosskerry, P., Wears, R. (Eds.), Handbook of Patient Safety in Emergency
Medicine. Lippincott Williams, Philadelphia.
Westrum, R., 2009. Hearing Faint Signals. Presentation at the Petroleum Safety
Authority, Stavanger, Norway, May 15, 2009.
Wise, J. et al. (Eds.), 1994. Verification and Validation of Complex Systems: Human
Factors. Springer, New York.
R. Westrum / Safety Science 67 (2014) 58–63 63
... We do not discount the benefits of technology, quite the contrary, but it is the nature of an Alliance's culture, which tends to be generative, that enables information to flow and be communicated (Westrum, 2014). Put simply, "by examining the culture of information flow, we can get an idea of how well people in the organisation are cooperating and also, how effective their work is likely to be" in ensuring quality (Westrum, 2014: p.58). ...
... Put simply, "by examining the culture of information flow, we can get an idea of how well people in the organisation are cooperating and also, how effective their work is likely to be" in ensuring quality (Westrum, 2014: p.58). So, when information does not flow, it can adversely affect the functioning of a project (Westrum, 2014). The blocking of communication actions with regard to poor quality is a common occurrence in projects that are not procured under collaborative delivery approaches, as organisations and individuals tend to be blamed and its costs charged to those who are deemed responsible (Love et al., 2018). ...
... The Alliancing TOC development process not only supports but requires the kind of open questioning of assumptions, exploration of options, consequences and development of issue solutions through genuine dialogue. The two case study projects clearly demonstrated extensive organisational support through the 'one-team' alliance concept, that may be considered generative rather than bureaucratic or pathological organisational cultures (Westrum, 2014), and adopting mentoring and facilitated workshops to confront (rather than hide) emerging issues within an intellectually safe workplace culture and environment was also demonstrated by these case studies. The case studies also demonstrated appreciation of the value (see Table 1) of dialogue, not only during the TOC development process, but throughout the project delivery process using toolbox meetings, BIM and re-work minimisation tools (see Figure 2), that were applied in a low/no power and information flow asymmetry environment to support sense-making and decision making. ...
Conference Paper
Full-text available
The popularity of projects being undertaken using Integrated Project Delivery (IPD) forms continues. Delivery platforms include the Integrated Form of Agreement (IFOA) in North America, Alliancing in Australia and New Zealand, Finland, and the Netherlands, and the New Engineering Contract (NEC3/4) in the United Kingdom. Findings from numerous surveys and case studies suggest that this form of delivery is more frequently successful in providing value to the project owner and enabling a range of ancillary benefits to project teams that are collaborative and unified. However, simply forming an IPD team is insufficient. What appears to be the essential ingredient is the effective collaboration that shapes the quality of cross-team and cross-disciplinary dialogue. This raises interesting questions about providing value through project delivery. Aside from the intrinsic value of the project infrastructure developed, other forms of value that escape attention may be created through effective IPD. Accepting the premise that effective dialogue is the key to positive and value generating integrated team collaboration, this paper explores the value that effective dialogue brings to the Alliance delivery platform, across the development phase to implications for the facility operation and its eventual disposal/reuse.
... The study applied Agricultural Innovations Systems (AIS) theory (Klerxk and Nettle, 2013) and theory of information flow (Ron, 2014) that provide illustrations and the insights on actors and how actors interact to provide and share technical information and other services. The cowpea in the context of this study is a product of an innovation developed by multiple actors including extension agents, agro input/output traders and researchers. ...
... In order for the interaction to be effective, it has to be frequent and intense for it to offer an opportunity for actors to share relevant and timely information/other services needed by the farmers in order to improve on production and market shares. Ron (2014) notes that based on the theory of information flow, information is best shared using information sharing channels like, on-farm demonstration, local radio, mobile phone SMS/phone calls, brochure, face-to-face and for information shared to be useful, it has to be provided at the time that a farmer needs and it should be able to address the existing problem and the farmer should be able to apply it in a bid to improve on production, market shares and profitability. ...
Article
Full-text available
With the increased use of technology in agriculture, new improved farming and varieties of seeds have been adopted. The study explored how the interaction among the actors helped in sharing vital information on market shares of improved cowpea and relative to improved beans. A cross-sectional research design was used, while quantitative and qualitative approaches were adopted. The respondents were purposively selected because of their expert knowledge in the study. The findings showed, that beans and cowpea enterprise had 0.23 (23%), and 0.14 (14%) interaction density, while 23 and 16 actors participated in the promotion of beans and cowpea enterprises respectively. The findings also show that as a cowpea entrepreneur a farmer is the most influential and dominant actor in providing and using information while in the bean's enterprise, extension service providers stood out as the most influential actor and the farmers remain the most dominant actor. Fellow farmers showed higher influence and dominance in providing and sharing information and other services regarding cowpea. Cowpea was a more profitable and viable enterprise. The study recommends that researchers and other actors should engage in the promotion of improved technologies. There should be increased interaction amongst actors and this creates product loyalty and promotes channel distribution. The actors should also provide and share relevant and timely information regarding agronomic, post-harvest practices, potential market opportunities, and other services.
... Thus, organisations need to control any potential damage as quickly as possible (including any likelihood of error cascades) and reduce the occurrence of future errors (i.e., secondary prevention). Therefore, errors provide learning opportunities as organisations can better position themselves to anticipate 'what might go wrong' and implement routines to deal with errors when they arise (Westrum, 2014;Love and Matthews, 2020). ...
... But, having an organisation structure, decision rights systems, measures, and incentives to help drive an error management culture provide the foundations to tackle this problem. If errors are to be responded to quickly, then a minimal management layer needs to be in place to ensure information is not filtered out-the flow of information influences cooperation and the functioning of an organisation (Westrum, 2014). A 'gain-share/pain-share' regime exists as an incentive within an alliance contract. ...
Article
The research we present in this paper addresses the following question: What type of error culture does the rank-and-file workforce experience during construction, and does it help mitigate rework? We undertake an exploratory case study of an alliance, which forms part of a transport mega-project. An error culture questionnaire is administered to the alliance's subcontractors' workforce across four projects. We find that an error management culture positively correlates with reductions in rework and holds a divergent relationship with an error aversion culture. We further reveal a negative association between an error aversion culture and the ability to reduce rework. Consequently, we question the contemporary wisdom that assumes that error prevention should be combined with error management to create an adaptive culture, aiming to minimise the negative and maximise positive error consequences. We finally discuss the study's limitations and implications for future research examining error culture in construction projects.
... Relatively, information flow within an organization can be said to be extremely important in determining the success or failure of the business in the long run. Westrum (2014) posited that information flow in organizations can be defined as timely, relevant and appropriate flow of information from a sender (transmitter) at point A to a receiver (recipient) at point B. Also, information flow may be defined as a maintained and updated stream of information from a source towards a destination. Similarly, Mahto and Davies (2012) stated that information can flow in four directions in an organization such as downward, upward, horizontally, and diagonally. ...
Article
Full-text available
Employee job performance may be considered as behavior and outcomes that employees undertake to contribute their quota to the organizational goals. An organization like Federal Inland Revenue Service has attracted different forms of criticism in regards to its performance as it said that they have been unable to deliver the goals and objective it was created for. Information flow has been identified as one of the solutions that can enhance the performance of any organization. Previous studies available to the researchers on the concept of Information flow were not carried out in Nigeria, which made the researchers carry out this study to evaluate information flow and employee's job performance in Federal Inland Revenue Service, Lagos State, Nigeria. This study used a survey design. A population of 630 was used for the study. A sample size of 245 was used using Taro Yamane's formula. The purposive sampling technique was used to select six (6) departments while proportionate stratified sampling was used to select the number of respondents that questionnaire was administered in each department. Data were analyzed using descriptive and inferential statistics. Investigating the influence of information flow reveals that information flow has a positive and significant influence on employee job 253 performance (β=0.226, p=0.05). The study concludes statistically that information flow has a positive and significant influence on employee job performance in Federal Inland Revenue Service, Lagos State. It further recommends that the culture of information flow should be maintained to help sustain the performance of the organization and its workforce.
... Relatively, information flow within an organization can be said to be extremely important in determining the success or failure of the business in the long run. Westrum (2014) posited that information flow in organizations can be defined as timely, relevant and appropriate flow of information from a sender (transmitter) at point A to a receiver (recipient) at point B. Also, information flow may be defined as a maintained and updated stream of information from a source towards a destination. Similarly, Mahto and Davies (2012) stated that information can flow in four directions in an organization such as downward, upward, horizontally, and diagonally. ...
Article
Full-text available
The quality of teaching being imparted to students is supposed to be of high quality and effective in this information superhighway age. Teaching effectiveness typified with teaching methodologies, classroom management, assessment procedures and content knowledge is of great unease to any tertiary education institution since it furthers the productivity of the institution. In tertiary institutions, like public polytechnics in Ekiti and Ondo states, teaching effectiveness seems to be of poor quality because of students’ low capacity to analytically think, poor lecturers proficiency. Educational mobile apps use can further the teaching effectiveness in any institution of higher learning. Therefore, this study investigated the influence of educational mobile apps use on teaching effectiveness of lecturers in public polytechnics in Ekiti and Ondo States, Nigeria. The study used survey research design. The population of the study consisted of 116 lecturers and 1,978 students in the three public polytechnics in Ekiti and Ondo states. The lecturers were all enumerated to participate, while Taro Yamane was used to select 333 students’ participants and multistage sampling was used to select the participants from the various faculties, departments and levels. A self-structured validated, and reliable questionnaire was used to gather data. The data collected were analyzed with the use of descriptive statistics, and linear regression. Findings showed that the most used educational mobile apps was Google Apps for education (GAFE) with a mean score of (x̅=1.64).The result also indicated that there was a weak positive but not significant influence of educational mobile apps use on the teaching effectiveness in public polytechnics in Ekiti and Ondo State, (β=0.027, t = 0.270, p-value>0.05). The study concluded that educational mobile apps use contributes to teaching effectiveness of lecturers in public polytechnics in Ekiti and Ondo states, Nigeria. It was therefore recommended that the government of Nigeria, through the ministry in charge of education, and National Board for Technical Examinations (NBTE) should carry out continuous awareness program and training for lecturers in public polytechnics on the use of educational mobile apps.
... due to employees being discouraged from voicing concerns), information on risk is less likely to be shared, which increases the likelihood of an accident due to decision-makers (e.g. supervisors, senior managers) being unaware of emerging hazards (Turner and Pidgeon 1997;Westrum 2014). Examples include the Columbia Space Shuttle disaster, where junior engineers remained silent about their concerns regarding damage on the aircraft prior to shuttle launch, or the Tenerife aircraft collision, where opportunities for a co-pilot to raise concerns to the pilot about the unsafe take-off were not taken (Edmondson 2018). ...
Article
Full-text available
Safety communication relates to the sharing of safety information within organizations in order to mitigate hazards and improve risk management. Although risk researchers have predominantly investigated employee safety communication behaviors (e.g. voice), a growing body of work (e.g. in healthcare, transport) indicates that public stakeholders also communicate safety information to organizations. To investigate the nature of stakeholder safety communication behaviors, and their possible contribution to organizational risk management, accounts from patients and families – recorded in a government public inquiry – about trying to report safety risks in an unsafe hospital were examined. Within the inquiry, 410 narrative accounts of patients and families engaging in safety communication behaviors (voicing concerns, writing complaints, and whistleblowing) were identified and analyzed. Typically, the aim of safety communication was to ensure hospital staff addressed safety risks that were apparent and impactful to patients and families (e.g. medication errors, clinical neglect), yet unnoticed or uncorrected by clinicians and administrators. However, the success of patient and family safety communication in ameliorating risk was variable, and problems in hospital safety culture (e.g. high workloads, downplaying safety problems) meant that information provided by patients and families was frequently not acted upon. Due to their distinct role as independent service-users, public stakeholders can potentially support organizational risk management through communicating on safety risks missed or not addressed by employees and managers. However, for this to happen, there must be capacity and openness within organizations for responding to safety communication from stakeholders.
... When the flow of information is impeded in an organisation, it can adversely affect its ability to function effectively (Westrum, 2014), and problems may become masked. A case in point is rework 2 performed during the construction of mega-transport projects where, on average, it has been shown to increase construction costs by 12% (Li and Taylor, 2014). ...
Article
Full-text available
Within construction, we have become increasingly accustomed to relying on the benefits of digital technologies, such as Building Information Modelling, to improve the performance and productivity of projects. We have, however, overlooked the problems that technology is unable to redress. One such problem is rework, which has become so embedded in practice that technology adoption alone can not resolve the issue without fundamental changes in how information is managed for decision-making. Hence, the motivation of this paper is to bring to the fore the challenges of classifying and creating an ontology for rework that can be used to understand its patterns of occurrence and risks and provide a much-needed structure for decision-making in transport mega-projects. Using an exploratory case study approach, we examine ‘how’ rework information is currently being managed by an alliance that contributes significantly to delivering a multi-billion dollar mega-transport project. We reveal the challenges around location, format, structure, granularity and redundancy hindering the alliance’s ability to classify and manage rework data. We use the generative machine learning technique of Correlation Explanation to illustrate how we can make headway toward classifying and then creating an ontology for rework. We propose a theoretical framework utilising a smart data approach to generate an ontology that can effectively use business analytics (i.e., descriptive, predictive and prescriptive) to manage rework risks.
Article
There has been a wealth of research that has examined the nature of rework in construction. Progress toward addressing the rework problem has been limited – it still plagues practice, adversely impacting a project’s performance. Almost all rework studies have focused on determining its proximal or root causes and therefore have overlooked the conditions that result in its manifestation. In filling this void, this paper draws upon our previous empirical studies, amongst others, to provide a much-needed theoretical framing to understand better why rework occurs, its consequences, and how it can be mitigated during construction. The theoretical framing we derive from our review provides construction organizations and their projects with a realization that the journey to mitigating rework begins with creating an error mastery culture comprising authentic leadership, psychological safety, an error management orientation, and resilience. We suggest that once an error mastery culture is established within construction organizations and their projects, they will be better positioned to realize the benefits of techniques, tools, and technologies espoused to address rework, such as the Last Planner® and Building Information Modeling. We also provide directions for future research and identify implications for practice so that strides toward rework mitigation in construction can be made.
Article
Full-text available
Background Establishing more substantial patient involvement in the health care has become fundamental to Western health care services. Person-centred care (PCC) has been developed as a way of working that involve the patients and family members. However, the implementation of PCC in clinical practice has proven to be challenging. The aim of this study was to explore the congruence of managers’ perceptions and understanding of various aspects of PCC across three organisational levels in one health care region in Sweden in terms of coupling, decoupling and recoupling. Methods A policy on increased patient participation in health care was adopted in one health care region in Sweden. This policy was embodied in the form of PCC and a support strategy for the implementation was put in place. Participants representing three organisational levels ( senders: politicians, n = 3; messengers: senior management, n = 7; and receivers: middle- and frontline managers, n = 13) were interviewed and documents collected. A deductive qualitative content analysis was performed and findings from the three organisational levels compared. Results Descriptions of PCC at all the three organisational levels included health care provided in partnership between provider and patient. However, messengers and receivers also included aspects of how work was organised as part of the concept. Representatives at all levels expected high-quality care while reducing health care costs as an outcome, however, messengers and receivers also anticipated improvements in the work environment and reduced staff turnover. Strategies to support implementation included continuation and enhancement of existing routines that were considered person-centred and development of new ones. A need to make PCC less ‘fuzzy’ and ambiguous and instead communicate a more tangible care process was described. Representatives among messengers and receivers also suggested that no actions were needed because the practice was already considered person-centred. Conclusion The findings indicated that congruence between organisational levels existed in some aspects, suggesting coupling between policy and practice. However, also incongruences were identified that might be due to the fuzziness of definitions and the application of PCC in practice, and the difficulty in assessing the level of patient-centredness in clinical practice.
Article
Full-text available
This research explores how group- and organizational-level factors affect errors in administering drugs to hospitalized patients. Findings from patient care groups in two hospitals show systematic differences not just in the frequency of errors, but also in the likelihood that errors will be detected and learned from by group members. Implications for learning in and by work teams in general are discussed.
Article
Full-text available
In making decisions about the reality of alleged anomalous events, scientists are likely to weigh both the a priori plausibility of what is alleged and the credibility of the reports which reach them. The present paper is an attempt to examine the anomaly reporting processes which led to the scientific recognition of the reality of meteorites in the eighteenth century. It is shown that scientists fail to make realistic assumptions about anomaly reporting, and that this failure affects the accuracy of the decisions made about anomalies. The treatment of reports about alleged anomalous events is further shown to be related to the scientific community's concerns about protecting its internal processes from external Interference. The recognition of meteontes took place only when the savants of the eighteenth century 1) found a way of evaluating the reports, 2) devised a theory to explam them, and 3) received unimpeachable eyewitness testimony of their occurrence.
Book
Despite its increasing importance, the verification and validation of the human-machine interface is perhaps the most overlooked aspect of system development. Although much has been written about the design and developmentprocess, very little organized information is available on how to verifyand validate highly complex and highly coupled dynamic systems. Inability toevaluate such systems adequately may become the limiting factor in our ability to employ systems that our technology and knowledge allow us to design. This volume, based on a NATO Advanced Science Institute held in 1992, is designed to provide guidance for the verification and validation of all highly complex and coupled systems. Air traffic control isused an an example to ensure that the theory is described in terms that will allow its implementation, but the results can be applied to all complex and coupled systems. The volume presents the knowledge and theory ina format that will allow readers from a wide variety of backgrounds to apply it to the systems for which they are responsible. The emphasis is on domains where significant advances have been made in the methods of identifying potential problems and in new testing methods and tools. Also emphasized are techniques to identify the assumptions on which a system is built and to spot their weaknesses.
Article
Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them. The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.