When Doing Wrong Feels So Right: Normalization
Mary R. Price, MSN, RN-C, NEA-BC and Teresa C. Williams, MSN, RN, NE-BC
Abstract: Normalization of deviance is a term first coined by sociologist
Diane Vaughan when reviewing the Challenger disaster. Vaughan noted
that the root cause of the Challenger disaster was related to the repeated
choice of NASA officials to fly the space shuttle despite a dangerous de-
sign flaw with the O-rings. Vaughan describes this phenomenon as occur-
ring when people within an organization become so insensitive to deviant
practice that it no longer feels wrong. Insensitivity occurs insidiously and
sometimes over years because disaster does not happen until other critical
factors line up. In clinical practice, failing to do time outs before proce-
dures, shutting off alarms, and breaches of infection control are deviances
from evidence-based practice. As in other industries, health care workers
do not make these choices intending to set into motion a cascade toward
disasterand harm. Deviation occurs because of barriers to using the correct
process or drivers such as time, cost, and peer pressure. As in other indus-
tries, operators will often adamantly defend their actions as necessary and
justified. Although many other high-risk industries have embraced the nor-
malization of deviance concept, it is relatively new to health care. It is ur-
gent that we explore the impact of this concept on patient harm. We can
borrow this concept from other industries and also the steps these other
high-risk organizations have found to prevent it.
Key Words: normalization of deviance, high-reliability
organizations, preventable harm, patient safety, medical errors
(J Patient Saf 2018;14: 1–2)
Normalization of deviance is a term first coined by sociologist
Diane Vaughan when reviewing the Challenger space shuttle
noted that the root cause of the Challenger
disaster was related to the repeated choice of NASA officials to
fly the space shuttle despite a dangerous design flaw with the
O-rings that failed to seal critical joints. Vaughan's
showed that this deviance from an established standard was
labeled as acceptable risk. Accepting this risk led to the Chal-
lenger exploding on a freezing morning when the cold-stiffened
O-rings allowed a critical flow of hot propellant gases to blowby
and ignite, killing 7 astronauts. Since then, normalization of devi-
ance has been found to be a factor in many other recent disasters,
includingthe Columbia space shuttle disaster, the deadly chemical
release in Bhopal, India, and nuclear-related accidents at
Chernobyl and Three Mile Island.
describes normalization of deviance as occurring
when people within an organization become so insensitive to
deviant practice that it no longer feels wrong. Insensitivity occurs
imperceptibly and sometimes over years because disaster does not
happen until other critical factors line up. In clinical practice,
failing to do time outs before procedures, shutting off alarms,
and breaches of infection control policies are examples of devi-
ances from standard, accepted, evidence-based practices. Although
the concept is relatively new to health care, it has tremendous appli-
cability. How many seasoned health care professionals have heard
novices state, “Every preceptor I have shows me a different way
to do things.”Another common statement heard from staff is,
“That's not how we do it on our unit.”
As in other industries, health care workers do not make these
choices intending to set into motion a cascade toward disaster
and harm. Health care providers may not see the required practice
as safer than the status quo because it has not been proven to them
to be better at preventing an adverse event. Frequently, deviation
occurs because of barriers to using the correct process or drivers
such as time, cost, and peer pressure.
Indeed, as in other indus-
tries, the operators will often adamantly defend their actions as
necessary and justified.
Eventually, the deviation becomes the
new norm. This new norm may even be justified as helping to
accomplish other important organizational goals such as customer
service and budget constraints, without seeming to compromise
explained that Murphy's Law is not appli-
cable to normalization of deviance. Everything that can go wrong
usually does not, and so normalization occurs. Good people make
small changes to a process, nothing bad happens, and then the
conclusion is drawn that the deviation is acceptable. Another
incremental change is made and results in the same conclusion.
At some point, however, something that could go wrong does go
wrong, and by that time, all the accumulated deviations over time
have removed the barriers that would have prevented the error
from becoming disastrous.
James Reason's Swiss Cheese Model
is useful in under-
standing how normalization of deviance sets up the system
for possible failure and resulting harm. Usually, no single isolated
cause leads to an adverse event. Often, multiple factors occurring
together result in failure and harm. The holes in the Swiss cheese
represent a breach in a potential barrier to an active error or unsafe
act. An ideal process is like a piece of Swiss cheese where none
of the holes ever line up. A failing process could be visualized
as several pieces of Swiss cheese with aligning holes. The same
concept can be applied to health care; if the right circumstances
occur at the right time, the holes line up, increasing the risk for
an adverse event.
No health care provider wants to be part of patient harm, yet it
happens all too frequently. A recent study reported in the Journal
of Patient Safety
estimates that acts of commission, omission,
and other preventable events cause more than 400,000 patient
deaths per year in the United States. The number of patient deaths
is equivalent to 22 Boeing 777 jets with 350 passengers and crew
crashing every week with no survivors and is at least 4 times the
number used in the landmark 1999 Institute of Medicine report,
To Err Is Human.
Harvard's Lucian Leape, one of the authors of
the Institute of Medicine report, says that it is time to embrace this
new number because it represents the best methodology to date.
These numbers are not meant to discourage creativity and
change toward improvement in patient care. To avoid normaliza-
tion of deviance, change should not be freelanced, undocumented,
From the St. Elizabeth Healthcare, Florence, Kentucky.
Correspondence: Mary R. Price, MSN, RN-C, NEA-BC, Manager, Staff
Development, St. Elizabeth Healthcare, 4900 Houston Rd, Florence, KY
41042; or Teresa C. Williams, MSN, RN, NE-BC, Education Specialist,
Staff Development, St. Elizabeth Healthcare, 4900 Houston Rd, Florence,
KY 41042 (e‐mail: firstname.lastname@example.org or
The authors disclose no conflict of interest.
Copyright © 2015 Wolters Kluwer Health, Inc. All rights reserved.
REVIEW ARTIC LE
J Patient Saf •Volume 14, Number 1, March 2018 www.journalpatientsafety.com 1
Copyright © 2018 Wolters Kluwer Health, Inc. All rights reserved.Copyright © 2018 Wolters Kluwer Health, Inc. All rights reserved.
and hidden. Change should be instituted with transparency using
the tools of performance improvement, evidence-based practice
change, and research. To those who disdain “cookbook medi-
cine,”the response should be, “Thank heavens we finally have a
Once deviation is entrenched, rooting it out is challenging.
Keeping it out adds another challenge. Other industries have suc-
cessfully turned deviance around, providing lessons learned avail-
able for healthcare to borrow. High-risk industries such as airlines
now meet the standard of being high-reliability organizations.
Achieving this level of reliability required a culture change in
those industries, which is exactly what is going to be required in
A shift in focus from individual guilt to systems and processes
is central to the culture change facing health care.
occurs, instead of looking for a scapegoat in the form of the last
link in the chain of a faulty system or process, the investigation
should search deeper, into the system or process itself. The patient
safety literature is rich in evidence demonstrating that a focus on
system fixes rather than trying to make humans perfect is much
more productive in preventing errors.
Blaming an individual does
nothing to change the system that pushed the individual to deviate.
Other individuals also deviate to cope with the same imperfect
system. Our communication with that last link, that last caregiver,
should not be an accusation, but rather a question, “Why did you
choose to do what you did? Was there a barrier to doing it accord-
ing to the established procedure?”The very fact that a deviance
was used is a signal that something is wrong in the system/
process/work flow, and the person struggling with the imperfect
system may be able to lead us to the holes in the Swiss cheese.
A normalization of deviance, a near miss, an error—all are faint
signals that there are problems in the system
and should cue
practitioners to dig deeper.
A focus on systems first requires a paradigm shift in thinking.
We have a history in health care of holding ourselves to a standard
of perfection. Only when health care providers accept the impos-
sibility of perfection will we value and prioritize the work that is
necessary to f ix the processes, equipment, and work flows that
set up the errors. It is imperative that leadership at all levels learn
how to think in terms of systems and to look at how systems inter-
act with one another. Leadership involvement is often necessary
to implement patient safety changes, and the “top-down”aspect
is one that cannot be ignored; however, there needs to be recogni-
tion that unintended downstream consequences of leadership
decisions could affect patient safety.
This is not to push the
blame upstream from the frontline caregiver to leadership, be-
cause drivers and barriers also exist at the leadership level.
System fixes may involve eliminating drivers to doing things
the wrong way or removing barriers to doing things the right
way. There are opportunities for system fixes in every report of
an error or near miss. Bar codes that do not scan properly are a
driver to skip the scanning process to administer the medication
on time. Complicated procedures and poorly designed work areas
create insidious barriers to doing things the right way because we
fail to see these for the hazards they are. For example, a workplace
design with inconveniently placed hand hygiene stations has been
shown to decrease hand hygiene compliance.
tors in the system are also at play; for example, simple peer pres-
sure can drive a person to deviance, and our novice nurses are
particularly vulnerable to being easily influenced.
Increasing awareness of this phenomenon is very important for
nursing because the nurse has more direct patient contact than
anyone else on the patient care team and often stands as the last
barrier to patient harm. Deviant practices that have become normal-
ized may remove that last barrier. According to 1 study, nurses will
create a work-around 93% of the time when faced with a problem
versus only 7% of the time reporting the process problem to those
who could focus on the contributing factors.
Further research is
needed to determine whether this tendency is prevalent among other
health care providers.
Frontline health care workers have the opportunity to take a
lead role in preventing deviance from creeping into practice.
Health care providers can accomplish this by being aware of
the concept of normalization of deviance, by being taught how
to report process problems in a nonpunitive environment, and by
being empowered to speak up to colleagues. Health care workers
should be open to constructive feedback and thank a colleague
who questions a work-around; this question may save the care pro-
vider from inadvertently causing patient harm.
Because normalization of deviance is a relatively new concept
to health care, becoming aware of this phenomenon is fundamen-
tal to eliminating and preventing this dangerous and often incre-
mental acceptance of unacceptable risk.
All of health care owns
the problem, and all must be part of the solution. The next
400,000 patients at risk are waiting.
1. Vaughan D. The Challenger Launch Decision: Risky Technology, Culture,
and Deviance at NASA. Chicago, IL: University of Chicago Press; 1996:
2. Dekker S. Patient Safety: A Human Factors Approach.BocaRaton,FL:
CRC Press; 2011:36–45.
3. Banja J. The normalization of deviance in healthcare delivery. Bus Horiz.
4. Dekker S. The Field Guide to Understanding Human Error. 2nd ed.
Burlington, VT: Ashgate Publishing Ltd; 2013:4–13.
5. Langewiesche W. Inside the Sky. New York, NY: Pantheon; 1998: 79.
6. Perneger T. The Swiss cheese model of safety incidents: are there holes in
the metaphor? BMC Health Serv Res. 2005;5:71 .
7. James J. A new evidence-based estimate of patient harms associated with
hospital care. J Patient Saf.2013;9:122–128.
8. Institute of Medicine. To E rr Is Human: B uilding a Safe r Health System.
Washington, DC: National Academy Press; 2001:1.
9. The AdvisoryBoard Company Web site. Available at: http://www.advisory.
leading-cause-of-death. Accessed October 28, 2013.
10. Chassin MR, Loeb JM. High reliability health care: getting there from here.
Milbank Q. 2013;91:459–490.
11. Bagian JP, Lee C, Gosbee J,et al. Developing and deploying a patient safety
program in a large health care delivery system: youcan't fix what youdon't
know about. Jt Comm J Qual Improv. 2001;27 :522–532.
12. Weick K, Sutcliffe K. Managing the Unexpected. 2nd ed. San Francisco,
CA: Jossey-Bass, 2007:32–35.
13. Henriksen K, Dayton E, Keyes M, et al. Understanding adverse events: a
human factors framework. In: Patient Safety and Quality: An
Evidence-Based Handbook for Nurses. Rockville, MD: U.S. Department
of Health and Human Services; 2008:67–85.
14. World Health Organization. WHO guidelines on hand hygiene in health
care: a summary. Available at: http://www.who.int/gpsc/5may/
tools/9789241597906/en/. Updated July 2009. Accessed June 20, 2014.
15. Tucker A, Edmondson A. Why hospitals don't learn from failures:
organizational and psychological dynamics that inhibit system change.
Calif Manage Rev. 2003;45:5 5–72.
Price and Williams J Patient Saf •Volume 14, Number 1, March 2018
2www.journalpatientsafety.com © 2015 Wolters Kluwer Health, Inc. All rights reserved.
Copyright © 2018 Wolters Kluwer Health, Inc. All rights reserved.Copyright © 2018 Wolters Kluwer Health, Inc. All rights reserved.