Content uploaded by Victor E. Sower
Author content
All content in this area was uploaded by Victor E. Sower on Jan 22, 2014
Content may be subject to copyright.
Benchmarking
In Hospitals: More
Than a Scorecard
by Victor E. Sower
f you look at many hospitals’ websites and
other resources they make publicly available,
you will often see charts, graphs or tables
showing hospital performance on some metric
compared with a national standard.
An example is Table 1, which was taken from a
hospital website and compares patient satisfac-
tion in numerous areas with the national average.
The hospital highlighted the areas in which it
exceeded the national average.
The term “benchmarking” is often mentioned in
hospital quality literature, but the process of bench-
marking is often misunderstood. True benchmark-
ing is not simply comparing outcome measures
with industry averages.
ASQ defines benchmarking as “a technique in
which a company measures its performance against
that of best in class companies, determines how
those companies achieved their performance levels
and uses the information to improve its own perfor-
I
HEALTHCARE
In 50 Words
Or Less
• Benchmarking should not just involve comparing
your hospital with national averages; it should
involve looking at best in class hospitals and
finding out what they do.
• Hospitals shouldn’t limit their benchmarking to
just the healthcare industry; there’s much to
learn from the service industry, too.
Benchmarking Outcome Measures
TABLE 1
Item Our hospital National average
Overall patient satisfaction 84.9 82.5
Check-in 82.0 79.0
Nurses 79.3 84.4
Doctors 84.1 85.4
Tests 85.5 85.8
Family or friends 88.9 85.5
Waiting time 83.6 87.5
Black: Our hospital lower than national average
Blue: Our hospital higher than national average
58
I
AUGUST 2007
I
www.asq.org
mance. Subjects that can be benchmarked include
strategies, operations and processes.”
1
Doing a simple comparison with a national aver-
age is more like a scoreboard showing who is win-
ning. It only answers the question “Am I above or
below average?” This doesn’t tell the hospital how
to improve operations. The approach might be of
interest to the general public, government and
accreditation agencies, but it is of limited value as
input to a hospital’s process of continuous quality
improvement (CQI).
Ask yourself which would contribute more to
your CQI program: knowing your hospital is
slightly above average nationally in controlling
Methicillin resistant Staphylococcus aureus
(MRSA) or understanding the processes that the
University of Virginia Hospital used to achieve
best in class MRSA control?
2
There is value in comparisons with national
averages. Residents of the hospital’s service area
can judge the quality of the hospital compared
with national averages. The hospital’s quality
director and quality improvement teams can use
this information to determine which areas most
need improvement. The improvement efforts’
progress can be monitored over time to determine
whether the actions taken are effective in closing
the gaps between a hospital’s performance and
national averages.
However, for all its usefulness, comparison with
national averages is insufficient. Meeting the nation-
al average does not equate excellence. It might not
even equate sufficiency.
A Canadian study found that 7.5% of patients
experienced at least one adverse event because of
medical errors in 2000.
3
If your hospital has a med-
ical error rate of 7%, it is better than the national
average. Is that sufficient? Wouldn’t it be better to
know what the error rate is at the best hospitals?
Wouldn’t it be even better to understand how
those best in class hospitals achieved the bench-
mark standard medical error rates?
The Leapfrog Hospital Quality and Safety
Survey found that 50% of hospitals do not have
procedures to prevent bedsores.
4
If your hospital
has any such procedures, you are above the nation-
al average. Is that sufficient? Wouldn’t it be better
to know what the procedures are at the hospitals
with the lowest incidence of bedsores?
Without information about the processes that the
best hospitals use, we must approach improvement
by reinventing the wheel. We are doomed to make
the same mistakes other hospitals have made and
learned from.
A problem with national averages is that we
don’t even know which hospitals are the best per-
formers and we don’t know what best in class per-
formance is. National averages provide no measure
of variation in performance and no information
about the level for best in class performers. Varia-
tion in performance can be a bigger problem than
average performance.
The Nebraska Medical Center’s interventional
radiology department undertook a project to
improve major problems in treatment delays,
which were creating patient dissatisfaction and
patients’ seeking treatment elsewhere.
5
The
department found it took an average of 1.4 calls
to schedule an appointment. Further analysis
revealed the standard deviation was 0.989 calls,
with a maximum of seven calls.
After several improvement projects had been
completed, the average was still 1.4 calls.
However, the standard deviation had been
reduced to 0.52 calls with a maximum of three
calls. If they had used a comparison to national
averages, the significant improvement in this
process would not be visible.
Benchmarking is an improvement process in
which an organization measures its performance
against that of best in class organizations within or
outside its industry, determines how those organi-
zations achieved their performance levels, and uses
that information to improve its own performance.
Benchmarking can be a valuable tool in moving
beyond national average performance to best in
class performance.
Benchmarking can be a
valuable tool in moving
beyond national average
performance to best in
class performance.
QUALITY PROGRESS
I
AUGUST 2007
I
59
Best In Class
While it is useful to discuss improvement efforts
with other hospitals that are nearby, easily accessible
or otherwise convenient, to aspire for excellence you
must compare yourself with excellent hospitals. One
such best in class hospital is Robert Wood Johnson
(RWJ) University Hospital in Hamilton, NJ.
RWJ received a 2004 Malcolm Baldrige National
Quality Award. It had a quality program in place
in 1999 that was based on five pillars of excel-
lence—service, finance, quality, people and growth.
But, looking for ways to better serve its customers,
the hospital’s management decided to use the
Baldrige criteria as a “framework for leadership
and acceleration of [its] quality journey.”
6
One of RWJ’s achievements is best in class ser-
vice in its emergency department (ED). Its 15/30
program guarantees every patient will see a nurse
within 15 minutes and a doctor within 30 minutes
of entering the ED. RWJ backs this program with
an extraordinary guarantee—if it fails to meet the
guarantee, the ED portion of the bill will be waived
upon patient request. The hospital’s payout is less
than 1%, indicating it has a process in place to
achieve the desired results. Patient satisfaction
with ED increased from 85% in 2001 to 90% in
2004. Because 70% of the hospital’s inpatients
enter through the ED, this program has con-
tributed to overall hospital success.
Another hospital has an average time from enter-
ing the ER to seeing a physician of 47 minutes. The
graph on its website shows this is better than the
national norm of about 55 minutes. Clearly, this is
an above average hospital. But it is not best in class.
It should benchmark against RWJ’s best in class
performance—not the national norm.
Inside or Outside the Industry
While numerous hospitals have been recog-
nized for excellence—four have received the
Baldrige award since 2002—other hospitals need
not restrict their searches for benchmarking part-
ners to other hospitals. Joseph Juran wrote, “As
the health industry undertakes … change, it is
well advised to take into account the experience
of other industries in order to understand what
has worked and what has not. The health indus-
try is different … however, the decisive factors in
what works and what does not are the managerial
processes, which are alike for all industries.”
7
For example, hospitals share several processes
with hotels. The Ritz-Carlton Hotel Co., which
received a Baldrige award in 1992, has approaches
to employee training, room service, custodial ser-
vices, customer orientation and quality metrics
that hospitals could learn from. Disney is well
known for employee training and customer orien-
tation—both important to hospitals. Both of these
organizations were used as benchmark standards
by Bronson Methodist Hospital in Kalamazoo,
MI—also a Baldrige recipient.
8
Benchmarking is not just copying what other
successful organizations are doing. It involves
not just understanding what best in class organi-
zations’ goals are and how they have achieved
those goals through process and operations
improvement; it is also taking that information
back to your own organization to determine how
to achieve comparable results given your unique
internal and external conditions. This process
will make yours a better hospital.
REFERENCES
1. “The Quality Glossary,” Quality Progress, June 2007.
2. Thomas G. Dolan, “Staph Infections—Stealthy Killers,”
Radiology Today, May 2006.
3. Anne McIlroy and Rod Mickleburgh, “Hospital Errors
Kill Thousands in Canada, Study Estimates,” The Toronto
Globe and Mail, May 24, 2004.
4. Leapfrog Hospital Quality and Safety Survey, 2005,
www.leapfroggroup.org.
5. Jennifer Volland, “Quality Intervenes at a Hospital,”
Quality Progress, February 2005.
6. Dave Nelsen, “Baldrige—Just What the Doctor
Ordered,” Quality Progress, October 2005.
7. Joseph Juran, foreword to Curing Health Care, by
Donald Berwick, A. Blanton Godfrey and Jane Roessner,
Jossey-Bass, 1990.
8. Michele Serbenski, executive director, corporate effec-
tiveness and customer satisfaction, Bronson Healthcare
Group, personal communication, Oct. 3, 2006.
VICTOR E. SOWER is a professor of management at Sam
Houston State University in Huntsville, TX. He earned a
doctorate in operations management from the University of
North Texas in Denton. Sower is a senior member of ASQ,
a member of the Health Care Management Division, a
Quality Press reviewer and a certified quality engineer.
HEALTHCARE
60
I
AUGUST 2007
I
www.asq.org