ArticlePDF Available

Practical Approaches to Quality Improvement for Radiologists

Authors:

Abstract and Figures

Continuous quality improvement is a fundamental attribute of high-performing health care systems. Quality improvement is an essential component of health care, with the current emphasis on adding value. It is also a regulatory requirement, with reimbursements increasingly being linked to practice performance metrics. Practice quality improvement efforts must be demonstrated for credentialing purposes and for certification of radiologists in practice. Continuous quality improvement must occur for radiologists to remain competitive in an increasingly diverse health care market. This review provides an introduction to the main approaches available to undertake practice quality improvement, which will be useful for busy radiologists. Quality improvement plays multiple roles in radiology services, including ensuring and improving patient safety, providing a framework for implementing and improving processes to increase efficiency and reduce waste, analyzing and depicting performance data, monitoring performance and implementing change, enabling personnel assessment and development through continued education, and optimizing customer service and patient outcomes. The quality improvement approaches and underlying principles overlap, which is not surprising given that they all align with good patient care. The application of these principles to radiology practices not only benefits patients but also enhances practice performance through promotion of teamwork and achievement of goals.
Content may be subject to copyright.
Note: This copy is for your personal non-commercial use only. To order presentation-ready
copies for distribution to your colleagues or clients, contact us at www.rsna.org/rsnarights.
QUALITY
1630
Practical Approaches to Quality
Improvement for Radiologists1
Continuous quality improvement is a fundamental attribute of
high-performing health care systems. Quality improvement is an
essential component of health care, with the current emphasis on
adding value. It is also a regulatory requirement, with reimburse-
ments increasingly being linked to practice performance metrics.
Practice quality improvement efforts must be demonstrated for
credentialing purposes and for certification of radiologists in prac-
tice. Continuous quality improvement must occur for radiologists
to remain competitive in an increasingly diverse health care mar-
ket. This review provides an introduction to the main approaches
available to undertake practice quality improvement, which will be
useful for busy radiologists. Quality improvement plays multiple
roles in radiology services, including ensuring and improving pa-
tient safety, providing a framework for implementing and improv-
ing processes to increase efficiency and reduce waste, analyzing
and depicting performance data, monitoring performance and
implementing change, enabling personnel assessment and devel-
opment through continued education, and optimizing customer
service and patient outcomes. The quality improvement approach-
es and underlying principles overlap, which is not surprising given
that they all align with good patient care. The application of these
principles to radiology practices not only benefits patients but also
enhances practice performance through promotion of teamwork
and achievement of goals.
©RSNA, 2015 • radiographics.rsna.org
Aine Marie Kelly, MD, MS, MA
Paul Cronin, MD, MS
Abbreviations: ABR = American Board of Ra-
diology, ACR = American College of Radiology,
FMEA = failure mode and effect analysis, PDSA =
plan, do, study, act
RadioGraphics 2015; 35:1630–1642
Published online 10.1148/rg.2015150057
Content Codes:
1From the Department of Radiology, Division of
Cardiothoracic Radiology, University of Michi-
gan Medical Center, 1500 E Medical Center
Dr, Ann Arbor, MI 48109. Received March 9,
2015; revision requested April 27 and received
May 27; accepted June 2. For this journal-based
SA-CME activity, the authors, editor, and re-
viewers have disclosed no relevant relationships.
Address correspondence to A.M.K. (e-mail:
ainekell@umich.edu).
©RSNA, 2015
After completing this journal-based SA-CME
activity, participants will be able to:
Discuss the rationale behind and the
need for departmental and institutional
quality improvement programs.
Describe the most commonly used
approaches to quality improvement that
can be applied in radiology departments.
List the steps involved in designing,
implementing, and continuing a depart-
mental or institutional quality improve-
ment program.
See www.rsna.org/education/search/RG.
SA-CME
LEARNING OBJECTIVES
Introduction
The current health care quality crisis is not new, with quality lapses
reported over at least 100 years. The Institute of Medicine’s report
“To Err is Human” (1) highlighted nearly 100,000 deaths annually
in the United States as a result of medical error. In a subsequent
report, “Crossing the Quality Chasm, the Institute of Medicine
proposed closing the performance gap by promoting efficiency,
effectiveness, patient safety, patient centeredness, timeliness, and
equity (2). The report also suggested 10 rules for redesign, as fol-
lows: care based on continuous healing relationships, care custom-
ized according to patient needs and values, care controlled by the
patient, knowledge sharing and free flow of information, evidence-
based decision making, safety as a system property, transparency as
a necessity, anticipation of needs, continual decrease in waste, and
cooperation among clinicians as a priority. Many of these aims and
rules for patient care are reflected in quality improvement perfor-
mance indicators and programs (3).
RG • Volume 35 Number 6 Kelly and Cronin 1631
Radiology (ABR) requirement for maintenance
of certification (10). A business case can also be
made for quality, and this is highlighted by suc-
cessful efforts of companies like General Electric,
which uses quality improvement specialists and
earns large returns on investments (11).
Quality improvement efforts in health care
delivery are essential for ensuring and improving
patient safety, implementing and continuously
improving processes, analyzing and depicting
data, monitoring performance and implementing
change, enabling professional staff assessment
and development through continued education,
and optimizing customer service and patient
outcomes (3).
Basic Definitions
Quality control is carried out retrospectively to
establish when measurements fall outside ranges
of acceptability; when this occurs, action is taken
(12,13). Quality control in radiology involves
technical testing of imaging equipment and
evaluation of imaging quality to ensure that they
conform to standards. This includes evaluation of
shielding around radiography facilities, measure-
ment of processing parameters (eg, developer
temperature, pH level, film base and fog, speed,
and contrast), and assessment of radiation dose
(eg, kilovolt peak, milliampere-seconds, imager
output, and effective dose) and imaging parame-
ters (image contrast, resolution, artifacts, appro-
priate labeling, and consistency).
Quality assurance is more comprehensive and
aims to ensure excellent (better than acceptable)
standards by means of systematic collection and
evaluation of data. Although quality assurance
involves quality control, it focuses on specific
performance indicators to ensure that more ser-
vices are of high quality and fewer fall below the
acceptable range. Performance indicators in radi-
ology include safety, processes or procedures, and
professional or patient outcomes or satisfaction.
Examples of indicators are access to services, ap-
propriateness of utilization, patient selection pa-
rameters, timeliness of scheduling, the scheduling
process, prescreening of patients, selection of im-
aging modality and protocol, waiting times, wait-
ing room amenities, technical effectiveness and
efficiency, patient safety, image interpretation,
availability of previously obtained images, repeat
rate, correlation with pathologic findings, missed
diagnosis rate, timeliness of reporting, finaliza-
tion of reports, reporting of critical findings, and
patient or professional in-service education. By
identifying the performance indicators that need
the most improvement, quality assurance helps
radiology practices make decisions about their
clinical practice and operational functions.
Health care reforms (and the great recession)
have altered insurance plans, with a switch from
fee per procedure to capitated payments for hos-
pital services, preauthorization, reduced technical
and professional fees, bundling, and less coverage
for new services—all of which affect diagnostic
imaging services. Accountable care organiza-
tions and shared savings programs will encourage
health care systems to be more competitive. The
Centers for Medicare and Medicaid Services will
begin applying payment adjustments to eligible
physicians who do not satisfactorily report data
on quality measures for covered professional
services later this year (4). All of these factors and
public reporting of health performance data will
result in a greater incentive to focus on quality,
rather than quantity (5,6).
The Joint Commission requires that active
managed quality programs comply with national
patient safety goals, and American College of Ra-
diology (ACR) accreditation requires that a peer
review process be in place (7,8). For trainees,
the Accreditation Council for Graduate Medical
Education requires that residents be able to apply
principles of quality improvement to processes
(9). For practicing radiologists, practice quality
improvement is part IV of the American Board of
TEACHING POINTS
Quality improvement efforts in health care delivery are es-
sential for ensuring and improving patient safety, implement-
ing and continuously improving processes, analyzing and
depicting data, monitoring performance and implementing
change, enabling professional staff assessment and develop-
ment through continued education, and optimizing custom-
er service and patient outcomes.
In radiology, the focus of quality improvement is to improve
the performance and effectiveness of diagnostic and thera-
peutic procedure processes, the appropriateness of imaging
and procedures, quality and safety, and the management of
all imaging services.
With total quality management or continuous quality im-
provement, a uniform commitment to quality across the
whole organization is required to focus on establishing collab-
oration among various departments within the organization
to continuously improve processes associated with providing
services that meet or exceed expectations.
Kaizen (and lean) tools add an additional “human element” to
the traditional PDSA cycles in that potential stakeholders must
agree on what adds value and what constitutes waste before
implementing any changes. Kaizen events are well suited to
addressing focused problems in single work areas that can be
completed quickly and for which rapid results can be appar-
ent. Selection of the initial kaizen project should focus on a
conspicuous problem that most of the team find annoying.
To improve overall efficiency and produce sustained improve-
ment, a gradual, continuous, and comprehensive “lean trans-
formation” of work philosophy and workplace culture should
take place by applying lean methods to all processes and areas
within a department.
1632 October Special Issue 2015 radiographics.rsna.org
Key process metrics that can be readily mea-
sured include access times, wait times, standard-
ized protocols, and report finalization times (11).
Professional and patient outcome metrics include
peer review, chart review of reports compared
with standards of reference, and procedural out-
comes (success rate, complication rate, radiation
dose, and procedural time). Metrics of particular
importance to an organization are known as key
performance indicators, and selection of these
indicators will depend on the department’s or
institution’s priorities for safety and quality (17).
Another way to depict radiology quality metrics
involves mapping a patient’s journey through the
radiology department with a quality value map
(Fig 1), which reveals many opportunities for
quality improvement in radiology (18).
Approaches and
Graphical Tools Used to
Identify and Depict Quality Issues
With any quality improvement effort, one should
be familiar with and correctly use appropriate
graphical tools and approaches because one can
manage only that which one can accurately mea-
sure (16).
To determine what processes need improve-
ment, all safety breaches or quality issues must be
identified. These are then prioritized for quality
improvement. There are many ways to gather and
depict information about improvement opportu-
nities, including surveys, safety and environment
of care walkabouts, flowcharts or process maps
for identifying bottlenecks, peer review and error
reporting, using chart review to monitor compli-
ance with national patient safety goals, brain-
storming sessions, and strengths-weaknesses-
opportunities-threats analysis (16).
All potential stakeholders (eg, patients and
employees) should feel empowered to raise sug-
gestions because this generates more ideas and
ensures greater participation (“buy in”) in the
quality improvement effort. Any quality improve-
ment initiative should align with departmental
or institutional goals and have full leadership
support. One must also engage the leadership of
other departments that may be affected by the
changes. Quality improvement efforts are not
finite, and quality improvement is often an itera-
tive, repetitive process, with small tweaks made
along the way to ensure constant improvement
(3,19).
There are many graphical tools available for
depicting data regarding safety, process ef-
ficiency, and outcome measures (professional
and patient). These tools include flowcharts,
cause-and-effect (Ishikawa or fishbone) dia-
grams, Pareto charts, check sheets (tally sheets),
Quality improvement and continuous qual-
ity improvement focus on proactively improving
and continually enhancing the quality of care and
services by combining professional knowledge
with knowledge about making improvements
(14,15). Their philosophy is focused on continu-
ous improvement of processes associated with
services that meet or exceed the expectations
of the patient or referring clinician. For quality
improvement and continuous quality improve-
ment, clear standards should be identified for
every activity or process in an imaging facility.
These standards should be measurable to allow
processes to be continually improved. In radiol-
ogy, quality improvement and continuous quality
improvement also include highlighting errors and
improving them after they occur and analyzing,
understanding, and improving work processes.
In radiology, the focus of quality improvement is
to improve the performance and effectiveness of
diagnostic and therapeutic procedure processes,
the appropriateness of imaging and procedures,
quality and safety, and the management of all
imaging services (16).
Quality management and total quality man-
agement are other terms used to describe or-
ganizational processes that use strategies that
focus on maintaining existing quality standards
and making incremental improvements and are
sometimes used interchangeably with continuous
quality improvement With total quality man-
agement or continuous quality improvement, a
uniform commitment to quality across the whole
organization is required to focus on establish-
ing collaboration among various departments
within the organization to continuously improve
processes associated with providing services that
meet or exceed expectations. In today’s competi-
tive economic environment, quality management
processes are commonplace in hospitals and are
proving to be successful for improving quality
while controlling costs.
Quality Metrics
To assess for opportunities to improve quality,
one must be able to measure performance or
effectiveness before and after the quality improve-
ment initiative. Many measures or metrics can be
used, including safety, process performance and/
or efficiency, patient and professional outcomes,
and satisfaction (service). Safety is the founda-
tion of any quality program, and safety metrics
most relevant to a radiology department include
radiology-generated infections, medication error
rates, patient falls, contrast material–induced ne-
phropathy, critical test result reporting, specimen
labeling errors, hand hygiene, medication recon-
ciliation, and correct image labeling (11).
RG • Volume 35 Number 6 Kelly and Cronin 1633
control charts (Shewhart or statistical process
charts), histograms, and scatter diagrams. These
are known collectively as the seven basic tools of
quality (12). Scorecards and dashboards placed
in a prominent location can be used to display
gathered data and monitor and track results in an
organization (11,20,21).
Commonly Used
Approaches to Quality Improvement
Root Cause Analysis
Errors, sentinel events, and near misses occur
every day in hospitals and departments and can
present opportunities for improvement. The Joint
Commission requires that accredited hospitals
identify and respond appropriately to all sentinel
or major adverse events (22). Identification and
elimination of causes can establish circumstances
that will decrease the chances of recurrence (23).
Reactive identification of causes is the first step
to solving problems, and this is the basis of root
cause analysis—a retrospective process. After an
error occurs, root cause analysis can be performed
to investigate what, how, and why it occurred and
help determine actions to minimize or prevent its
recurrence (24). Root cause analysis should be
performed promptly after an event occurs, by a
team of four to 10 people with various roles and
backgrounds in the department. A useful graphical
tool is the fishbone diagram, and a useful method
to get from the obvious (proximate) cause to the
root cause is to use the “five whys” (25). For each
apparent cause, the team asks, “Why did this
happen?” Once the first answer is determined, the
question is repeated at least four more times. For
example, if a patient undergoes the wrong radio-
logic procedure, the first question asked is “Why
did the patient undergo the wrong procedure?”
The answer might be because the procedure was
entered incorrectly by the scheduler. The second
question is “Why did the scheduler enter the pro-
cedure incorrectly?” The answer might be because
it was prescribed incorrectly. The third question
becomes “Why was the incorrect examination
prescribed?” The answer might be that the radiolo-
gist could not read the handwriting on the request
form. The fourth question is “Why was the form
filled out in illegible handwriting?” The answer
may be that there is no electronic form (and be-
cause, in many cases, the doctor’s handwriting is
not legible). The final question is “Why don’t we
have an electronic request form for radiology pro-
cedures with a drop-down menu of options and
alerts in cases of allergies, other contraindications,
and appropriateness?”
The Joint Commission suggests 11 steps for
performing root cause analysis for sentinel events,
as follows: (a) organize a team, (b) define what
happened, (c) identify and define processes related
to the event, (d) identify proximate (closest to
the error) causes, (e) design and implement any
necessary “quick fix” interim changes, (f) identify
root causes, (g) identify potential risk-reduction
strategies, (h) formulate and evaluate proposed
improvement actions and identify measures of suc-
cess, (i) develop and implement an improvement
action plan, (j) evaluate and fine-tune improve-
ment efforts, and (k) communicate the results
(23,26). A cause-and-effect (fishbone) diagram for
root cause analysis of why an incorrect imaging
test was performed is illustrated in Figure 2.
Failure Mode and Effect Analysis
With failure mode and effect analysis (FMEA),
a prospective systematic approach is used to
identify and understand causes, contributing fac-
tors, and effects of potential failures on a process,
system, or practice. Proactive identification of po-
tential problems (latent and active predisposing
contributing factors) may prevent problems from
happening and is more effective than root cause
analysis (27). In addition, clinicians are more
likely to accept a prospective process like FMEA
because, unlike error reporting, there is no alloca-
tion of blame. Prospective analysis is a more posi-
tive approach to problems because it harnesses
people’s knowledge and competencies rather than
highlighting human weaknesses (27). Radiology
departments with equipment and many discrete
processes lend themselves to FMEA.
Improvement processes like FMEA are now
mandatory for hospitals that admit patients,
and Joint Commission standards require annual
FMEA and suggest five steps for performing it
(28). The steps (and substeps) involved in FMEA
depend on the complexity of the process and the
severity of the consequences of failure (27,29).
Fishbone diagrams are useful for depicting each
specific failure mode. First, a potential problem is
chosen and a team is assembled to map out and
evaluate the process leading up to the problem.
Each failure mode is assessed and ranked for
severity, probability of occurrence, and prob-
ability of detection to generate a criticality index
(risk priority number), a numeric score used to
prioritize and determine if any exceed acceptable
limits. An action plan is developed, and the team
follows up after the action plan is put in place to
reassess risk (Fig 3). Table 1 illustrates a worked
example of the FMEA process for patients re-
ferred for cardiac computed tomography (CT)
who did not undergo adequate workup, resulting
in missing information about contrast material
allergy, contraindications, or history, which can
lead to time delays. The National Center for
1634 October Special Issue 2015 radiographics.rsna.org
Figure 2. Fishbone (cause-and-effect) diagram shows root cause analysis of an error involving use of the wrong
imaging test.
Patient Safety at the U.S. Department of Veter-
ans Affairs has developed the health care FMEA
to proactively identify risks to patient safety and
reduce errors in health care (23). There are many
similarities to FMEA, but severity and probability
definitions are modified within a hazard scoring
matrix, and a decision tree is used in the health
care FMEA (30). Compared with FMEA, the
health care FMEA generates a more simplified
hazard score (from 1 to 16) compared with the
risk priority number (from 1 to 100). The Joint
Commission and the National Center for Patient
Safety have produced tool kits for FMEA team
facilitators and other interested team members.
These tool kits, along with the FMEA Handbook,
are useful resources (28,31).
Plan, Do, Study, Act
The “plan, do, study, act” (PDSA) (Deming) cycle
involves a trial and learning approach in which a
hypothesis or solution is generated and then tested
on a small scale before making changes to the
whole system (12,25). First, a specific process or
path is identified, and work teams, including all
parties that interact with the system being stud-
ied, are assembled. A system flowchart (graphical
tool) is created, and observations are made about
problems that exist or areas to be improved. Next,
testing is initiated and a course of action to im-
prove the process is planned. Finally, the effects of
changes to the system are monitored or measured
and feedback is given. Small changes can be incor-
porated quickly at the end of each cycle, with addi-
tional changes added for subsequent cycles (Fig 4).
An example of a PDSA cycle in radiology would be
to look at the appropriate use of CT for pulmonary
embolism in the emergency department (Fig 5). In
the first PDSA cycle, one would plan what data to
collect with CT for pulmonary embolism (eg, age,
sex, whether a Wells score was obtained, whether a
Figure 1. Radiology qual-
ity value map. Diagram
shows the patient’s jour-
ney through the radiology
department. (Adapted and
reprinted, with permission,
from references 11 and 18.)
RG • Volume 35 Number 6 Kelly and Cronin 1635
Figure 3. Diagram illus-
trates the steps involved in
FMEA.
Table 1: Example of FMEA Process for Patients Referred for Cardiac CT Who Did Not Undergo Ad-
equate Workup
Process Potential Risks
Detection
Score
Occurrence
Score
Severity
Score
Risk Priority
No. or Criti-
cality Score*
Process 1: examination requested
Request completed
and faxed to
radiology
Insufficient or incorrect clinical
data on request; incorrect car-
diac CT requested
6 4 5 120
Protocol
completed
Incorrect type of cardiac CT pre-
scribed; insufficient or incorrect
clinical information available;
requesting clinician’s handwrit-
ing not legible
3 3 5 45
Process 2: cardiac CT is performed
Acquire images Delay in performing examination
as clinical information is sought
2 6 3 36
Wrong type of cardiac CT per-
formed because clinical informa-
tion is missing
2 3 6 36
Examination is canceled because
patient has a contraindication or
allergy
5 2 2 20
Administer contrast
material and
other drugs
Patient has contrast material al-
lergy or contraindication
6 2 6 72
Patient has β-blocker allergy or
contraindication
6 2 6 72
Patient has nitroglycerin allergy or
contraindication
6 2 6 72
Process 3: notification of significant or urgent findings at examination
Call clinician to
convey results
Referring clinician is outside hospi-
tal system and cannot be located
1 4 6 24
Referring clinician out of office
and not contactable by any other
means
1 4 6 24
Note.—Inadequate workup can result in missing information about contrast material allergy, contraindications,
or history, which can lead to delays.
*The risk priority number or criticality index is the product of the detection, occurrence, and severity scores.
1636 October Special Issue 2015 radiographics.rsna.org
Figure 5. Diagram illustrates PDSA cycles with regard to
appropriateness for CT pulmonary angiography through the
emergency department. PE = pulmonary embolism, VQ =
ventilation-perfusion.
Figure 4. Diagram illustrates the steps involved in the
PDSA cycle.
d-dimer assay was performed, whether ultrasonog-
raphy or ventilation-perfusion scanning was used)
and then collect the data (“do”). The data are then
studied to assess the appropriateness rate. The final
step in the PDSA cycle is to act (eg, deliver an edu-
cational intervention). In the second cycle, follow-
ing the educational intervention, one would carry
out the same steps of the PDSA approach and
determine whether the educational intervention
improved appropriateness. If not, then one would
plan to survey referring clinicians to assess their
perceived indications for use of CT in pulmonary
embolism and their perceptions of its risks and
benefits. Cycle 3 would be a PDSA cyle carried out
after a revised targeted educational intervention on
the basis of the results of the survey.
Lean Management
Lean management, or simply “lean,” derives
from the Toyota Production System management
and manufacturing policies designed to allow
personnel and organizations to become more
efficient and eliminate waste (32). Continuous
incremental improvements in performance are
made, with the goal of adding value to services
provided while maintaining the highest possible
customer satisfaction. Expenditure of resources
for reasons other than creating value for the
customer is regarded as wasteful and targeted for
elimination. As waste is eliminated, production
times and costs will decrease. Lean management,
which emphasizes process analysis, is particularly
relevant to radiology departments, which depend
on a smooth flow of patients and uninterrupted
equipment function for efficient operation (33).
To improve overall efficiency and produce a sus-
tained improvement, a gradual, continuous, and
comprehensive “lean transformation” of work
philosophy and workplace culture should take
place by applying lean methods to all processes
and areas within a department. Lean transforma-
tion differs from other approaches in that it is
both a philosophy and an organizational way of
life that continuously transforms staff into lean
experts and keeps everyone on the path of con-
tinuous improvement (33).
The principles of lean management include
having equal involvement and respect for all staff,
observing and analyzing processes where they
occur (by “going to the gemba,” or “real place”),
eliminating all forms of waste or steps in the
process that do not add value, standardizing work
processes to minimize variation, improving flow
of all processes in the system, using visual cues
to communicate and update, adding value for the
customer, and using lean (graphical) tools.
Lean management starts with equal involve-
ment of all staff members, with the understand-
ing that frontline workers best understand any
problems and must be involved in quality im-
provement efforts at all stages of the process. The
opinions of all staff must be respected and val-
ued, and staff should move from compliance (do-
ing things because they have to) to commitment
(doing things because they believe it is adding
value) to ensure sustainable change (33). For any
lean initiative to achieve sustainable change, all
stakeholders (supervisors, managers, and front-
line staff members) must be engaged in dialogue
and brainstorming at all stages of the process.
Visiting the workplace (ie, going to the gemba)
is encouraged for senior staff to view the workflow
and process environment, identify safety hazards,
see the cause of complaints, and better understand
inefficiencies. Workers can demonstrate processes
and illustrate inefficiencies and then be allowed to
make suggestions for improvements in their area.
RG • Volume 35 Number 6 Kelly and Cronin 1637
Eliminating waste or steps that do not add
value is an essential principle of lean manage-
ment. The eight wastes of the lean approach are
applicable to radiology and include overproduc-
tion (imaging more anatomy than needed), trans-
portation (unnecessarily transferring patients,
personnel, or equipment), inventory (stock occu-
pies physical space that costs money to rent), mo-
tion (unnecessary movement of personnel within
a work area), defects (field of view is too small),
overprocessing (too many reformatted images
are produced), waiting (imaging unit downtime),
and skills (underutilizing capabilities of faculty)
(33,34).
The standardization of work reduces varia-
tion and increases efficiency and, in radiology,
involves the use of flowcharts (eg, determination
of an appropriate imaging modality for patients
suspected of having pulmonary embolus) and
preprocedure checklists.
To identify variations in flow and order
(bottlenecks) in complex clinical environments,
process management tools like value stream
mapping can be used to map the process from
concept to product (35). Value stream maps differ
from traditional flow diagrams or process maps
in that they enable capture of both process and
material flow and allow clear identification of
waste and value-added steps. Mapping out an
entire process, such as an outpatient diagnostic
test encounter (from request initiation through
report finalization), involves a multidisciplinary
team of all involved in the process steps. Before
mapping, it is important to clearly understand
the customers’ (patient, referring clinician) needs
and expectations in relation to the product to be
mapped (35). There are six steps in value stream
mapping, but maps will vary in size and complex-
ity depending on the process.
The first step in value stream mapping is to
chart suppliers (personnel involved in starting
the process, including referring clinicians and
patients), inputs (information entered to start
the process, including request forms), processes
(including examination requests, prescreening,
scheduling, protocoling, patient registration,
patient preparation, and imaging procedure), out-
puts (effects of the test, including test results and
disposition), and customers (personnel affected
by the information, including patients, admit-
ting ward personnel, clinicians, and radiologists).
These steps are abbreviated as “SIPOC” (Table
2) (36).
Next, the process is observed in the work-
place and then mapped in at least four to eight
steps to identify queues and staging areas in
the process. Information systems (electronic or
manual) should be added to the map. Data and
time stamps should be identified (wait time, cycle
time, number of people in queue, delays), and
sources of waste should be searched for (informa-
tion, process, physical, environment, and people).
Finally, the map is completed by validating times
with baseline data (how long steps should take,
with use of benchmarks from the literature) (Fig
6) (37).
Other tools that can be used to improve order
include the five S’s (sorting, straightening, scrub-
bing, systematizing, and standardizing), produc-
tion leveling and/or level scheduling (keeping
small batches rather than bulk), “push” (making
required actions mandatory) and “pull” (mak-
ing required actions easy) systems, and mistake
proofing (33).
Visual tools can be used to communicate and
inform and include color coding of different
devices to enable easier stocking and retrieval and
the use of reminder (to restock) cards inserted
near the end of a batch of stock. An A3 report
(named after the A3 paper size) documents the
background (problem), current condition, and
goals on one side and depicts analysis, proposed
solutions (countermeasures), implementation,
and status reviews on the other (38). An A3
Table 2: First Step in Value Stream Mapping: Charting Supplier, Input, Process, Output, and Customer
Supplier Input Process Output Customer
Patient Patient information Request submitted Imaging results Patient
Referring clinician Histor y and physical Prescreening ques-
tionnaire
Patient disposi-
tion
Referring clinician
Scheduler Schedule Radiologist protocols Radiologist
Radiologist Information systems Technologist reviews
Lead technologist Available imaging time Patient scheduled
Radiology nurses Magnetic resonance
(MR) imaging safety
information
Patient registers
MR imaging unit MR imaging request Perform MR imaging
1638 October Special Issue 2015 radiographics.rsna.org
Figure 6. (a) Value stream map. Diagram illustrates the patient pathway through the MR imaging department
and reveals that the longest wait time was for the nurse to insert the intravenous line. IV = intravenous, MiChart =
electronic medical record system, PACS = picture archiving and communication system, RIS = radiology informa-
tion system. (b) Value stream mapping icons.
(and lean) tools add an additional human element
to the traditional PDSA cycles in that potential
stakeholders must agree on what adds value and
what constitutes waste before implementing any
changes. Kaizen events are well suited to address-
ing focused problems in single work areas that can
be completed quickly and for which rapid results
can be apparent. Selection of the initial kaizen
project should focus on a conspicuous problem
that most members of the team find annoying. The
problem should be defined and measured if pos-
sible. Next, the problem is analyzed, and a single
waste problem is selected to focus on. A kaizen
team should be assembled to include a sponsor
(ideally, a knowledgeable individual, such as a
lean or kaizen coach or a member of the quality
report posted in a prominent location docu-
ments learning, decision making, and planning
steps undertaken to solve problems after a root
cause analysis. Displaying the A3 report is useful
to communicate lean project goals, foster effec-
tive and efficient dialogue in an organization or
group, and communicate progress to staff (Fig 7).
In any part of the lean processes, one must not
lose focus on the customers and should ensure
that value is being added from their perspective.
In radiology, customers include referring physi-
cians and patients. One must solicit feedback
from these groups because our perception of their
values may not be accurate. Referring physicians
rely heavily on imaging and often value timeli-
ness of report finalization, whereas patients value
aspects like ease and speed of procedure schedul-
ing or comfort in the reception area.
Kaizen
Kaizen, which means “good change” or “improve-
ment,” is a lean management tool; other tools
include hoshin or balanced planning, balanced
scorecards, annual operating plans, and manage-
ment dashboards (33). A kaizen event (or blitz)
involves the quick (short period, eg, a week)
analysis of small manageable components of prob-
lems, with rapid implementation of solutions and
continuous real-time reassessment (19). Kaizen
RG • Volume 35 Number 6 Kelly and Cronin 1639
assurance team), a leader (this could be the person
who initially identified the problem), and mem-
bers (any involved or interested team members).
The team can hold formal (kaizen event or blitz)
or informal (applicable to problems that everyone
agrees have a simple quick solution to the waste)
meetings. Next, the process change is agreed on
and solutions implemented. Because kaizen is
often a local process that involves a single depart-
ment, this allows quick assessment or testing of the
effect of the change. The steps involved in kaizen
are as follows: define, measure, and analyze the
problem; select a single waste step to focus on;
assemble a kaizen team with sponsor, leader, and
members; hold a kaizen event to discuss strate-
gies to address waste step; implement the process
change; quickly test the effect of change; and iden-
tify the next problem and repeat the process.
Lean or kaizen efforts must be aligned with
institutional goals to decrease risk for changes
being reversed. All potential stakeholders must
be included, from institutional leaders and senior
administration, through all management levels, to
frontline workers. Stakeholders must feel continu-
ally engaged and be promoted and rewarded for
reporting events, including near misses. A “just
culture” of quality and safety should be promoted
by building consensus and momentum. Initially,
the lean or kaizen team will need help in assem-
bling and guidance on the selection of clearly
definable problems. Most health systems now have
lean or kaizen coaches available to guide teams
through the PDSA process. After a few cycles,
team leaders will evolve to become coaches to ap-
ply principles to optimize engagement and achieve
improvements. The team should meet regularly
and be committed to the PDSA cycles. The use
of visual tools can help keep meetings short and
focused (19). A surveillance system should be es-
tablished to monitor quality indicators, and a sys-
tematic process should be put in place to analyze
and manage reported events. Finally, processes
must be implemented to prevent errors, improve
safety, and manage customer relations, including
continued educational programs.
Six Sigma
Six Sigma has been around for about a century
and was popularized by Motorola in the 1980s.
In manufacturing, Six Sigma is a quality standard
based on reducing variability within processes or
products to move the mean toward a standard of
reference (39). Six Sigma indicates six standard
deviations from the arithmetic mean, allowing only
3.4 defects per million. By measuring the number
of defects, identifying the sources of error, and
systematically determining how to avoid them,
one aims to reach zero defects. Six Sigma involves
statistical methods and starts with identification of
the process and key customers, which requires a
team of individuals who understand the problem
and are familiar with the institution. The five steps
of the Six Sigma process are as follows: define cus-
tomer needs, measure performance, analyze data,
set priorities and launch improvements, and check
for change compared with baseline to control the
future process (define, measure, analyze, improve,
and control, or DMAIC) (13). The elements of Six
Sigma are similar to those of PDSA and other qual-
ity improvement tools and align with good medi-
cal practice (Fig 8). A perceived weakness of Six
Sigma is its complexity, with rigorous adherence to
problem solving potentially resulting in overworking
simple problems with obvious solutions (34).
Figure 7. Sample A3 report for
patients who arrive at the CT
department without information
about creatinine level and allergies.
1640 October Special Issue 2015 radiographics.rsna.org
Figure 10. Diagram illustrates the Institute for Healthcare Im-
provement model for health care improvement.
Figure 9. Diagram illustrates the ABR three-phase model for
practice quality improvement projects.
Figure 8. Diagram illustrates steps involved in the Six Sigma
approach.
The ABR
In 2007, the ABR established the practice qual-
ity improvement program as part of its main-
tenance of certification program, with the goal
“to improve the quality of healthcare through
diplomate-initiated learning and quality im-
provement” (40). A satisfactory practice quality
improvement project must be relevant to one’s
practice, be achievable in one’s clinical setting,
produce results suited for repeat measurements
during the maintenance of certification cycle,
and be reasonably expected to improve quality
(41). The ABR’s prescribed method is a version
of PDSA and has at least three phases: a base-
line PDSA cycle, implementation of an im-
provement plan, and a post-improvement-effort
PDSA cycle (42). This practice quality improve-
ment model is incorporated into the ABR’s
maintenance of certification for diplomates,
with time-limited ABR certificates (from 2003
on) being required to complete three practice
quality improvement projects within a 10-year
cycle and diplomates with continuous certifi-
cates (from 2012 on) required to perform at
least one practice quality improvement project
in the previous 3 years on audit. The ABR al-
lows individual, group, or institutional projects,
and groups can include radiation oncologists
and physicists. The ABR Web site gives useful
examples and information for carrying out proj-
ects (41). With this approach, changes are tested
and implemented one at a time with periods of
measurement in between (Fig 9).
Model for
Health Care Improvement
The Institute for Healthcare Improvement is
committed to redesigning health care into a
system without errors, waste, delay, and unsus-
tainable costs (43). Its model for health care
improvement is one of the most popular qual-
ity improvement methods used in health care
and an alternative to the ABR’s approach that
involves framing the proposed project with three
questions, namely “What are we trying to ac-
complish?” “How will we know that the change
is an improvement?” and “What changes can we
make that will result in improvement?” (Fig 10)
(42). Suggested process changes are then tested
to obtain knowledge about the process or the
potential effectiveness of proposed changes, re-
fined, and tested again (in an iterative process),
with changes implemented gradually. Each
round of testing is conducted by using PDSA
methods. Essentially, each test is a focused
scientific experiment, with a hypothesis, plan-
ning and conducting the test, analysis of results,
and actions taken on the basis of the results,
which could include making a change, modify-
ing the planned changes, retesting, or rejecting
the change (42). An advantage of the model
for health care improvement is that it enables
rapid iteration, with measures obtained continu-
ously, allowing an effect on outcome almost
immediately.
RG • Volume 35 Number 6 Kelly and Cronin 1641
The ACR
ACR resources available to enable radiologists to
assess, monitor, and improve radiology quality
include the ACR physician consortium for perfor-
mance improvement and national committee for
quality assurance clinical performance measures,
which can be used as a basis for quality improve-
ment efforts (44). By registering with the ACR
National Radiology Data Registry, radiology prac-
tices can benchmark outcomes and process-of-
care measures and develop quality improvement
programs that qualify as ABR maintenance of cer-
tification (45). Subscribing to ACR’s RADPEER
allows practices to carry out periodic peer review
and counts as a group practice quality improve-
ment project for ABR maintenance of certification
(46). Finally, freely available ACR practice param-
eters and technical standards provide standards
for safe and effective diagnostic imaging, and the
ACR Appropriateness Criteria provide guidance
on appropriate imaging in specific patient clinical
scenarios (47,48).
Conclusion
Herein, we described practical approaches to
quality improvement for practicing radiologists
and outlined available resources. Continuous
quality improvement programs will ensure that
radiology practice is optimized to ensure the
highest quality patient care while satisfying regu-
latory and health care policy requirements and
containing costs.
References
1. Institute of Medicine. To err is human: building a safer
health system. https://www.iom.edu/~/media/Files/Re
port%20Files/1999/To-Err-is-Human/To%20Err%20is%20
Human%201999%20%20report%20brief.pdf. Published
November 1999. Accessed February 22, 2015.
2. Institute of Medicine. Crossing the quality chasm: a new
health system for the 21st century. https://www.iom.edu
/Reports/2001/Crossing-the-Quality-Chasm-A-New-Health
-System-for-the-21st-Century.aspx. Published March 2001.
Accessed February 22, 2015.
3. Kruskal JB, Anderson S, Yam CS, Sosna J. Strategies for
establishing a comprehensive quality and performance im-
provement program in a radiology department. RadioGraphics
2009;29(2):315–329.
4. Centers for Medicare and Medicaid Services (CMS).
Physician quality reporting system (PQRS). http://www
.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment
-Instruments/PQRS/index.html?redirect=/PQRS/. Published
April 2014. Accessed February 22, 2015.
5. Seltzer SE, Lee TH. The transformation of diagnostic radiol-
ogy in the ACO era. JAMA 2014;312(3):227–228.
6. Toussaint JS, Berry LL. The promise of lean in health care.
Mayo Clin Proc 2013;88(1):74–82.
7. Joint Commission. National patient safety goals (NPSG).
http://www.jointcommission.org/standards_information
/npsgs.aspx. Published 2015. Accessed February 22, 2015.
8. American College of Radiology. Accreditation. http://www
.acr.org/quality-safety/accreditation. Accessed February
22, 2015.
9. Accreditation Council for Graduate Medical Education
(ACGME). Diagnostic radiology milestones. http://www
.acgme.org/acgmeweb/tabid/148/ProgramandInstitutional
Accreditation/Hospital-BasedSpecialties/DiagnosticRadiol
ogy.aspx. Published 2012. Accessed February 22, 2015.
10. American Board of Radiology. Maintenance of certifica-
tion part IV: practice quality improvement (PQI). http://
www.theabr.org/moc-dr-comp4. Published 2007. Accessed
February 22, 2015.
11. Johnson CD, Krecke KN, Miranda R, Roberts CC, Denham
C. Quality initiatives: developing a radiology quality and
safety program—a primer. RadioGraphics 2009;29(4):
951–959.
12. Abujudeh HH, Bruno MA. Basic definitions. In: Quality
and safety in radiology. Oxford, England: Oxford University
Press, 2012.
13. Erturk SM, Ondategui-Parra S, Ros PR. Quality manage-
ment in radiology: historical aspects and basic definitions. J
Am Coll Radiol 2005;2(12):985–991.
14. Applegate KE. Continuous quality improvement for radiolo-
gists. Acad Radiol 2004;11(2):155–161.
15. Amaral CS, Rozenfeld H, Costa JM, Magon MdeF, Mas-
carenhas YM. Improvement of radiology services based on
the process management approach. Eur J Radiol 2011;78(3):
377–383.
16. Kruskal JB, Eisenberg R, Sosna J, Yam CS, Kruskal JD,
Boiselle PM. Quality initiatives: quality improvement in
radiology—basic principles and tools required to achieve
success. RadioGraphics 2011;31(6):1499–1509.
17. Abujudeh HH, Kaewlai R, Asfaw BA, Thrall JH. Quality
initiatives: key performance indicators for measuring and
improving radiology department performance. RadioGraph-
ics 2010;30(3):571–580.
18. Swensen SJ, Johnson CD. Radiologic quality and safety: map-
ping value into radiology. J Am Coll Radiol 2005;2(12):992–
1000.
19. Knechtges P, Decker MC. Application of kaizen methodology
to foster departmental engagement in quality improvement.
J Am Coll Radiol 2014;11(12 Pt A):1126–1130.
20. Donnelly LF, Gessner KE, Dickerson JM, et al. Quality
initiatives: department scorecard—a tool to help drive imag-
ing care delivery performance. RadioGraphics 2010;30(7):
2029–2038.
21. Abujudeh HH, Bruno MA. Control charts and dashboards.
In: Quality and safety in radiology. Oxford, England: Oxford
University Press, 2012.
22. Joint Commission. Sentinel events. http://www.jointcom
mission.org/assets/1/6/CAMH_24_SE_all_CURRENT.pdf.
Updated January 2015. Accessed February 22, 2015.
23. Abujudeh HH, Bruno MA. Root cause analysis (RCA) and
health care failure and effect analysis (HFMEA). In: Quality
and safety in radiology. Oxford, England: Oxford University
Press, 2012.
24. Kruskal JB, Siewert B, Anderson SW, Eisenberg RL, Sosna J.
Managing an acute adverse event in a radiology department.
RadioGraphics 2008;28(5):1237–1250.
25. Bruno MA, Nagy P. Fundamentals of quality and safety
in diagnostic radiology. J Am Coll Radiol 2014;11(12 Pt
A):1115–1120.
26. Joint Commission. Framework for conducting a root cause
analysis and action plan. http://www.jointcommission
.org/Framework_for_Conducting_a_Root_Cause_Analy
sis_and_Action_Plan/. Updated March 2013. Accessed
February 22, 2015.
27. Thornton E, Brook OR, Mendiratta-Lala M, Hallett DT,
Kruskal JB. Application of failure mode and effect analysis in
a radiology department. RadioGraphics 2011;31(1):281–293.
28. Joint Commission. Failure mode and effect analysis (FMEA).
http://www.jointcommission.org/Failure_Mode_Effect_and_
Criticality_Analysis_FMECA_Worksheet/. Accessed July 24,
2015.
29. Abujudeh HH, Kaewlai R. Radiology failure mode and effect
analysis: what is it? Radiology 2009;252(2):544–550.
30. DeRosier J, Stalhandske E, Bagian JP, Nudell T. Using health
care failure mode and effect analysis: the VA National Center
for Patient Safety’s prospective risk analysis system. Jt Comm
J Qual Improv 2002;28(5):248–267, 209.
31. Dailey KW. The FMEA pocket handbook. New York, NY:
DW Publishing, 2004.
1642 October Special Issue 2015 radiographics.rsna.org
32. Toyota Production System. http://www.toyota-global.com
/company/vision_philosophy/toyota_production_system/.
Accessed July 24, 2015.
33. Kruskal JB, Reedy A, Pascal L, Rosen MP, Boiselle PM.
Quality initiatives: lean approach to improving performance
and efficiency in a radiology department. RadioGraphics
2012;32(2):573–587.
34. Vallejo B. Tools of the trade: lean and six sigma. J Healthc
Qual 2009;31(3):3–4.
35. Lee E, Grooms R, Mamidala S, Nagy P. Six easy steps on
how to create a lean sigma value stream map for a multidisci-
plinary clinical operation. J Am Coll Radiol 2014;11(12 Pt A):
1144–1149.
36. i Six Sigma. SIPOC diagram. http://www.isixsigma.com/tools
-templates/sipoc-copis/sipoc-diagram/. Accessed February
22, 2015.
37. i Six Sigma. Value stream mapping. http://www.isixsigma.com
/tools-templates/value-stream-mapping/. Accessed February
22, 2015.
38. A3 Thinking. http://a3thinking.com/index.html. Published
2013. Accessed February 22, 2015.
39. Abujudeh HH, Bruno MA. Six sigma and lean: opportunities
for health care to do more and better with less. In: Quality
and safety in radiology. Oxford, England: Oxford University
Press, 2012.
40. Strife JL, Kun LE, Becker GJ, Dunnick NR, Bosma J,
Hattery RR. American Board of Radiology perspective on
maintenance of certification. Part IV. Practice quality im-
provement for diagnostic radiology. RadioGraphics 2007;
27(3):769–774.
41. American Board of Radiology. Maintenance of certification:
PQI projects and templates. http://www.theabr.org/moc-dr
-pqi-projects. Accessed February 22, 2015.
42. Lee CS, Larson DB. Beginner’s guide to practice quality
improvement using the model for improvement. J Am Coll
Radiol 2014;11(12 Pt A):1131–1136.
43. Institute for Healthcare Improvement Web site. http://www
.ihi.org/Pages/default.aspx. Accessed February 22, 2015.
44. American College of Radiology. Performance measures.
http://www.acr.org/Quality-Safety/Quality-Measurement
/Performance-Measures. Updated 2015. Accessed February
22, 2015.
45. American College of Radiology. National radiology data reg-
istry. http://www.acr.org/Quality-Safety/National-Radiology-
Data-Registry. Updated 2015. Accessed February 22, 2015.
46. American College of Radiology. RADPEER. http://www.acr
.org/Quality-Safety/RADPEER. Accessed February 22, 2015.
47. American College of Radiology. Practice parameters and
technical standards. http://www.acr.org/Quality-Safety
/Standards-Guidelines. Updated 2015. Accessed February
22, 2015.
48. American College of Radiology. ACR appropriateness criteria.
http://www.acr.org/Quality-Safety/Appropriateness-Criteria.
Updated 2015. Accessed February 22, 2015.
This journal-based SA-CME activity has been approved for AMA PRA Category 1 CreditTM. See www.rsna.org/education/search/RG.
... The next step is to apply the 5S (sort, simplify, sweep, standardize, and self -discipline). [21][22][23] Six Sigma and Lean have a complementary relationship with each other and can be combined as Lean Six Sigma. The synergetic adoption of these methods allows the creation of a continuous process flow that eliminates waste (Lean) and reduces process variation (Six Sigma), to achieve and maintain the best quality. ...
... IRL is about using the opportunities from reported actual or potential incidents and analyzing them to determine the systemic and human factors involved. [22,26,27] IRL is a reactive and retrospective look at a known error. ...
... Quality improvement is a central component of a highperforming healthcare system and essential to improving performance standards and optimizing delivery of care [1]. Recent efforts to promote improved healthcare delivery are motivated by several Institute of Medicine (IOM) reports which highlight the magnitude and scope of medical errors and estimate the role of diagnostic errors as contributing to 10% of patient deaths [2,3]. ...
Article
Full-text available
Objective The purpose of this study was to transition from a traditional score-based peer-review system to an education-oriented peer-learning program in our academic abdominal radiology practice. Material and methods This retrospective study compared our experience with a score-based peer-review model used prior to September 2020 and a peer-learning model implemented and used exclusively beginning in October of 2020. In peer review, a web-based peer-review tool randomly generated a list of cases, which were blindly reviewed in consensus. Comparison of the consensus interpretation with the original report was used to categorize each reviewed case and to calculate the rates of significant and minor discrepancies. Only cases with a discrepancy were considered to represent a learning opportunity. In peer learning, faculty prospectively identified and submitted cases for review in several categories, including case interpretations with a discrepancy from subsequent opinion or result, interpretations considered to represent a great call, and interesting or challenging cases meriting further discussion. The peer-learning coordinator showed each case to the group in a manner which blinded the group to both submitting and interpreting radiologist and invited discussion during various stages of the case. Results During peer review, a total of 172 cases were reviewed over 16 sessions occurring between April 2016 and September 2020. Only 3 cases (1.8%) yielded significant discrepancies whereas 13 (7.6%) yielded minor discrepancies, representing a total of 16 learning opportunities (3.6 per year). In peer learning, 64 cases were submitted and 52 reviewed over 7 sessions occurring between October 2020 and October 2021. 29 (56%) were submitted as an interesting or challenging case meriting further discussion, 18 (35%) were submitted for a discrepancy, and 5 (10%) were submitted for a great call. All 52 presented cases represented learning opportunities (48 per year). Conclusion An education-focused peer-learning program provided a platform for continuous quality improvement and yielded substantially more learning opportunities compared to score-based peer review. Graphical abstract
... Therefore, an evaluation of data transparency by radiologic technologist managers and error reporting by staff technologists might further explain why staff technologists' OPRS was lower in this study. Given that those closest to the problem often are most acutely aware of the issues at hand, managerial rounding, Gemba walks, 59 and active and open communication related to radiation safety challenges with staff technologists might support an improved understanding of radiation safety perception across radiologic technologists holding various roles. ...
Article
Purpose: The purpose of the study was to examine mean differences between intrapersonal and institutional variables and the overall perception of radiation safety (OPRS) among U.S. radiologic technologists. The study also sought to demonstrate the applicability of the socioecological model for radiation safety decision-making. Methods: A quantitative, cross-sectional design with the Radiation Actions and Dimensions of Radiation Safety survey instrument was used to collect data and guide hypotheses testing. The 425 research participants included radiologic technologists working in radiography, mammography, computed tomography, and radiology management. Categorical and descriptive data were calculated, and 1-way analysis of variance tests were used to analyze hypotheses. Results: Seven main effects demonstrated mean differences between groups for the OPRS, including age (F 5,419 = 2.55, P = .03), years of experience (F 5,419 = 4.27, P = .001), primary employed imaging modality (F 2,422 = 9.04, P < .001), primary role (F 2,422 = 4.58, P = .01), shift length (F 3,421 = 10.33, P < .001), primary practice facility (F 4,404 = 5.00, P = .001), and work shift (F 3,405 = 4.14, P = .007), with shift length having the largest effect. Level of education, employment status, number of imaging credentials, gender, patient population, and practice location were not significant at the level of P ≤ .05. Discussion: Radiation safety culture is a multidimensional topic that requires consideration of several intervening influences, making the socioecological model well aligned when considering radiation safety culture and radiation safety perception in medical imaging. Previous research on radiation safety perception among radiologic technologists demonstrated that leadership actions, teamwork across imaging stakeholders, organizational learning, and questioning behavior are drivers of OPRS. However, this study's findings demonstrate that radiologic technologist scheduling practices and primary employed imaging modalities also should be considered when seeking to improve OPRS. Conclusion: This study presents an extensive examination of intrapersonal and institutional variables on OPRS among U.S.-based radiologic technologists and provides findings to support radiation safety culture decision-making in medical imaging, particularly for shift length considerations.
... Quality improvement is currently highly prioritised by most health services and various reports refer to different approaches that could be followed in diagnostic medical imaging [39,40]. Table 1 proposes a tool that imaging teams could use to reflect on PCC in their practice and on ways to improve it. ...
Article
Full-text available
Background There is emerging interest in person-centred care within a short-lived yet complex medical imaging encounter. This study explored this event from the viewpoint of patients referred for an imaging examination, with a focus on the person and their person-al space. Methods We used convenience sampling to conduct semi-structured interviews with 21 patients in a private medical imaging practice in Australia. The first phase of data analysis was conducted deductively, using the six elements of the person-centred, patient-journey framework of the Australian Commission on Safety and Quality in Healthcare: transition in; engagement; decisions; well-being; experience; and transition out. This was followed by inductive content analysis to identify overarching themes that span a patient’s journey into, through and out of an imaging encounter. Results The transition-in phase began with an appointment and the first point of contact with the imaging department at reception. Engagement focused on patient-radiographer interactions and explanations to the patient on what was going to happen. Decisions related primarily to radiographers’ decisions on how to conduct a particular examination and how to get patient cooperation. Participants’ well-being related to their appreciation of gentle treatment; they also referred to past negative experiences that had made a lasting impression. Transitioning out of the imaging encounter included the sending of the results to the referring medical practitioner. Person-al vulnerabilities emerged as a cross-cutting theme. Patients’ vulnerability, for which they needed reassurance, pertained to uncertainties about the investigation and the possible results. Healthcare professionals were vulnerable because of patient expectations of a certain demeanour and of pressure to perform optimal quality investigations. Lastly, patients’ personal lives, concerns and pressures – their person-al ‘baggage’ – shaped their experience of the imaging encounter. Conclusion To add value to the quality of the service they deliver, radiography practitioners should endeavour to create a person-al space for clients. Creating these spaces is complex as patients are not in a position to judge the procedures required by technical imaging protocols and the quality control of equipment. A reflective tool is proposed for radiographers to use in discussions with their team and its leaders on improving person-centred care and the quality of services in their practice.
... Quality assurance with respect to the clinical processes associated with MRI can be measured and tracked to provide objective data. These data potentially include several parameters including appropriateness of imaging ordered, ready access to imaging and the scheduling process, wait times before scanning, timely study protocoling, patient safety in the scanner, image interpretation, timeliness of reporting, imaging repeat rates, communicating critical findings and measuring the outcomes both clinically and in terms of patient satisfaction surveys [7,14]. The authors present their experiences with institutional QA processes across a spectrum of pediatric radiology divisions in the next subsections. ...
Article
Full-text available
Quality in MR imaging is a comprehensive process that encompasses scanner performance, clinical processes for efficient scanning and reporting, as well as data-driven improvement involving measurement of key performance indicators. In this paper, the authors review this entire process. This article provides a framework for establishing a successful MR quality program. The collective experiences of the authors across a spectrum of pediatric hospitals is summarized here. © 2021, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.
Article
The Kaizen method is an approach to lean process improvement that is based on the idea that small ongoing positive changes can lead to major improvements in efficiency and reduction of waste. The hospital-based CT division at Mayo Clinic Arizona had been receiving numerous concerns of delays in the performance of examinations from inpatients, outpatients, and patients presenting to the emergency department. These concerns, along with a planned hospital expansion, provided the impetus to perform a process improvement project with the goal of reducing inpatient, emergency department, and outpatient turnaround times by 20%. Kaizen process improvement was chosen because of the emphasis on reduction of waste, standardization, and empowerment of frontline staff. The project was led by a process improvement coach who was trained in lean process improvement and A3 thinking. At the end of a weeklong Kaizen event, inpatient turnaround time decreased by 54%, emergency department turnaround time decreased by 29%, and outpatient turnaround time decreased by 45%. These results were achieved and sustained by establishing standardized work, developing frontline problem solvers, instituting visual management, aligning with relevant metrics, emphasizing patient and staff satisfaction, and reducing lead time and non-value-added work. When done properly, a Kaizen event can be an effective tool for process improvement in the health care setting. Online supplemental material is available for this article. ©RSNA, 2022.
Article
Full-text available
Background Medical ultrasound device has been more and more widely used in the hospital and Its safety risk is significantly increased when failures occur. However, there is a lack of quantitative risk assessments of different types of failure modes for medical ultrasound device. This study utilizes a failure mode, effect and criticality analysis (FMECA) approach for quantitative risk evaluation of different failure modes for ultrasound devices. Methods The 4216 medical ultrasound device failure records at various hospitals were investigated. A failure mode and effect analysis method was developed for the quantitative evaluation of the risks of different failure modes. Visual correlation analysis was conducted for all categories to identify the causes of various failure modes. Based on the severity, occurrence and detectability of the failure causes determined, the risk priority number (RPN) for each failure mode was back-calculated through the obtained tracking diagram. Results The failure modes of unclear images, unable to power on and dark shadows on an image had the highest RPNs. One failure mode could be caused by various factors, and the failure location was not necessarily related to the cause of the failure. Conclusions This quantitative approach more accurately evaluated the risks of different failure modes, and the results of the corresponding analysis of failure modes and causes could support the rapid determination of the causes of failures in clinical practice.
Article
Full-text available
Diagnostic errors in neuroradiology are inevitable, yet potentially avoidable. Through extensive literature search, we present an up-to-date review of the psychology of human decision making and how such complex process can lead to radiologic errors. Our focus is on neuroradiology, so we augmented our review with multiple explanatory figures to show how different errors can reflect on real-life clinical practice. We propose a new thematic categorization of perceptual and cognitive biases in this article to simplify message delivery to our target audience: emergency/general radiologists and trainees. Additionally, we highlight individual and organizational remedy strategies to decrease error rate and potential harm.
Article
Full-text available
Num cenário de constante incerteza económica impõe-se a necessidade de melhorar a qualidade na prestação de cuidados de saúde. Para esse fim é impreterível definir procedimentos e mecanismos institucionalizados para monitorização, avaliação e correção, com vista à melhoria contínua, da mesma maneira que mecanismos de acompanhamento, avaliação e desenvolvimento de Sistemas de Gestão da Qualidade, essenciais para acautelar que a prática da radiologia seja segura, precisa, otimizada e que garanta um atendimento ao paciente da mais alta qualidade, satisfazendo sempre os requisitos da política de cuidados de saúde e contendo os custos.Um estudo de caso foi realizado com recurso à aplicação de um questionário, tendo como objetivo determinar se os Técnicos de Radiologia dos serviços de Imagiologia Geral e Imagiologia Neurológica, com um Sistema de Gestão da Qualidade implementado, possuem cultura da qualidade. A taxa de resposta dos Técnicos a este questionário foi de 84,7% (83/98).Conclui-se que os Técnicos de Radiologia percebem a importância da qualidade e estão recetivos a novas ideias para a aumentar, contudo, demonstram uma preocupante escassez de conhecimento acerca dos conceitos básicos da qualidade. A cultura de qualidade destes profissionais demonstra lacunas que podem comprometer o sucesso do Sistema de Gestão da Qualidade dos seus serviços, sendo uma das principais causas a diminuta oferta de formações sobre qualidade a estes profissionais.
Article
Rationale Three years ago, the Accreditation Council for Graduate Medical Education (ACGME) introduced updated Common Program Requirements in recognition of the need to further promote resident and faculty member well-being and patient safety. The ACGME acknowledged residencies would need time to comply with new requirements. This grace period, however, concluded as of July 1, 2019, and programs now risk citations for failure to implement new requirements. Methods and Results The authors, members of the Association of Program Directors in Radiology Common Program Requirements Ad Hoc committee, developed downloadable resources provided in the Appendix delineating the 2019 Common Program Requirements and offering sample resources as compliant solutions. Conclusion The resources offer a national standardized approach to educating trainees in these essential skills and should be especially helpful to programs with access to fewer resources. In addition to achieving compliance, incorporation of these resources into residency training will ensure the next generation of radiologists are equipped to add value while remaining physically and emotionally healthy.
Article
Full-text available
An urgent need in American health care is improving quality and efficiency while controlling costs. One promising management approach implemented by some leading health care institutions is Lean, a quality improvement philosophy and set of principles originated by the Toyota Motor Company. Health care cases reveal that Lean is as applicable in complex knowledge work as it is in assembly-line manufacturing. When well executed, Lean transforms how an organization works and creates an insatiable quest for improvement. In this article, we define Lean and present 6 principles that constitute the essential dynamic of Lean management: attitude of continuous improvement, value creation, unity of purpose, respect for front-line workers, visual tracking, and flexible regimentation. Health care case studies illustrate each principle. The goal of this article is to provide a template for health care leaders to use in considering the implementation of the Lean management system or in assessing the current state of implementation in their organizations.
Article
Full-text available
The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements.
Article
The Toyota Production System, also known as Lean, is a structured approach to continuous quality improvement that has been developed over the past 50 years to transform the automotive manufacturing process. In recent years, these techniques have been successfully applied to quality and safety improvement in the medical field. One of these techniques is kaizen, which is the Japanese word for "good change." The central tenant of kaizen is the quick analysis of the small, manageable components of a problem and the rapid implementation of a solution with ongoing, real-time reassessment. Kaizen adds an additional "human element" that all stakeholders, not just management, must be involved in such change. Because of the small size of the changes involved in a kaizen event and the inherent focus on human factors and change management, a kaizen event can serve as good introduction to continuous quality improvement for a radiology department. Copyright © 2014. Published by Elsevier Inc.
Article
The most fundamental aspects of quality and safety in radiology are reviewed, including a brief history of the quality and safety movement as applied to radiology, the overarching considerations of organizational culture, team building, choosing appropriate goals and metrics, and the radiologist's quality "tool kit." Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Article
Radiologists in the United States are required to complete the Practice Quality Improvement (PQI) program as part of their Maintenance of Certification by the ABR. The Institute for Healthcare Improvement's (IHI) Model for Improvement (MFI) offers an alternative to the 3-phase approach currently advocated by the ABR. The MFI implicitly assumes that many interventions will need to be tested and refined for any meaningful project, and provides a project management approach that enables rapid assessment and improvement of performance. By collecting data continuously, rather than simply before and after interventions, more interventions can be tested simultaneously and projects can progress more rapidly. In this article, we describe the ABR's 3-phase approach, and introduce the MFI and how it can be employed to affect positive changes. Using a radiology case study, we demonstrate how one can utilize the MFI to enable rapid quality improvement. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Article
Value stream mapping (VSM) is a very useful technique to visualize and quantify the complex workflows often seen in clinical environments. VSM brings together multidisciplinary teams to identify parts of processes, collect data, and develop interventional ideas. An example involving pediatric MRI with general anesthesia VSM is outlined. As the process progresses, the map shows a large delay between the fax referral and the date of the scheduled and registered appointment. Ideas for improved efficiency and metrics were identified to measure improvement within a 6-month period, and an intervention package was developed for the department. Copyright © 2014. Published by Elsevier Inc.
Article
As health care undergoes fundamental redesign organized around increasing the value of care for populations, most of the discussion has focused on primary care and its role in managing the care of these populations. Relatively less attention has been given to specialty care population management, especially for hospital-based specialties such as diagnostic radiology. In radiology, an analysis of such a change is important and similar issues likely apply to other specialist activities as well.
Article
Many hospital radiology departments are adopting "lean" methods developed in automobile manufacturing to improve operational efficiency, eliminate waste, and optimize the value of their services. The lean approach, which emphasizes process analysis, has particular relevance to radiology departments, which depend on a smooth flow of patients and uninterrupted equipment function for efficient operation. However, the application of lean methods to isolated problems is not likely to improve overall efficiency or to produce a sustained improvement. Instead, the authors recommend a gradual but continuous and comprehensive "lean transformation" of work philosophy and workplace culture. Fundamental principles that must consistently be put into action to achieve such a transformation include equal involvement of and equal respect for all staff members, elimination of waste, standardization of work processes, improvement of flow in all processes, use of visual cues to communicate and inform, and use of specific tools to perform targeted data collection and analysis and to implement and guide change. Many categories of lean tools are available to facilitate these tasks: value stream mapping for visualizing the current state of a process and identifying activities that add no value; root cause analysis for determining the fundamental cause of a problem; team charters for planning, guiding, and communicating about change in a specific process; management dashboards for monitoring real-time developments; and a balanced scorecard for strategic oversight and planning in the areas of finance, customer service, internal operations, and staff development.
Article
All imaging departments are expected to establish and maintain effective quality, safety, and performance improvement programs. Essential components of such programs include adherence to the basic principles of quality management and appropriate utilization of quality tools. The initial step is the gathering of relevant information, followed by the collection and analysis of quality and performance data; analysis and ranking of causes that likely contributed to a process failure, error, or adverse event; and prioritization and local implementation of solutions, with careful monitoring of newly implemented processes and wider dissemination of the tools when a process proves to be successful. Quality improvement requires a careful, dedicated, and continuously planned effort by a number of skilled and committed team members, with the goal being to do the right thing in a timely fashion in every case. This process can be sustained by offering rewards and celebrating successes, with all lessons learned disseminated throughout the department or organization.