Evaluation & the Health Professions / March 2006
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRA CTICE
The gap between evidence-based guidelines
cussed. Effective interventions are needed to
help health care providers reduce this gap.
Whereas the development of clinical practice
guidelines from biomedical and clinical
research is an example of Type 1 translation,
Type 2 translation involves successful imple-
mentation of guidelines in clinical practice.
aimed at increasing adherence to clinical
primary care practices that use a common
electronic medical record (EMR). Practice
performance reports, site visits, and network
meetings are intervention methods designed
to stimulate improvement in practices by
addressing personal and organizational fac-
tors. Theories and evidence supporting these
interventions are described and could prove
useful to others trying to translate medical
research into practice. Additional theory
development is needed to support translation
in medical offices.
Keywords: primary care; quality improve-
ment; evaluation; translation
THE LOGIC BEHIND A
IMPROVE ADHERENCE TO
GUIDELINES IN A
OF PRIMARY CARE
University of Southern California
STEVEN M. ORNSTEIN
RUTH G. JENKINS
Medical University of South Carolina
SARAH T. CORLEY
Practice Partner Research Network
LYNNE S. NEMETH
PAUL J. NIETERT
Medical University of South Carolina
EVALUATION & THE HEALTH PROFESSIONS, Vol. 29 No. 1, March 2006 65-88
© 2006 Sage Publications
AUTHORS’ NOTE: This work was sup-
and Quality, U.S. Department of Health and
Human Services, Public Health Service
(Grant nos. U18 HS11132 and U18
Grimshaw, 1999; Shojania & Grimshaw, 2005). In 2001, the Institute
of Medicine (IOM) characterized the large gap in translating medical
research into practice as a quality chasm and made recommendations
to improve the health system: the development of effective teams,
redesign of care processes, effective use of information technology,
knowledge, and skills management, and incorporation of perfor-
mance and outcome measures. Clinical practice guidelines are a tool
to assist with knowledge management and Type 1 translation of bio-
medical and clinical research. Committees of experts review medical
research and present summaries of the findings with corresponding
recommendations for evidence-based clinical practice. These
recommendations are vetted by physician and payer groups prior to
Electronic Medical Records (EMRs) can also facilitate the imple-
mentation of IOM recommendations. Electronic Medical Records
include tools such as disease management templates, messaging, and
recalls that provide reminders and decision support, quick access to
clinical information such as test results, opportunities for quick and
easycoordination in the practice,and automated outreach to patients.
Use of EMR tools can improve compliance with guidelines and are
associated with better prescribing and follow-up, fewer medical
errors,lowermortality,andbetterdiseasecontrol (Buntinx, Winkens,
Grol, & Knottnerus, 1993; Evans, Pestotnik, Classen, Clemmer, &
Weaver, 1998; Griffin & Kinmonth, 2000; Hunt, Haynes, Hanna, &
Smith, 1998; Shea, DuMouchel, & Bahamonde, 1996; Spencer,
real-world settings, trials of EMR tools have met with mixed success
(Flottorp, Havelsrud, & Oxman, 2003; Hulscher, Laurant, Wensing,
& Grol, 1999; Rousseau, McColl, Newton, Grimshaw, & Eccles,
years have the products consistently included disease management
prompts and reminders. Cost has been the primary barrier to wide-
spread adoption along with the disruption they can cause to the usual
When a practice has successfully implemented an EMR, physicians
may not fully use the decision support tools that are available in the
66 Evaluation & the Health Professions / March 2006
software because of time constraints. These typically require addi-
not be reimbursed through the current payment structures.
Systematic reviews have found the evidence for interventions to
the need for more study (Bero et al., 1998; Borbas, Morris,
McLaughlin, Asinger, & Gobel, 2000; Buntinx et al., 1993; Davis
et al., 1999; Oxman, Thomson, Davis, & Haynes, 1995; Shortell,
Bennett, & Byck, 1998; Thomson O’Brien, et al., 2000a, 2000b;
nation strategies). A criticism of much of the existing research is that
the “choices of particular interventions lack compelling theories pre-
dicting their success or informing specific features of their develop-
ment” (Shojania & Grimshaw, 2005, p. 148), or similarly, that the
evaluation is unclear about empirical or theoretical expectations
(Ovretveit & Gustafson, 2003; A. E. Walker et al., 2003).
guide change in the small-office practices that make up much of our
primary care system, and for practices that use EMRs. The Practice
Partner Research Network (PPRNet) comprises more than 100 pri-
(Practice Partner, Physician Micro Systems, Inc., Seattle, WA).
PPRNet is currently engaged in studies of translating research into
practice by increasing adherence to clinical practice guidelines. To
support the need for theory-guiding translation of medical research
into practice, this article describes PPRNet’s multimethod interven-
BACKGROUND ON THE PPRNet INTERVENTION
and the vendor of their EMR system. Other practices using the EMR
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE67
electronically contributing data from their medical records. The
multimethod intervention described in this article was tested and
refined in a randomized controlled trial aimed at improving primary
and secondary prevention of cardiovascular disease in 20 practices
(Ornstein et al., 2004). The qualitative evaluation of this project
helped to identify differences in practice operations that might con-
tribute to differences in performance (the improvement model
described below). The intervention is currently being studied in a
demonstration project called Accelerating the Translation of
Research into Practice (A-TRIP). Accelerating the Translation of
Research into Practice addresses eight disease management and pre-
vention areas (listed below) and has extended the intervention to 101
practices in 37 states.
first box in the upper left corner represents the intervention methods.
The intervention team uses practice-level performance reports, site
visits, and annual network meetings to motivate practices and assist
them with improvement plans. These intervention methods are the
focus of this article.
ment model. The improvement model was developed using multiple
68 Evaluation & the Health Professions / March 2006
The proportion of patients
with diabetes in Practice X
with a lipid test in the past 12
months increases from 45%
The site visitor at Practice X
discusses the report showing
low levels of lipid testing for
diabetes and presents
strategies other practices have
used to deal with this.
Practice X prompts nurses with
a message in the EMR to check
for lipid tests in patients with
diabetes and to order if the last
one is a year or more old.
Figure 1: Framework for Accelerating the Translation of Research Into Practice
NOTE: EMR = Electronic Medical Record.
sources of data, including participant observation at practice site vis-
its, field notes, interviews, and site visit evaluations. The aim of the
analysis was to describe what the practices had done to improve care.
A grounded theory approach to the analysis of the qualitative data
tion being studied (Glaser, 1998). The intervention presents the
are encouraged to adopt strategies to prioritize performance, involve
all staff, redesign delivery systems, activate patients, and use EMR
tools (Feifer & Ornstein, 2004). To activate patients, a practice might
explain to a patient what the guidelines are, why they are important,
howthe patient’svalueon atestcomparesto the goal for diseasecon-
trol, and ways to achieve the goal. Other examples of items in the
model are training staff on clinical guidelines, establishing standing
nated subgroups of patients without an individual doctor order, and
having clinicians send themselves electronic reminders about patient
The lastbox representsthe projects’expected outcomes. Theseare
practice-level indicators of guideline adherence, such as the percent-
blood pressuremeasurelessthan140/90 mmHg.TheA-TRIPproject
includes 84 quality-of-care indicators related to underuse, overuse,
and misuse of health care services in eight clinical areas, including
immunizations, nutrition, infectious disease, and safe prescribing for
the elderly. The indicators were selected based on the strength of evi-
dence behind each specific guideline recommendation and their rele-
vance to primary care practice. These guidelines additionally are
widely accepted among physicians as being the appropriate standard
of care. The project encourages improvement across this broad spec-
individual practice. They can elect to specialize in one area, such as
hypertension, to focus improvements on one area at a time, or to
attempt improvements in multiple areas at once.
The logic behind the A-TRIP intervention is summarized in Table
1. The first column lists the intervention methods, and the third col-
umn provides examples of our outcomes. The middle column lists
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE69
Logic for the Interventions
Determinants of Outcomes
Practice reports—Audit and feedback
1. Providers accept the data
Improved care process, for example,
• Get data on all patients with minimal
2. Providers recognize that their performance needs
• diabetes patients with HgbA1c measure
disruption to provider
in 6 months
• Provide practice reports with
3. Providers believe there are better levels of
• diabetes patients with LDL measure
1. practice performance on indicators
performance that are achievable
in 1 year
2. targets (based on network benchmark)
• diabetes patients with BP measure
3. national benchmarks, when known
in 6 months
4. network comparisons (medians)
Site visits—Academic detailing
1. Clinicians know and value evidence-based project
Improved disease control, for example,
• Provide information on guidelines
• diabetes patients with most recent
• Review goals of A-TRIP
2. Staff values the impact of their role on guideline
HgbA1c < 7%
• Review strategies for practice-based
• diabetes patients with most recent
improvement (improvement model)
3. Practice understands project goals
LDL < 100 mg/dl
• Build EMR skills
4. Clinicians and staff accept a participatory model for
• diabetes patients with most recent
BP < 130/80
5. Practice is willing to experiment and react with
adjustments as needed
6. Practice is aware of new strategies to improve care7. Practice uses more EMR tools
Site visits—Audit & feedback
1. Practices raise questions about measures and their
• Review practice reports
2. Practices accept that their performance needs
3. Practices believe there are better levels of performance
that are achievable
4. Practices are willing to consider improvement
strategies from other practices that have
achieved better levels of performance
5. Practice is motivated to tackle improvement6. Comparisons stimulate competition as an
additional source of motivation
Site visits—Participatory planning
1. Practice buys into team approach and strengthens team
• Build practice communication
2. Practice designates a leader for improvement efforts
• Facilitate discussion of plans for
3. Staff are empowered and involved in problem solving
4. Practice designs an implementation plan
• Facilitate discussion of barriers
5. Practice agrees on targets for improvement efforts
6. A critical mass of practice members accept the plans7. Challenges are recognized and addressed8. Practice takes action 9. Achievements are reviewed and plans are revised as needed
1. Clinicians learn from each others’best practices
• Best practice examples
2. Clinicians are motivated to try specific new things
• Peer group interaction
3. Staff learn from each others’best practices
• Research summaries
4. Staff are motivated to get more involved
5. Network members develop a rewarding affiliation with
something bigger than everyday practice
6. Awards stimulate competition as an additional source
7. Presenters of best practices feel pride in their work and
strengthen their practice’s culture of innovation
NOTE: A-TRIP = Accelerating the Translation of Research into Practice; EMR = Electronic Medical Record; LDL = low-density lipoprotein.
care. We expect, for example, that a practice performance report will
help providers recognize areas with room for improvement. If they
agree to adjust the way their practice delivers care in one of those
areas, we expect the corresponding outcome measure to improve.
OVERVIEW OF SUPPORTING THEORIES
There was no one unifying theory that could guide the interven-
tion’s development. It was clear that to get change in busy primary
by a set of personal and organizational theories supporting different
methods and expected impacts. These theories were originally devel-
oped for other purposes, and the intervention tapped selected aspects
of them for ideas and justification.
ing three main assumptions. First, we assume that motivation deter-
mines behavior, and motivation is socially influenced. Our social
influence interventions areinformed bysocialpsychological theories
(Bandura, 1977) and social marketing theory (Kotler, 1984). Social
cognitive theory asserts behavior is predicted by an individual’s
hood it results in desirable outcomes. Social marketing theory asserts
that one can influence the motivations of a group with carefully
Second, we assume that doctors and staff in primary care practices
are busy and will be interested in our information only if it appears to
be important for them. Wedraw on adult learning theory to frameour
motes learning based on assumptions about adults: They are self-
directed, draw on their experience, learn when presented a new role,
and want immediate application (Knowles, 1970).
Third, we assume we will find individuals at different levels of
readiness to make practice changes. The transtheoretical model
describes a stepwise process for behavior change and suggests that
interventions will be more effective if they address the individual’s
72 Evaluation & the Health Professions / March 2006
stage-specific needs (Prochaska & DiClemente, 1984). Our interven-
tion applies the idea of readiness to individuals and practices.
The intervention also draws on four organizational theories. There
are many organizational change theorists who describe ways to culti-
vate change. Steps adapted in our intervention from Kotter (1999)
include (a) establishing a sense of urgency, (b) forming a powerful
(d) empowering members to act on the vision, (e) planning for short-
term wins, and (f) consolidating improvements and institutionalizing
successful approaches. Learning organizations theory adds that feed-
back is important for improvement, saying that groups that learn
quickly and accurately about their environment and translate this
learning to their work are more successful (Senge, 1990). Complex
adaptive systems theory asserts that organizations, like organisms,
evolve in response to internal and external cues, often in unique,
unpredictable ways (Plsek, 2001). Small cues can result in big
tions should experiment and see which approach evolves as the best
solution. Diffusion of innovations theory describes distinct phases an
organization moves through from first becoming aware of an innova-
tion to finally adopting it (Rogers, 1962; Sussman, Valente,
Rohrbach, Skara, & Pentz, 2006). Like the transtheoretical model for
individual behavior change, this theory helps us target our interven-
improvement strategies and the guidelines.
The connection between these theories and the intervention is pre-
sented along with available evidence in the next sections by interven-
tion method: practice reports, site visits, and network meetings.
practice process (collecting performance data and sharing the results
with providers; Eisenberg, 1986; Greco & Eisenberg, 1993;
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE73
isons, this method was less effective than point-of-care reminders
(prompts integrated into the flow of care routines; McPhee, Bird,
Jenkins, & Fordham, 1989; Tierney, Hui, & McDonald, 1986). Sys-
tematic reviews found that audit and feedback was not always suc-
cessful, yet comparisons that might explain differing effects were not
available (e.g., doctors getting data for their own patients vs. aggre-
gated practice level data; Bero et al., 1998; Shojania & Grimshaw,
2005; Thomson O’Brien, et al., 2000a).
The PPRNet intervention includes audit and feedback through
practice reports. In return for the quarterly extract of their data, prac-
tices receive a report of practice-level performance on a set of indica-
tors (see Figure 2 for a sample report page). The information in
PPRNet’s quarterly reports is meant to be timely, guided by learning
ods with trends over time to help practices evaluate performance and
whether improvement activities are making a difference (Western
Electric Company, 1984).
The motivation theories hypothesize that for audit and feedback to
work, physicians must believe their current practices need improving
improvement depends on several factors: Doctors must believe the
data are accurate (Shojania & Grimshaw, 2005), recognize that the
performance level is low, and believe that higher levels of perfor-
mance are achievable. PPRNet’s practice reports are based on data
from all active patients in the practice making it more exhaustive and
believable than data from a sample of charts. A patient-level report is
also made available. The practice can match information from the
patient-levelreport to their EMR to identify eachpatient whose treat-
ment is not meeting guidelines. This patient-level review helps prove
opportunities forimprovement.Whentheintervention teamdevelops
new measures, it works with network practices to adjust the way data
are recorded to ensure the most accurate representation of practice
The practice reports provide comparisons with other practices in
the network using medians, and with national levels when known.
Recognizing that one’s performance is below one’s peers can raise
74 Evaluation & the Health Professions / March 2006
Figure 2: Sample Chart From a Practice Report: Practice Y, Second Quarter, 2005
line) (C). Practice Y’s performance exceeded the national, median, and benchmark levels. More than 60% of the patients with diabetes in this practice have
LDL values less than 100.The “tests” mark variations in the data over time that are not random; they are used to identify where real improvements have taken place. The n varies each
month; it is the number of patients with an LDL measure in the past 12 months.
for behavior change according to the transtheoretical model
(Prochaska &DiClemente,1984). Likewise,itcancreatetension that
helps the group to prioritize, and successfully attain this change,
according to organizational change theory (Kotter, 1999).
Targets are also presented in the reports. These are Achievable
Benchmarks of Care (Allison, Kiefe, & Weissman, 1999), roughly
equivalent to the 90th percentile among all practices in the network
recognizable clinically meaningful goals.
in PPRNet’s intervention include academic detailing (educational
outreach), audit and feedback, and participatory planning. The three
methods are discussed individually below.
Delivering focused education to providers at their worksite is
known as academic detailing. The method is based on social market-
ing theory and has been shown to change physician behaviors (Avorn
& Soumerai, 1983; E. Walker et al., 1995). The first studies of aca-
tance of the site visitor’s credibility, presenting both sides of contro-
versial issues and unbiased sources of information, and stimulating
active participation in educational interactions that repeat key mes-
sages (Shojania, McDonald, Wachter, & Owens, 2004).
Social influence theory explains that the opinions of a respected
colleague can carry substantial weight in decision making (Mittman,
Tonesk, & Jacobson, 1992). Physicians have been found to rely on
colleagues’recommendations of whether to adopt new techniques or
tors are primary care clinicians with expertise in the medical areas of
of acquiring and integrating the new knowledge needed to translate
research into practice. PPRNet draws from adult learning theory to
76 Evaluation & the Health Professions / March 2006
include varied, active presentations of a message that is practical and
relevant to the learner. The site visitor might explain the findings of
recent studies or new medications relevant to performance measures
and discuss their implications for practice improvement efforts, or
they might present information about how the practice can organize
itself more efficiently for the improvements they are trying to make.
life saved to make this particularly meaningful to the practitioners.
Improvementstrategies can require skills and acceptancenot pres-
a computer in front of patients or may not want their staff more
involved in care coordination. Site visitors borrow from the
ommendations based on each practice’s level of readiness to adopt
change. A practice that is not technically oriented might be shown
EMR tools that make processes easier but would learn most about
strategies that are consistent with the current culture and abilities. At
later visits, when members demonstrate they are ready for change by
AUDIT AND FEEDBACK
For audit and feedback to work, physicians must be able to act on
the feedback (Greco & Eisenberg, 1993; Stevenson et al., 2001). The
PPRNet site visits complement reports as a way to provide roadmaps
and assistance with change to those who need it. In a group meeting
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE77
Sample Site Visit Agenda
Site Visit Two Agenda, Practice Z, November 2004
• Update on the progress of the A-TRIP project (10 minutes)
• Present new practice guideline information on diabetes and the treatment of hyperlipidemia
• Show features of Patient Records Version 8 relevant to A-TRIP (15 minutes)
• Discuss practice’s progress and barriers since first site visit: review each of the tasks they
had decided to work on, the changes they made to those plans, and the data on each of those
indicators to show the impact on performance (45-60 minutes)
• Review additional items in the practice report and make future plans (45-60 minutes)
NOTE: A-TRIP = Accelerating the Translation of Research into Practice.
during the site visit, staff and clinicians review and discuss the prac-
tice report together, transforming a passive source of feedback infor-
mation into an active one. Site visitors answer questions about the
measures and the practice’s performance, which in turn can increase
practice acceptance of the data and willingness to act on it. In the dis-
cussion, each practice identifies areas it feels need improvement and
are improvable. This shared vision, according to organizational
working together toward change.
Another part of the feedback during a site visit is the site visitor’s
explanation of the practice’s performance relative to other practices.
The site visitor can elaborate on differences between practices and
provide examples of improvement strategies used in higher perform-
ing or similar sites. This, according to the transtheoretical model,
(Prochaska & DiClemente, 1984). Comparisons can also work as a
motivating force for improvement, through social influence, by stim-
ulating competition orthedesiretokeepup withothers. InPPRNet,a
friendly level of competition among practices is encouraged.
change in a process we refer to as participatory planning. Some case
Stange,1998; Eisenberg,1986; Stevensonetal.,2001); however,evi-
dence from a review is mixed (Bero et al., 1998). The evidence sup-
porting multidisciplinary, collaborative teams for clinical and
improvement efforts appears to be more consistent (Berlowitz et al.,
one program, a team approach to improvement helped staff value the
impact of their role and was associated with high levels of motivation
staff in planning discussions ensures that the entire practice under-
stands the goals of the improvement project and each person’s
78 Evaluation & the Health Professions / March 2006
individual role. Clear objectives and explicit, tangible, measurable
tasks have been associated with successful change (Lee & Steinberg,
PPRNet’s participatory planning draws on organizational change
theory to cultivate change at the practice level. After the performance
cussions of the practice’s plans for new strategies to improve perfor-
mance, including the selection of leaders for the change effort. Prac-
tice members develop a shared vision about goals and process. The
practice plans a few changes per visit for short-term wins and institu-
tionalizes successful approaches (Kotter, 1999). The site visits pro-
about issues that otherwise often get lost in daily primary care
Periodic site visits also assist practices to move through stages in
practices acknowledge needs, evaluate attributes of strategies for
improvement, and decide which ones to adopt. They might also put
some of the strategies into play. Implementation continues after the
gies. Subsequent site visits assist with the final step, confirmation,
where practices choose between continuing and discontinuing the
attempted strategies for improvement.
Characteristics of the planning process are drawn from additional
theory to promote a willingness to experiment and see which
approach emergesasthebestsolution (Plsek, 2001). Awillingness to
experiment and to be flexible during implementation has been identi-
fied as an important component for successful hospital-based quality
improvement processes (Shortell et al., 1995). Practices are encour-
aged to borrow ideas from others with the advance knowledge that
practice staff who have been trained to follow protocols and may
respond uneasily to uncertainty. The site visitor validates that it may
seem to be a chaotic process but reassures the staff that these experi-
ments are a good way to find what works best for the practice.
Second, the combination of participatory planning with perfor-
mance feedback during periodic site visits is an application of
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE79
learning organization theory. Repeating cycles of improvement
the preceding one (Berwick & Nolan, 1998). The practices plan and
motivating the practice and providing a sense of accountability.
recognizes the importance of helping a group reach a critical mass of
2000). Site visitors encourage the practice to identify representatives
of different subgroups (i.e., nursing, registration staff, providers) to
act as the improvement effort leaders. The leaders tap into innate
desires for group affiliation to influence attitudes and behaviors and
recruit the critical mass of members needed for change.
Site visitors encourage practices to involve staff in planning new
roles to help the practice adhere more systematically to guidelines.
work (Taplin et al., 1998), PPRNet addresses teams indirectly by
encouraging staff and clinicians to work together to select goals,
streamline communications, clarify roles, and devise new strategies
for guideline adherence. Two rationales for using this team approach
to practice change can be drawn from diffusion of innovations theory
tors; these people then interact with others in their constituencies
encouraging them to adopt the change. In comparison, a small group
of physician leaders acting as innovators has limited opportunities to
influence the adoption of their ideas by other practice members. Sec-
ond, the team is better able to design a solution that fits the local con-
text and is, therefore, more acceptable to the larger group and subcul-
inserted into existing routines (Shojania & Grimshaw, 2005).
PPRNet’s participatory planning assumes a mixed team of clinicians
and staff will be more successful adjusting strategies to site-specific
context and workflow.
PPRNet’sparticipatory planning alsodrawson empiricalevidence
of barriersto guideline adherence(Cabanaetal., 1999; Lomas,1994;
Woolf, 2000). Screening for barriers can help practices move more
efficiently through the change process (Logan & Graham, 1998;
Moulding, Silagy, & Weller, 1999). The site visitor is aware of
80 Evaluation & the Health Professions / March 2006
might not have vocalized. Some of these will be minor logistical
issuesthatcanbesolvedeasily.Otherbarriers,such asproblems with
suggestions on a case-by-casebasis. Experts agree that changing cul-
ture is more difficult than changing process (Bradley et al., 2004;
nizations is more difficult than changing small ones (Shortell et al.,
1995; Solberg et al., 2000). We can predict that practices with greater
needs for culture change and larger practices needing organizational
development might move more slowly and need more assistance.
There is a growing body of evidence supporting learning
collaboratives for health care improvement (Gordon & Chin, 2004;
Wagner et al., 2001). In a collaborative, teams from different health
lem. Participants receive clinical, technical, and social support for
their improvement effort. The period between meetings is used by
teams to implement changes at home.
tatives of each practice in the network. Provider and staff representa-
tives attend and receive additional training related to the project, best
practices are recognized, top performers describe their approach to
attendees with similar roles in different practices to share ideas. The
change. Two expected results are that the practices will share a larger
senters of best practices will feel pride in their work, in turn strength-
ening their practice’s culture of innovation.
The network meeting was developed in response to diffusion of
tact between innovators, early adopters, and others, providing an
likely to consider adopting a new approach in this context because
they are given firsthand knowledge that it worked well with others
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE81
who tried it (Rogers, 1962). Meeting attendees may be socially moti-
of a desire to compete or keep up, or because the peer presenter made
successful change seem possible. The presenters function as opinion
popular presentation showed, for example, how a practice success-
fully engaged staff in taking on tasks that had previously just been
done by physicians, to deliver better care and activate patients to
achieve goals. Clinicians and staff from the practice presented
positive views of their effort.
Multimethod interventions are appropriate when the aims of a
improve indicators representing a broad spectrum of primary care
areas by encouraging them to steadily test and adopt new improve-
ment strategies in the context of daily practice. To succeed, the inter-
vention must address multiple factors (Counte & Meurer, 2001;
Solberg et al., 2000), particularly the practice’s individual and group
motivations and abilities to focus on new tasks in an already busy
workday. These aims logically result in a pluralistic intervention,
drawing from many paradigms and theories (Kirby, 2000; Van
The methods used in PPRNet’s intervention are not new and have
been described elsewhere, though they are sometimes called by other
2004; Solomon & Powers, 2002; Stone et al., 2002). Together, the
methods target individual behaviors and the function of a group. This
article presented the logic behind these methods based on evidence
and a set of personal and organizational theories: the transtheoretical
modelandsocialinfluence, motivation, adultlearning, learning orga-
nizations, organizational change, complex adaptive systems, and
diffusion of innovations theories.
Complexity theory asserts that one cannot change the behavior of
all of the providers all of the time. Combinations of the methods we
mixed success (de Fine Olivarius, Beck-Nielsen, Helms Andreasen,
82 Evaluation & the Health Professions / March 2006
Verstappen et al., 2003; Wells et al., 2000). With continued research,
work best for specific kinds of changes, as well as the amount of time
tion period could be a confounding factor in studies that focus on the
effectiveness of a particular method. Meanwhile, we need to deter-
mine whether our assumptions are valid. Does the PPRNet interven-
tion lead to the intermediary determinants we feel are necessary for
improved outcomes? Does the evidence support the relationship
between intermediary determinants and improved performance on
outcomes? Are there contextual factors that can be manipulated to
support the success of the methods? Can we achieve improvements
scalestudies of this type arebeginning to be conducted (A. E. Walker
et al., 2003).
the major organizational transformation of “going electronic” and
have elected to join a network specializing in quality improvement
practices it might have to address participation issues that PPRNet
others. The intervention presented has been designed to translate
research into practice in smaller medical offices with EMRs. The
majority of physicians in the United States are in practices of eight or
less (Cunningham, 2004), and although EMR use rates are still cur-
ested in using them (R. H. Miller, Sim, & Newman, 2003). To an
ble the PPRNet practices. Research that is conducted within large
health care organizations cannot be considered applicable. These
larger groups operate with multilayered management structures and
those in small office practice. Meanwhile, the borrowing of theories
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE83
the need for collaboration with basic scientists to develop our own
implementation theory (Ginexi & Hilton, in press).
posed to improve the design and evaluation of interventions to
those influences. When we are explicit about the assumptions behind
interventions, we can test these assumptions along with the overall
these specific endeavors will evolve promoting better understanding
Allison, J., Kiefe, C. I., & Weissman, N. (1999).Can data-driven benchmarksbe used to set the
goals of Healthy People 2010? American Journal of Public Health, 89(1), 61-65.
Avorn, J., & Soumerai, S. (1983). Improving drug therapy decisions through educational out-
reach. A randomizedcontrolledtrial of academicallybased “detailing.” New EnglandJour-
nal of Medicine, 308(24), 1457-1463.
Review, 84, 191-215.
Berlowitz, D. R., Young, G. J., Hickey, E. C., Saliba, D., Mittman, B. S., Czernowski, E., et al.
(2003). Quality improvement in nursing homes. Health Services Research, 38(1), 65-81.
Bero, L. A., Grilli, R., Grimshaw, J. M., Harvey, E., Oxman, A. D., & Thomson, M. A. (1998).
Closing the gap between research and practice: An overview of systematic reviews of inter-
ventions to promote the implementation of research findings. British Medical Journal,
nal Medicine, 128, 289-292.
Borbas, C., Morris, N., McLaughlin, B., Asinger, R., & Gobel, F. (2000). The role of clinician
opinion leaders in guideline implementation and quality improvement (review). Chest,
Bradley, E. H., Webster, T. R., Baker, D., Schlesinger, M., Inouye, S. K., Barth, M. C., et al.
(2004). Translating research into practice: Speeding the adoptionof innovative health care
programs [Issue Brief]. New York: Commonwealth Fund.
Buntinx, F., Winkens, R., Grol, R., & Knottnerus, J. A. (1993). Influencing diagnostic and pre-
tice, 10, 219-228.
Cabana, M. D., Rand, C. S., Powe, N. R., Wu, A. W., Wilson, M. H., Abboud, P. A. C., et al.
(1999). Why don’t clinicians follow practice guidelines? A framework for improvement.
Journal of the American Medical Association, 282(15), 1458-1465.
84Evaluation & the Health Professions / March 2006
Christensen, S., & Westenholz,A. (2000).Collective decisionmaking:Toward a relationalper-
spective. American Behavioral Scientist, 43(8), 1301-1315.
Cohen,D.,McDaniel,R. R.,Crabtree,B.,Ruhe,M.C.,Weyer,S.M.,Tallia,A.,et al.(2004).A
practice change model for quality improvement in primary care practice. Journal of
Healthcare Management, 49(3), 155-168.
implementation in health care organizations. International Journal for Quality in Health
Care, 13(3), 197-207.
Crabtree, B. F., Miller, W. L., Aita, V. A., Flocke, S. A., & Stange, K. C. (1998). Primary care
practice organization and preventive services delivery: A qualitative analysis. Journal of
Family Practice, 46, 403-409.
Cunningham, R. (2004). Professionalism reconsidered: Physician payment in a small-practice
environment. Health Affairs, 23(6), 36-47.
Davis, D., O’Brien, M. A. T., Freemantle, N., Wolf, F., Mazmanian, P., & Taylor-Vaisey, A.
comes? Journal of the American Medical Association, 282(9), 867-874.
de Fine Olivarius, N., Beck-Nielsen, H., Helms Andreasen, A., Horder, M., & Pedersen, P. A.
(2001). Randomised controlled trial of structured personal care of type 2 diabetes mellitus.
British Medical Journal, 323, 1-9.
micro-systems: A cross-case analysis. Washington, DC: Institute of Medicine–Committee
on Quality of Health Care in America, National Academy Press.
Eisenberg, J. (1986). Doctors’decisions and the cost of medical care: The reasons for doctors’
product patterns and ways to change them. Ann Arbor, MI: Health Administration Press.
assisted management program for antibiotics and other antiinfective agents. New England
Journal of Medicine, 338(4), 232-238.
Feifer, C., Fifield, J., Ornstein, S., Karson, A. S., Bates, D. W., Jones, K. R., et al. (2004).From
Journal on Quality and Safety, 30(5), 235-245.
Feifer, C., & Ornstein, S. M. (2004). Strategies for increasing adherence to clinical guidelines
on Quality and Safety, 30(8), 432-441.
Flottorp,S., Havelsrud,K., & Oxman,A. D. (2003).Process evaluationof a cluster randomized
trial of tailored interventions to implement guidelines in primary care—Why is it so hard to
change practice? Family Practice, 20(3), 333-339.
Gabbay, J., & May, A. L. (2004). Evidence based guidelines or collectively constructed
cal Journal, 329, 1013.
Gilbody, S., Whitty, P., Grimshaw, J., & Thomas, R. (2003). Educational and organizational
interventions to improve the management of depression in primary care: A systematic
review. Journal of the American Medical Association, 289(23), 3145-3151.
Ginexi, E., & Hilton, T. (in press). Bridging the basic/applied science link: Back and forward
translation. Evaluation & the Health Professions.
Glaser, B. G. (1998). Doing grounded theory: Issues and discussions. Mill Valley, CA: Sociol-
lations: Case studies of redesign and change through a learning collaborative. New York:
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE85
Medicine, 329(17), 1271-1274.
cal technologiesinto practice. InternationalJournalof Technological Assessment of Health
Care, 4, 5-26.
Griffin, S., & Kinmonth, A. (2000). Systems for routine surveillance for people with diabetes
Grol, R., & Grimshaw, J. (1999). Evidence-basedimplementationof evidence-based medicine.
Joint Commission Journal on Quality and Safety, 25(10), 503-513.
Gustafson, D. H., Sainfort, F., Eichler, M., Adams, L., Bisognano, M., & Steudel, H. (2003).
Developing and testing a model to predict outcomes of organizational change. Health Ser-
vices Research, 38(2), 751-756.
Hulscher, M., Laurant,M., Wensing,M., & Grol, R. (1999).Planning,monitoring,and describ-
ing interventions. In T. Thorsen & M. Makela (Eds.), Changing professional practice: The-
ory and practice of clinical guidelines implementation (pp. 133-152). Copenhagen, Den-
mark: Danish Institute for Health Services Research and Development.
cal decision support systems on physician performanceand patient outcomes.A systematic
review. Journal of the American Medical Association, 280(15), 1339-1346.
National Academy Press.
Kirby, D. (2000).Logic models: A useful tool for designing, strengthening, and evaluating pro-
grams to reduce adolescent pregnancy. Scotts Valley, CA: ETR Associates.
Knowles, M. S. (1970). The modern practice of adult education. Andragogy versus pedagogy.
Englewood Cliffs, NJ: Prentice Hall.
A. Brehony (Eds.), Marketing health behaviors: principles, techniques and applications
(pp. 23-39). New York: Plenum.
ary 16, 2002, from http://leadertoleader.org/leaderbooks/L2L/fall98/kotter.html
tems Management, 3(4), 19-25.
Logan, J., & Graham, I. D. (1998). Toward a comprehensive interdisciplinary model of health
care research use. Science Communication, 20(2), 227-146.
Medicine, 69(10), S95-S101.
tion in family practices using complexity science. Journal of Family Practice, 50(10), 872-
sician practices. Oakland: California Health Care Foundation.
Miller, R. H., West, C., Brown, T. M., Sim, I., & Ganchoff, C. (2005). The value of electronic
health records in solo or small group practices.Health Affairs, 24(5), 1127-1137.
Mittman, B., Tonesk, X., & Jacobson, P. (1992). Implementing clinical practice guidelines:
Social influence strategies and provider behavior change. Quality Review Bulletin, 18(12),
86Evaluation & the Health Professions / March 2006
change in clinical practice: Dissemination and implementation of clinical practice guide-
lines. Quality in Health Care, 8(3), 177-183.
cluster randomized trial. Annals of Internal Medicine, 141(7), 523-532.
Ovretveit, J., & Gustafson, D. (2003). Improving the quality of health care: Using research to
inform quality programs.British Medical Journal, 326, 759-761.
tematic review of 102 trials of interventions to improve professional practice. Canadian
Medical Association Journal, 153(10), 1423-1431.
21st century (pp. 322-335). Washington, DC: National Academy Press.
Prochaska, J., & DiClemente, C. (1984). The transtheoretical approach: Crossing traditional
boundaries of change. Homewood, IL: Dow Jones/Irwin.
Rogers, E. M. (1962). Diffusion of innovations. New York: Free Press.
tudinal, qualitative interview study of computerised evidence based guidelines in primary
care. British Medical Journal, 326(7384), 314.
Senge, P. (1990). The fifth discipline: The art and practice of the learning organization. New
York: Currency Doubleday.
Shea, S., DuMouchel, W., & Bahamonde, L. (1996). A meta-analysis of 16 randomized con-
ambulatory setting. Journal of the American Medical Informatics Association, 3(6), 399-
Shojania, K. G., & Grimshaw, J. M. (2005). Evidence-based quality improvement: The state of
the science. Health Affairs, 24(1), 138-150.
gap: A critical analysis of quality improvement strategies, Volume 1–Series overview and
methodology (Technical Review 9 AHRQ Publication No. 04-0051-1). Rockville, MD:
Agency for Healthcare Research and Quality.
Shortell, S., Bennett, C., & Byck, G. (1998). Assessing the impact of continuous quality
improvement on clinical practice: What will it take to accelerate progress. Milbank Quar-
terly, 76(4), 1-37.
Concept versus implementation. Health Services Research, 30(2), 377-401.
Solberg, L., Brekke, M., Fazio, C., Fowles, J., Jacobsen, D., Kottke, T., et al. (2000). Lessons
from experienced guideline implementers: Attend to many factors and use multiple strate-
gies. Joint Commission Journal on Quality Improvement, 26, 171-188.
under pay for performance. Oakland: California HealthCare Foundation.
tion of smoking status: Continuous quality improvement and electronic medical records.
Archives of Family Medicine, 8(1), 18-22.
care teams associated with successful quality improvement of diabetes care: A qualitative
study. Family Practice, 18(1), 21-26.
Feifer et al. / TRANSLATING MEDICAL RESEARCH INTO PRACTICE87
Stone,E.G.,Morton,S.C.,Hulscher,M.E.,Maglione,M.A.,Roth,E.A.,Grimshaw,J.M.,etal. Download full-text
A meta-analysis. Annals of Internal Medicine, 136(9), 641-651.
Sussman, S., Valente, T. W., Rohrbach, L. A., Skara, S., Pentz, M. A. (2006). Translation in the
health professions: Converting science into action. Evaluation & the Health Professions,
29(1), pp. 7-32.
Taplin, S., Galvin, M. S., Payne, T., Coole, D., & Wagner, E. (1998). Putting population-based
tice and health care outcomes (The Cochrane Database of Systematic Reviews [computer
file]). Cochrane Library. Oxford: Update Software.
E. L. (2000b). Educational outreach visits: Effects on professional practice and health care
outcomes (The Cochrane Database of Systematic Reviews [computer file CD000409]).
Cochrane Library. Oxford: Update Software..
E. L. (2000). Local opinion leaders: Effects on professional practice and health care out-
comes (The Cochrane Database of Systematic Reviews [computer fileCD000125]).
Cochrane Library. Oxford: Update Software.
Tierney, W. M., Hui, S. L., & McDonald, C. J. (1986). Delayed feedback of physician perfor-
ance. Medical Care, 24, 659-666.
intervention: A systematic approach. Quality and Safety in Health Care, 12, 215-220.
sicians: A randomized trial [see comment]. Journal of the American Medical Association,
Wagner, E. H. (1998). Chronic disease management: What will it take to improve care for
chronic illness? Effective Clinical Practice, 1(1), 2-4.
Wagner, E. H., Glasgow, R. E., Davis, C., Bonomi, A. E., Provost, L., McCulloch, D., et al.
(2001). Quality improvement in chronic illness care: A collaborative approach. Joint Com-
mission Journal on Quality Improvement, 27(2), 63-80.
Walker, A. E., Grimshaw, J., Johnston, M., Pitts, N., Steen, N., & Eccles, M. (2003). PRIME—
to change clinical practice. BMC Health Services Research, 3(1), 22.
development to prevent diabetes complications: Assessment of barriers in an urban clinic.
Diabetes Care, 18(9), 1291-1293.
Impact of disseminatingquality improvement programsfor depressionin managedprimary
care: A randomized controlled trial.Journal of the American Medical Association, 283(2),
T Technologies Inc.
Journal of Family Practice, 49(2), 126-129.
88 Evaluation & the Health Professions / March 2006