ArticlePDF Available


Evidence-based practice (EBP) is an orientation to practice that values evidence as a resource for clinical decision making while recognizing that evidence alone is never sufficient to make a clinical decision. Critics of EBP typically ignore, negate, or misrepresent the role of practitioner thinking processes and expertise in clinical settings. The authors believe that, far from being a mechanistic process that ignores practitioner expertise, reflection and critical thinking are essential to implementing EBP in real-world clinical practice. The purpose of this article is to provide guidance for how practitioners bring their expertise to bear when engaging in the process of EBP. The authors use a social work practice scenario to illustrate the application of practitioner expertise in each of the five steps of EBP.
Research on Social Work Practice
The online version of this article can be found at:
DOI: 10.1177/1049731507308143
2008 18: 301 originally published online 29 October 2007Research on Social Work Practice
Stanley G. McCracken and Jeanne C. Marsh
Practitioner Expertise in Evidence-Based Practice Decision Making
Published by:
can be found at:Research on Social Work PracticeAdditional services and information for Alerts:
What is This?
- Oct 29, 2007 OnlineFirst Version of Record
- Jun 5, 2008Version of Record >>
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
Practitioner Expertise in Evidence-Based
Practice Decision Making
Stanley G. McCracken
Jeanne C. Marsh
University of Chicago
Evidence-based practice (EBP) is an orientation to practice that values evidence as a resource for clinical deci-
sion making while recognizing that evidence alone is never sufficient to make a clinical decision. Critics of EBP
typically ignore, negate, or misrepresent the role of practitioner thinking processes and expertise in clinical set-
tings. The authors believe that, far from being a mechanistic process that ignores practitioner expertise, reflection
and critical thinking are essential to implementing EBP in real-world clinical practice. The purpose of this article
is to provide guidance for how practitioners bring their expertise to bear when engaging in the process of EBP.
The authors use a social work practice scenario to illustrate the application of practitioner expertise in each of
the five steps of EBP.
Keywords: evidence-based practice; practitioner expertise; clinical decision making; research implementation;
social work practice; clinical research
Evidence-based practice (EBP) is a process of using
research findings to aid clinical decision making. EBP
promotes the collection, interpretation, and utilization
of evidence that has been derived from client reports,
clinician observations, and empirical research. EBP is
an orientation to practice that values evidence as a
resource for clinical decision making while recogniz-
ing that evidence alone is never sufficient to make a
clinical decision (Berlin & Marsh, 1993; Guyatt &
Rennie, 2002; Sackett et al., 1996; Weisz & Addis,
Practitioner expertise is an important element in EBP
decision making. Although there has been an increasing
interest in the functioning of experts and expertise in a
variety of settings (Dreyfus & Dreyfus, 2005; Ericsson
et al., 2006; Mieg, 2001), little of this work has focused
on social work practice or EBP in social work. Sheppard
et al. (2000) provide an example of social work scholar-
ship concerned with the role of expertise. They identify
specific thinking processes related to social work decision
making. Berlin and Marsh (1993) also examine the knowl-
edge and information as well as the thinking processes
required to draw inferences and make decisions in prac-
tice. They review the research on human information
processing, perception, and problem solving as it relates to
clinical decision making to understand how the practi-
tioner shapes, organizes, and uses information in practice.
To make sense of large quantities of information that we
are confronted with on a daily basis, clinicians, like all
human beings, rely heavily on judgmental heuristics or
rules of thumb. Clinical decision making, and human
information processing more generally, is vulnerable to
predictable sources of bias and improved by both evidence
and decision aids that guide and formalize the decision-
making process.
Critics of EBP typically ignore, negate, or misrepresent
the role of practitioner thinking processes and expertise in
clinical settings. Rather than contributing to routinized,
automatic decision making that can characterize conven-
tional approaches to practice, EBP requires reflection and
critical thinking. Although the exigencies of everyday
helping require some reliance on shortcuts or tried and
true methods, EBP pushes the practitioner to improve the
quality of decisions made by systematically reviewing
information from rigorous data-gathering efforts instead
of relying on customary practice or agency policy.
Authors’ Note: This manuscript was based in part on a presentation by the
second author at the conference titled What Works: An International
Conference, held at the University of Bielefeld, Germany, on November 10-
12, 2005. The authors would like to acknowledge the contribution of Bridget
Colaccio Wesley and Elisabeth Kinnel who initially presented the case study
used as the basis for the case discussed in this article. While the case mater-
ial has been further modified and is an amalgamation of cases seen in their
program, the clinical question, search, and decisions reflect those made by
Ms. Colaccio Wesley with the assistance of Ms. Kinnel. The authors are
grateful for the comments of Chris Leiker on an earlier draft of this article.
This article was invited and accepted by the editor.
Research on Social Work Practice, Vol. 18 No. 4, July 2008 301-310
DOI: 10.1177/1049731507308143
© 2008 Sage Publications
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
We define practitioner expertise as a set of cognitive
tools that aid in the interpretation and application of evi-
dence. This set of tools for thinking develops over a
period of extended practice such that the individual with
experience in a decision area is likely to respond very dif-
ferently from the novice. We suggest that practitioner
expertise includes three overlapping knowledge and skill
sets: clinical, technical, and organizational. The clinical
component includes knowledge, skills, and experience
related to direct practice with client systems and includes
diagnostic, assessment, and goal/problem formulation
skills; engagement, relationship enhancing, and other
knowledge and skills related to the communication of
empathy, warmth, and genuineness; and knowledge of
theory and mastery of skills related to specific models and
interventions (Barlow, 2004; Lambert & Barley, 2001).
Technical knowledge and skills related to formulating
questions, conducting an electronic search, and evaluat-
ing validity and reliability of findings are needed in order
to use evidence to make real-time decisions about prac-
tice (Gibbs, 2003). Finally, practitioners who deliver ser-
vices in a team or agency context need skills and
knowledge related to teamwork, organizational design
and development, and leadership, particularly when EBP
is used to inform decision making in program develop-
ment (McCracken & Corrigan, 2004).
The purpose of this article is to provide guidance for
practitioners seeking to bring their expertise to bear when
engaging in the process of EBP. Practitioners not only
need to be able to identify the best evidence to address the
clients’ concerns but also must use their expertise to inter-
pret the evidence and apply it appropriately to their
client’s situation (Haynes, Devereaux, & Guyatt, 2002).
We will use a practice scenario to illustrate how clinical,
technical, and organizational expertise are used in each of
the steps of EBP. We will begin each section with the
practice scenario and then comment on the decisions and
actions taken by the practitioner.
Ms. W is a social worker in a community agency that
provides in-home and agency-based services to parents
and children involved with the state department of child
and family services and the foster care system. She was
referred a 5-year-old, Mexican American girl, Marie,
who was living with foster parents and was in the
process of being adopted by them. Marie had been
removed from her mother a year earlier because of
physical abuse, neglect, and allegations of having been
fondled by her mother’s boyfriend. At the time she was
removed from her biological mother, she was malnour-
ished and had bruises and burns on her body. Prior to
bringing Marie to live with them, the foster parents had
expressed an interest in adopting an infant. Marie was
referred to the agency because her foster parents (in
their early 20s) wanted to adopt her but were concerned
because she was engaging in sexualized play with her
toys. The client/family presenting problems include
Marie’s anxious and inappropriate sexual behavior at
home and school, foster parents’ fear and anxiety about
parenting Marie, foster parents’ lack of understanding of
child development and inappropriate expectations of
Marie, disrupted attachment to her biological mother,
and concerns about the lack of attachment between
Marie and her foster parents. In the course of her evalu-
ation, Ms. W gathered data from Marie, her parents, and
her teachers; she observed Marie alone and interacting
with her parents in the office and at home; and she
reviewed department of child and family services
records. Clinical questions: How should the presenting
problems be prioritized? What intervention would most
effectively address these problems?
Decision-Making Steps in EBP
EBP can be defined in terms of five steps that serve
to structure decision making and ensure optimum use of
practitioner expertise (Sackett et al., 2000). In the
remainder of the article, we move through the steps indi-
cating specific skill sets required and how practitioner
expertise comes into play at each point in the decision-
making process. The steps are as follows:
1. convert the need for information into an
answerable question;
2. track down with maximum efficiency the best
evidence with which to answer that question;
3. critically appraise that evidence for its validity
and usefulness;
4. integrate the critical appraisal with practitioner
clinical expertise and with the client values,
preferences, and clinical circumstances and
apply the results to practice; and
5. evaluate the outcome.
Practice Scenario
The first clinical decision for Ms. W was, “How
should Marie and her family’s presenting problems be
prioritized?” Specifically, she wondered whether the
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
problems could be prioritized and clustered in such a
way that one intervention could address several of the
most important problems simultaneously. She deter-
mined that her information needs included the clinical
information about Marie, her foster parents, and their
interaction; parent preferences for treatment; and depart-
ment of child and family services expectations for treat-
ment. Ms. W was able to form a collaborative working
relationship with Marie and her foster family during the
assessment. In addition to the other information gathered,
Ms. W learned that Marie’s biological mother had made
no attempt to contact the child or the foster family during
the ten months after Marie was removed from the home.
Based on the information gathered during assessment, she
determined that the clinical priorities were (a) monitoring
and strengthening the attachment between Marie and the
foster parents and helping Marie process the abuse trauma,
(b) teaching parenting skills and providing information to
the parents, and (c) modifying/regulating Marie’s anx-
ious and inappropriate sexual behavior at home and
school. Based on a previous search for background
information about child abuse, Ms. W’s reasoning for
prioritizing the problems in this order is that both the
anxious and inappropriate sexual behavior and the
problems with attachment were probably related to
Marie’s history of physical and sexual abuse and dis-
rupted attachment to her biological mother. She also
believed that it might be possible to identify activities
that would address both the bond between Marie and
her foster parents and the lack of parenting skills.
Agency expectations were that treatment would be
provided in the client’s home, initial treatment would
be short term, and ethical standards and legal responsi-
bilities would be upheld.
The second clinical decision for Ms. W was, “What
intervention would most effectively address the problems
identified?” Ideally, this intervention should either include
the parents or be taught to the parents so that they would
increase their parenting skills and so that changes would
be maintained. Information needs were external evidence
about whether there is an effective (home-based) treat-
ment (involving the parents) to help reduce the effects of
trauma in Mexican American preschool children who
have been physically or sexually abused and to increase
bonding between the child and foster parents. Ms. W for-
mulated the following initial question to guide her search
of the literature: For a Mexican American, preschool
child in foster care who has been physically and sexually
abused, what is the most effective home-based treatment
involving the parents to increase bonding between the
child and the parents and to decrease anxious and sexu-
ally inappropriate behaviors.
The clinical decision-making question at this step is,
“Is there a need for additional information in order to
identify and apply the most effective intervention for my
client (system),
and if so, what information do I need?”
EBP begins with a well-formulated question based on a
thorough evaluation of the client’s condition. However,
before that can occur, the therapist must engage the
client and establish a collaborative working relationship
using empathy, warmth, congruence, genuineness, and
other factors that influence therapy outcome (Lambert
& Barley, 2001). The therapist’s approach to engaging
and assessing the client may be influenced by back-
ground information
about individual development, cog-
nitive processing, and cultural influences on the
experience and presentation of problems and on the
development of interpersonal relationships (Comas-
Diaz, 2006; Guyatt & Rennie, 2002).
Any clinical encounter may yield a number of poten-
tial practice questions, and expertise in assessment and
diagnosis is needed to identify and help the individual
prioritize problems and goals. Once these are identified,
the practitioner draws upon expertise to consider
whether evidence exists that would be useful to guide
decision making and to formulate a question that is
likely to provide relevant information to guide work
with that particular client. One frequent complaint of
practitioners engaged in EBP in community agencies is
that there is a lack of research addressing populations
seen in practice, for example, individuals with dual dis-
orders and economically disadvantaged and minority
clients (McCracken, Steffen, & Hutchins, 2007). A
practitioner working with specific groups may need to
use a multi-pronged approach to identify useful evi-
dence, such as formulating both a problem-focused
question, for example, what sort of intervention is most
likely to reduce suicidal thinking and parasuicidal
behaviors in an adolescent with depression, and another
question about client characteristics, for example, what
sorts of approaches have been found to be effective with
Hispanic adolescents.
Practice Scenario
Ms. W had used a wide variety of search engines in
her graduate training; however, she had access to none
of these search engines (e.g., OVID, PsychInfo, Social
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
Work Abstracts) at her current place of employment.
She conducted her searches in PubMed and Google
Scholar. Ms. W initially included all of the terms, with
synonyms as alternate search terms, from the original
question in her search in hopes that she would locate
studies that would be specific to Marie and her situation.
However, given her previous experience in searching lit-
erature on treatment of young children, she was not sur-
prised that search terms from the question as initially
formulated failed to yield any relevant studies. This led
her to reduce the number of terms in her search. Given
Marie’s circumstances (e.g., the foster mother was second-
generation Cuban American and the foster father was of
non-Hispanic, White ancestry), Ms. W felt that the age
of the child and her history of abuse were the most
important and most fruitful components of the portion
of the question addressing client characteristics and
problem. While the fact that the child was in foster care
was an important clinical consideration, Ms. W felt that
it was likely that an intervention that would improve
bonding in parents also would increase bonding in fos-
ter parents. She decided that she would drop this from
her question, though she would note whether samples
included foster children and, if so, weight these more
heavily in her assessment of relevance of the literature to
her client. Finally, she felt that it would be wise to search
for the most effective intervention regardless of whether
or not the research was on home-based services and
whether or not it included parents. She decided that in-
home delivery of services involving the parents could be
addressed as implementation issues and should not
restrict the range of studies examined. Her revised ques-
tion was, therefore, “For a preschool child who has been
physically or sexually abused, what intervention is most
effective in increasing bonding between the parents and
the child and in decreasing anxiety and inappropriate sex-
ual behaviors?” Ms. W also added search terms that
would identify high-quality research (e.g., controlled,
random, double-blind, meta-analysis, systematic review).
When she searched terms related to the broader question,
she identified several studies that appeared appropriate.
This search also led her to the Web site for the National
Child Traumatic Stress Network, which allowed her to
access the full text of several useful articles.
Once it is determined that additional information is
needed and a practice question is formulated, the clini-
cal decision-making issue is, “What is the best place to
get this information, and how do I structure my search
to efficiently get the information I need?” Both the
effectiveness of the search (i.e., locating the evidence
needed to inform the clinical decision) and efficiency in
conducting the search are important. Practitioner exper-
tise is necessary at this step to identify potentially use-
ful search terms and to reject those which do not fit the
client’s condition. Experience and expertise also guide
selection of the level of evidence most appropriate to
answering the question (e.g., single studies, systematic
reviews, synopses, or systems), as well as the research
design, and thus the methodological search terms most
appropriate to the question (Guyatt & Rennie, 2002;
Heneghan & Badenoch, 2006). Expertise at this step
also is important to conduct searches efficiently. The top
priority for practitioners is providing services to clients,
while the incentive system often rewards the quantity of
care provided (Weisz & Addis, 2006). Efficiency is
essential for EBP to be a realistic tool for clinical deci-
sion making in the real world. Expertise at this step
helps avoid dead ends in the search and provides the
practitioner with the confidence that the results of the
search are an accurate reflection of the state of the evi-
dence on the question being asked.
Practice Scenario
Ms. W successfully located randomized controlled
trials of interventions that addressed the treatment of
children who have been abused or who were targeted for
increasing the bonding/attachment between child and
parent. She evaluated the quality of several of these
studies using the Quality of Study Rating Form (Gibbs,
2003) and found promising research supporting two dif-
ferent interventions—cognitive behavioral treatment
(CBT; e.g., Cohen et al., 2004) and child parent psy-
chotherapy (CPP; Lieberman, Van Horn, & Ippen, 2005;
Toth et al., 2002). She found that the external evidence
supporting CBT for symptoms of posttraumatic stress
disorder in children was much stronger, including sev-
eral randomized clinical trials plus a meta-analysis, than
the evidence supporting CPP, which consisted of two
randomized clinical trials, a six-month follow-up study,
and a study examining different outcome measures.
However, when she considered the relevance of the
study for her clients, she found several factors that sug-
gested CPP might be a better choice. First, CPP has
been studied with preschool children while most of the
literature on CBT has examined older children. Second,
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
CPP specifically focuses on the relationship between the
mother and the child. Finally, the sample in the study by
Lieberman and colleagues (2005) was multi-ethnic,
from mixed socio-economic levels. On the other hand,
CBT has been studied in children in whom the symp-
toms of posttraumatic stress disorder result from abuse
of the children, while the children in one of the studies
of CPP experienced symptoms of posttraumatic stress
disorder as a result of abuse to their mothers (Lieberman
et al., 2005). When considering both the research evi-
dence and applicability factors, Ms. W gave particular
weight to the age of the child and the fact that the heal-
ing within CPP is hypothesized to occur within the con-
text of a safe and secure attachment between mother and
child in arriving at her decision that CPP would be the
better intervention to use in this case.
This step includes two equally important questions
for clinical decision making: “Did my search yield high-
quality information? If so, does it apply to my client?”
As was true of Step 2, practitioners need to be both
accurate and efficient in rating the quality of the evi-
dence located in their electronic search. Accuracy may
be enhanced by training and review on research design
and data interpretation and by use of user-friendly
research rating forms (Gibbs, 2003; Moher, Schulz, &
Altman, 2001). Efficiency requires regular practice, for
example, through participation in a journal club in
which quality ratings are included as part of the journal
discussion. Even though evidence suggests that social
workers infrequently read or use research as a source of
practice knowledge (Mullen & Bacon, 2003), we have
found practitioners to be quite receptive to training in
research design and data interpretation if presented con-
textually using research addressing questions generated
from their own practice (McCracken et al., 2007).
Even though evidence from studies and syntheses iden-
tified in the electronic search is of high quality, it may not
clearly identify one specific intervention to use with one’s
client. Practitioners must use their expertise when they con-
sider the magnitude and precision of the effect of the study,
determine the likelihood of benefit, balance the potential
benefit versus harm, and assess the feasibility of the inter-
vention both with the client and in one’s practice setting
(Guyatt & Rennie, 2002; Straus et al., 2005). If it appears
that the results do apply to the client, the practitioner will
draw upon knowledge of both the client and the relationship
in considering whether the client is likely to find the inter-
vention acceptable and, if so, whether the client is likely to
be able to follow through with the intervention. In some
cases, the evidence may address one clinical decision
but raise another. The practitioner may find that the
most effective intervention is not feasible in the practice
setting. For example, while there is considerable evi-
dence to support the delivery of community-based inter-
ventions, like supported employment, for individuals
with severe and persistent mental illnesses, these ser-
vices may not be available in a hospital-based, fee-for-
service, outpatient psychiatry department (Bond &
Jones, 2005). In this case, the practitioner faces a
dilemma: Should the practitioner recommend the client
be referred to another agency that can offer the services
supported by literature, recommend an alternative inter-
vention targeting a different problem (e.g., reduction of
delusions instead of employment), or recommend paral-
lel services in two agencies? In either case, the practi-
tioner is ethically obligated to discuss the evidence and
the options with the client (Gambrill, 2006).
Practice Scenario
Ms. W found that the integration of her critical
appraisal and the evidence was more straightforward
than the decision about which of the two interventions
was likely to be more appropriate to Marie and her fos-
ter parents. The timing of this clinical decision was for-
tuitous; Ms. W’s agency had recently received a grant
that would provide training and supervision in CPP. So
even though she had studied CBT during her graduate
education, it also was feasible for her to provide CPP.
Similarly, she did not expect any problem using CPP in
the context of her agency since clinicians in her program
were expected to deliver services in the client’s home
(compatible with CPP), and her agency was participat-
ing in the CPP project. The only concern that Ms. W had
about using CPP was that it requires a substantial com-
mitment of both time and effort on the part of the
mother. Without this commitment, CPP would not be
feasible, and Ms. W would need to consider an approach
that required less of the parents. Given their clear com-
mitment to making things work out with Marie, their
eagerness for information about parenting and child
care, and the fact that they had developed a collaborative
and trusting relationship with her, Ms. W decided that
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
she would conduct a detailed discussion of the research
findings, the applicability factors, and the details of
implementing CPP. Their discussion included the
amount of time and effort needed from the foster mother
and the fact that Ms. W would be learning the approach
while she was delivering services to the family. The
parents responded well to the amount and type of infor-
mation provided and to the fact that they were included
in the decision about which intervention would be used.
They noted that this was very different from the sorts of
discussions that they were accustomed to having with
their health care and social service providers. In the end,
the parents agreed that CPP was the better approach,
and Marie’s foster mother was quite willing to devote
the time and effort needed by this approach.
There are a series of clinical decisions at this step.
“How do I present this information to my client? Will
the approach suggested by this information need to be
modified—how? What will I need to do to apply this
approach with my client—what specific skills will I
need to use/learn, what is the specific implementation
sequence?” Educating the client about his/her problem
and the various options for intervention is one of the fun-
damental principles and skills of EBP (Gill & Pratt,
2005). However, while the principle of client involvement
is clear, how to actually determine the role that the client
would like to play in decision making, assessing the
client’s preferences, and providing the client with the
information that he or she needs to make decisions
requires considerable expertise, in spite of the develop-
ment of clinical decision-analysis aids (Guyatt & Rennie,
2002; Straus et al., 2005). For example, some individuals
prefer detailed information about options—others may be
overwhelmed by too much information; some individuals
seek equal partnership with the provider—others prefer to
see the provider as the professional expert who makes
specific recommendations (Comas-Diaz, 2006). Some
clients, such as individuals with chronic illnesses who
have received services over many years from many
providers, may be surprised by the fact that a provider is
offering to discuss the evidence for their treatment or
even the fact that different treatments exist. The practi-
tioner may need to be sensitive to the need to educate the
client about how to participate in such a discussion.
The practitioner must also decide whether the inter-
vention can be used as described in the literature or in a
practice guideline or whether it needs to be adapted to
one’s client or practice environment and if so, how.
There is risk both in adapting and failing to adapt an
intervention (Proctor & Rosen, 2006). The risk of adapt-
ing is that the intervention may be less effective if one
does not maintain fidelity to the intervention. The risk of
failing to adapt is that there may be elements of the
intervention that make it inappropriate for a particular
client or not feasible in a particular practice situation.
There are several factors that suggest an intervention
may need to be adapted, though the presence of one or
more of these factors does not necessarily mean that it
must be modified: First, it may have been designed and
tested with individuals whose problem configuration,
personal characteristics, or situation is different from
the client’s. Second, the outcomes tested, while similar,
may differ from those desired by the client. Third, the
intervention was tested in a different setting or with dif-
ferent types of practitioners (Proctor & Rosen, 2006).
Interventions may be modified or augmented in a
number of ways. For example, many interventions do
not address how to establish or maintain a therapeutic
relationship with the client, and the practitioner needs to
supplement the guidelines with relationship expertise as
described above. The manner of delivery of the inter-
vention may need to be adapted to individuals with cer-
tain characteristics or for certain clinical situations. For
example, homework assignments may need to be modi-
fied for individuals who are unable to read or write,
skills may need to be taught in smaller steps to individ-
uals with cognitive impairment, or provision for crises
may need to be built into the guidelines. The frequency,
intensity, duration, modality, or site of delivery may
need to be modified for a particular client or practice
context. For example, individual interventions may need
to be adapted for use in groups if they are to be used in
programs that rely heavily on this form of treatment.
Modifications must be well reasoned, thoughtfully
implemented, and carefully evaluated. The particular
modifications will depend upon practitioner expertise
and the reasons for making the modification (Proctor &
Rosen, 2006).
Finally, when EBP is used for clinical decision mak-
ing in program development, practitioner organizational
expertise will be needed to implement the intervention
(Fixsen et al., 2005; McCracken & Corrigan, 2004). The
fact that an intervention is supported by evidence does
not mean that staff will want to use it. Staff buy-in and
ownership of the new intervention is essential to suc-
cessful adoption and implementation. Staff must be
adequately prepared before and supported during the
change process (Corrigan & McCracken, 1997). Before
adopting the intervention, staff must be made aware of
the intervention, must have sufficient information about
what it does and how to use it, and must be clear about
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
how the intervention will affect them personally. During
adoption, staff must have sufficient training (preferably
on-site) in how to use the intervention and the effects of
the intervention. Finally, successful adoption is more
likely if the established users are provided adequate
feedback about the consequences of adoption and have
the resources and support to adapt the intervention to
their particular situation (Hall & Hord, 1987). In addi-
tion to preparing the team for change, the practitioner
will need to decide how best to sequence the implemen-
tation of the intervention. This is important because
many interventions have multiple components that need
to be implemented and because implementing a new
intervention will have an effect on previously existing
elements of the program (Corrigan & McCracken,
1997). This entire process will be easier if the team and
other relevant stakeholders have been involved in all of
the steps from identifying the practice question to eval-
uating the evidence and if there has been administrative
support for the process. In other words, both top-down
and bottom-up support for the process is required for
successful implementation (Corrigan & Boyle, 2003).
Practice Scenario
Ms. W used the Child Behavior Checklist (Achenbach,
1991), the Randolph Attachment Disorder Questionnaire
(Randolph, 1997), and behavior logs completed by the
teacher and the parents to evaluate the outcome of her work
with Marie and her foster mother. Her decision to use the
Child Behavior Checklist was an easy one—because it was
used both in the CPP research and by a number of
programs in the agency where she worked. The decision to
use the Randolph Attachment Disorder Questionnaire was
less clear and was made after she conducted a search of the
literature for a clinically useful, reliable, and valid measure
of parent-child attachment. Ms. W found several instru-
ments that had been used in dissertations and other
research, but she did not feel that any of them were feasi-
ble for her to use with her client. While the Randolph
Attachment Disorder Questionnaire focused on behaviors
associated with reactive attachment disorders, there were
several behaviors of interest in Marie. Furthermore, this
instrument was brief and was sensitive to change. Finally,
Ms. W asked both the teacher and the foster parents to
keep a daily record of Marie’s sexualized play with toys
plus several anxious and fearful behaviors. Ms. W audio-
or videotaped treatment sessions and completed a thera-
pist fidelity checklist as part of her participation in the
grant-supported CPP training. It should be noted that
while Ms. W was primarily interested in the data from
the outcome measures to monitor and assess the effec-
tiveness of her intervention, the agency clinical direc-
tor and the director of her program used data from the
Child Behavior Checklist and the Randolph Attachment
Disorder Questionnaire for program development and
quality assurance.
Regardless of whether the intervention is used as
described in the literature or practice manual or whether it
is adapted to the client and situation, it should be evalu-
ated. The clinical decisions at this point are, What mea-
sures provide the best indicators of the implementation
and results of the intervention? How can this information
be gathered in a way that is both accurate and user
friendly? Ideally, both process and outcome should be
evaluated—without some measure of process, it can be
hard to interpret the meaning of outcomes. Corrigan and
associates (1994) proposed evaluating process and out-
come for both clients and staff. We would propose a
third—organizational—level of evaluation particularly
when using EBP as an approach to program development.
At the client level, the process question is whether the
individual is participating in and adhering to the interven-
tion. The client outcome measure is what impact the inter-
vention has on the client’s life. For staff, the process issue
was traditionally related to the quantity of services such as
the number of sessions or contact hours. An equally
important staff process issue for EBP is the quality of ser-
vice delivery (e.g., treatment fidelity, the extent to which
the service has been delivered as intended). Staff outcome
measures address the impact of service delivery on the per-
son(s) delivering that intervention, for example, job satis-
faction, level of burnout, and job-related social support
(Corrigan et al., 1994). Staff outcome data are important
when using an EBP approach to program development
since staff reactions to the intervention influence the likeli-
hood that the intervention will be adopted and maintained
over time (Corrigan et al., 1998; McCracken & Corrigan,
2004). Organizational process measures address the cost of
developing and implementing the services and the degree to
which organizational factors either facilitate or impede the
delivery of services. Organizational outcome measures
include the impact of service delivery on fulfilling the mis-
sion and accomplishing the goals of the organization.
Evaluating the intervention provides data that not only
are used in assessing the effectiveness of the EBP on the
client but can be used in subsequent practice decisions.
Organizations should develop a mechanism for sharing
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
the data among practitioners in the program. As men-
tioned earlier, EBP is an iterative process. The best
available data are not always from the research litera-
ture; they may be from the practitioner’s own practice or
program (Gellis & Reid, 2004).
The case illustration of EBP decision making reveals the
importance of both evidence and expertise. By moving
through the five steps of a structured decision-making
process for an actual clinical case, the way that expertise
comes into play in EBP is vividly portrayed. In general,
the case indicates that three types of expertise come into
play: clinical, technical, and organizational.
In the case scenario, the practitioner exercised signifi-
cant clinical expertise. The need for a positive, collabora-
tive working relationship in any clinical encounter is well
documented (Lambert & Barley, 2001). Clinicians seek to
communicate empathy, warmth, and genuineness to
engage the client in a productive relationship. As seen
from the case above, clinician expertise in relationship
development derives from information and experience
clinicians may rely on related to the client’s individual
characteristics and cultural background, developmental
stage, and relationship history. A positive, collaborative
relationship derives from the ongoing work of the clini-
cian to understand and respect client preferences and val-
ues. In the case of Marie’s foster family, the clinician’s
work to engage the family in prioritizing the problems and
selecting the interventions represents examples of clini-
cians’ efforts to elicit and respect client preferences and
values and, in the process, to develop the relationship.
Thus, clinician expertise comes into play in understanding
that a positive, collaborative working relationship is fun-
damental to any clinical encounter and, then, in develop-
ing the relationship throughout work with the client.
Another area where clinical expertise came into play
in the scenario was in problem formulation, goal setting,
and intervention selection. Throughout the therapeutic
encounter, clinicians and clients refine priorities and select
and evaluate interventions. This process of understanding
the client’s problems and identifying the resources available
to resolve them requires a high level of expertise. Research
on human information processing helps us to understand
that these types of decisions are based on somewhat pre-
dictable thinking processes and judgmental heuristics
(Berlin & Marsh, 1993; Hogarth, 1987; Nisbett & Ross,
1980). Clinical expertise derives from approaching decision
making with a critical stance toward the analytic processes
we use. In the case of Maria’s family, Ms. W. sought to indi-
vidualize her intervention and to avoid a customary, cookie-
cutter approach by identifying evidence about effective
interventions for a preschool child who had been
physically or sexually abused. Thus, clinician expertise
comes into play throughout the process as the clinician
critically appraises the thinking processes used in inter-
preting and applying the evidence.
A major area where technical expertise was exercised
was in the appraisal of data and information. Once spe-
cific evidence was identified, the practitioner needed the
tools to evaluate the quality and relevance of the data
and the likelihood the approach would be beneficial
with a particular client in a particular practice setting.
The practitioner in the scenario used judgment and
expertise to determine whether a particular approach
could be adapted to a client’s environment and whether
it could be implemented in the context of the client-
provider relationship. In the case of Maria’s family, Ms. W.
ultimately selected CPP over CBT after appraising available
data, considering organizational context, and consulting
with her clients. Thus, we see that technical expertise
comes into play throughout the process as the clinician
critically appraises both the evidence and the processes
used in applying the evidence.
Finally, the use of organizational expertise was apparent
in the scenario in several instances. The practitioners
needed to consider fit of intervention with organization val-
ues and practice and of organizational capacity to imple-
ment the specific intervention. In selecting CPP over CBT,
the fact that the organization had received a grant for super-
vision and training with this model ensured the choice was
congruent with organizational practice. The existence of
this grant also ensured that staff training and supervision
would increase the likelihood that the intervention would
be implemented with fidelity. Finally, any evaluation of an
intervention is more likely to be conducted if the results are
of interest for quality assurance and program development.
In this scenario, the practitioner’s knowledge of the agency
and expertise in its functioning contributed to effective
decision making.
In conclusion, practitioner expertise is an essential ele-
ment in EBP decision making. Although we are only
beginning to understand how expertise operates in human
information processing generally and in social work prac-
tice decision making more specifically, a step-wise analy-
sis of EBP decision making reveals the thinking tools and
processes that support the use of research findings to aid
judgment in practice. Through every step in the decision-
making process, the application of clinical, technical, and
organizational expertise can be identified.
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
1. This scenario is provided with the permission of Bridget
Colaccio Wesley (therapist) and Elisabeth Kinnel (evidence-
based practice project coordinator), who initially presented the
case study used as the basis for the case discussed in this article.
Ms. Wesley and Ms. Kinnel work in a large family service
agency serving Chicago and surrounding suburbs. The case
material, including names, identifying characteristics, and spe-
cific details, has been changed to protect confidentiality and is
best considered an amalgamation of cases seen in their program.
However, the clinical question, search, and decisions reflect
those made by Ms. Colaccio Wesley with the assistance of Ms.
Kinnel. The clinical questions were initially formulated and
searched in 2005.
2. A note about language: Recipients of services may be
referred to in a number of ways, such as “patient,” “client,
“consumer,” or “an individual” with a certain problem or cer-
tain characteristics. In this article, we use the term “client” to
refer to a recipient of services. While the term “client” is sin-
gular, we use this term to mean client system, which may
include one or many individuals and those around them.
3. Background questions provide basic information about
client characteristics and different kinds of problems and are
distinguished from foreground questions, which address the
specific clinical issue at hand (Guyatt & Rennie, 2002).
Achenbach, T. M. (1991). Manual for the Child Behavior
Checklist/4-18 and 1991 profile. Burlington: University of
Vermont Department of Psychiatry.
Barlow, D. H. (2004). Psychological treatments. American
Psychologist, 59, 869-878.
Berlin, S. B., & Marsh, J. C. (1993). Informing practice decisions.
New York: Macmillan.
Bond, G. R., & Jones, A. (2005). Supported employment. In R. E.
Drake, M. R. Merrens, & D. W. Lynde (Eds.), Evidence-based
mental health practice: A textbook (pp. 367-394). New York:
Cohen, J. A., Deblinger, E., Mannarino, A. P., & Steer, R. A. (2004).
A multisite, randomized controlled trial for children with sexual
abuse-related PTSD symptoms. Journal of the American
Academy of Child & Adolescent Psychiatry, 43, 393-402.
Comas-Diaz, L. (2006). Cultural variation in the therapeutic rela-
tionship. In C. D. Goodheart, A. E. Kazdin, & R. J. Sternberg
(Eds.), Evidence-based psychotherapy: Where practice and
research meet (pp. 81-105). Washington, DC: American
Psychological Association.
Corrigan, P. W., & Boyle, M. G. (2003). What works for mental
health system change: Evolution or revolution? Administration
and Policy in Mental Health, 30, 379-395.
Corrigan, P. W., Luchins, D. J., Malan, R. D., & Harris, J. (1994).
User-friendly continuous quality improvement for the mental
health team. Medical Interface, 7(12), 89-95.
Corrigan, P. W., & McCracken, S. G. (1997). Interactive staff train-
ing: Rehabilitation teams that work. New York: Plenum.
Corrigan, P. W., Williams, O. B., McCracken, S. G., Kommana, S.,
Edwards, M., & Brunner, J. (1998). Staff attitudes that impede
the implementation of behavioral treatment programs. Behavior
Modification, 22, 548-562.
Dreyfus, H. L., & Dreyfus, S. E. (2005). Expertise in real world con-
texts. Organization Studies, 26, 779-792.
Ericsson, K. A., Charness, N., Hoffman, R. R., & Feltovich, P. J.
(2006). The Cambridge handbook on expertise and expert per-
formance. Cambridge, UK: Cambridge University Press.
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., &
Wallace, F. (2005). Implementation research: A synthesis of the
literature (FMHI publication no. 231). Tampa: University of
South Florida, Louis de la Parte Florida Mental Health Institute,
National Implementation Research Network.
Gambrill, E. (2006). Social work practice: A critical thinker’s guide
(2nd ed.). New York: Oxford.
Gellis, Z. & Reid, W. J. (2004). Strengthening evidence-based prac-
tice. Brief Treatment and Crisis Intervention, 4, 155-165.
Gill, K. J., & Pratt, C. W. (2005). Clinical decision making and the
evidence-based practitioner. In R. E. Drake, M. R. Merrens, &
D. W. Lynde (Eds.), Evidence-based mental health practice: A
textbook (pp. 95-122). New York: Norton.
Gibbs, L. E. (2003). Evidence-based practice for the helping profes-
sions. Pacific Grove, CA: Brooks/Cole.
Guyatt, G., & Rennie, D. (2002). Users’ guides to the medical liter-
ature: Essentials of evidence-based clinical practice. Chicago:
American Medical Association.
Hall, G. E., & Hord, S. M. (1987). Change in schools: Facilitating
the process. Albany: State University of New York.
Haynes, R. B., Devereaux, P. J., & Guyatt, G. H. (2002). Clinical
expertise in the era of evidence-based medicine and patient
choice. Evidence-Based Medicine, 7, 36-38. Retrieved November
24, 2006, from http//
Heneghan, C., & Badenoch, D. (2006). Evidence-based medicine
toolkit (2nd ed.). Malden, MA: Blackwell.
Hogarth, R. M. (1987). Judgment and choice: The psychology of
decision. New York: John Wiley.
Lambert, M. J., & Barley, D. E. (2001). Research summary on the
therapeutic relationship and psychotherap
y outcome. Psychotherapy,
38, 357-361.
Lieberman, A. F., Van Horn, P., & Ippen, C. G. (2005). Toward
evidence-based treatment: Child-parent psychotherapy with
preschoolers exposed to marital violence. Journal of the
American Academy of Child & Adolescent Psychiatry, 44,
McCracken, S. G., & Corrigan, P. W. (2004). Staff development in
mental health. In H. E. Briggs & T. L. Rzepnicki (Eds.), Using
evidence in social work practice: Behavioral perspectives
(pp. 232-256). Chicago: Lyceum.
McCracken, S. G., Steffen, F., & Hutchins, E. (2007, January).
Implementing EBP in a large metropolitan family service agency:
Preparing the organization for change. Roundtable conducted at
the Society for Social Work and Research Annual Conference,
San Francisco.
Mieg, H. A. (2001). The social psychology of expertise: Case studies in
research, professional domains, and expert roles. Mahwah, NJ:
Lawrence Erlbaum.
Moher, D., Schulz, K. F., & Altman, D. (2001). The CONSORT
statement: Revised recommendations for improving the quality
of reports of parallel-group randomized trials. Journal of the
American Medical Association, 285, 1987-1991.
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
Mullen, E. J., & Bacon, W. F. (2003). Practitioner adoption and imple-
mentation of practice guidelines and issues of quality control. In
A. Rosen & E. K. Proctor (Eds.), Developing practice guidelines
for social work intervention: Issues, methods, and research
agenda (pp. 223-235). New York: Columbia University.
Nisbett, R. E., & Ross, L. (1980). Human inference: Strategies and short-
comings of social judgment. Englewood Cliffs, NJ: Prentice Hall.
Proctor, E. K., & Rosen, A. (2006). Concise standards for develop-
ing evidence-based practice guidelines. In A. R. Roberts & K. R.
Yeager (Eds.), Foundations of evidence-based social work prac-
tice (pp. 93-102). New York: Oxford.
Randolph, E. (1997). Manual for the Randolph Attachment Disorder
Questionnaire. Evergreen, CO: Attachment Center Press.
Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R. B.,
& Richardson, W. S. (1996). Evidence-based medicine: What it is
and what it isn’t (editorial). British Medical Journal, 312, 71-72.
Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W., &
Haynes, R. B. (2000). Evidence-based medicine: How to practice
and teach EBM (2nd ed.). Edinburgh, UK: Churchill Livingstone.
Sheppard, M., Newstead, S., Di Caccavo, A., & Ryan, K. (2000).
Reflexivity and the development of process knowledge in social
work: A classification and empirical study. British Journal of
Social Work, 30, 465-488.
Straus, S. E., Richardson, W. S., Glasziou, P., & Haynes, R. B.
(2005). Evidence-based medicine: How to practice and teach
EBM (3rd ed.). Edinburgh, UK: Elsevier.
Toth, S. L., Maughan, A., Manly, J. T., Spagnola, M., & Cicchetti, D.
(2002). The relative efficacy of two interventions in altering mal-
treated preschool children’s representational models: Implications
for attachment theory. Development and Psychopathology, 14,
Weisz, J. R., & Addis, M. E. (2006). The research-practice
tango and other choreographic challenges: Using and testing
evidence-based psychotherapies in clinical care settings. In
C. D. Goodheart, A. E. Kazdin, & R. J. Sternberg (Eds.),
Evidence-based psychotherapy: Where practice and research
meet (pp. 179-206). Washington, DC: American Psychological
at UNIV OF CHICAGO LIBRARY on January 2, 2014rsw.sagepub.comDownloaded from
... tion is problematic and antithetical to the social work profession's emphasis on and commitment to evidence-based practice (EBP) (McCracken & Marsh, 2008). Particularly given the recent emphasis towards EBP practice within social work and fields such as medicine, psychology, and public health, it is odd that schools of social work have not undertaken more formal evaluations and collection of evidence around orientation programmatic efforts. ...
... The concept and implementation of evidence-based practice are not the same as data-driven decision making, which data science is uniquely situated for. The steps of evidence-based practice are built on informing practice decisions using the professional literature (McCracken & Marsh, 2008). However, determining whether service disparities or other problems exist in a given service organization requires local data collection by the organization itself. ...
Full-text available
Recent and rapid technological advances have given rise to an explosive growth of data, along with low-cost solutions for accessing, collecting, managing, and analyzing data. Despite the advances in technology and the availability of data, social work organizations routinely encounter data-related problems that have an impact on their opportunities for making data-driven decisions. Although training in research methods and statistics is important for social work students, these courses often do not address the needs organizations face in collecting, managing, and using data for data-driven decision making. In this teaching note, we propose innovating the social work curriculum using a data science framework as a way to address the day-to-day challenges organizations face regarding data. We provide a description of data science, along with four examples of MSW student projects that were based on a data science framework.
... Others advocate for appropriate consideration of adaptation and building adaptation into implementation strategies and evidence-based approaches (Chambers, 2017;Chambers and Norton, 2016). The wise application of expertise in the delivery of EBP has been recommended in deference to an autocratic and strict application of EBP (McCracken and Marsh, 2008;Sexton and van Dam, 2010). As noted above, studies of clinical decision making processes should be conducted to foster an understanding of what types of adaptations support effective practice in given contexts and with different client populations (Arnd-Caddigan and Pozzuto, 2011;van de Luitgaarden, 2009). ...
Purpose The purpose of this paper is twofold: first, to identify the types of adaptations made by service providers (i.e. practitioners) during a large-scale US statewide implementation of SafeCare®, an evidence-based intervention to reduce child neglect; and second, to place adaptations within a taxonomy of types of adaptations. Design/methodology/approach Semi-structured interviews and focus groups were conducted with 138 SafeCare providers and supervisors. Grounded theory methods were used to identify themes, specific types of adaptations and factors associated with adaptation. Findings Adaptations were made to both peripheral and core elements of the evidence-based practice (EBP). The taxonomy of adaptations included two broad categories of process and content. Process adaptations included presentation of materials, dosage/intensity of sessions, order of presentation, addressing urgent concerns before focusing on the EBP and supplementing information to model materials. Content adaptations included excluding parts of the EBP and overemphasizing certain aspects of the EBP. Adaptations were motivated by client factors such as the age of the target child, provider factors such as a providers’ level of self-efficacy with the EBP and concerns over client/provider rapport. Client factors were paramount in motivating adaptations of all kinds. Research limitations/implications The present findings highlight the need to examine ways in which adaptations affect EBP implementation and sustainment, client engagement in treatment, and client outcomes. Practical implications Implementers and EBP developers and trainers should build flexibility into their models while safeguarding core intervention elements that drive positive client outcomes. Originality/value This study is unique in examining and enumerating both process and content types of adaptations in a large-scale child neglect implementation study. In addition, such adaptations may be generalizable to other types of EBPs.
This aim of this article is to explore the current position of evidence-based practice (EBP) in nursing. The article provides an overview of the historical context and emergence of EBP with an outline of the EBP process. There is an exploration of the current challenges facing the nursing profession as it endeavours to adopt EBP into care delivery, along with actions to address these challenges. There will also be a discussion on how to integrate EBP into undergraduate nursing curricula as academic institutions implement the Future nurse standards of proficiency from the Nursing and Midwifery Council.
Since the mid-1970s, neoliberal policies have relied on privatization and other tactics to down-size the state, transforming human service organizations in the process. The impact of this approach, also known as managerialism, has not been examined in addiction treatment, where the opioid epidemic has intensified the need for services. Using qualitative semi-structured interviews, we explore how managerialism has affected the workforce, service delivery, and the quality of care in New York City addiction treatment programs. Front-line and managerial staff identified threats to working conditions, including high caseloads and productivity demands; threats to service quality including standardization of practice, loss of professional discretion and serving only those most likely to succeed; and threats to worker well-being marked by stress, burn-out, and low morale. The contradictions between the goals of managerialism and addiction treatment threaten the ability to meet the needs of people struggling with addiction.
Full-text available
Similar to all academic disciplines, the use of information and communication technology (ICT) in social work has grown exponentially in both scope and volume during recent decades. Until the COVID 19 crisis, main driving forces were the growing demand for higher education, constant evolving of innovative technologies, cost effectiveness, and accessibility that it offers to those who are not able to benefit from traditional face to face classes because of their geographic locations, responsibilities or disabilities. The Covid19 pandemic drove social work education to become almost exclusively and universally taught online. This required a quick conversion of tradition courses and development of new courses to be taught online. This article discusses and illustrates issues, challenges and strategies related to the interaction between the faculty and technology experts in the context of developing online courses and points to directions for future research.
Full-text available
Social workers are frequently involved in making decisions and in managing risks, although there has been limited conceptualisation to connect these tasks with each other or with assessment processes. This lack of connection reflects the general separateness of the wider academic discourses on risk and uncertainty (often sociological and organisational, relating frequently to business or economic contexts) and those on decision-making (often focusing on psychology of individual judgement, and typically relating to medical or military contexts). This article presents and explores the potential of a ‘risk-managing decision model’, as an example of a model linking risk management with decision science. This is a psycho-social rationality model for choosing between options, such as possible care, support or intervention plans for a client or family. Rather than treating the options as ‘given’ (i.e. unchangeable), as in most decision theories, this model proposes that the decision maker(s) look for ways to manage or reduce the risks inherent in the preferred option as part of the decision process. Like other psycho-social rationality models, this model incorporates both individual cognitive dimensions and framing aspects of the decision environment. Relevance to social work is discussed with examples and reference to various settings and decision processes.
Clinical Data Mining (CDM) is a paradigm of practice-based research that engages practitioners in analyzing and evaluating routinely recorded material to explore, evaluate and reflect on their practice. The rationale for, and benefits of this research methodology are discussed with multiple exemplars from health and human service settings. While CDM was conceived as a quantitative methodology evaluating the process, intervention and outcomes of practice, it can support qualitative studies encouraging reflectiveness. CDM was originally employed as a practice based research (PBR) consultation strategy with practitioners in clinical settings, but the methodology has been increasingly used by doctoral students as a dissertation research strategy either by itself or in combination with other research methods. CDM has gained international recognition by both social workers and allied health professionals. The authors present CDM as a knowledge-generating paradigm contributing to “evidence-informed” practice rather than “evidence based practice.”
Providing childcare along with interventions for disadvantaged mothers increases the benefit. However, program designers rarely focus on the implementation of that care and how it affects program participants. Using the common factors model as a lens, this paper explores the challenges that arose in the third year of an intervention that provided childcare to Spanish-speaking immigrant mothers enrolled in a Cognitive Behavioral Treatment (CBT) group when the socioeconomic status of beneficiaries and the population of children needing care changed. We used data collected by the childcare coordinator—participant observation, field notes, and administrative documentation—to examine the meanings participants assigned to problems in the childcare program, their resolution, and how it affected the therapeutic alliance. Data analysis focuses on the extent emerging themes were consistent with the concepts from the common factors approach. Four lessons for providers of interventions with similar supports emerge: attend to the physical environment, anticipate that learning from and rectifying mistakes can improve the therapeutic relationship, select and train childcare providers to understand they are clinical helpers, and recognize that participants view the childcare service as an extension of the intervention. Findings underscore the importance of support services in fostering the success of social work interventions in community settings.
Full-text available
Available for download at
Full-text available
The relationship between knowledge and practice has been a long standing issue for social work. This is, in part, because, despite the strong intuitive appeal of a position which emphasises the importance of knowledge, this has yet to be established through research. Part of the problem lies with different epistemological positions, and the consequent differences in the examination of knowledge application. Recently, however, there has been a growth of theoretical interest in reflection, reflexivity and cognitive processes as a way of looking at social work knowledge. This process knowledge represents a break from the past where, despite other differences, writers emphasised 'knowledge as product', that is given knowledge, already researched and available for practice use. Despite impressive theoretical developments, however, process knowledge has not been subject to empirical research. This paper reports on work which sought to develop categories and concepts for process knowledge based on research with social workers. Using a novel method, the cognitive processes interview, a number of categories of process knowledge were identified related broadly to areas of critical appraisal, hypothesis generation and the relationship between the two. The paper provides clear evidence of the need for high level cognitive abilities for the conduct of practice, and the need to develop educational processes which will encourage rigour in relation to these cognitive processes.
An overview of why every clinician needs to understand research methods.
Health workers who provide services to persons with severe mental illness are frequently under enormous stress; burnout is common. Alleviating such stress is the objective of Interactive Staff Training. The book provides rehabilitation and mental health professionals with a strategy to help them and their colleagues work as a well-integrated team. This strategy has been implemented in teams serving more than 10,000 persons with psychiatric disabilities. The text combines a careful description of the central theory behind the strategy with pleanty of clinical anecdotes that illustrate its practical, everyday benefits.
This bestselling pocket guide to the skills of evidence-based medicine succeeds in demystifying the terminology and processes in a handy and easy-to-follow format, all within the space of 100 pages. With an improved layout, this second edition of Evidence-based Medicine Toolkit offers more up-to-date guidance as well as new sections on important areas of research. New features of this second edition include: A box for each major database showing how to search the evidence, and highlighting the differences between them. Flow charts for different study types. New critical appraisal sections on qualitative research and economic evaluation. Expanded list of EBM resources on the net. With these added features to make the job easier, the new Toolkit is now an even better companion for all health care professionals using evidence-based methodology in their research and practice.
Smith has written a most interesting obituary of Dave Sackett, whom I first met in 1995 at his office at the John Radcliffe Hospital in Oxford.1 I was publishing director for Churchill Livingstone and had made the appointment directly with him by email, of which he was a prolific and early user. Sackett sat at a very large screen, talking to …