ArticlePDF Available

Practical Research for the Solution-Focused Practitioner: A Follow-Up to Wheeler's Article on Quick and Dirty Research

Authors:

Abstract

ijsfp.v2i2.20 ISSN 2001-5453 (Print) ISSN 2001-6980 (Online) Copyright © 2014 by the Author. By virtue of publication in IJSFP, this article is free to use with proper attribution in educational and other non-commercial settings. Abstract In a recent article in this journal, Wheeler (2014) encouraged practitioners to engage in practical research illustrated by some of his experiences. In addition, he encouraged a movement from evidence-based practice to practice-based evidence. The purpose of this article is to amplify some of the points made in that article with particular attention to (a) guidelines for practical research for the clinician, and (b) a clarification of the distinctions between evidence-based practice and practice-based evidence. In the first part of this article, practitioners are encouraged to engage in research that adheres to fundamental principles of science without becoming obsessed with technical perfection. In this pursuit, the practitioner can partner with experts who could be technical consultants while the process of study is under the leadership of the practitioner. In the second part of this article, solution-focused practice is discussed in regard to distinctions between practice-based evidence and two different ways of conceptualizing evidence-based practice. The evidence on solution-focused practice makes its use consistent with both of the views of evidence-based practice and with the perspective of practice-based evidence.
This article was published October 27, 2014.
Reginald O. York, Professor, University of North Carolina Wilmington
Kristin W. Bolton, Assistant Professor, University of North Carolina Wilmington
Robert G. Blundo, Professor, University of North Carolina Wilmington
Correspondence concerning this article should be addressed to Reginald O. York; e-mail: yorkr@uncw.edu
International Journal of Solution-Focused Practices
2014, Vol. 2, No. 2, 32-39
DOI: 10.14335/ijsfp.v2i2.20
ISSN 2001-5453 (Print) ISSN 2001-6980 (Online)
Copyrig ht © 2014 by the Author.
By virtue of publication in IJSFP,
this article is free to use with proper attribution in educational and
other non-commercial settings.
www.ijsfp.com
Practical Research for the Solution-Focused Practitioner:
A Follow-Up to Wheeler’s Article on Quick and Dirty Research
Reginald O. York, Kristin W. Bolton, and Robert G. Blundo
University of North Carolina at Wilmington
Abstract
In a recent article in this journal, Wheeler (2014) encouraged practitioners to engage in practical research
illustrated by some of his experiences. In addition, he encouraged a movement from evidence-based
practice to practice-based evidence. The purpose of this article is to amplify some of the points made in
that article with particular attention to (a) guidelines for practical research for the clinician, and (b) a
clarification of the distinctions between evidence-based practice and practice-based evidence. In the first
part of this article, practitioners are encouraged to engage in research that adheres to fundamental
principles of science without becoming obsessed with technical perfection. In this pursuit, the practitioner
can partner with experts who could be technical consultants while the process of study is under the
leadership of the practitioner. In the second part of this article, solution-focused practice is discussed in
regard to distinctions between practice-based evidence and two different ways of conceptualizing
evidence-based practice. The evidence on solution-focused practice makes its use consistent with both of
the views of evidence-based practice and with the perspective of practice-based evidence.
Keywords: solution-focused, practical research, evidence-based practice, practice-based evidence
The article by Wheeler (2014) in a recent edition of this
journal inspired a reply and extension of the thoughts
offered. Wheeler’s main purpose was to encourage more
clinically useful research, especially research done by
practitioners. In this endeavor, he discussed his own
experience doing what he referred to as “quick and dirty
research”referencing Fain (2013)which could also be
described as practical research.1 Wheeler also encouraged a
movement from evidence-based practice to practice-based
evidence.
This article seeks to build on Wheeler’s article in an
effort to (a) discuss how university-based research can be
improved in terms of its usefulness to the practitioner, (b)
provide basic guidelines for practical research by the
clinician, (c) discuss the rudiments of evidence-based
practice and practice-based evidence in terms of similarities
and differences, (d) provide conclusions regarding the
aforementioned themes as they relate to solution-focused
1 Practical research refers to the building of knowledge useful to practice that
adheres to the basic principles of scientific inquiryclearly defined research
questions, valid measures of behavior, systematic collection and analysis of
data, and appropriate conclusions (i.e., not overstated). Practical research
could further be distinguished from basic research and theoretical research
(note by the authors).
practices, and (e) provide an overview relative to the current
evidence supporting solution-focused brief therapy (SFBT)
as a method of treatment.
How University-Based Research Can Be Improved
To understand how university-based research can be
improved, it is important to look at the nature of the PhD
program. From the beginning of the experience, it tends to
be the expectation that the PhD student will contribute
highly original research that meets rigorous standards of
methodology and theory. Whether this is of importance to
the everyday practice society is of less concern. How well
does the research meet the high standards of the experts in
the associated field? That is the question. So, a student
might develop a research question that is much too focused
on a narrow spectrum of knowledge to be of optimal use for
the practitioner. Generally, it is the PhD students who teach
in universities. When the PhD graduate becomes a college
professor, the first task is to obtain tenure. How is this
achieved? Sometimes, it is through the continued pursuit of
a narrow path of research and the refinement of scientific
methods. Fortunately, there are academicians who pursue a
different path, as Wheeler’s (2014) described in discussing
the guidance he received.
YORK, BOLTON, AND BLUNDO
33
Some professors focus so much attention on the pursuit
of technical perfection that some useful and scientifically
legitimate research is labeled fundamentally flawed. After
all, who is ever able to do the perfect study? The answer is,
no one. We believe excessive attention to technical
perfection has led to some academicians to undertake
research that is not highly useful to the practitioner. What
then is the difference between a major flaw in research and a
limitation? A flaw would be exemplified by a study in which
the methodology is clearly biased in the direction of only
one research answer or in which the data analysis is
misleadingthe conclusions inconsistent with the data. But
imperfections or limitations are everywhere and do not make
the study flawed; limitations are a necessity of any study. It
is important to choose scientific methods that are well-suited
in order to address the particular question under study.
Finally, the refinement of theory is another source of
contention. The idea is that the academician should
contribute something highly original to the current
knowledge base. It can be useful to always have in mind that
a single theory seldom is a useful guide to every aspect of
practice with a given client. Even the highly researched
therapies, such as cognitive-behavioral therapy, are very
often delivered with certain elements of the theory included
while others are not.
We hope this article will be helpful in closing the gap
between the researcher and the practitioner. Thus, a good
deal of what Wheeler called “quick and dirty research”
should be encouraged and communicated among
practitioners. A problem, however, lies in the fact that many
scholarly journals are focused on publishing highly
controlled research rather than a multitude of attempts to
research a given issue.2 We suggest that it would be more
useful to publish a wide array of research with methods of
varying levels of sophistication, and allow the reader to
assess the credibility and usefulness. In this environment,
one might find useful research that is somewhat weak in
methodology (but not flawed), but at the same time, very
useful for practice.
The Fundamentals of Practical Research
Wheeler encouraged the practitioner to engage in quick
and dirty research. In his examples, he employed a
consultant from academia and found that partnership useful.
It is not difficult to engage in simple studies that adhere to
the fundamentals of scientific research. In higher education,
students are introduced to the nature of scientific research
and are required to review such literature. Below is an
overview of a variety of research fundamentals:
The spirit of scientific inquiry is a process of discovery,
not justification. You do not engage in research for the
2 The International Journal of Solution-Focused Practices represent a
category of scholarly journals that, as the authors suggest, aims to publish a
wide array of research with methods of varying levels of sophistication. In
this context, a small and simple study can very well be more important that a
large and highly controlled study (note by the editor).
purpose of proving a point. Instead, you do it for the
purpose of finding out.
The purpose of the study must be clear and must fit with
the research methods. Wheeler made a number of
comments regarding how he was encouraged to slow
down and make things more clear as he engaged in his
experience with quick and dirty research.
Research methods should be transparent and should
reveal to the consumer the results, such that the
credibility of the study can be evaluated and explained
in manner such that the study could be replicated.
These are just a few research fundamentals, and it is
important to note that a consultant who knows research
methodology and statistics and accepts the assumptions of
practical research can aid research projects. A useful guide
is the framework of utilization-focused evaluation by Patton
(2008) where the research expert facilitates research for
intended users and also serves as technical consultant. In this
endeavor, it is the intended users (i.e., practitioners) who
determine what will be evaluated and how it will be
evaluated, starting with a clear picture of how the research
will be put to use.
A Few Guidelines for Practical Research
There are a number of guidelines for practical research
(York, 2009). These guidelines are designed to heighten
awareness of the types of research that can be done in
practice that are consistent with the nature of scientific
inquiry.
A. There are several levels of scientific inquiry that can be
useful for practice. However, it is important to note that such
levels are not restricted to those levels necessary for
publication.
1. At the first level, (i) you might construct a survey
for clients asking for their evaluations of service and their
suggestions, (ii) measure outcomes at intake and discharge,
or (iii) interview a group of clients or practitioners about
some issue of importance to practice. While this could be
conceived as the first level of research, it adheres to the
basic tenets of scientific inquiry, using transparent methods
that lend equal opportunity for the support of more than one
conclusion.
It is important not to fall into the category that Wheeler
called a “self-deluding feedback loop” (Wheeler, 2014,
p. 3). Two key terms here are method and transparency,
primarily because they distinguish scientific inquiry from
other forms of inquiry. In scientific inquiry, tools for
measurement should be reported, sampling methods should
be clear, and data should be fully reported. When using
qualitative methods for interview studies, it is necessary to
report protocols for the interviews as well as for the analysis
of data. These protocols should be credible if others are to
view the research as scientific. It would not be sufficient to
simply report, for example, the conclusions of 10 interviews
conducted with clients, even if the conclusions are endowed
with selected quotes. The latter is sufficient for the
newspaper reporter and may be useful for some
PRACTICAL RESEARCH
34
practitioners, but it is not sufficient to call such a study
scientific research because the latter requires more in the
way of transparent methods, a key principle in science.
When research is complete, the researcher should be able to
describe the methods sufficiently enough for others to be
able to replicate them.
This level of inquiry may or may not employ statistical
analysis of data. If a group of clients made a 30%
improvement from intake to discharge, the researcher can
draw conclusions on the clinical significance of these results
for this sample of persons without the benefit of statistical
analysis. If the researcher wishes to examine the issue of
chance (i.e., how often the data would occur by chance),
there are easy ways to do so, such as using a simple Excel
spreadsheet (York, 2009). If the researcher wishes to
generalize findings to the population, he or she will also
need to consider the way in which the sample was drawn.
Ultimately, examining the relevance of these findings to this
group of clients requires only that you have reason to
believe that the data are credible, but generalizing the data or
considering the issue of chance requires other steps.
2. At the next level, the researcher might statistically
compare outcomes from clients with those on the waiting
list. The advantage of using a comparison group design is
that there is a basis for considering normal growth and
development over time as an alternative explanation for the
client’s gains. Additionally, the advantage of the statistical
analysis of data, as noted beforehand, is the ability to rule
out chance as a legitimate explanation of the data.
At this second level, using a different example, the
researcher might employ a recognized protocol for the
analysis of qualitative data in a study examining methods of
practice for clinicians or an examination of agency records
for information on practice policies. This type of analysis
entails going beyond the methods of the news reporter in
developing and reporting the methodology employed.
3. At the third level, the researcher could engage in
research normally conducted by academicians and often
published in scholarly journals. This level is not the focus of
this article, however, so we will not go far down that road.
Suffice to say, this level requires more sophisticated
methods of sampling, validity testing of measurement tools,
research design, and statistical analysis of data.
The point is that there is more than one legitimate level
of inquiry and we can employ one or the other depending on
the need and availability of resources.
B. The practitioner is the expert on what level of inquiry is
sufficient to generate useful research. However, the
practitioner cannot exercise that expertise without sufficient
knowledge of methods used by others (when reviewing that
research) or knowledge on how to construct research itself
(when the practitioner is the researcher). This knowledge
level is easy to develop, but might sometimes require the use
of a consultant. In this regard, consulting Patton’s (2008)
utilization-focused evaluation, which is described briefly in
York (2009, pp. 7377), might be a useful alternative.
C. Below are some suggestions to the practical researcher
in making choices and constructing a viable study:
1. Consider the idea of face validity in the examination
of the validity of a measurement device rather than the more
sophisticated methods (e.g., testretest reliability, content
validity, construct validity, etc.). This simply means that the
measurement tool has been reviewed by key individuals in
an agency (or elsewhere) and that the key individuals have
offered the opinion that the tool does a reasonable job of
measuring what it is intended to measure. Conducting a pilot
of the tool would also be useful. This means giving it to a
few persons and seeing how they respond.
2. Examine the data naturally collected from clients
using certain activities associated with solution-focused
practice. Review these data over time with regard to types of
clients and goals to see if there are areas of inquiry that
emerge from them. Next, test these inquires more directly,
or suggest that others do so. Both outcome data and data on
client characteristics could be examined, as well the
relationships between them using practical tools for data
analysis like those in York (2009).
3. If statistical analysis is one part of the study,
consider the use of a more liberal standard for statistical
significance (i.e., p < .10) rather than the more conservative
standard employed in the social sciences (i.e., p < .05). The
standard employed is arbitrarythere is no scientific basis
for either standard. The difference lies in whether it is more
important to avoid the error of concluding that a service is
effective when it is not (the conservative alternative) or
avoid the error of concluding that a service is not effective
when it actually is effective (the liberal alternative). The
former is the approach of the typical academic researcher
who lives in a different world than the practitioner. They
want to be sure that their conclusions will be supported by
later research. The practitioner, on the other hand, must
make decisions with limited data and thus, needs to find
some guidance and move on quickly. The idea of improving
the odds that service is effective is the key.
Furthermore, there are simple statistical tests that will
work fine with the practical approach to research. It makes
more sense to use a statistical test that is understood than to
employ a more sophisticated statistic that is not understood.
Moving from a basic statistic to a more sophisticated one
usually does not make a difference in the basic conclusions
that are drawn from the data when considering the practical
implications of the intended research.
The use of scaling is commonplace in solution-focused
practice, so practitioners could consider collecting data from
various clients and prepare a report that addresses useful
themes for practice. Perhaps there is an apparent trend that
shows which techniques seem to work better with certain
clients. In another example, a practitioner could review
clinical notes using a basic protocol for content analysis of
qualitative data or conduct a survey with clients to gain
more insight into critical issues. Finally, the Session Rating
Scale and the Outcome Rating Scale offered by Miller,
Duncan, Brown, Sorrell, and Chalk (2006) could be
employed to track client progress and report trends. There
are a number of research initiatives a practitioner could
YORK, BOLTON, AND BLUNDO
35
employ and this section offers merely a few. However,
while developing a research initiative, it could be useful to
develop a partnership with a technical consultant and/or seek
someone in academia who embraces the idea of practical
research.
How To Move From Evidence-Based Practice to
Practice-Based Evidence?
Wheeler (2014) noted some problems with evidence-
based practice and suggested a shift toward practice-based
evidence. His complaint was that evidence-based practice
places inordinate focus on the testing of theory, while the
solution-focused practitioner, following the lead of de
Shazer and Berg (1997), would be better advised to pursue
the “naturalist inquiry” approach. In order to understand
how to make this transition, it is important to clarify
research terms and define both evidence-based practice and
practice-based evidence.
Evidence-Based Practice Versus
Practice-Based Evidence
Evidence-based practice emerged from the field of
medicine and has been picked up by clinical psychology,
social work, and other helping professions. The early
definition of evidence-based practice involved the
employment of the conscientious, explicit use of current best
evidence in making decisions about the care of individuals
(Sackett, Rosenberg, Gran, Haynes, & Richardson, 1996). A
more contemporary definition is the conscientious, explicit
use of the best available evidence, along with considerations
of client characteristics and treatment preferences, to inform
practice (Dozois et al., 2014)
It is not easy to argue with this definition unless one has
no belief in science. The biggest problem has emerged from
misunderstandings about what this perspective really is and
how it should be implemented. Unfortunately, some believe
that it means always using the treatment that has the best
score of evidenceregardless of other considerationsand
never deviating from strict adherence to that model at all
times with the client regardless of how the client is
responding. No one who writes about evidence-based
practice has ever, to our knowledge, suggested anything that
is remotely close to this understanding of the nature of
evidence-based practice. An agency policy that says that all
clinicians will employ cognitive-behavioral therapy in the
treatment of depression is not operating according to the
principles of evidence-based practice because the best
available evidence for a given client and practitioner may
suggest another approach to treatment, especially if
considering client preferences and practitioner expertise. A
practitioner who employs a therapy strictly by the manual
and views lack of success as so-called “client resistance,
without consideration of other alternatives, is not operating
according to the principles of evidence-based practice.
Unfortunately, these things have happened and have given
some people an impression that this is the way evidence-
based practice is supposed to be. If this is the main
contribution of the evidence-based practice movement, then
we must assert that this has not been a good movement.
This leads into the distinction between two ways to
characterize evidence-based practice. One is evidence-based
practice as process, and the other is evidence-based practice
as product (see, for example, Chapter 2 of Palinkas &
Soydan, 2011). The primary writers on evidence-based
practice prefer that it be viewed as a process where the
practitioner reviews the current available evidence regarding
the goal being pursued and treats it with respect in
determining the course of action for each individual client.
Evidence-based practice as product is the idea of using a
practice that is supported by evidence without regard for the
other critical conditions of practitioner expertise or client
preferences. Unfortunately, some agencies advertise that
they only use evidence-based practices based on the idea
that their practices have some supporting evidence without
regard to the goal of treatment or the other critical aspects of
practitioner expertise and client preferences.
While the proponents of evidence-based practice
advocate for the perspective of evidence-based practice as
process, the current reality is that the clinical dialogue seems
to favor the idea of evidence-based practice as product.
Human services agencies and practitioners often market
themselves as using evidence-based practices without clear
reference to evidence about the particular method of
treatment employed for the specific condition being treated.
Using a treatment with some evidence is better than using
one with none, unless, of course, practitioners and/or
researchers are engaging in a clinical experiment that has a
logical rationale and are seeking new knowledge for the
field. Furthermore, as long as data on client outcomes are
systematically collected, the practitioner is being
accountable without regard to the evidence in the literature.
Barkham and Mellor-Clark (2003) suggested that the
difference between evidence-based practice and practice-
based evidence is that the former focuses on very specific
research, optimally addressing the issue of causation
regarding whether a particular intervention has a specific,
measureable effect on a given condition. Practice-based
evidence, according to this view, focuses on research that is
more relevant to normal practice. The latter is concerned
with the effectiveness of treatments across broad
populations and service settings. That early
conceptualization, however, seems to have given way to a
debate on whether the practice based research actually
suggests that one bona fide treatment is superior to another.
To illustrate what practice-based evidence and evidence-
based practice is, it is important to examine the basic
therapeutic process. First, the practitioner and the client
engage in a dialogue that leads to the determination of the
goals of treatment. Next, the practitioner and the client
collaborate on the decision of what the treatment will entail.
Then, the practitioner and the client engage in the service
process and collectively monitor client progress. The
approach of practice-based evidence follows this process,
encouraging the clinician to systematically collect data on
both service process and client outcome. This approach is
also compatible with evidence-based practice, which calls
PRACTICAL RESEARCH
36
upon the practitioner to use the best available evidence, in
conjunction with considerations of practitioner expertise and
client preferences. The clinician using evidence-based
practice will collaborate with the client on treatment goals,
search for evidence on what works, share that evidence with
the client, and collaborate on the treatment plan. So, what is
the difference between practice-based evidence and
evidence-based practice?
One key difference between these two perspectives lies
in how evidence guides the decision of what kind of
treatment to employ. Advocates for practice-based evidence
have concluded that the evidence about practice outcomes
fails to show that one bona fide treatment is superior to
another. However, the evidence is very clear on the
importance of the common factors in therapy (Flückiger,
Wampold, & Symonds, 2011; Horvath, Del Re, Flückiger,
& Symonds, 2011; Marsh, Angell, Andrews, & Curry, 2013;
Munder, Flückiger, Gerger, Wampold, & Barth, 2012). The
common factors are those things that promote good
outcomes regardless of the approach taken to treatment.
Prominent among those common factors are the therapeutic
alliance between client and practitioner, the skills of the
clinician, and variables related to the individual client, such
as hope. Practitioners should therefore strengthen the
common factors and employ a bona fide method of
treatment that is compatible with their skills and the clients’
interests. Culling through evidence on whether cognitive-
behavioral therapy is better than SFBT for the treatment of
anxiety or depression is basically a waste of time. Evidence
is available supporting both, and most likely the
preponderance of evidence fails to show superiority of one
approach to treatment over another. Thus, the focus should
be on practitioner skills and other common factors.
Solution-focused practice is both a bona fide treatment
and one that has been supported by a great deal of evidence
in regard to a number of target behaviors. If this practice fits
with a practitioner’s expertise and a client’s preference, it
would make sense to use it, according to the model of
evidence-based practice as process. This situation would
also fit with the perspective of practice-based evidence.
However, if a practitioner employs it for behaviors that have
not been tested in research, then he or she is drifting out of
the bounds of evidence-based practice as process. Therefore,
if a practitioner wants to claim that he or she is following the
guidelines of evidence-based practice, it is suggested that he
or she systematically monitor client outcomes and employ
these data in determining what works for that client.
Exploring and reading evidence of certain bona fide
treatments that are superior to others for a given behavior
leads to the daunting task of analyzing the evidence from
conflicting articles found in the literature. For example,
some reviews of evidence on the relative effectiveness of
cognitive-behavioral therapy in the treatment of depression
suggest it is superior to certain other treatments (Tolin,
2010), but some reviews suggest it is not (Baardseth et al.,
2013). The evidence picture is surely not clear about what
works best for which behavior. Perhaps it can help us to
avoid therapies that might cause harm and can encourage us
to try therapies with solid evidence, but it is clear that the
common factors are more important than the specific
ingredients of the treatment approach taken.
Solution-Focused Practice, Naturalistic Inquiry, and
Practice-Based Evidence
Steve De Shazer and Insoo Berg (1997) understood the
roots of SFBT practice to lie in the “naturalist inquiry”
approach to knowledge building. Macdonald (2011) stated
that the development of solution-focused therapy emerged
from “feedback from the clients as to which elements of
therapy were effective in increasing goal attainment” (p. 88).
This observation and feedback formed the foundations of
solution-focused practice as people and families were
engaged in a conversation about what they wanted, as well
as what they have already done to move toward their own
expressed goals and desired outcomes. Although seeing
people for the most part in a particular setting (i.e., an
office) differs from merely engaging them in their own
settings, the basic principle of being curious about learning
from one’s clients still applies; what clients want to be
different in their lives does not impose any particular
perspective or theory on how they should be living, nor does
it depend on any preconceived way of getting them to
change for the better, as defined by the therapist’s
worldview. In contrast, Kirmayer (2012) commented on the
complexity of research-based practice in the context of
social and cultural values and made the point that “the
rhetoric of evidence base practice tends to obscure the
social, moral and political contexts that necessarily shape
both research and clinical practice” (p. 252).
Solution-focused practice, as a form of naturalistic
inquiry, enables people and families to reveal in their own
way and in their own words the nature of their challenges
and also recognize the times when they have been able to
make things better or cope with critical conditions. This
opens the conversation in terms set by the client and reveals
the client’s strengths, implicit capacities, possible alternative
lives, and the potential means for creating a better life for
themselves.
What Can the Practitioner Do?
If a practitioner embraces solution-focused practice as
well as practice-based evidence, he or she should engage the
client in a collaboration related to the goals of service and
service methods, collect data on service process and service
outcome, and then collaborate with the client on what to do
next. What could come next is a change in methods, referral
to other services, or a termination of service because it is no
longer needed.
This process is not very different from evidence-based
practice as product, provided that one’s methods are
consistent with solution-focused practice, a treatment that
has been found to be supported by evidence regarding many
different types of outcomes (Franklin, Trepper, Gingerich, &
McCollum, 2012). However, if a practitioner embraces
YORK, BOLTON, AND BLUNDO
37
evidence-based practice as process, he or she would review
the evidence and potentially move in a different direction
than solution-focused practice if the evidence favors a
different approach with a given client and the practitioner is
either skilled in the alternative therapy or prepared to seek
training. If not, then evidence-based practices as product, as
process, and practice-based evidence are not really different.
One of the major hurdles that a solution-focused
practitioner may encounter if interested in using evidence-
based practice as process is that using it as process assumes
that different behaviors are better treated with different
therapeutic methods. The differences in behaviors tend to be
viewed in light of the idea of diagnosis, even though there is
nothing specific to evidence-based practice as process that
requires a diagnosis. The fact is that much of the evidence is
grounded in diagnosis. To the solution-focused practitioner,
the focus is on strengths and solutions rather than problems
and causes. However, it is important to note that this is not
limited to solution-focused practitioners and therefore also
applies to practitioners that use other treatment models.
Practice-Based Evidence and Solution-Focused Practice
The majority of solution-focused practitioners use
scaling in sessions to access the client’s subjective
evaluation of his or her situation and feelings over time.
How they evaluate their subjective experience in terms of
better or worse is a guide for both the practitioner and the
client to understand the behaviors and contexts that would
produce change, and to employ those actions that lead to
making things better or bringing the client closer to
achieving his or her stated goals.
The practice-based evidence perspective has placed a lot
of emphasis on the use of systematically collected data from
the client. One tool is the Session Rating Scale (Miller et al.,
2006), which asks the client to rate each session at the end
according to the presence of certain behaviors that promote
a therapeutic alliance. For example, did the session focus on
things of special concern to the client? Another tool is the
Outcome Rating Scale (Miller et al., 2006), which asks the
client at the beginning of each session to rate the extent to
which he or she has experienced certain general outcomes
since the previous session. For example, how has the client’s
life been socially?3 Thus, there is a subjective means of
sharing outcomes between the practitioner and the client
both during the process and at the end. Research has
demonstrated the efficacy of scaling by comparing scaling
with standard inventories to assess outcomes (Dahl, Bathel,
& Carreon, 2000; Gostautas, Cepukiene, Pakrosins, &
Fleming, 2005; Nelson & Kelly, 2001; Wiseman, 2003).
These studies have demonstrated that scaling is significantly
related to instrument-measured outcomes at the end of
treatment and after extended periods of time following
treatment. Therefore, it might be useful for practitioners to
utilize these scores and the changes as general indicators of
real and lasting change.
3 For more information see http://www.centerforclinicalexcellence.com
Beyond Scaling to the Current Evidence Base of
Solution-Focused Practice
Seeking empirical evidence to support practice can be a
daunting task. However, there are a number of resources
available to solution-focused practitioners allowing for more
efficient and accessible evidence to support SFBT as a
method of treatment. For example, Alasdair Macdonald
(2014) maintains a running list of SFBT evaluations. This
list can be accessed from the Solution-Focused Brief
Therapy Association research page. As of November 2013,
there were a total of 133 documented studies on SFBT,
including two meta-analyses, five systematic reviews of the
literature, and 125 follow up research-based papers. The
follow-up research-based papers consist of 28
randomized/control group findings, 47 comparison studies,
and 50 naturalistic studies summarizing a variety of mental
health issues. In 2012, Franklin et al. published an edited
book titled the Solution-Focused Brief Therapy: A
Handbook of Evidence-Based Practice, which provides
practitioners with an elaborate overview of SFBT with
diverse client populations. Gingerich and Peterson (2013)
presented a systematic review of evidence on solution-
focused therapy showing its effectiveness for a wide range
of behaviors. Finally, the continued growth of SFBT
research studies has led to recognition of SFBT as a
promising evidence-based practice by the Substance Abuse
and Mental Health Service Administration’s National
Registry of Evidence-based Programs and Practices
(http://www.nrepp.samhsa.gov/) and the Australian
Psychological Association (2010) review of evidence-based
psychological intervention.
As the evidence supporting SFBT as a method of
treatment continues to grow, so will the recognition of SFBT
as an evidence-based practice by national government
organizations. Given the political realities of government
sponsors for research, the insurance industry, and the
pressure for professional organizations to join the mix with
other service providers, evidence-based practice is often
touted as the best practice to be used in a particular setting
and with particular clients. Therefore, it is advantageous for
SFBT to be recognized by government and appear on
national registries. In some cases, practitioners may find that
their agency has taken on one of these evidenced practices.
In this situation, solution-focused practitioners find
themselves searching for and recognizing the extensive
research already done with solution-focused practice under
the label of evidence-based practice. Utilizing the available
resources mentioned above is a useful means of obtaining
such information.
Conclusions
The practitioner with an interest in exerting more effort
to use research to inform practice or to engage in research
that informs practice can use our practical guidelines for
research and seek partnerships with academicians to engage
in a practical research project. Using the basic principles of
PRACTICAL RESEARCH
38
utilization-focused evaluation, such a project makes the
practitioner and the academician collaborators. These
practical research projects can take a form that falls into any
of a number of categories, from research conducted on
current clients for one’s own development, to research
conducted for reports that might be shared with others, to
research that might be of a wider interest and published in
scholarly journals. In each case, there are fundamental
research methods to be implemented.
Those who adhere to the principles of practice-based
evidence may employ solution-focused practice if they are
skilled in this approach and can effectively implement the
common factors in therapy. Those who prefer to consider
themselves evidence-based practitioners can consider the
wide array of research supporting solution-focused practice
as a basis for their claim in using evidence-based practice as
product. To claim to be employing evidence-based practice
as process, however, the practitioner will need to spend
some time with the evidence for each client and consider
what the best approach is.
References
Australian Psychological Society. (2010). Evidence-
based psychological interventions: A literature review
(3rd ed.). Retrieved from
https://www.psychology.org.au/Assets/Files/Evidence-
Based-Psychological-Interventions.pdf
Baardseth, T. P., Goldberg, S. B., Pace, B. T., Wislocki, A.
P., Frost, N. D., Siddiqui, J. R., Wampold, B. E. (2013).
Cognitive-behavioral therapy versus other therapies:
Redux. Clinical Psychology Review, 33, 395405.
doi:10.1016/j.cpr.2013.01.004
Barkham, M., & Mellor-Clark, J. (2003). Bridging evidence-
based practice and practice-based evidence: Developing
a rigorous and relevant knowledge for the psychological
therapies. Clinical Psychology and Psychotherapy, 10,
319327. doi:10.1002/cpp.379
Dahl, R., Bathel, D. & Carreon, C. (2000). The use of
solution-focused therapy with an elderly population.
Journal of Systemic Therapies, 19, 4555. Retrieved
from http://www.guilford.com/journals/Journal-of-
Systemic-Therapies/Jim-Duvall/11954396
De Shazer, S., & Berg, I. K. (1997). What works? Remarks
on research aspects of solution-focused brief therapy.
Journal of Family Therapy, 19, 121124.
doi:10.1111/1467-6427.00043
Dozois, D. J. A., Mikail, S. F., Alden, L. E., Bieling, P. J.,
Bourgon, G., Clark, D. A., Johnston, C. (2014). The
CPA presidential task force on evidence-based practice
of psychological treatments. Canadian Psychology, 553,
153160. doi:10.1037/a0035767
Fain, P. (2013, May 1). Quick and dirty research. Inside
Higher Ed. Retrieved from
http://www.insidehighered.com/news/2013/05/01/educat
ion-research-and-pace-innovation
Flückiger, C., Del Re, A. C., Wampold, B. E., & Symonds,
D. (2011). How central is the alliance in psychotherapy:
A multilevel longitudinal meta-analysis. Journal of
Counseling Psychology, 59, 1017.
doi:10.1037/a0025749
Franklin, C., Trepper, T., Gingerich, W. J., & McCollum, E.
(Eds.). (2011). Solution-focused brief therapy: From
practice to evidence-informed practice. New York, NY:
Oxford University Press.
Gingerich, W. J. & Peterson, L.T. (2013). Effectiveness of
solution-focused brief therapy: A systematic qualitative
review of controlled outcome studies. Research on
Social Work Practice, 23, 266283.
doi:10.1177/1049731512470859
Gostautas, A., Cepukiene, V., Pakrosins, R., & Fleming, J.
S. (2005). The outcome of solution-focused brief
therapy for adolescents in foster care and health
institutions. Baltic Journal of Psychology, 6, 514.
Retrieved from
http://www.lu.lv/apgads/izdevumi/elektroniskie-
izdevumi/zurnali-un-periodiskie-izdevumi/baltic-
journal-of-psychology/
Horvath, A. O., Del Re, A. C., Flückiger, C., & Symonds, D.
(2011). Alliance in individual psychotherapy.
Psychotherapy, 48, 916. doi:10.1037/a0022186
Kirmayer, L. J. (2012). Cultural competence and evidence-
based practice in mental health: Epistemic communities
and the politics of pluralism. Social Science & Medicine,
75, 249256. doi:10.1016/j.socscimed.2012.03.018
Macdonald, A. J. (2011). Solution-Focused therapy, theory,
research & practice. London, United Kingdom: Sage.
Macdonald, A. J. (2014). Solution-focused brief therapy
evaluation list. Retrieved from
http://www.sfbta.org/research.html
Marsh, J. C., Angell, B., Andrews, C. M., & Curry, A.
(2012). Client-provider relationships and treatment
outcomes: A systematic review of substance abuse, child
welfare, and mental health services research. Journal of
the Society of Social Work Research, 3, 233267.
doi:10.5243/jsswr.2012.15
Miller, S. D., Duncan, B. L., Brown, J., Sorrell, R., & Clark,
M. B. (2006). Using formal client feedback to improve
retention and outcome: Making ongoing, real-time
assessment feasible. Journal of Brief Therapy, 5, 522.
Retrieved from http://www.journalbrieftherapy.com/
Munder, T., Flückiger, C., Gerger, H., Wampold, B., &
Barth, J. (2012). Is the allegiance effect an
epiphenomenon of true efficacy differences between
treatments: A meta-analysis. Journal of Counseling
Psychology, 59, 631637. doi:10.1037/a0029571
Nelson, T. S., & Kelly, L. (2001). Solution-focused couples
group. Journal of Systemic Therapies, 20, 4766.
doi:10.1521/jsyt.20.4.47.23085
Palinkas, L. A., & Soydan, H. (2011). Translation and
implementation of evidence-based practice. Oxford,
United Kingdom: Oxford University Press.
YORK, BOLTON, AND BLUNDO
39
Patton, M. Q. (2008). Utilization-focused evaluation (4th
ed.). Thousand Oaks, CA: Sage.
Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., &
Haynes, R. B., & Richardson, W. S.(1996). Evidence
based medicine: What is it and what it isn’t. British
Medical Journal, 312, 7172.
doi:10.1136/bmj.312.7023.71
Tolin, D. F. (2010). Is cognitivebehavioral therapy more
effective than other therapies? A meta-analytic review.
Clinical Psychology Review, 30, 710720.
doi:10.1016/j.cpr.2010.05.003
Wheeler, J. (2014). Quick and dirty research: Opportunities
for people who are too busy to do research. International
Journal of Solution-Focused Practice, 2(1), 13.
doi:10.14335/ijsfp.v2i1.15
Wiseman, R. (2003). The luck factor: Change your luck and
change your life. London, United Kingdom: Century.
York, R. O. (2009). Evaluating human services: A practical
approach for the human service professional. Boston,
MA: Pearson.
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
As a trainer and supervisor I have heard of many examples of the use of a solution-focused approach bringing about significant improvements in the lives of service users. Too often, from my point of view however, these wonderful stories of the possibilities of change stay known only to a few. As a practitioner I also recall my concerns over what might be required for my witnessing of this evidence to be presented to a wider audience. In this paper I reflect on my own journey into research whilst still being in practice, the ideas from professional culture that may have influenced my thinking, the possibilities of moving from evidence-based practice to practice-based evidence and the opportunities that were available to me in a busy Child & Adolescent Mental Health Service. The paper proposes the possibilities of a “quick and dirty” approach for others in practice who would like to do research. Two examples are given of studies that took advantage of available opportunities. I then offer a series of questions, in the hope that this paper will encourage more practitioners to carry out research.
Article
Full-text available
In 2011, the Board of Directors of the Canadian Psychological Association (CPA) launched the Task Force on Evidence-Based Practice of Psychological Treatments to support and guide practice as well as to inform stakeholders. This article describes the work of this task force, outlining its raison d'etre, providing a comprehensive definition of evidence-based practice (EBP), and advancing a hierarchy of evidence that is respectful of diverse research methodologies, palatable to different groups, and yet comprehensive and compelling. The primary objective was to present an overarching methodology or approach to thinking about EBP so that psychologists can provide and implement the best possible psychological treatments. To this end, our intention for this document was to provide a set of guidelines and standards that will foster interest, encourage development, and promote effectiveness in EBP.
Article
This study examines the clinical efficacy and cost-effectiveness of treating seniors with solution-focused therapy (SFT) in an outpatient setting. Data was collected from 74 patients. Patients receiving outpatient behavioral services were referred to treatment for the following problems: depression, anxiety, marital problems, and stress related to chronic illness. The measures used were pre and post self-scaling scores, motivational scores, and pre and post Global Assessment of Functioning (GAF) scores. In addition, patient satisfaction questionnaires were begun during the second quarter of the study. Individual two-tailed, paired t tests were used to test the differences between time one and time two self-scaling scores and GAF scores, t tests showed statistically significant increases for both self-scaling and GAF scores (p < .05). Sixty-nine of the seventy-four participants filled out a satisfaction survey. Patients indicated high satisfaction with behavioral health services. Consistent with results reported from previous research studies, this study revealed the clinical effectiveness of SFT. This study went on to demonstrate what has not been well documented in the literature, namely SFT's effectiveness with elderly outpatients. In conclusion, preliminary implications for clinicians indicate that SFT can produce significant clinical outcomes with senior outpatient populations. This study examines the clinical efficacy and cost-effectiveness of treating seniors with solution-focused therapy (SFT) in an outpatient setting. Data was collected from 74 patients. Patients receiving outpatient behavioral services were referred to treatment for the following problems: depression, anxiety, marital problems, and stress related to chronic illness. The measures used were pre and post self-scaling scores, motivational scores, and pre and post Global Assessment of Functioning (GAF) scores. In addition, patient satisfaction questionnaires were begun during the second quarter of the study. Individual two-tailed, paired t tests were used to test the differences between time one and time two self-scaling scores and GAF scores, t tests showed statistically significant increases for both self-scaling and GAF scores (p < .05). Sixty-nine of the seventy-four participants filled out a satisfaction survey. Patients indicated high satisfaction with behavioral health services. Consistent with results reported from previous research studies, this study revealed the clinical effectiveness of SFT. This study went on to demonstrate what has not been well documented in the literature, namely SFT's effectiveness with elderly outpatients. In conclusion, preliminary implications for clinicians indicate that SFT can produce significant clinical outcomes with senior outpatient populations.
Book
This second edition of Solution-focused Therapy remains the most accessible yet comprehensive case-based introduction to the history, theory, research and practice of solution-focused therapy (SFT) within mental health care and beyond. Drawing on contemporary research and the author's own extensive experience, the fully revised and updated new edition includes: discussion of recent developments relevant to research and training; a new chapter on challenges to SFT and the integration of SFT with other therapeutic approaches; extended discussion on ethical issues; topical exploration of the application of SFT with patients with personality disorders and dementias; contemporary research on solution-focused coaching and approaches to organizational change; new case material This highly practical guide should be on the desk of every student or trainee studying this strongly supported, growing approach. It is also a useful resource for practitioners wanting to update their core skills and knowledge.
Book
Therapy is frequently miscast as requiring an enormous amount of time and financial commitment, but helpful, goal-oriented therapy can produce positive results after only a few sessions. Solution-focused brief therapy (SFBT) has been gaining momentum as a powerful therapeutic approach since its inception in the 1980s. By focusing on solutions instead of problems, it asks clients to set concrete goals and to draw upon strengths in their lives that can help bring about the desired change for a preferred future. Chapters review the current state of research on SFBT interventions and illustrate its applications-both proven and promising-with a diverse variety of populations, including domestic violence offenders, troubled and runaway youth, students, adults with substance abuse problems, and clients with schizophrenia. This text also includes a treatment manual, strengths-based and fidelity measures, and detailed descriptions on how to best apply SFBT to underscore the strengths, skills, and resources that clients may unknowingly possess.
Article
This book is about conducting research on the process and outcomes of the translation and implementation of evidence-based practices in social work. Its aims are to outline a strategy for conducting such research and to identify the infrastructure and resources necessary to support such research within the field of social work. Using the National Institutes of Health (NIH) Roadmap as a guide, the book describes the challenges of investigating the process and outcomes of efforts to translate and implement evidence-based social work practice. It begins with a general introduction to the topic of translation and implementation of evidence-based practice and its importance to the field of social work. It then moves to an examination of the methods for studying the effectiveness, dissemination, and implementation of evidence-based practices and the organizational context in which these activities occur in social work practice. It also describes the use of mixed-methods designs and community-based participatory research (CBPR) methods to address these challenges. It is unique in that it provides case studies of research on the translation and implementation in social work practice, identifies potential barriers to conducting such research, and offers recommendations and guidelines for addressing these barriers. The proposed strategy is founded on the principle and practice of cultural exchange between members of social worker-led interdisciplinary research teams and between researchers and practitioners. The outcome of such exchanges is the transformation of social work research and practice through the linkage between translational research and research translation.
Article
I was interested to read the paper on solution-focused brief therapy by Iveson (2002), and the commentary by Gopfert (2002). Solution-focused brief therapy is a valuable treatment approach within psychiatry, although the outcome research shows that other approaches are needed for some patients. G
Article
In the literature there is a lack of well-designed research on the effectiveness of Solution-focused brief therapy (SFBT) for adolescents having social or health difficulties. Besides, it remains unclear, whether the effectiveness of SFBT is similar across different subgroups of adolescents (e.g. foster care, clinical, general population etc.). The aim of the present study was to evaluate and compare the effectiveness of SFBT for adolescents in foster care and health care settings. The treatment manual, standardized and subjective measures, as well as a comparison group were used. The treatment group consisted of 81 adolescents, 44 of them were from foster care and 37 from health care settings. The comparison group (N = 52) was selected from the same two populations. The effectiveness of SFBT for adolescents was demonstrated by all employed measures. The comparison of adolescents from foster care and health care institutions showed no differences in most of the outcome measures with the exception of higher scores on the subjective client’s evaluation of therapeutic progress in foster care setting.
Article
Objective: We review all available controlled outcome studies of solution-focused brief therapy (SFBT) to evaluate evidence of its effectiveness. Method: Forty-three studies were located and key data abstracted on problem, setting, SFBT intervention, design characteristics, and outcomes. Results: Thirty-two (74%) of the studies reported significant positive benefit from SFBT; 10 (23%) reported positive trends. The strongest evidence of effectiveness came in the treatment of depression in adults where four separate studies found SFBT to be comparable to well-established alternative treatments. Three studies examined length of treatment and all found SFBT used fewer sessions than alternative therapies. Conclusion: The studies reviewed provide strong evidence that SFBT is an effective treatment for a wide variety of behavioral and psychological outcomes and, in addition, it may be briefer and therefore less costly than alternative approaches.