ArticlePDF Available

Staff Development and Practice Models: When We Get Better Our Clients Get Better

Authors:
  • Transforming Corrections
Article

Staff Development and Practice Models: When We Get Better Our Clients Get Better

Abstract and Figures

This article describes how a team of juvenile probation officers in Yamhill, Oregon changed its culture and co-created a practice model. A practice model is a shared set of integrated evidence-based practices that an agency believes will result in better public safety outcomes. The practice-based evidence of staff was combined with a new integration of evidence-based practices called COVE. COVE stands for Coaching Options that are Versatile and Effective. The paper describes the COVE model. The Yamhill team used the National Implementation Research Network's three-part model of implementation to achieve implementation success. Early results indicate that staff and client wellbeing increased. Introduction This article tells the story of how a team of juvenile probation officers co-created a new context of wellbeing in their agency. The new context fostered individual and team growth and made it possible for the team to successfully implement a practice model. A practice model is a shared set of integrated evidence-based practices and principles that an agency believes will result in better public safety outcomes, when supported by the agency and followed with fidelity by its staff (Bogue & O'Connor, 2013). The practice model we describe combined the practice-based evidence of staff with a new and innovative integration of evidence-based practices called COVE. COVE stands for Coaching Options that are Versatile and Effective. Because the probation team used a three-box model of implementation to guide its work, we tell this story in three parts.
Content may be subject to copyright.
Advancing Corrections Journal: Edition #8-2019
10
STAFF DEVELOPMENT AND PRACTICE MODELS: WHEN WE GET BETTER OUR
CLIENTS GET BETTER
Thomas P. O’Connor, Ph.D., CEO, Transforming Corrections & Adjunct Professor,
Western Oregon University
Bradford Bogue, M.A., Director, Justice Systems Assessment and Training
Samantha Collins, M.A., LPC., Senior Consultant and Coach, Transforming
Corrections
Sorcha O’Connor, Undergraduate Student in Physics, University of Oregon
Abstract
This article describes how a team of juvenile probation ocers in Yamhill, Oregon changed its culture
and co-created a practice model. A practice model is a shared set of integrated evidence-based
practices that an agency believes will result in better public safety outcomes. The practice-based
evidence of sta was combined with a new integration of evidence-based practices called COVE.
COVE stands for Coaching Options that are Versatile and Eective. The paper describes the COVE
model. The Yamhill team used the National Implementation Research Network’s three-part model
of implementation to achieve implementation success. Early results indicate that sta and client
wellbeing increased.
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
11
Introduction
This article tells the story of how a team of juvenile probation ocers co-created a new context
of wellbeing in their agency. The new context fostered individual and team growth and made it
possible for the team to successfully implement a practice model. A practice model is a shared set of
integrated evidence-based practices and principles that an agency believes will result in better public
safety outcomes, when supported by the agency and followed with delity by its sta (Bogue &
O’Connor, 2013). The practice model we describe combined the practice-based evidence of sta with
a new and innovative integration of evidence-based practices called COVE. COVE stands for Coaching
Options that are Versatile and Eective. Because the probation team used a three-box model of
implementation to guide its work, we tell this story in three parts.
A farming analogy explains the three-
box model of implementation. First,
in order to implement something new
you must till the soil or “enable the
context”. Every agency’s current context
is perfectly suited to producing what it
already has. You must change the soil
or context if you want something new to
grow. Even the best seed will not grow in
unprepared or rocky ground. Second, you
must choose good seed, the right kinds
of evidence-based practice that will
work in your context. Third, to ensure
a lasting result you must sow the seed in the right way with eective methods of implementation.
Just throwing the seed on the ground will not work. When all three boxes (gure 1) are skillfully
tended their contents multiply, resulting in tangible outcomes that benet sta, clients, and the
community (Cusumano & Preston, 2017). The National Implementation Research Network (NIRN)
believes that agencies who make full use of implementation science and this model dramatically
decrease implementation time and increase implementation success. Agencies go from implementing
an average 14% of an innovation after 17 years of work, to implementing 80% of the innovation after 3
years of work (Balas & Boren, 2000; D. L. Fixsen, Blase, Timbers, & Wolf, 2001; Green & Seifert, 2005;
National Implementation Research Network, 2019e) .
All agency sta members play an important role in our story, but we focus on John Lynch for
his central role in the narrative. John is the juvenile probation supervisor in the Yamhill County
Department of Community Justice in Oregon. John helps lead the department with his manager, Dana,
and director, Jessica. He supervises six probation ocers – Alfredo, Becky, Jackie, Jondee, Kati, and
Laurie - and works closely with the agency administrative sta – Vicki, Kamren, Lisa, and Sarah1. The
remaining characters in our story are Tom, Brad and Samantha, three of the authors of this paper
1 Yamhill County is one of 36 counties in the State of Oregon which is in the Pacic Northwest of the United States.
The county seat is McMinnville. The origin of the county name is probably from an explorer’s name for a local Native
American tribe, the Yamhill, who are part of the North Kalapuyan family of tribes. The full sta names are Dana Carelle,
Jessica Beach, Alfredo Madrigal, Becky Neumann, Jackie Lee, Jondee Rivera, Kati Foster, Laurie Taylor, Vicki Wood (admin
supervisor), Lisa Hanes, Kamren Weller, and Sarah Everett.
Advancing Corrections Journal: Edition #8-2019
12
and the external coaches who assess, support, and challenge John and the Yamhill team2. The story
begins when John asks Tom about his work with Brad to help agencies create and implement their
own “practice model”.
Perhaps like John at the time, you are curious and asking yourself “What the heck is a ‘practice
model’?” Whether you are a probation or parole ocer, or a correctional sta working in a prison,
you already have the basic elements of a practice model within you. Its roots can be found in your
daily practices – the things you do to help your clients live well and crime free, satisfy your boss,
judges, politicians, and members of the public, and prevent further crime. Your practice model emerges
from the habitual ways you interact with fellow sta, complete paperwork, manage your caseload,
and keep yourself motivated. It also comes out of the ways you build rapport with, assess, motivate,
challenge, and sanction clients, or link clients to resources in the prison or community.
Even though you have all these practices, it is probably dicult for you to name which precise
elements of your practice foster change. The complex nature of human beings means practices with
people are never static. They evolve and shift depending on which colleagues you talk with, how
you are feeling on a given day, the size of your caseload, which trainings you attend, failures and
successes with your clients, and the constant ow of daily experience. You and your agency have all
these practices, but you probably do not have a practice model. A practice model explicitly names
the causal factors or elements that lead to client growth and it links these elements into a coherent
whole. A practice model is shared, understood, agreed to, and practiced - with appropriate variation
for role - by all the sta in an agency. A practice model is the common practice you believe has a
direct impact on achieving your agency mission. It describes the practices that all sta should follow
to prevent more crime and promote the social and human capital (rehabilitation) of people under
supervision (Bogue & O’Connor, 2013). That is the kind of practice model John wanted to know more
about.
Since 2012, Brad and Tom have been testing
a method of helping correctional agencies
develop and articulate a “home grown”
practice model. Their method combines
the “practice-based evidence” (PBE) of
sta with the “evidence-based practice”
(EBP) of research (gure 2). It is important
for agencies to use EBP, but we also
believe in the power of being a “reective
practitioner” (Schon, 1982) and helping sta
to draw on their own PBE. Every day we
interact with and try dierent things with
our clients. This practice is inuenced by EBP, but it is also unique to each of us and our context. The
individual and collective practice in an agency gives us an ongoing and extremely relevant body of
evidence to learn from.
2 Yamhill County hired Tom O’Connor, Brad Bogue and Samantha Collins, the rst three authors of this paper, as consultants
to help sta develop and implement a practice model. Tom and Brad are the originators of the COVE model explained in
this paper. Given this, the authors have a personal, professional and nancial investment in this paper.
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
13
For many reasons, we often fail to get feedback from or reect on our own practice-based evidence.
As a result, we do not learn from our practice. Think of your own typing skills. You probably type
every day, but your typing skills have likely not increased much since you rst learned how to type.
Practice is not enough to grow in skill, it must be deliberate practice. When deliberate structures of
feedback, reection and coaching are built into our practice we improve in skill and our clients have
better outcomes (Lambert, Whipple, & Kleinstäuber, 2018; Miller, Hubble, & Duncan, 2007). Brad and
Tom draw on Bas Vogelvang’s work in the Netherlands because his Building Blocks Method provides
a way of including both PBE and EBP in the creation of a practice model. The Building Blocks Method
enables sta to name, co-create, and test the elements of their practice model. The sta begin with
a theoretical practice model that has dierent elements. The sta then use each element and gather
their own data on whether its use was a “hit” (worked for the client) or a “miss” (did not work for the
client)3. Over time the sta know if they should keep or drop that element from their model.
This ensures each element of the practice model works in their unique context. It also helps those
elements become shared elements of practice (Vogelvang, 2006, 2012; Vogelvang & Bogue, 2012).
Importantly, the sta co-create their practice model in a “top-supported, bottom-up” approach. The
model is not built outside and then installed in the agency in a “top-down” manner. Implementing a
practice model is an adaptive change, it requires sta to develop new mindsets and new skills. Top
down methods do not work for adaptive challenges (Ronald A Heifetz, 1994). So, Brad and Tom act
less as the experts with the answer and more as coaches who help sta discover the right questions
and then their own answers. The sta are the experts in their own eld and context, and theirs is the
fundamental expertise that guides the project. Just as clients have autonomy over what they do, sta
have autonomy over what they do, and implementation must support that autonomy.
Box 1: Tilling the Soil - Enabling the Yamhill Context
The more John explored the idea with Tom, the more
intrigued he was at the prospect of co-creating a
practice model with his team of ocers. At the same
time, John was keenly aware that everyone on his
team would not simply embrace this idea. John and
the probation ocers had a healthy skepticism of
implementing EBP models. They had recently taken
part in a largely failed attempt by all the county
juvenile probation agencies in Oregon to implement
the Eective Practices in Community Supervision
(EPICS) model. EPICS is one of the “pre-packaged”
models currently available in corrections (Labrecque,
Smith, Schweitzer, & Thompson., 2013; Latessa,
Smith, Schwietzer, & Labrecque, 2013). Another such
model is the Strategic Training Initiative in Community
Supervision knows as STICS (Bonta et al., 2009;
3 Sta also track “roadblocks” and “nds”. A roadblock is something that prevents the use of an element. If this roadblock
cannot be overcome, then the element is useless in the model. Often, however, the roadblock can be overcome and then
the element can be tested to see if it is a “hit” or a “miss”. A nd is the use of an element that is not in the model but was a
“hit”. A process of testing the nd ensues to see if this element should be included in the practice model.
Enable the Context – Till the Soil
Set the right context by developing the kind of
cultural story, dialogue, teamwork, values and
assumptions that are necessary to support
change. Create a holding environment that will
safely allow people to engage in the hard work,
anxiety, discord and loss that is part of the
change process. Practice how to distinguish
between adaptive and technical change. Develop
yourself and your people, because systems
develop to the extent that the people running
the system develop. One must till the soil to get
it ready (enabled) to receive the new seeds of
change that come from combining practice-based
evidence and evidence-based practice.
Advancing Corrections Journal: Edition #8-2019
14
Bonta et al., 2010; Bonta et al., 2011). EBP models tend to be set before they reach sta and are thus
designed for a top down or expert installation method of implementation. The EPICS implementation
process in Oregon had not done enough to “till the soil” in preparation for receiving the new seed.
There were other implementation problems too. Many ocers felt the model was mechanical and was
not individualized to the unique needs and skills of either the ocers or the clients. Despite having
good content, the EPICS model was only partially implemented and has gradually faded. This and
similar experiences of failed eorts to introduce something new, had induced implementation fatigue
and wariness in the team.
Furthermore, like many probation ocers around the world, each ocer in Yamhill tended to work
in a “lone ranger” mode. Everyone’s practice was largely independent. Sta got along in general, but
they were not particularly eective at a team or collective level. Sta had points of unresolved conict
and role disagreement with each other, the management, and the administrative sta. On a personal
level, John was struggling. He was a new supervisor who was used to being on an equal power level
with his fellow ocers. Now he felt caught between his new and old roles and the power dynamics
of the oce. As a committed member of the Yamhill community, it had been John’s dream to work
as a leader in his community. But now the internal and external discord in the oce left him feeling
discouraged, and he considered looking for work in another county. Clearly, before John could help
much with meeting agency level challenges, he had to meet his own. The challenge for John was to
develop a more complex identity so he could be a more complex leader. He needed to grow internally
and learn how to be more independent of what was going on in his surroundings.
Returning to our three-box metaphor, it was clear that the soil in Yamhill was not ready to receive a
new seed no matter how good that seed might be or how expertly it might be planted. With Tom’s
coaching, John began to work on himself and with his team to prepare the soil. The challenge for the
sta, as a collective, was to improve how the members communicated and resolved diculties. The
sta would have to develop a new collective
story – one of greater trust, collaboration,
feedback and reection. The sta needed to
create a new culture in their agency.
Creating a new cultural context presented an
“adaptive” challenge for John and the team
that could only be resolved through sta
development (Helsing & Lahey, 2010). It was
not a “technical” challenge that could be met
by the usual technical means of problem
solving. Technical challenges can be xed by
an expert or a top-down approach, because
the problem to be solved is clear and the
know-how for solving the problem already
exists. Technical problems are external to
us, so solving them does not require that we
change. Yamhill’s real problems were internal
and were part of its “soil”. Solving these
Adaptive change is “distressing for the people going
through it. They need to take on new roles, new
relationships, new values, new behaviors, and new
approaches to work. Many employees are ambivalent
about the eorts and sacrices required of them. They
often look to the senior executive to take problems
o their shoulders. But those expectations have to be
unlearned. Rather than fullling the expectation that they
will provide answers, leaders have to ask tough questions.
Rather than protecting people from outside threats,
leaders should allow them to feel the pinch of reality in
order to stimulate them to adapt. Instead of orienting
people to their current roles, leaders must disorient them
so that new relationships can develop. Instead of quelling
conict, leaders have to draw the issues out. Instead of
maintaining norms, leaders have to challenge “the way
we do business” and help others distinguish immutable
values from historical practices that must go.” (Ronald A.
Heifetz & Laurie, 1997, p. 124)
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
15
adaptive problems depended on people changing and growing. John and the ocers needed new
mindsets and skills they did not already possess. John realized this and asked his superiors Dana and
Jessica for their facilitative administrative support. Facilitative administration is a key implementation
driver where the administration listens to identify emerging organizational needs and is willing to
move things around to create hospitable environments that support new ways of working (National
Implementation Research Network, 2019c). With that support John and the sta were well positioned
to embark on a journey of development and adaptive change with Tom’s guidance. In turn, Tom was
guided by his colleagues Brad and Samantha.
Tom used his COVE developmental coaching
model - described below - to assess, support and
challenge John in his growth as a leader. The COVE
process helped John to identify his “leading edge of
development”. He set a new goal for himself - “to have
more mental and emotional bandwidth”. He wanted
to replace his anxiety producing patterns with habits
of internal space and freedom for clear thinking.
Then John captured an image of what it would be
like when he achieved his goal and began working on
a plan to get there. John was a willing coachee and
followed through on every aspect of the plan. Part
of the plan was for John to work through a process
called “Immunity to Change” (Kegan & Lahey, 2009).
The Immunity to Change process proved to be pivotal.
It enabled John to name, reect on, and get new
data that shifted the unconscious assumptions that had been guiding his old patterns of behavior.
As a result, John made a signicant change in his mindset and moved into a more complex level of
personal development. This change dramatically increased his “mental and emotional bandwidth” and
hence his capacity as a leader. John developed a more secure identity that was internally grounded
and independent of how others saw or related to him. His identity was more “self-authoring”
(Kegan & Lahey, 2009). John could now take a more nuanced approach to his leadership role and his
relationships with others. The discouraging aspects of the agency’s context remained, but John was
no longer thrown o by them.
Inspired by John, the sta began working on their own development and the development of the
team. Tom used the COVE coaching model to guide the team through a process of assessment,
feedback, and new practice. Each sta member took the Kantor communication and team behavioral
assessment. Tom facilitated team meetings to explore the results of this assessment and discern the
“leading edge of development” for each person and the team. How could each person’s and the team’s
unique pattern of strengths be further developed?
The Kantor assessment of structural dynamics (Kantor, 2012; The Kantor Institute, 2014)
is a behavioral rather than a personality assessment. This means its ndings can be used
developmentally. It is dicult to change our personality; however, we can always change our
behavior, especially with coaching and deliberate practice. The Kantor assessment helps people
Immunity to Change
The Immunity to Change coaching process,
developed by Harvard School of Education
researchers Bob Kegan and Lisa Lahey, is a
profound method for helping people make the
changes they desire but are unable to make.
The process has been featured in the Harvard
Business Review (Kegan & Lahey, 2001), the
Oprah Winfrey Magazine (Brubach, 2009), and
the New York Times (Singer, 2012). Peter Senge,
author of The Fifth Discipline says, “Immunity
to Change is a wonderfully original approach to
a familiar problem: why many crucial change
eorts fail”
Advancing Corrections Journal: Edition #8-2019
16
see the patterns in how they tend to interact, make
meaning, and communicate with others at work and
at home. It reveals the structures of communication
that underly team dynamics. Do you often give voice
to ideas and thus bring momentum to a team? Or, do
you tend to listen more, support the best ideas, and
bring them to completion? Perhaps your pattern is
to respectfully oppose ideas on the table and help
correct them? Or maybe you prefer to observe and
help the team to see its own habitual patterns of
behavior? The team part of the Kantor assessment
reveals the actual patterns of interacting, making
meaning, and communicating in use by the team.
Often a team’s patterns are limited and do not foster
eective dialogue. Often too, the team patterns do
not match up with the individual strengths of the
team members. When this is happening, it means the
cultural context is producing and reinforcing limiting
patterns of dialogue. Seeing the existing patterns of
dialogue allows for new patterns to be coached and
practiced.
Dialogue is a way of thinking and talking collectively
that fosters cultural change and growth (W. Isaacs,
1999). Its impact can be seen in the Virginia Department of Corrections. In 2010, when Harold Clark
became the director of the Virginia Department of Corrections (VDOC), he found a largely hierarchical
agency with a somewhat punitive mindset. Harold believed VDOC needed to become a collaborative
agency with a healing mindset. To achieve this cultural change Harold had his executive team trained
and coached in the skills and practices of dialogue. VDOC then trained and coached its successive
levels of sta until all the nearly 12,000 employees who run Virginia’s prison, probation, and parole
systems knew how to engage in dialogue. They gained a set of skills and a common language for
thinking and talking together. These dialogue practices led to extensive cultural change and the
department saw a steady realization of its goal to collaboratively create a healing environment. For
three years running, Virginia has had the lowest recidivism rates across the 43 US states that report
three-year reincarceration rates. In 2018, Virginia’s rate of recidivism was 23%, the lowest point on a
continuum that reached 64% in the state with the highest rates (Virginia Department of Corrections.,
2018). Importantly, the cultural changes at the VDOC also resulted in steady improvement on all
measures of sta engagement and wellbeing (Clarke & Williams, 2018).
People like Bill Issacs, David Bohm, and David Kantor have helped us understand how a “dialogue” is
dierent from other forms of conversation such as a debate or a discussion (Bohm, 1996; W. Isaacs,
1999; Kantor, 2012). To have a dialogue, everyone must have psychological safety, an equal voice,
and be willing and able to suspend their assumptions. So, while dialogue is possible, it does not come
easily. It takes training, feedback, coaching and deliberate practice.
Dialogue
“Dialogue is a skillful way of talking and thinking
together that establishes a common meaning
amongst a group of people. The spirit of Dialogue
is to understand rather than to convince, and the
process changes the ground out of which the
various relationships arise. Professional Dialogue
is a transparent way of learning together and
humanising an organisation or community.
Through Professional Dialogue organisations
and communities are able to engage the
collective intelligence of the participants to
make better decisions, and to realise the creative
opportunities inherent in any problem. The
ongoing practice of Professional Dialogue at a
systemic level generates sustainable strategic
and operational change. In a Dialogic Organisation
or a Dialogic Community this is underwritten by
the ongoing regeneration of its culture, including
shifting historically-stuck patterns of behaviour,
for the constructive benet of the whole.” Peter
Garret, Academy of Professional Dialogue
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
17
Tom and Samantha provided the training, feedback and coaching. The Yamhill team did the deliberate
practice. The ocers began using the skills of dialogue to think collectively and have more eective
and productive meetings. Each sta member knew their own prole on the Kantor assessment,
as well as how it compared to their colleagues. Taking a strengths-based approach made it a fun,
insightful, and challenging process for everyone. People gradually began to work safely through
points of discord and reach a place of greater trust, vulnerability, and cohesion. Everyone had a
common language in which to talk collectively about their own adaptive challenges and help clients
with theirs. Yamhill was becoming a “dialogic organization” with “dialogic leadership” - an agency that
is skillful at co-creating new meaning to address the ever-evolving and complex needs of the agency
and its clients (W. N. Isaacs, 1999). The COVE coaching process - explained in full below - aligned with
the Kantor process - enabled the ocers to co-create a new cultural context that was open to new
seeds and ready to take on the hard work of developing a practice model.
Box 2: The Right Seed for Yamhill
As with enabling the context, developing a practice model takes work. There are common elements
to every practice model, however, that come from the evidence-based literature of our eld. Agencies
need to develop their own practice model, but it is not necessary or wise to leave them to their own
devices to reinvent the wheel. Brad and Tom spent years collaborating with teams of probation
ocers and supervisors in San Diego County in California, and in Ramsey, Anoka, Dakota, DFO (Dodge,
Filmore and Olmsted), and Isanti Counties as well as the Department of Corrections in Minnesota, to
identify the elements eld sta felt were necessary in a practice model (Bogue, Diebel, & O’Connor,
2008; Bogue & O’Connor, 2015). They then used their 60+ years of combined criminal justice
experience to synthesize a set of core evidence-based elements into a coaching practice model that
seeks to evoke rather than install change in the coachee. Brad and Tom called their coaching model
COVE which stands for Coaching Options that are Versatile and Eective. COVE integrates, sequences
and blends a set of evidence-based practices from four existing literatures: 1) role clarication; 2)
Motivational Interviewing; 3) cognitive-behavioral skill practice; 4) coaching and learning. The model is
versatile and eective in any kind of coaching relationship such as manager to supervisor, supervisor
to sta, sta to client, or peer to peer. We think the model works if you have ve, fteen, or fty
minutes to coach. We also believe the model could work in all human service contexts.
In terms of an agency’s practice model,
COVE is like the chassis of a car. Every car
has a chassis or fundamental framework
that provides the basic four-wheel
structure of a car (gure 3). A Ferrari
looks very dierent from a Ford, a Toyota,
a Chevy or a Mercedes. Yet each of these
cars has a similar chassis. Each agency’s
practice model should have a distinctive
body and look that matches its unique
context, resources and needs. At the same
time, each agency’s model will have a core
underlying chassis or framework that is
similar. The COVE or COVE-like part of an
Advancing Corrections Journal: Edition #8-2019
18
agency’s model acts as a chassis and ensures the agency is following evidence-based practices. COVE
saves an agency a lot of time because it gives it a framework on which to build its own unique practice
model.
The literature on sta development suggests that coaching become an integral part of every
probation, parole, and prison system (Joyce & Showers, 2002). We take coaching for granted in sports,
because we know it is key to ongoing growth and success. Every top athlete or team has a coach. Few
of us in corrections have a coach. Coaching is an evidence-based practice for leaders, but most leaders
do not coach (Goleman, 2000). Coaching is also one of NIRN’s eight “implementation drivers”, and
it is often the most underused driver (National Implementation Research Network, 2019b; National
Implementation Research Network & Fixsen, 2004). A recent article usefully contrasts two metaphors
to explain the role of a probation ocer: the probation ocer as referee versus the probation ocer
as coach (Lovins, Cullen, Latessa, & Jonson, 2018). We agree with the premise of this article. The
probation/human service relationship is best characterized as a coach helping players to play instead
of a referee blowing the whistle when the players fail to follow the rules. Making this shift requires
us to develop a coaching mindset and set of coaching skills (Bungay Stanier, 2016; Keller, 2001). If
coaching were a routine part of how we develop ourselves as leaders, our sta as change agents,
and our clients as good citizens we think it would have a signicant impact on client recidivism and
success rates.
COVE has four broad sections with 14 distinct elements (gure 4). Each element sets up and ows
into the next element, so the 14 elements form a single structure. However, COVE is not a linear
model. The coach can and should choose the right point of entry into the model depending on the
client’s context and how much time they have. Every conversation between a probation/prison ocer
and a client takes place in what we call a “discretionary space”. The sta member and client have
complete discretion to take their conversation in any direction. This discretion is necessary because
of the complex nature of human interaction, the coaching relationship, and the variation involved in
the desistance journey. Yet sta also need a way to bring structure to the space, for without some
structure both the coach and coachee can easily become confused and lose the ow of meaning in
the relationship. COVE structures and maintains the discretionary space. Once the ow of COVE is
mastered, the coach knows where they are and what is happening in any conversation. The coach can
thus more easily match the conversation to the client’s path of rehabilitation or desistance.
Section 1: Role Clarication is based on Chris Trotter’s work in Australia (Trotter, 2006; Ward &
Trotter, 2012). There are two elements in this section. First, the coach articulates the specic context
of the conversation and what their role or job will be in the conversation. Second, the coach asks the
coachee to talk about how they see their role or job in the conversation. If done well, this section
helps the coach and coachee to establish a partnership. It signals to the coachee that they have an
active role and fosters agency. It also helps to distinguish the coaching conversation from other kinds
of conversation such as supervision, condition setting/monitoring, or sanctioning.
Section 2: Motivational Interviewing (MI) is based on Miller and Rollnick’s work. We divided this
section into the four processes of MI: 1) Engaging; 2) Focusing; 3) Evoking; and 4) Planning. We then
weave six elements into these four processes that come from Michael Bungay Stanier’s work on “The
Coaching Habit”. Later in the model we use a seventh element from Bungay Stanier (Bungay Stanier,
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
19
2016). Bungay Stanier wants people
to adopt what he calls a “coaching
mindset”. This allows coaching to
take place in the very busy lives of
managers and sta who often only
have ve or ten minutes to have
a conversation. Bungay Stanier
recommends coaches ask seven
open-ended questions in service of
“being curious”. We discovered that
six of the seven questions, when
used in an MI adherent manner,
help sta to move through the four
processes of MI. The six questions
are a great t with the spirit, skills
and processes of MI. Sta can put
the questions in their own words
provided they keep the intent of
each question and keep it open-
ended.
The rst MI process, Engaging, is
fostered by asking the rst two
of Bungay Stanier’s questions: 1)
“What is on your mind?”; and 2)
“What else is on your mind?” These
two big open-ended questions
allow the coachee to express what
is on the top of their mind and then
go deeper. The open-ended nature of the questions allows the coachee to choose the direction of the
conversation. The answers let the coach know (assess) what is going on.
The second MI process, Focusing, begins when it is time to ask Bungay Stanier’s third question: 3)
“What is the real challenge here for you?” By engaging, the coach has just helped the coachee to
express everything that is on their mind. This next question helps the coachee to sift through and nd
the real issue of concern and growth. In MI terms, question 3 helps the coach and coachee to identify
the right target behavior or mindset to focus on for change. The desistance journey often represents
a series of adaptive challenges for the client. In asking this question the coach is helping the client to
identify their adaptive challenge or leading edge of development.
The third MI process, Evoking, emerges when you ask the next two of Bungay Stanier’s questions:
4) “What do you want?” and 5) “How can I help?” Asking “What do you want?” helps the coachee to
imagine what it will be like when they have met their challenge. Another way of asking this question
is to say “What will it be like when you meet your challenge? It is often dicult for people to answer
this question because they cannot imagine things being dierent. Once they get a glimpse of what it
Advancing Corrections Journal: Edition #8-2019
20
would be like it is very motivating. When the coachee has a vision of what they want, the next element
simply asks them “How can I help you get there?” This second evoking question helps to keep the
focus on the coachee doing the work while also giving the coach great feedback on how to play their
part in the relationship.
The fourth process, Planning, starts when it is appropriate to ask Bungay Stanier’s sixth question:
6) “If you are saying yes to this what are you saying no to?” Most often we do not plan, we add.
We think that change is about adding something new to our plate. Substantial or adaptive change,
however, always involves a loss of some kind. In choosing to say yes to new mindsets and behaviors,
we are implicitly choosing to say no to old mindsets and behaviors that have worked for us in some
way. Bungay Stanier’s sixth question helps the coachee to delve into this key aspect of change and
enter real planning. The seventh of Stanier’s questions is asked at the end of COVE and we address it
in Section 4.
Section 3: Cognitive-Behavioral Skill Rehearsal. This helps the coachee to begin practicing the new
mindset or skills they want in their life. We move out of the purely intellectual process of change and
start to put our plan into action. This is the point where MI stops because the person has resolved
their ambivalence about change. We know we are at this point when we hear a lot of change talk
in the conversation. COVE uses cognitive-behavioral literature to identify four elements or steps to
support active change (Andrews, 1994; Burrell, 2008; Bush, Glick, & Taymans, 2002; Public Safety
Canada, 2012).
First, the coach discusses practicing a skill with the coachee that relates to their plan and then
describes that skill. The skill usually involves putting a new mindset into practice such as
collaborating, listening, having dicult conversations, or saying no to certain things. Second, the
coach models the skill in question. It is very important for the coach to model the skill because the
coachee gets to see the skill in action. This helps them form a mental image of the skill. Third, the
coachee practices using the skill and gives themselves some feedback on how they did. Then the
coach, with permission, can give the coachee additional feedback on what went well with the skill
and anything they may have missed. Fourth, the coach asks the coachee to reect on how this skill
might transfer to other areas of their life. All four of these cognitive-behavioral steps in this order are
important. These steps can often be completed in ve or ten minutes.
Section 4: Learn and Carry. This section has two elements. First, the coach asks Bungay Stanier’s
seventh and nal question – 7) “What was most useful to you today?” This gives the coachee a
moment to reect on the session. Learning research tells us that the act of recalling something helps
to move it from our short-term into our long-term memory. Recall makes it more likely we will retain
the learning that just took place (Brown, Roediger III, & McDaniel, 2014; Coakley, 1990; Oakley, 2014;
Oakley & Sejnowski, 2018). The answer to this question also gives the coach some important feedback
about the session. We have been surprised at the high percentage of people who say the skill practice
was the most helpful.
The second element in this section (carry) transitions to what will happen after the coaching session.
The coach asks the coachee to identify a simple next step that will keep the work they have just
done moving forward. People often respond with a full project, but here we are literally talking about
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
21
just the next step in the project. For example, if someone wants to be a better listener, their next
step might be to ask a friend to meet for coee so they can practice listening. The next step is not to
practice listening, it is to ask a friend to go for coee. Simple next steps keep the momentum going
because people feel they are doable. This is a key part of David Allen’s system for getting things done
(Allen, 2015).
This 14-step integration of a whole range of evidence-based practices was the seed the Yamhill sta
decided to plant into their newly prepared soil or context. The team was ready to work in the third
box. In doing so we had to be careful because a growing body of international research, often referred
to as Implementation Science, has found that most eorts to implement evidence-based practices
fail (Balas & Boren, 2000; D. L. Fixsen et al., 2001; Green & Seifert, 2005; National Implementation
Research Network, 2019e). This pattern persists across dierent elds such as health care, education,
business, criminal justice, childcare, etc. (D. Fixsen, Naoom, Blase, Friedman, & Wallace, 2005).
Well-meaning eorts at implementing EBP often fail because agencies do not take a systematic
approach to the implementation process. Rather, we tend to follow the “train and pray” method –
send a bunch of people to a training and pray that something will be dierent when they come back.
Yamhill decided to take a systematic approach to implementation. It did so by using Vogelvang’s
building blocks method and the ve parts of NIRN’s “active implementation” framework (National
Implementation Research Network, 2019a)4.
Box 3: Sowing the Seed in the Right Way
By working on their own development and dialogue skills as a team the ocers had created a new
context that opened fresh horizons and possibilities. Everyone had become curious about developing
a practice model. John and the ocers were ready to use their practice-based evidence to gather data
on whether each of the 14 COVE elements should be a part of Yamhill’s practice model. Tom did some
training on how Yamhill could use COVE as a starting point and chassis for their model. The team
members decided to begin by trying out the dierent COVE elements to see if they were a “hit” or a
“miss” with their clients. They also began documenting any ‘roadblocks” that prevented them from
using any of the elements. They kept track when they had any “nds”. Finds were things sta did
that were a “hit” with clients but were not in COVE. The evidence from their practice convinced the
sta they were on to something. When they used an element with a client it was usually a hit. But
they also found they were not and could not use some of the elements. For example, they struggled
with the role clarication and cognitive-behavioral steps. It was also dicult to follow the ow of the
14 elements. The positive response from clients, however, was motivating. It increased the ocers’
motivation and desire to learn more and grow in their skills. This spilled over into sta meetings and
became a collective desire to learn as a team. Because the team had created a level of psychological
safety everyone gradually became more willing to enter the discomfort, vulnerability and struggle of
real or adaptive learning.
Now it was time for another external coach, Samantha Collins, to get involved and take the
implementation process to an even more rigorous level. Samantha is a colleague of Tom’s at
4 The ve parts of NIRN’s active implementation framework are: 1) have “usable innovations”; 2) take a “stages of
implementation” approach; 3) use all eight of the “implementation drivers”; 4) develop an implementation team” that
understands and can use the science; and 5) complete “improvement cycles” based on the Plan, Do, Study, Act (PDSA)
process
Advancing Corrections Journal: Edition #8-2019
22
Transforming Correction and a skilled coach. She has experience in using implementation science
to help sta reach competency in EBP innovations. Samantha is also a member of the Motivational
Interviewing Network of Trainers (MINT). The goal was for Yamhill to have its own sustainable
practice model. This meant the Yamhill sta needed to become competent in using their model
independently of Tom, Brad, and Samantha. John stepped up again and committed to becoming
competent in using and coaching COVE. This would enable to coach the sta to competency. The six
probation ocers also stepped up. They wanted to learn to use COVE in an even more competent way
with their clients. There are eight “drivers of implementation” (National Implementation Research
Network, 2019d)5. One of the drivers is having a delity or performance measurement system that
lets you determine when sta are competent in using an agency practice model. Another driver is
having a decision-support data system to track sta progress toward competency. Samantha showed
Yamhill how to put both drivers in place.
As you have seen above, there are many moving parts in COVE (14 elements) and many advanced
mindsets and skills to master. For example, it is impossible to navigate the MI section embedded
in COVE unless you can move from a “xing” mindset to a coaching mindset. You also must be
able to embody the four spirits of MI - entering into relationships with clients that are based on 1)
acceptance; 2) partnership; 3) evocation; and 4) compassion. You must also master the four MI skills
of open-ended questions, armations, reections, and summaries (OARS). Learning all this is a
collective endeavor and necessitates an enabling context. It also calls for a lot of deliberate practice, a
community of practice, assessment and feedback on skills, and successive coaching of skills according
to measurable criteria of delity. Samantha worked on all these fronts with John and the ocers
Laurie, Jackie, Katie, Alfredo, Becky, and Jondee. Competency began to emerge when the team saw
further positive results with their youth. There were deeper conversations and greater vulnerability,
trust, transparency, and feedback with clients. Both sta and clients had a sense of progress. For
example, Becky got this text from a young man, age 19, after one of her sessions. “Hey I just wanted
to say thank you for the last meeting we had. I felt like I really got a lot out of it just sitting there and
nding ways to help myself, so I just wanted to say thank you for that.”
At rst, John was nervous about having Samantha sit in or listen to tapes of him coaching the sta
using COVE. The sta were also nervous about having John sit in or listen to tapes of them coaching
clients. Everyone was nervous about doing role plays with each other or sharing tapes of themselves
with clients in their regular community of practice meetings. During one community of practice, Kati
overcame a personal barrier and was the rst to practice her skills in front of all her colleagues. The
dam of vulnerability burst, and others followed suit. The sta have now become comfortable with
sharing tapes of their conversations with clients, doing role plays with each other and having John
and/or Samantha sit in with them in client sessions. They have created a skilled learning community
where it was safe and therefore possible to learn. Although the ocers used the same chassis, their
unique approaches and individuality became more pronounced and celebrated. This allowed for more
responsivity in matching clients to sta.
There have been many sta outcomes to this collective process of dialogue, assessment, coaching
5 The eight drivers of implementation are: 1) leadership that knows the dierence between leading adaptive and technical
change; 2) sta selection; 3) sta training; 4) sta coaching; 5) systems intervention; 6) facilitative administration; 7)
decision-support data systems; and 8) a delity or performance measurement system.
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
23
and practice. Figure 5 shows
the progress of sta in
learning to use all 14 COVE
elements in a session with a
client. We set the threshold
of competency as being
able to demonstrate the
use of 75% of the 14 COVE
elements on two occasions
with clients. Each ocer
took a dierent amount
of time before they were
ready to submit a tape. John
(shown with the red line)
was the rst person to send in a tape. He was also the rst to achieve competency. The chart makes
it look like Becky (orange line) was the rst to achieve competency, but she started at a later point in
time than John.
Figure 5 shows that every sta member begins their journey at a dierent level and progresses in
a unique way. It took John four improvement cycles to reach competency for the rst time. It took
Katie six cycles, Alfredo six cycles, Jackie four cycles, and Becky three cycles. Laurie is growing in her
skills, and Jondee, the newest ocer, is getting ready to submit her rst tape. Overall, the average
trend clearly depicts a growth in mastery. To enable this growth Samantha listened to John’s tapes
and gave him written feedback according to a set of delity criteria for each of the 14 elements. These
delity criteria were created and rened by Brad, Tom, and Samantha with the help of coaches from
Brad’s company6. After John read his feedback, he would meet with Samantha to identify his growing
edge and practice the COVE skills he needed to develop. John followed the same process in reviewing
recorded sta sessions, as Samantha coached him on how to do so with delity. John and the ocers
found the coaching particularly helpful because the sessions were individualized to each person’s
unique strengths and challenges.
Figure 6 shows sta
progress in one of the
measures we used to track
MI competency. In MI, it is
important to have more
reections than questions
to convey accurate
understanding and empathy,
and to reect change talk.
These skills are critical in
helping people change. As
we explained, COVE contains
the seven open-ended
6 Brad Bogue is the director of Justice Systems Assessment and Training (J-Sat).
Advancing Corrections Journal: Edition #8-2019
24
questions articulated by Bungay Stanier. To meet MI standards, sta must learn to blend in a lot of
reections with those questions to help the coachee go deeper into their answers and nd internal
motivation. The international standard for competency in MI is a ratio of having two reections for
every one question. Because COVE contains other elements in addition to MI, many of which are
open-ended questions, we set the COVE-MI competency level at a ratio of one reection for every one
question. Once again, you can see that the overall trend for all sta is toward MI-COVE competency
over time. Again, each sta member starts their journey at dierent levels and takes a unique path
forward.
The nal competency
measure - Figure 7 -
shows sta gains in a
self-assessment of three
behaviors key to good
coaching. The Coaching
Behaviors Inventory (CBI)
is a validated instrument
that allows people to
self-assess their use of
assessing, challenging and
supporting behaviors with
their clients (Noer, Leupold,
& Valle, 2007; Van Dyke &
Naoom, 2011). We have CBI
data for four of the ocers. The threshold for adequate use of these three behaviors is a score of
70%. You can see that at the beginning of this project the sta average fell below the threshold on all
three measures. Now, the sta average is above on all three measures. Before the project, sta were
strongest in supporting behaviors, and now their assessing and challenging behaviors have largely
caught up.
Section 3 on how to sow the seed reveals the complexity of the implementation process.
Implementing evidence-based practices takes a great deal of work. This is the reason most
implementation eorts fail despite the best of intentions. Take heart, however, from knowing it is
possible to till the soil, and that the seeds and sowing practices are available. One sowing tool we use
to help track all the moving parts of an implementation project is an online tool called the Stages of
Implementation Completion (SIC). Dr. Lisa Saldana and her colleagues at the Oregon Social Learning
Center developed the SIC. The SIC lays out eight implementation stages with a total of 46 universal
steps that need to be accomplished in any implementation project. Agencies can dramatically increase
the likelihood of successful implementation by completing most of the 46 universal implementation
activities (Chamberlain, Brown, & Saldana, 2011; Saldana, 2014; Saldana, Chamberlain, Wang, & Brown,
2012).
We programmed the SIC to map its 46 activities to equivalent activities for implementing COVE.
Whenever Yamhill completed any of the 46 steps, we entered that date into the SIC and checked that
activity o the list of things to be done. So far Yamhill has completed 80% of the SIC activities. This
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
25
tells us Yamhill has a 91% probability of a full and sustainable implementation. We base this 91%
probability on data from about 1,000 other EBP implementation sites that are using the SIC. It took
Yamhill over a year to till its soil and commit to implementing COVE. From that point it only took
Yamhill four months to have its rst COVE session with a youth under supervision. From there it took
John eight months to reach competency in COVE and it took the rst probation ocer 11 months.
Learn and Carry: Summary and Next Steps
The three-box model of implementation posits that if you skillfully till the soil, choose the right
seed for your context, and sow the seed in the right way you will multiply benets for sta, clients
and communities. This paper attempts to demonstrate skillful work in all three of these boxes and
introduces COVE as an innovative new integration of evidence-based practices. COVE is valuable
because it helps to simplify, structure and enhance the complex discretionary space between sta and
people under criminal justice supervision. Clearly this paper draws on a multiplicity of moving parts. It
is a story that requires complexity. Our goal is to articulate the meaning inherent in that complexity.
As Oliver Wendal Holmes is reputed to have said “For the simplicity on this side of complexity, I
wouldn’t give you a g. But for the simplicity on the other side of complexity, for that I would give you
anything I have.”
One next step in our story is to evaluate whether this work will result in positive impacts for sta,
clients and the community. In this paper we have seen the beginnings of a positive impact on sta
and clients. However, we know we need to do much more to measure this work’s concrete eects.
Has Yamhill’s work resulted in more positive school, family, attitude, peer, and non-substance use
outcomes for its youth under supervision? Has it reduced recidivism and fostered the desistance
process? We cannot yet say. We are currently working with the research unit at the Oregon Youth
Authority to develop an evaluation strategy to study the return on investment from the Yamhill
project. We would like to examine the overall outcomes and recidivism trends in Yamhill over the three
years prior to, and since we began. Our hypothesis is that the cultural change achieved by the Yamhill
sta, and the skills they have developed, will have an impact on aggregate outcomes for the agency.
The Yamhill sta, however, report great satisfaction with the outcomes of the process. They feel
they have beneted personally and professionally, and their story continues to evolve. The ocers
show no signs of slowing down. Instead, they have asked John to coach them on their “leading edge
of development” or adaptive challenges as leaders. This coaching is in addition to coaching for COVE
competency. Inspired by Yamhill, two other juvenile probation agencies in Oregon - Malheur and
Tillamook County - are in the early phases of co-creating their own practice model with the help of
dialogue coaching and COVE. Several counties in Minnesota are using COVE elements as part of their
chassis, as are mental health and probation sta in Hawaii and California. We hope the epilogue to this
story will be the scaling up of these implementation practices and COVE in multiple jurisdictions to
benet sta, clients and communities.
LIST OF REFERENCES
Allen, D. (2015). Getting Things Done: The Art of Stress-Free Productivity. New York: Penguin Books.
Andrews, D. A. (1994). A Social Learning and Cognitive Approach to Crime and Corrections: Core
Elements of Evidence-Based Correctional Intervention. Training Protocol. Carleton University -
Advancing Corrections Journal: Edition #8-2019
26
Department of Psychology. Ottawa.
Balas, E. A., & Boren, S. A. (2000). Managing clinical knowledge for health care improvement. In J.
Bemmel & A. T. McCray (Eds.), Yearbook of Medical Informatics 2000: Patient-Centered Systems
(pp. 65-70). Stuttgart, Germany: Schattauer Verlagsgesellschaft.
Bogue, B., Diebel, J., & O’Connor, T. (2008). Combining Ocer Supervision Skills: A New Model
for Increasing Success in Community Corrections. Perspectives: the Journal of the American
Probation and Parole Association.
Bogue, B., & O’Connor, T. (2013). A New Practice Model for Probation and Parole. Justice Systems
Assessment and Training. Boulder, Colorado.
Bogue, B., & O’Connor, T. (2015). The San Diego Risk and Resiliency Check Up II. Justice Systems
Assessment and Training. Boulder, Colorado.
Bohm, D. (1996). On Dialogue. New York: Routledge Classics.
Bonta, J., Bourgon, G., Rugge, T., Scott, T.-L., Yessine, A., & Hons, J. (2009). Strategic Training Initiative in
Community Supervision (STICS): Audiotape Coding Instruction Manual. Corrections Research Unit
Public Safety Canada.
Bonta, J., Bourgon, G., Rugge, T., Scott, T.-L., Yessine, A. K., Gutierrez, L. K., & Li, J. (2010). The Strategic
Training Initiative in Community Supervision: Risk-Need-Responsivity in the Real World 2010-01.
Public Safety Canada.
Bonta, J., Bourgon, G., Rugge, T., Scotte, T.-L., Yessine, A., Team, S. D., & Canada, P. S. (2011). STICS
(Strategic Training Initiative in Community Supervision): Participant Training Guide. Ottawa,
CANADA: Public Safety Canada.
Brown, P. C., Roediger III, H. L., & McDaniel, M. A. (2014). Make It Stick: The Science of Successful
Learning.
Brubach, H. (2009). You Don’t Need More Willpower. O. The Oprah Magazine, January.
Bungay Stanier, M. (2016). The Coaching Habit: Say Less, Ask More & Change the Way You Lead
Forever. Toronto: Box of Crayons Press.
Burrell, W. D. (2008). Cognitive Behavioral Tactics: The Next Phase for Evidence-Based Practices.
Community Corrections Report, 15(2), 17.
Bush, J., Glick, B., & Taymans, J. (2002). Thinking for a Change: Integrated Cognitive Behavior Change
Program Longmont, Colorado: National Institute of Corrections, NIC Academy.
Chamberlain, P., Brown, H. C., & Saldana, L. (2011). Observational measure of implementation in
community-based settings: The Stages of implementation completion. Implementation Science,
6-116.
Clarke, H., & Williams, S. (2018). Dialogue and a Healing Environment in the Virginia Department of
Corrections. Paper presented at the The World Needs Dialogue: Academy of Professional Dialogue
Conference, London.
Coakley, C. (1990). Creativity in prison. The Yearbook of Correctional Education(1990), 105-111.
Cusumano, D., & Preston, A. (2017). Thoughtful and Purposeful Implementation: Break Out Session.
Retrieved from http://ceelo.org/wp-content/uploads/2017/06/IMPLEMENTATION_breakout.pdf
Fixsen, D., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation Research: A
Synthesis of the Literature. University of South Florida, Louis de la Parte Florida Mental Health
Institute, The National Implementation Research Network (FMHI Publication #231). Tampa, FL.
Retrieved from https://nirn.fpg.unc.edu/resources/implementation-research-synthesis-literature
Fixsen, D. L., Blase, K. A., Timbers, G. D., & Wolf, M. M. (2001). In Search of Program Implementation:
792 Replications of the Teaching-Family Model. In G. A. Bernfeld, D.P. Farrington and A.W. Leschied
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
27
(Ed.), Oender Rehabilitation in Practice (pp. 149-166): John Wiley & Sons, Ltd.
Goleman, D. (2000). Leadership that Gets Results. Harvard Business Review, 78-90.
Green, L. A., & Seifert, C. M. (2005). Translation of research into practice: why we can’t “just do it”.
Journal of American Board of Family Practice, 18(6), 541-545.
Heifetz, R. A. (1994). Leadership Without East Answers. Boston: President and Fellows of Harvard
College.
Heifetz, R. A., & Laurie, D. L. (1997). The Work of Leadership. Harvard Business Review, 75(1), 12.
Helsing, D., & Lahey, L. L. (2010). Unlocking Leadership Potential: Overcoming Immunities to Change. In
K. A. Bunker, D. T. Hall, & K. E. Kram (Eds.), Extraordinary Leadership: Addressing the Gaps in Senior
Executive Development. San Francisco, CA: Jossey-Bass.
Isaacs, W. (1999). Dialogue and the Art of Thinking Together: A Pioneering Approach to
Communicating in Business and in Life. New York: Currency.
Isaacs, W. N. (1999). Dialogic Leadership. The Systems Thinker, 10(1), 1-5.
Joyce, B., & Showers, B. (2002). Student Achievement Through Sta Development (3rd ed.). Alexandria,
VA: Association for Supervision and Curriculum Development.
Kantor, D. (2012). Reading the Room: Group Dynamics for Coaches and Leaders. San Francisco, Ca:
Jossey-Bass.
Kegan, R., & Lahey, L. (2009). Immunity to Change: How to Overcome It and Unlock the Potential in
Yourself and Your Organization. Boston, MA: Harvard Business School Publishing Corporation.
Kegan, R., & Lahey, L. L. (2001). The Real Reason People Won’t Change. Harvard Business Review
Onpoint(November), 81-93.
Keller, V. F. (2001). Star Performance: A Coaching Perspective. Keller & Company. Norwalk,
Connecticut.
Labrecque, R., Smith, P., Schweitzer, M., & Thompson., C. (2013). Targeting antisocial attitudes in
community supervision using the EPICS Model: An examination of change scores on the Criminal
Sentiments Scale. Federal Probation, 77(1), 1-9.
Lambert, M. J., Whipple, J. L., & Kleinstäuber, M. (2018). Collecting and delivering progress feedback: A
meta-analysis of routine outcome monitoring. Psychotherapy, 55(4), 520-537.
Latessa, E. J., Smith, P., Schwietzer, M., & Labrecque, R. M. (2013). Evaluation of the eective practices
in community supervision model (EPICS) in Ohio. Schoold of Criminal Justice, University of
Cincinnati.
Lovins, B. K., Cullen, F. T., Latessa, E. J., & Jonson, C. L. (2018). Probation Ocer as a Coach: Building a
New Professional Identity. Federal Probation, 82(1), 15-19.
Miller, S., Hubble, M., & Duncan, B. (2007). Supershrinks: What’s the Secret of Their Success?
Psychotherapy In Australia, 14(4), 14-22.
National Implementation Research Network. (2019a). Active Implementation Frameworks Retrieved
from https://nirn.fpg.unc.edu/module-1
National Implementation Research Network. (2019b). Competency Drivers. Retrieved from https://nirn.
fpg.unc.edu/module-2/competency-drivers
National Implementation Research Network. (2019c). Facilitative Administration. Retrieved from
https://nirn.fpg.unc.edu/module-2/facilitative-administration
National Implementation Research Network. (2019d). Framework 3: Implementation Drivers. Retrieved
from https://nirn.fpg.unc.edu/module-1/implementation-drivers
National Implementation Research Network. (2019e). Topic 2: Research and Rationales - Why are
Implementation Teams Important?
Advancing Corrections Journal: Edition #8-2019
28
National Implementation Research Network, & Fixsen, D. L. (2004). Additional Evidence for
Consultation & Coaching. In T. N. I. R. N. (NIRN) (Ed.), The National Implementation Research
Network (NIRN) (pp. 1-3). Chapel Hill, NC: The National Implementation Research Network (NIRN).
Noer, D., Leupold, C. R., & Valle, M. (2007). An analysis of Saudi Arabian and U.S. Managerial Coaching
Behaviors. Journal of Managerial Issues, 19(2), 271-287.
Oakley, B. (2014). A Mind for Numbers: How to Excel at Math and Science (Even If You Flunked Algebra)
(1st ed.). New York: Penguin Random House.
Oakley, B., & Sejnowski, T. (2018). Learning How to Learn: How to Succeed in School Without Spending
All Your Time Studying; A Guide for Kids and Teens. New York: Penguin Random House.
Public Safety Canada. (2012). Cognitive-Behavioural Interventions in Community Supervision. Research
Summary, 17(5), 1-2.
Saldana, L. (2014). The stages of implementation completions for evidence-based practice: protocol for
a mixed methods study. Implementation Science, 9-43.
Saldana, L., Chamberlain, P., Wang, W., & Brown, C. H. (2012). Predicting Program Start-Up Using the
Stages of Implementation Measure. Administration and Policy in Mental Health and Mental Health
Services Research, 39(6), 419-425.
Schon, D. A. (1982). The Reective Practitioner: How Professionals Think in Action. New York: Basic
Books Incorporated.
Singer, N. (2012, March 17). Helping Managers Find, and Fix, Their Flaws. The New York Times.
Retrieved from https://www.nytimes.com/2012/03/18/business/minds-at-work-helps-managers-
nd-and-x-their-aws.html
The Kantor Institute. (2014). Structural Dynamics: Origin of the Theory. Retrieved from http://www.
kantorinstitute.com/fullwidth.html
Trotter, C. (2006). Working With Involuntary Clients: A Guide to Practice (2nd ed.). New South Wales,
Austrailia: Allen & Unwin.
Van Dyke, M., & Naoom, S. (2011). Setting the Stage: Active Implementation Frameworks to
Integrate the Science and Practice of Implementation. In D. Fixsen & K. A. Blase (Eds.). Global
Implementation Pre-Conference, Washington D.C: National Implementation Research Network.
Virginia Department of Corrections. (2018). State Recidivism Comparison. Evaluation Unit, Virginia
Department of Corrections.
Vogelvang, B. (2006). Development of the Rescue: The Building Blocks Method. In V. Broninembock
(Ed.), Zicht op Eectivfeit, Deel (pp. 71-86): NIZW/ Pracktikon/ Ministerie van VWS.
Vogelvang, B. (2012). A Communication Model for Oender Supervision. Avans University of Applied
Sciences. Avans University of Applied Sciences.
Vogelvang, B., & Bogue, B. (2012). The Building Blocks Model: Implementing EBP in your Organization.
Paper presented at the International Community Corrections Association, Orlando, FL. https://
www.slideshare.net/basov1/icca-2012-15140238 -
Ward, T., & Trotter, C. (2012). Involuntary Clients, Pro-social Modelling and Ethics. Ethics and Social
Welfare, 7(1).
About the Authors
Dr. Tom O’Connor is internationally recognized for his communication, facilitation, and cultural-change
skills in the specialty area of criminal justice. A native of Ireland who now holds dual citizenship in
the U.S., Tom has earned degrees in law, philosophy, theology, counseling and religion. Tom’s cutting-
Article 1: Sta Development and Practice Models: When We Get Better Our
Clients Get Better
29
edge work on facilitating whole system change takes him to many states in the U.S. and to countries
such as New Zealand, Canada, Australia, England, Ireland, France and Estonia. Tom’s company,
Transforming Corrections, fosters change in criminal justice agencies, stas, and clients. Tom is life-
long learner with a unique perspective on the human condition. He grew up in a working-class area of
Dublin and lived as a friar (a wandering monk) for nine years with a Catholic religious order called the
Carmelites. Tom lives in Oregon with Aislinn, his wife, and Sorcha, his daughter.
Contact Information: oconnortom@aol.com; 1420 Court St. NE., Salem, Oregon 97301; (503)-559-5752
Bradford Bogue has worked in corrections, mental health, and alcohol/drug addiction since 1971.
He has published many journal articles and two books on a broad range of treatment and case
management issues. Brad has conducted over 70 program evaluations ranging from restorative
justice to addictions treatment interventions. Mr. Bogue has an MA in Sociology from the University
of Colorado; he has been a member of the Motivational Interviewing Network of Trainers (MINT)
for 25 years. He is also a certied trainer in a wide range of interventions and assessment tools. Mr.
Bogue owned and managed a licensed addictions treatment program for adults and juveniles, Center
for Change, from 1999 to 2006. Brad is the Director of Justice System Assessment & Training (J-Sat),
a national U.S. consulting company operating since 1997. J-Sat specializes in the implementation of
evidence-based practices and facilitating deeper dialogues to bring about social justice.
Contact Information: brad@j-sat.com; 1521 Norwod Avenue #3, Boulder, CO 80304; (303)-887-4094
Samantha Collins, M.A., LPC, NCC, MAC, CACII has a background as a licensed counselor and certied
addictions counselor. She holds a bachelor’s degree in psychology from the University of Pittsburgh
and a master’s degree in forensic psychology from the University of Denver. She has worked with
local, state, and federal entities directing eorts focused on the delivery of an evidence-based practice
to facilitate the integration of primary health and substance use services across the state of South
Carolina. She also managed a large statewide initiative, using implementation science, to develop and
rene the skills of addiction counselors. In her work at Transforming Corrections, Samantha continues
to assist agencies to fully engage and reach competency in evidence-based practices by using
implementation science to bring about real change for individuals, families, communities, and human
service practitioners. She is a member of the Motivational Interviewing Network of Trainers (MINT).
Contact Information: P.O. Box 353, Cascade Locks, OR 97014; samantha_collins@hotmail.com
Sorcha O’Connor is pursuing a B.S. in Physics at the University of Oregon. She has a passion for
writing and the communication of ideas. Sorcha helps Transforming Corrections test the applicability
of Motivational Interviewing and COVE to university students. She also works as a part-time editor and
writer for Transforming Corrections.
Contact Information: sorchao@uoregon.edu; 959 Franklin Blvd., Apt 809., Eugene, OR 97403; (503)-
602-7618
... We must not conflate training with cultural change, which requires a wide-ranging set of activities informed by the implementation sciences (e.g. Miller & Miller, 2015;O'Connor, Bogue, Collins & O'Connor, 2019). Yet, practitioners cannot reflect on the extent to which their practice adheres to restorative principles, without first understanding and being able to articulate these principles. ...
... The prison is an authoritarian environment in which autonomy is limited and time is seldom spent on building relationships (Liebling, Price & Shefer, 2012). Prisons are also among the criminal justice agencies that operate hierarchically (Dias & Vaughn, 2006) and that usually do not give practitioners a voice in policy or practice development, despite change being unlikely to succeed or be sustained without staff participation and buy-in (O'Connor et al., 2019). ...
Article
Full-text available
This study uses the Criminal Sentiments Scale-Modified (CSS-M), a self-report measure of antisocial attitudes, values, and beliefs related to criminal behavior, to measure changes in antisocial attitudes and values over time as a result of participating in EPICS sessions with community supervision officers.
Article
Full-text available
Background: This protocol describes the 'development of outcome measures and suitable methodologies for dissemination and implementation approaches,' a priority for implementation research. Although many evidence-based practices (EBPs) have been developed, large knowledge gaps remain regarding how to routinely move EBPs into usual care. The lack of understanding of 'what it takes' to install EBPs has costly public health consequences, including a lack of availability of the most beneficial services, wasted efforts and resources on failed implementation attempts, and the potential for engendering reluctance to try implementing new EBPs after failed attempts.The Stages of Implementation Completion (SIC) is an eight-stage tool of implementation process and milestones, with stages spanning three implementation phases (pre-implementation, implementation, sustainability). Items delineate the date that a site completes implementation activities, yielding an assessment of duration (time to complete a stage), proportion (of stage activities completed), and a general measure of how far a site moved in the implementation process. Methods/design: We propose to extend the SIC to EBPs operating in child service sectors (juvenile justice, schools, substance use, child welfare). Both successful and failed implementation attempts will be scrutinized using a mixed methods design. Stage costs will be measured and examined. Both retrospective data (from previous site implementation efforts) and prospective data (from newly adopting sites) will be analyzed. The influence of pre-implementation on implementation and sustainability outcomes will be examined (Aim 1). Mixed methods procedures will focus on increasing understanding of the process of implementation failure in an effort to determine if the SIC can provide early detection of sites that are unlikely to succeed (Aim 2). Study activities will include cost mapping of SIC stages and an examination of the relationship between implementation costs and implementation performance (Aim 3). Discussion: This project fills a gap in the field of implementation science by addressing the measurement gap between the implementation process and the associated costs. The goal of this project is to provide tools that will help increase the uptake of EBPs, thereby increasing the availability of services to youth and decreasing wasted resources from failed implementation efforts.
Article
Full-text available
ABSTRACT: An increasingly large body of research is focused on designing and testing strategies to improve knowledge about how to embed evidence-based programs (EBP) into community settings. Development of strategies for overcoming barriers and increasing the effectiveness and pace of implementation is a high priority. Yet, there are few research tools that measure the implementation process itself. The Stages of Implementation Completion (SIC) is an observation-based measure that is used to track the time to achievement of key implementation milestones in an EBP being implemented in 51 counties in 53 sites (two counties have two sites) in two states in the United States. The SIC was developed in the context of a randomized trial comparing the effectiveness of two implementation strategies: community development teams (experimental condition) and individualized implementation (control condition). Fifty-one counties were randomized to experimental or control conditions for implementation of multidimensional treatment foster care (MTFC), an alternative to group/residential care placement for children and adolescents. Progress through eight implementation stages was tracked by noting dates of completion of specific activities in each stage. Activities were tailored to the strategies for implementing the specific EBP. Preliminary data showed that several counties ceased progress during pre-implementation and that there was a high degree of variability among sites in the duration scores per stage and on the proportion of activities that were completed in each stage. Progress through activities and stages for three example counties is shown. By assessing the attainment time of each stage and the proportion of activities completed, the SIC measure can be used to track and compare the effectiveness of various implementation strategies. Data from the SIC will provide sites with relevant information on the time and resources needed to implement MTFC during various phases of implementation. With some modifications, the SIC could be appropriate for use in evaluating implementation strategies in head-to-head randomized implementation trials and as a monitoring tool for rolling out other EBPs.
Article
Full-text available
Recent efforts to better understand the process of implementation have been hampered by a lack of tools available to define and measure implementation progress. The Stages of Implementation Completion (SIC) was developed as part of an implementation trial of MTFC in 53 sites, and identifies the duration of time spent on implementation activities and the proportion of activities completed. This article examines the ability of the first three stages of the SIC (Engagement, Consideration of Feasibility, Readiness Planning) to predict successful program start-up. Results suggest that completing SIC stages completely, yet relatively quickly, predicts the likelihood of successful implementation.
Article
This systematic review and meta-analysis examines the impact of measuring, monitoring, and feeding back information on client progress to clinicians while they deliver psychotherapy. It considers the effects of the 2 most frequently studied routine outcome monitoring (ROM) practices: The Partners for Change Outcome Management System and the Outcome Questionnaire System. Like other ROM practices, they typify attempts to enhance routine care by assisting psychotherapists in recognizing problematic treatment response and increasing collaboration between therapist and client to overcome poor treatment response. A total of 24 studies were identified and considered suitable for analysis. Two-thirds of the studies found that ROM-assisted psychotherapy was superior to treatment-as-usual offered by the same practitioners. Mean standardized effect sizes indicated that the effects ranged from small to moderate. Feedback practices reduced deterioration rates and nearly doubled clinically significant/reliable change rates in clients who were predicted to have a poor outcome. Clinical examples, diversity considerations, and therapeutic advances are provided.
Article
The purpose of this exploratory study was to compare coaching behaviors as they relate to the underlying cultural values of Saudi Arabian and U.S. managers. The Coaching Behaviors Inventary (Noer, 2005) was administered to 80 Saudi Arabian and 71 U.S. managers to measure the frequency with which they exhibited assessing, challenging and supporting coaching behaviors. Results indicated that, relative to their U.S. counterparts, the Saudi Arabian managers 1) demonstrated more overall homogeneity in their coaching behaviors and 2) scored significantly higher on the supporting and challenging dimensions. Implications for U.S. and Saudi coaching relationships as well as the use of effective coaching behaviors to facilitate deeper and more authentic cross-cultural communications are discussed.
Article
Contents: Professional Knowledge and Reflection-in-Action: The crisis of confidence in professional knowledge From technical rationality to reflection-in-action. Professional Contexts for Reflection-in-Action: Design as a reflective conversation with the situation Psychotherapy: The patient as a universe of one The structure of reflection-in-action Reflective practice in the science-based professions Town planning: Limits to reflection-in-action The art of managing: Reflection-in-action within an organizational learning system Patterns and limits of reflection-in-action across the professions. Conclusion: Implications for the professions and their place in society.