Article

Supporting Implementation: The Role of Community Development Teams to Build Infrastructure.

Center for Research to Practice, 12 Shelton McMurphey Blvd, Eugene, OR, 97401, USA, .
American Journal of Community Psychology (Impact Factor: 1.74). 03/2012; DOI: 10.1007/s10464-012-9503-0
Source: PubMed

ABSTRACT Evidence-based methods for assisting consumers, such as counties, in successfully implementing practices are lacking in the field of implementation science. To fill this gap, the Community Development Teams (CDT) approach was developed to assist counties in developing peer networks focused on problem-solving and resource sharing to enhance their possibility of successful implementation. The CDT is an interactive, solution-focused approach that shares many elements of the Interactive Systems Framework (ISF) for Dissemination and Implementation. An ongoing randomized implementation trial of Multidimensional Treatment Foster Care (MTFC) was designed to test the hypothesis that such interactive implementation methods are more successful at helping counties achieve successful and sustainable MTFC programs than standard individualized implementation methods. Using the Stages of Implementation Completion measure, developed for this study, the potential benefit of these interactive methods is examined at different stages of the implementation process ranging from initial engagement to program competency.

0 Bookmarks
 · 
92 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background Much is to be learned about what implementation strategies are the most beneficial to communities attempting to adopt evidence-based practices. This paper presents outcomes from a randomized implementation trial of Multidimensional Treatment Foster Care (MTFC) in child public service systems in California and Ohio, including child welfare, juvenile justice, and mental health.Methods Fifty-one counties were assigned randomly to one of two different implementation strategies (Community Development Teams (CDT) or independent county implementation strategy (IND)) across four cohorts after being matched on county characteristics. We compared these two strategies on implementation process, quality, and milestone achievements using the Stages of Implementation Completion (SIC) (Implement Sci 6(1):1¿8, 2011).ResultsA composite score for each county, combining the final implementation stage attained, the number of families served, and quality of implementation, was used as the primary outcome. No significant difference between CDT and IND was found for the composite measure. Additional analyses showed that there was no evidence that CDT increased the proportion of counties that started-up programs (i.e., placed at least one family in MTFC). For counties that did implement MTFC, those in the CDT condition served over twice as many youth during the study period as did IND. Of the counties that successfully achieved program start-up, those in the CDT condition completed the implementation process more thoroughly, as measured by the SIC. We found no significant differences by implementation condition on the time it took for first placement, achieving competency, or number of stages completed.Conclusions This trial did not lead to higher rates of implementation or faster implementation but did provide evidence for more robust implementation in the CDT condition compared to IND implementation once the first family received MTFC services. This trial was successful from a design perspective in that no counties dropped out, even though this study took place during an economic recession. We believe that this methodologic approach of measurement utilizing the SIC, which is comprised of the three dimensions of quality, quantity, and timing, is appropriate for a wide range of implementation and translational studies.Trial registrationTrial ID: NCT00880126 (ClinicalTrials.gov).
    Implementation Science 10/2014; 9(1):134. · 3.47 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Objective To examine the extent to which child welfare agencies adopt new practices and to determine the barriers to and facilitators of adoption of new practices. Methods Data came from telephone interviews with the directors of the 92 public child welfare agencies that constituted the probability sample for the first National Survey of Child and Adolescent Well-being (NSCAWI). In a semi-structured 40 min interview administered by a trained Research Associate, agency directors were asked about agency demographics, knowledge of evidence-based practices, use of technical assistance and actual use of evidence-based practices. Of the 92 agencies, 83 or 90% agreed to be interviewed. Results Agencies reported that the majority of staff had a BA degree (53.45%) and that they either paid for (52.6%) or provided (80.7%) continuing education. Although agencies routinely collect standardized child outcomes (90%) they much less frequently collect measures of child functioning (30.9%). Almost all agencies (94%) had started a new program or practice but only 24.8% were evidence-based and strategies used to explore new programs or practices usually involved local or state contracts. Factors that were associated with program success included internal support for the innovation (27.3%), and an existing evidence base (23.5%). Conclusions Directors of child welfare agencies frequently institute new programs or practices but they are not often evidence-based. Because virtually all agencies provide some continuing education adding discussions of evidence-based programs/practices may spur. Reliance on local and state colleagues to explore new programs and practices suggests that developing well informed social networks may be a way to increase the spread of evidence-based practices.
    Children and Youth Services Review 04/2014; 39:147–152. · 1.27 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background There is currently a lack of scientifically designed and tested implementation strategies. Such strategies are particularly important for highly complex interventions that require coordination between multiple parts to be successful. This paper presents a protocol for the development and testing of an implementation strategy for a complex intervention known as the Housing First model (HFM). Housing First is an evidence-based practice for chronically homeless individuals demonstrated to significantly improve a number of outcomes.Methods/designDrawing on practices demonstrated to be useful in implementation and e-learning theory, our team is currently adapting a face-to-face implementation strategy so that it can be delivered over a distance. Research activities will be divided between Chicago and Central Indiana, two areas with significantly different barriers to HFM implementation. Ten housing providers (five from Chicago and five from Indiana) will be recruited to conduct an alpha test of each of four e-learning modules as they are developed. Providers will be requested to keep a detailed log of their experience completing the modules and participate in one of two focus groups. After refining the modules based on alpha test results, we will test the strategy among a sample of four housing organizations (two from Chicago and two from Indiana). We will collect and analyze both qualitative and quantitative data from administration and staff. Measures of interest include causal factors affecting implementation, training outcomes, and implementation outcomes.DiscussionThis project is an important first step in the development of an evidence-based implementation strategy to increase scalability and impact of the HFM. The project also has strong potential to increase limited scientific knowledge regarding implementation strategies in general.
    Implementation Science 10/2014; 9(1):138. · 3.47 Impact Factor

Full-text

Download
60 Downloads
Available from
May 16, 2014