Outcome-Based Workforce Development and Education in Public Health
United States Public Health Service, Office of Workforce and Career Development, Centers for Disease Control and Prevention, Atlanta, Georgia 30333, USA. Annual Review of Public Health
(Impact Factor: 6.47).
12/2009; 31(1):253-69 1 p following 269. DOI: 10.1146/annurev.publhealth.012809.103705
The broad scope of the public health mission leads to an increasingly diverse workforce. Given the range of feeder disciplines and the reality that much of the workforce does not have formal training in public health science and practice, a pressing need exists for training and education throughout the workforce. Just as we in public health take a rigorous approach to our science, so too should we take a rigorous, evidence-driven approach to workforce development. In this review, we recommend a framework for workforce education in public health, integrating three critical conceptual approaches: (a) adult learning theory; (b) competency-based education; and (c) the expanded Dreyfus model in public health, an addition to the Dreyfus model of professional skills progression. We illustrate the application of this framework in practice, using the field of applied epidemiology. This framework provides a context for designing and developing high-quality, outcome-based workforce development efforts and evaluating their impact, with implications for academic and public health practice efforts to educate the public health workforce.
Available from: Candace D Rutt
- "Methods We used Kirkpatrick's training evaluation framework (Kirkpatrick, 1996) to evaluate outcomes and needs among HIA trainees, measuring their reaction, learning, behavior and results. We modified the framework to incorporate additional frameworks for adult learning, competencybased education, and skills progression in workforce development (Hughes and Kemp, 2007; Koo and Miner, 2010). Using this framework (Table 1), we developed a semi-structured interview guide asking about the trainee's background, pre-training motivation and propensity , effectiveness of the training, and post-training transfer and implementation . "
[Show abstract] [Hide abstract]
ABSTRACT: Background: Despite the continued growth of Health Impact Assessment (HIA) in the US, there is little research on HIA capacity-building. A comprehensive study of longer-term training outcomes may reveal opportunities for improving capacity building activities and HIA practice. Methods: We conducted in-depth interviews with HIA trainees in-the United States to assess their outcomes and needs. Using a training evaluation framework, we measured outcomes across a spectrum of reaction, learning, behavior and results. Results: From 2006 to 2012, four organizations trained over 2200 people in at least 75 in-person HIA trainings in 29 states. We interviewed 48 trainees, selected both randomly and purposefully. The mean duration between training and interview was 3.4 years. Trainees reported that their training objectives were met, especially when relevant case-studies were used. They established new collaborations at the trainings and maintained them. Training appeared to catalyze more holistic thinking and practice, including a range of HIA-related activities. Many trainees disseminated what they learned and engaged in components of HIA, even without dedicated funding. Going forward, trainees need assistance with quantitative methods, project management, community engagement, framing recommendations, and evaluation. Conclusions: The research revealed opportunities fora range of HIA stakeholders to refine and coordinate training resources, apply a competency framework and leverage complimentary workforce development efforts, and sensitize and build the capacity of communities. Published by Elsevier Inc.
Environmental Impact Assessment Review 10/2014; 50. DOI:10.1016/j.eiar.2014.10.002 · 2.60 Impact Factor
Available from: Kathleen Duggan
- "The EBPH curriculum consists of nine modules (see next section for a list of modules and learning objectives) and adheres to adult learning principles (i.e., learning through problem solving and active involvement, integrating the experiences of faculty and participants into course discussions) [14,23]. Seven of the nine modules (excluding Modules 1 and 6) include interactive exercises in which participants work in small groups (e.g., using local data to develop a concise problem statement, searching PubMed for literature on a specific topic, developing an action plan based on a logic model). "
[Show abstract] [Hide abstract]
There are few studies describing how to scale up effective capacity-building approaches for public health practitioners. This study tested local-level evidence-based decision making (EBDM) capacity-building efforts in four U.S. states (Michigan, North Carolina, Ohio, and Washington) with a quasi-experimental design.Methods
Partners within the four states delivered a previously established Evidence-Based Public Health (EBPH) training curriculum to local health department (LHD) staff. They worked with the research team to modify the curriculum with local data and examples while remaining attentive to course fidelity. Pre- and post-assessments of course participants (n¿=¿82) and an external control group (n¿=¿214) measured importance, availability (i.e., how available a skill is when needed, either within the skillset of the respondent or among others in the agency), and gaps in ten EBDM competencies. Simple and multiple linear regression models assessed the differences between pre- and post-assessment scores. Course participants also assessed the impact of the course on their work.ResultsCourse participants reported greater increases in the availability, and decreases in the gaps, in EBDM competencies at post-test, relative to the control group. In adjusted models, significant differences (p¿<¿0.05) were found in `action planning,¿ `evaluation design,¿ `communicating research to policymakers,¿ `quantifying issues (using descriptive epidemiology),¿ and `economic evaluation.¿ Nearly 45% of participants indicated that EBDM increased within their agency since the training. Course benefits included becoming better leaders and making scientifically informed decisions.Conclusions
This study demonstrates the potential for improving EBDM capacity among LHD practitioners using a train-the-trainer approach involving diverse partners. This approach allowed for local tailoring of strategies and extended the reach of the EBPH course.
Implementation Science 09/2014; 9(1):124. DOI:10.1186/s13012-014-0124-x · 4.12 Impact Factor
Available from: Marie-Claude Tremblay
- "In so doing, our intention is to further improve and clarify the method. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation (Gotway Crawford et al., 2009; Koo & Miner, 2010; Tilson & Gebbie, 2004). The example chosen in this article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. 2. Logic analysis: what it is and how it differs from similar trends Logic analysis is a type of evaluation that fits within the broader stream of program theory evaluation, or theory-based evaluation (Brousselle & Champagne, 2011). "
[Show abstract] [Hide abstract]
ABSTRACT: Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method.
Evaluation and program planning 06/2013; 40C:64-73. DOI:10.1016/j.evalprogplan.2013.05.004 · 0.89 Impact Factor
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.