Article

Educational epidemiology: applying population-based design and analytic approaches to study medical education.

Department of Community and Family Medicine, Dartmouth Medical School, Hanover and Lebanon, NH 03756, USA.
JAMA The Journal of the American Medical Association (Impact Factor: 29.98). 10/2004; 292(9):1044-50. DOI: 10.1001/jama.292.9.1044
Source: PubMed

ABSTRACT Conducting educational research in medical schools is challenging partly because interventional controlled research designs are difficult to apply. In addition, strict accreditation requirements and student/faculty concerns about educational inequality reduce the flexibility needed to plan and execute educational experiments. Consequently, there is a paucity of rigorous and generalizable educational research to provide an evidence-guided foundation to support educational effectiveness. "Educational epidemiology," ie, the application across the physician education continuum of observational designs (eg, cross-sectional, longitudinal, cohort, and case-control studies) and randomized experimental designs (eg, randomized controlled trials, randomized crossover designs), could revolutionize the conduct of research in medical education. Furthermore, the creation of a comprehensive national network of educational epidemiologists could enhance collaboration and the development of a strong educational research foundation.

0 Bookmarks
 · 
77 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: As educators seek confirmation of successful trainee achievement, medical education must move toward a more evidence-based approach to teaching and evaluation. Although medical training often provides physicians with a general background in biostatistics, many are not prepared to apply these skills. This can hinder clinician educators as they wish to develop, analyze and disseminate their scholarly work. This paper is intended to be a concise educational tool and guide for choosing and interpreting statistical tests aimed toward medical education assessment. It includes guidelines and examples that clinician-educators can use when analyzing and interpreting studies and when writing methods and results sections of reports.
    Journal of General Internal Medicine 09/2006; 21(9). · 3.42 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In-service training is a key strategic approach to addressing the severe shortage of health care workers in many countries. However, there is a lack of evidence linking these health care worker trainings to improved health outcomes. In response, the United States President's Emergency Plan for AIDS Relief's Human Resources for Health Technical Working Group initiated a project to develop an outcome-focused training evaluation framework. This paper presents the methods and results of that project. A general inductive methodology was used for the conceptualization and development of the framework. Fifteen key informant interviews were conducted to explore contextual factors, perceived needs, barriers and facilitators affecting the evaluation of training outcomes. In addition, a thematic analysis of 70 published articles reporting health care worker training outcomes identified key themes and categories. These were integrated, synthesized and compared to several existing training evaluation models. This formed an overall typology which was used to draft a new framework. Finally, the framework was refined and validated through an iterative process of feedback, pilot testing and revision. The inductive process resulted in identification of themes and categories, as well as relationships among several levels and types of outcomes. The resulting framework includes nine distinct types of outcomes that can be evaluated, which are organized within three nested levels: individual, organizational and health system/population. The outcome types are: (1) individual knowledge, attitudes and skills; (2) individual performance; (3) individual patient health; (4) organizational systems; (5) organizational performance; (6) organizational-level patient health; (7) health systems; (8) population-level performance; and (9) population-level health. The framework also addresses contextual factors which may influence the outcomes of training, as well as the ability of evaluators to determine training outcomes. In addition, a group of user-friendly resources, the Training Evaluation Framework and Tools (TEFT) were created to help evaluators and stakeholders understand and apply the framework. Feedback from pilot users suggests that using the framework and accompanying tools may support outcome evaluation planning. Further assessment will assist in strengthening guidelines and tools for operationalization.
    Human Resources for Health 10/2013; 11(1):50. · 1.83 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Research in medical education has been paid more attention than before; however the quality of research reporting has not been comprehensively appraised. To evaluate the methodological and reporting quality of Iranian published medical education articles. Articles describing medical students, residents, fellows or program evaluation were included. Articles related to continuing medical education or faculty development, review articles and reports, and studies considering both medical and nonmedical students were excluded. We searched MEDLINE through PubMed in addition to major Iranian medical education search engines and databases including Scientific Information Database (SID) from March 2003 to March 2008. The Medical Education Research Quality Index (MERSQI) scale and the Consolidated Standards of Reporting Trials (CONSORT 2001) were used for experimental studies and the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) was utilized for observational studies. Ninety five articles were found to be related to the medical education research in Iran including 16 (16.8%) experimental studies. Total MERSQI scores ranged between 3.82 and 13.09 with the mean of 8.39 points. Mean domain scores were highest for data analysis (1.85) and lowest for validity (0.61). The most frequently reported item was background (96%) and the least reported was the study limitations (16%). The quality of published medical education research in Iran seems to be suboptimal.
    Medical journal of the Islamic Republic of Iran 01/2014; 28:79.