The Assessment of Emergency Physicians by a Regulatory Authority

Department of Community Health Sciences, Faculty of Medicine, University of Calgary, Calgary, Alberta, Canada.
Academic Emergency Medicine (Impact Factor: 2.01). 12/2006; 13(12):1296-303. DOI: 10.1197/j.aem.2006.07.030
Source: PubMed


To determine whether it is possible to develop a feasible, valid, and reliable multisource feedback program (360 degree evaluation) for emergency physicians.
Surveys with 16, 20, 30, and 31 items were developed to assess emergency physicians by 25 patients, eight coworkers, eight medical colleagues, and self, respectively, using five-point scales along with an "unable to assess" category. Items addressed key competencies related to communication skills, professionalism, collegiality, and self-management.
Data from 187 physicians who identified themselves as emergency physicians were available. The mean number of respondents per physician was 21.6 (SD +/- 3.87) (93%) for patients, 7.6 (SD +/- 0.89) (96%) for coworkers, and 7.7 (SD +/- 0.61) (95%) for medical colleagues, suggesting it was a feasible tool. Only the patient survey had four items with "unable to assess" percentages > or = 15%. The factor analysis indicated there were two factors on the patient questionnaire (communication/professionalism and patient education), two on the coworker survey (communication/collegiality and professionalism), and four on the medical colleague questionnaire (clinical performance, professionalism, self-management, and record management) that accounted for 80.0%, 62.5%, and 71.9% of the variance on the surveys, respectively. The factors were consistent with the intent of the instruments, providing empirical evidence of validity for the instruments. Reliability was established for the instruments (Cronbach's alpha > 0.94) and for each physician (generalizability coefficients were 0.68 for patients, 0.85 for coworkers, and 0.84 for medical colleagues).
The psychometric examination of the data suggests that the instruments developed to assess emergency physicians were feasible and provide evidence for validity and reliability.

Download full-text


Available from: Claudio Violato, Dec 19, 2014
  • Source

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background: The assessment, maintenance of competence, and recertification for surgeons have recently received increased attention from many health organizations. Assessment of physicians' competencies with multisource feedback (MSF) has become widespread in recent years. The aim of the present study was to investigate further the use of MSF for assessing surgical practice by conducting a systematic review of the published research. Methods: A systematic literature review was conducted to identify the use of MSF in surgical settings. The search was conducted using the electronic databases EMBASE, PsycINFO, MEDLINE, PubMed, and CINAHL for articles in English up to August 2012. Studies were included if they reported information about at least 1 out of feasibility, reliability, generalizability, and validity of the MSF. Results: A total of 780 articles were identified with the initial search and 772 articles were excluded based on the exclusion criteria. Eight studies met the inclusion criteria for this systematic review. Reliability (Cronbach α ≥ 0.90) was reported in 4 studies and generalizability (Ep2 ≥ 0.70) was reported in 4 studies. Evidence for content, criterion-related, and construct validity was reported in all 8 studies. Conclusion: MSF is a feasible, reliable, and valid method to assess surgical practice, particularly for nontechnical competencies such as communication skills, interpersonal skills, collegiality, humanism, and professionalism. Meanwhile, procedural competence needs to be assessed by different assessment methods. Further implementation for the use of MSF is desirable.
    Journal of Surgical Education 07/2013; 70(4):475-486. DOI:10.1016/j.jsurg.2013.02.002 · 1.38 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: OBJECTIVE: The aim of this paper is to inform College Fellows, trainees and other stakeholders about the structure, principles and functioning of the new Board of Education. CONCLUSION: The educational activities of the College are likely to evolve and to be developed over the next 5 years by a process taking account of the views of key stakeholders. In the short term, there will be no changes to training or examination processes which would disadvantage trainees.
    Australasian Psychiatry 05/2008; 16(2):74-9. DOI:10.1080/10398560701874317 · 0.47 Impact Factor
Show more