Influence of clerkship experiences on clinical competence

Erasmus Universiteit Rotterdam, Rotterdam, South Holland, Netherlands
Medical Education (Impact Factor: 3.2). 06/2006; 40(5):450-8. DOI: 10.1111/j.1365-2929.2006.02447.x
Source: PubMed


Clerkship experiences are considered crucial for the development of clinical competence. Yet whether there is a direct relationship between the nature and volume of patient encounters and learning outcomes is far from clear. Some evidence in the literature points towards the importance of clinical supervision on student learning, but the relationship between clinical supervision, patient encounters and student competence remains unclear.
This study aimed firstly to determine the variation in students' clinical experiences within and across sites; secondly, to identify the causes of this variation, and thirdly, to investigate the consequences of this variation on students' competence.
Clerkship students at 12 hospital sites recorded their patient encounters in logbooks. Site characteristics that might influence the variation in patient encounters were collected. Student competence was determined by 3 independent indicators: a practical end-of-clerkship examination; a theoretical end-of-clerkship examination, and an evaluation of professional performance. A model was developed to test the available clerkship data using structural equation modelling (SEM) software.
Analysis of the logbooks revealed a large variation in the number of patients encountered by students. The average length of patient stay, number of patients admitted, and quality of supervision accounted partly for this variation. An increased number of patient encounters did not directly lead to improved competence. Quality of supervision turned out to be crucially important because it directly impacted student learning and also positively influenced the number of patient encounters.
Monitoring the effectiveness of clerkship by merely asking students to keep a tally of the problems and diseases they encounter, without paying attention to the quality of supervision, does not contribute towards improving student learning.

17 Reads
  • Source
    • "Learning in the clinical setting is a complex process and could be influenced by many factors, such as the quality of the supervision, exposure to a variety of clinical experiences, quality of feedback and the length of time spent with patients [1-7]. The impact of these factors on the clinical learning of undergraduate medical students is variable [2]. However, students’ performance on clinical examinations was found to be positively associated with exposure to a large variety of clinical cases and the provision of feedback from the supervisors [3]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Teaching and learning of clinical skills for undergraduate medical students usually takes place during the clinical clerkship. Therefore, it is of vital importance to ensure the effectiveness of the rotations within this clerkship. The aims of this study were to develop an instrument that measures the effectiveness of the clinical learning environment, to determine its factor structure, and to find first evidence for the reliability and validity of the total scale and the different factors. The Clinical Learning Evaluation Questionnaire (CLEQ) is an instrument, consisting of 40 items, which have been developed after consideration of the results of a qualitative study that investigated the important factors influencing clinical learning, both from the perspective of students, as well as teachers. Results of relevant literature that investigated this issue were also incorporated in the CLEQ. This instrument was administered to a sample of students (N = 182) from three medical colleges in Riyadh city, the capital of Saudi Arabia. The factor structure of the CLEQ (Principal component analysis, Oblimin rotation) and reliability of the factor scales (Cronbach's alpha) were determined. Hypotheses concerning the correlations between the different factors were tested to investigate their convergent and divergent validity. One hundred and nine questionnaires were returned. The factor analysis yielded six factors: F1 Cases (8 items), F2 Authenticity of clinical experience (8 items), F3 Supervision (8 items), F4 Organization of the doctor-patient encounter (4 items), F5 Motivation to learn (5items), and F6 Self awareness (4 items). The overall internal consistency (alpha) of the CLEQ was 0.88, and the reliabilities (Cronbach's alpha) of the six factors varied from .60 to .86. Hypotheses concerning the correlations between the different factors were partly confirmed, which supported the convergent validity of the factors, but not their divergent validity. Significant differences were found between the scores of the students of the three different schools on the factors Supervision and Organization of patient-doctor encounter. The results of this study demonstrated that CLEQ is a multidimensional and reliable instrument. It can be utilized as an evaluation tool for clinical teaching activities, both by educators as well as students. Further research is needed into the validity of the CLEQ.
    Full-text · Article · Mar 2014 · BMC Medical Education
  • Source
    • "Eight studies compared the patient mix of training sites and their contribution to learning. In three of these studies, similar sites were compared (Chatenay et al. 1996; Wimmers et al. 2006a; Yu et al. 2011), three others compared academic vs. nonacademic sites (Schwiebert et al. 1993; McLeod et al. 1997; Nomura et al. 2008) and two compared inpatients and outpatients (Jacobson et al. 1998; Duke et al. 2011). Four studies evaluated the learning effects of an intervention: the introduction of a rotation (Gruppen et al. 1993), a skill-training programme (Boots et al. 2008), identification of 10 preselected complaints (Lampe et al. 2008) and a new internship (Nomura et al. 2008). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Background: Clinical workplace-based learning has been the means to becoming a medical professional for many years. The importance of an adequate patient mix, as defined by the number of patients and the types of medical problems, for an optimal learning process is based on educational theory and recognised by national and international accreditation standards. The relationship between patient mix and learning in work-based curricula as yet remains unclear. Aim: To review research addressing the relationship between patient mix and learning in work-based clinical settings. Method: The search was conducted across Medline, Embase, Web of Science, ERIC and the Cochrane Library from the start date of the database to July 2011. Original quantitative studies on the relationship between patient mix and learning for learners at any level of the formal medical training/career were included. Methodological quality was assessed and two reviewers using pre-specified forms extracted results. Results: A total of 10,420 studies were screened on title and abstract. Of these, 298 articles were included for full-text analysis, which resulted in the inclusion of 22 papers. The quality of the included studies, scored with the Medical Education Research Study Quality Instrument (MERSQI), ranged from 8.0 to 14.5 (of 18 points). A positive relationship was found between patient mix and self-reported outcomes evaluating the progress in competence as experienced by the trainee, such as self-confidence and comfort level. Patient mix was also found to correlate positively with self-reported outcomes evaluating the quality of the learning period, such as self-reported learning benefit, experienced effectiveness of the rotation, or the instructional quality. Variables, such as supervision and learning style, might mediate this relationship. A relationship between patient mix and formal assessment has never been demonstrated. Conclusion: Patient mix is positively related to self-reported learning outcome, most evidently the experienced quality of the learning programme.
    Full-text · Article · Jun 2013 · Medical Teacher
  • Source
    • "In both curricula clinical assessment was identical: each clerkship grade was based on several mini-CEX scores. Mini-CEX scores are sufficiently reliable to estimate clinical competence [33]. In both curricula, grades were given on a 10-point scale. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Background Little is known about the gains and losses associated with the implementation of undergraduate competency-based medical education. Therefore, we compared knowledge acquisition, clinical performance and perceived preparedness for practice of students from a competency-based active learning (CBAL) curriculum and a prior active learning (AL) curriculum. Methods We included two cohorts of both the AL curriculum (n = 453) and the CBAL curriculum (n = 372). Knowledge acquisition was determined by benchmarking each cohort on 24 interuniversity progress tests against parallel cohorts of two other medical schools. Differences in knowledge acquisition were determined comparing the number of times CBAL and AL cohorts scored significantly higher or lower on progress tests. Clinical performance was operationalized as students’ mean clerkship grade. Perceived preparedness for practice was assessed using a survey. Results The CBAL cohorts demonstrated relatively lower knowledge acquisition than the AL cohorts during the first study years, but not at the end of their studies. We found no significant differences in clinical performance. Concerning perceived preparedness for practice we found no significant differences except that students from the CBAL curriculum felt better prepared for ‘putting a patient problem in a broad context of political, sociological, cultural and economic factors’ than students from the AL curriculum. Conclusions Our data do not support the assumption that competency-based education results in graduates who are better prepared for medical practice. More research is needed before we can draw generalizable conclusions on the potential of undergraduate competency-based medical education.
    Full-text · Article · May 2013 · BMC Medical Education
Show more