Skills coaches as part of the educational team: a randomized controlled trial of teaching of a basic surgical skill in the laboratory setting.
ABSTRACT The aim of this study was to compare the laboratory teaching of a basic technical skill by a nonphysician skills coach and a faculty surgeon.
Medical students were randomized to instruction of skin suturing in the skills laboratory by a faculty surgeon or by a nonphysician skills coach. Testing of performance occurred at 3 time points. Other faculty surgeons, blinded to identities and training groups, rated performance.
Forty-nine students participated. Baseline fourth-year student mean scores showed no significant difference between training groups. Third-year and fourth-year student performance showed no difference between training groups on postintervention testing. Delayed testing also showed no difference in third-year student scores.
Training by either a nonsurgeon skills coach or a faculty surgeon resulted in no difference in performance on a basic surgical skill. This was true for students with and without prior experience and was also true after subsequent clinical experiences. Nonphysician coaches may ease the teaching burden of surgical faculty members while providing similar quality of instruction for trainees.
- SourceAvailable from: Adam Dubrowski[show abstract] [hide abstract]
ABSTRACT: Surgical skills laboratories have become an important venue for early skill acquisition. The principles that govern training in this novel educational environment remain largely unknown; the commonest method of training, especially for continuing medical education (CME), is a single multihour event. This study addresses the impact of an alternative method, where learning is distributed over a number of training sessions. The acquisition and transfer of a new skill to a life-like model is assessed. Thirty-eight junior surgical residents, randomly assigned to either massed (1 day) or distributed (weekly) practice regimens, were taught a new skill (microvascular anastomosis). Each group spent the same amount of time in practice. Performance was assessed pretraining, immediately post-training, and 1 month post-training. The ultimate test of anastomotic skill was assessed with a transfer test to a live, anesthetized rat. Previously validated computer-based and expert-based outcome measures were used. In addition, clinically relevant outcomes were assessed. Both groups showed immediate improvement in performance, but the distributed group performed significantly better on the retention test in most outcome measures (time, number of hand movements, and expert global ratings; all P values <0.05). The distributed group also outperformed the massed group on the live rat anastomosis in all expert-based measures (global ratings, checklist score, final product analysis, competency for OR; all P values <0.05). Our current model of training surgical skills using short courses (for both CME and structured residency curricula) may be suboptimal. Residents retain and transfer skills better if taught in a distributed manner. Despite the greater logistical challenge, we need to restructure training schedules to allow for distributed practice.Annals of Surgery 10/2006; 244(3):400-9. · 6.33 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Educational, medicolegal, and financial constraints have pushed surgical residency programs to find alternative methods to operating room teaching for surgical skills training. Several studies have demonstrated that the use of skills laboratories is effective and enhances performance; however, little is known about the facilities available to residents. A survey was distributed to 40 general surgery program directors who, in an earlier questionnaire, indicated that they had skills laboratory facilities at their institutions. The survey included the following sections: demographics, facilities, administrative infrastructure, curriculum, learners, and opinions/thoughts of program directors. Of the 34 program directors that completed the survey, 76% are from a university program. The average facility is 1400 square feet, and most skills laboratories are located in the hospital. Nearly all skills facilities have dry laboratories (90%), and the most common equipment is box trainers (90%). Average start-up costs were $450,000. Sixty-two percent of programs have a skills curriculum for residents. Responders agreed that skills laboratories have a high value and should be part of residency curricula. The results of this survey provide a preliminary view of skills laboratories. There is variation in the size, location, and availability of simulators in skills laboratory facilities. Variations also exist in types of curricula formats, subspecialties who make use of the laboratory, and some administrative approaches. There is strong agreement among respondents that skills laboratories are a necessary and valuable component of residency education. Results also indicated concerns for recruiting faculty to teach in the skills laboratory, securing ongoing funding, and implementing a skills laboratory curriculum.Journal of Surgical Education 01/2007; 64(5):260-5. · 1.63 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Previous reports have detailed the efficacy of simulated patients as instructors, giving immediatefeedback on genital/ rectal examination techniques. This prospective study compares the long-term retention of technical and interpersonal skills learnedfrom simulated patients versus traditional methods. The members of a sophomore medical school class were randomly assigned to one of two instructionalprograms during their Introduction to Clinical Medicine course. A random sample from each group was evaluated shortly afterwards. Significant differences were found for 22 of 27 items rates. A similar evaluation was repeated when the students became seniors. The two groups were still different in 13 areas. The results demonstrate the superiority ofsimulated patient training for long-term retention of instructional material. Even 18 months of intervening clinical exposure could not compensatefor initial differences.Evaluation & the Health Professions 01/1985; 8(1):69-81. · 1.48 Impact Factor