Assessment of BLS skills: Optimizing use of instructor and manikin measures

University of Dundee School of Medicine, Dundee, Tayside Centre for General Practice, Mackenzie Building, Kirsty Semple Way, Dundee DD2 4BF, United Kingdom.
Resuscitation (Impact Factor: 4.17). 03/2008; 76(2):233-43. DOI: 10.1016/j.resuscitation.2007.07.018
Source: PubMed


The primary objective of layperson CPR training is to ensure that learners achieve minimal competence to provide aid that improves the odds of survival of victims of out-of-hospital sudden cardiac arrest. During CPR courses, pronouncement of a learner's competence typically depends entirely on judgments made by an instructor; yet previous research strongly suggests that these judgments - particularly of chest compressions - are not sufficiently precise or accurate to ensure valid assessments. Comparisons of instructors' subjective assessments with objective data from recording manikins provide one means of understanding the magnitude and type of instructor errors in assessment.
Eight hundred and twenty-six laypersons between 40 and 70 years old participated in CPR training. Performance of five discrete skills was tested in a scenario format immediately afterward: assessing responsiveness, calling the emergency telephone number 911, delivering ventilations of adequate volume, demonstrating correct hand placement for compressions, and delivering compressions with adequate depth. Thirteen AHA-certified instructors assessed these five skills and rendered a global performance rating; sensored Resusci Anne manikins with SkillReporting software recorded ventilation and compression data.
Instructors' ratings of the ventilation skills were highly accurate; ratings of compressions were correct about 83% of the time; yet inadequate compression depth was rated adequate 55% of the time, and incorrect hand placement was rated adequate 49% of the time.
Instructors' judgments alone are not sufficient to determine learners' competence in performing compressions. Assessment, technology, and guidelines must be better aligned so that learners can receive accurate feedback.

Download full-text


Available from: Bonnie Lynch, May 08, 2015
1 Follower
40 Reads
  • Source
    • "Since BLS skills mastery rapidly decays and should not be assumed to persist for pre-defined time periods, regular skill assessment should be established to determine the need for refresher training [6]. Current BLS testing methods require the presence of an instructor, making testing time-consuming with a risk of instructor bias [7]. Acquiring objective data from recording manikins provides more accurate information about skills mastery than instructor judgement. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Current methods to assess Basic Life Support skills (BLS; chest compressions and ventilations) require the presence of an instructor. This is time-consuming and comports instructor bias. Since BLS skills testing is a routine activity, it is potentially suitable for automation. We developed a fully automated BLS testing station without instructor by using innovative software linked to a training manikin. The goal of our study was to investigate the feasibility of adequate testing (effectiveness) within the shortest period of time (efficiency). As part of a randomised controlled trial investigating different compression depth training strategies, 184 medicine students received an individual appointment for a retention test six months after training. An interactive FlashTM (Adobe Systems Inc., USA) user interface was developed, to guide the students through the testing procedure after login, while Skills StationTM software (Laerdal Medical, Norway) automatically recorded compressions and ventilations and their duration ("time on task"). In a subgroup of 29 students the room entrance and exit time was registered to assess efficiency. To obtain a qualitative insight of the effectiveness, student's perceptions about the instructional organisation and about the usability of the fully automated testing station were surveyed. During testing there was incomplete data registration in two students and one student performed compressions only. The average time on task for the remaining 181 students was three minutes (SD 0.5). In the subgroup, the average overall time spent in the testing station was 7.5 minutes (SD 1.4). Mean scores were 5.3/6 (SD 0.5, range 4.0-6.0) for instructional organisation and 5.0/6 (SD 0.61, range 3.1-6.0) for usability. Students highly appreciated the automated testing procedure. Our automated testing station was an effective and efficient method to assess BLS skills in medicine students. Instructional organisation and usability were judged to be very good. This method enables future formative assessment and certification procedures to be carried out without instructor involvement. B67020097543.
    BMC Medical Education 07/2012; 12(1):58. DOI:10.1186/1472-6920-12-58 · 1.22 Impact Factor
  • Source
    • "Ideally, the design of the educational study should include a pre-test. However, in this study, a pre-test to examine if different conditions would change the team performance of D-CPR could possibly have influenced the performance of teams in Group A. To strengthen reliability, mannequin-based data of CPR performance should be used in addition to observational data [39]. The video observations in this study demonstrate that the raters assessed some aspects of D-CPR performance in different ways, indicated by some large inter-observer differences. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Although nurses must be able to respond quickly and effectively to cardiac arrest, numerous studies have demonstrated poor performance. Simulation is a promising learning tool for resuscitation team training but there are few studies that examine simulation for training defibrillation and cardiopulmonary resuscitation (D-CPR) in teams from the nursing education perspective. The aim of this study was to investigate the extent to which nursing student teams follow the D-CPR-algorithm in a simulated cardiac arrest, and if observing a simulated cardiac arrest scenario and participating in the post simulation debriefing would improve team performance. We studied video-recorded simulations of D-CPR performance in 28 nursing student teams. Besides describing the overall performance of D-CPR, we compared D-CPR performance in two groups. Group A (n = 14) performed D-CPR in a simulated cardiac arrest scenario, while Group B (n = 14) performed D-CPR after first observing performance of Group A and participating in the debriefing. We developed a D-CPR checklist to assess team performance. Overall there were large variations in how accurately the nursing student teams performed the specific parts of the D-CPR algorithm. While few teams performed opening the airways and examination of breathing correctly, all teams used a 30:2 compression: ventilation ratio.We found no difference between Group A and Group B in D-CPR performance, either in regard to total points on the check list or to time variables. We found that none of the nursing student teams achieved top scores on the D-CPR-checklist. Observing the training of other teams did not increase subsequent performance. We think all this indicates that more time must be assigned for repetitive practice and reflection. Moreover, the most important aspects of D-CPR, such as early defibrillation and hands-off time in relation to shock, must be highlighted in team-training of nursing students.
    Scandinavian Journal of Trauma Resuscitation and Emergency Medicine 04/2012; 20(1):23. DOI:10.1186/1757-7241-20-23 · 2.03 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This article reports a secondary data analysis of a year-long study with 606 nursing students involving brief monthly CPR practice with voice-activated manikins versus no practice. Findings indicate that even with monthly practice and accurate voice-activated manikin feedback, some students could not perform CPR correctly. Implications of these findings for staff educators are discussed.
    Journal for nurses in staff development: JNSD: official journal of the National Nursing Staff Development Organization 01/2012; 28(1):9-15. DOI:10.1097/NND.0b013e318240a6ad
Show more