Preparing First-Year Radiology Residents and Assessing Their Readiness for On-Call Responsibilities: Results Over 5 Years
Department of Radiology, Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, MA 02215, USA. American Journal of Roentgenology
(Impact Factor: 2.73).
02/2009; 192(2):539-44. DOI: 10.2214/AJR.08.1631
The objective of our study was to evaluate the preparedness of postgraduate year (PGY)-2 residents for independent call responsibilities and the impact of the radiology residency training program on call preparedness using an objective DICOM-based simulation module over a 5-year period.
A month-long emergency radiology lecture series, conducted over 5 consecutive years, was designed and given to radiology residents at all levels. A DICOM-based, interactive, computer-based testing module with actual emergency department cases was developed and administered at the end of the lecture series. Comparison was made between first-year and upper-level resident test scores using a Student's t test, generalized estimating equations, and individual fixed effects to determine PGY-2 residents' before-call preparedness and the effectiveness of the simulation module to assess call preparedness. Resident scoring on the simulation module was also plotted as a function of progression through their residency program to evaluate the impact of the training program on call preparedness.
Over 5 years, 45 PGY-2, 34 PGY-3, 32 PGY-4, and 35 PGY-5 residents attended the lecture series and completed the computer-based testing module. PGY-2 residents scored an average of 71% +/- 15% (SD), PGY-3 residents scored 79% +/- 11%, PGY-4 residents scored 84% +/- 10%, and PGY-5 residents scored 86% +/- 11% of the total points possible. A statistically significant (p < 0.05) difference in scoring on the simulation module was identified between the PGY-2 residents and each upper-level class over the 5-year period and during 4 of 5 examination years analyzed separately. A trend toward higher average scores for each cohort of residents as they progressed through residency training was identified.
Over a 5-year period, first-year radiology residents scored significantly lower than upper-level colleagues on an emergency radiology simulation module, suggesting a significant improvement in the ability of residents to interpret typical on-call imaging studies after the PGY-2 year.
Available from: Hsin-Kai Wu
- "Transaction data, the most popular type of OR items, were collected by IMI in 10 assessments, entailing detailed information about test-takers' problem solving processes and solutions in real-world problems (Bennett et al., 2010; Bersky, 1994; Chung et al., 2002; Fitzgerald et al., 1994; Quellmalz et al., 2012; Schacter et al., 1999; Shaw et al., 1997; Stevens et al., 1999; Vendlinski & Stevens, 2002). The second type used in four identified CBSAs called for examinees' free-text answers to the findings, diagnoses, or recommendations for each depicted case (Bowen, 1998; Ganguli, Camacho, Yam & Pedrosa, 2009; Mina et al., 2011; Richardson et al., 2002). The last type of OR items took the form of electronic portfolios in two assessments where a compendium of reports, papers, and other materials documenting learning experiences, performances, attitudes, and professionalism in practice provided a holistic view for evaluation (Cook, Kase, Middelton & Monsen, 2003; Duque, 2003). "
[Show abstract] [Hide abstract]
ABSTRACT: Drawing upon an integrated model proposed by Bennett and Bejar (1998), this review study examined how 66 computer-based science assessments (CBSAs) in basic science and medicine took advantage of advanced technologies. The model regarded a CBSA as an integrated system, included several assessment components (e.g., assessment purpose, measured construct, test and task design, examinee interface, and scoring procedure), and emphasized the interplay among these components. Accordingly, this study systematically analyzed the item presentations of interactive multimedia, the constructs measured, the response formats in formative and summative assessments, the scoring procedures, the adaptive test activities administrated based on the algorithms related to the item response theory (IRT) and other rules beyond IRT, and the strategies for the automatic provision of informative hints and feedbacks in the CBSAs. Our analysis revealed that although only 19 out of 66 assessments took advantage of dynamic and interactive media for item presentations, CBSAs with these media could measure integrated under- standing of science phenomena and complex problem-solving skills. However, we also found that lim- itations in automated scoring may lead to infrequent use of the automated provision of hints and feedbacks with open-ended and extended responses. These findings suggest the interrelatedness of the assessment components, and thus we argue that designers should repeatedly consider the relationships between components of CBSAs to ensure the validity of the assessments. Finally, we also indicate the issues for future research in computer-based assessments.
Computers & Education 07/2013; 68:388-403. DOI:10.1016/j.compedu.2013.06.002 · 2.56 Impact Factor
[Show abstract] [Hide abstract]
ABSTRACT: The start of call is a stressful time for radiology residents. Traditional teaching methods are not ideal for call preparation because they are radically different than the task performed on call. The purpose of this study is to determine if a computer-based radiology simulator would have an effect on resident confidence level or diagnostic abilities.
A simulator was created to mimic the picture archive and communication system (PACS) at our hospital. Typical call-level cases were selected, anonymized, and entered into the database. The first-year residents were randomly split into a control group and a study group that used the simulator. Each resident took a survey 1 month before and after beginning call to measure his or her subjective feeling of preparedness and nervousness. Objective measures were also obtained through the use of discordance levels from on-call cases.
Seventy-one cases were entered into the simulator. Of the 12 residents in the first-year class, 7 were placed in the study group and 5 in the control group. The residents in both groups claimed they felt more prepared and less nervous 1 month after starting call. The differences at survey were not significant, but the residents in the study group trended toward feeling more prepared and less nervous. There was no statistical difference in the discordance rates for on-call cases between the two groups.
Although statistical significance was not reached between the users of the radiology simulator and the control group, there was a subjective feeling that the simulator was useful for call preparation and as an interactive learning tool. A larger sample study group size may show statistical significance.
Academic Radiology 11/2007; 14(10):1271-83. DOI:10.1016/j.acra.2007.06.011 · 1.75 Impact Factor
Available from: Alan Schwartz
01/2009; Report to the Accreditation Council for Graduate Medical Education.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.