Evaluation of an audit with feedback continuing medical education program for radiation oncologists.
ABSTRACT Meta-analyses demonstrate audit with feedback (AWF) is effective continuing medical education (CME). However, efficacy varies between specialties, with little published radiation oncologist (RO)-specific evidence. We evaluated an AWF CME intervention for ROs determining efficacy, cost-effectiveness, and participant satisfaction.
CME program: The CME incorporated fortnightly random patient chart audit, scoring management adequacy via a checklist. Scores were presented at a same-day institutional meeting, and case management discussed. Senior peers provided individualized, educational feedback.
Changes in behavior and performance were evaluated by chart review of new patients seen by ROs in the 2 months before commencement of AWF (T0), and at months 13-14 of the program (T1). Behavior and performance were evaluated with a validated, reproducible, 19-item instrument. Criteria for each case audited included 10 targeted and 3 nontargeted behavior items and 6 performance items; each scored 1 point if deemed adequate (maximum score 19). Cost-effectiveness was reported as cost to the institution per item point gained. The mean score (out of 5) of a 14-item questionnaire evaluated program perception.
A total of 113 and 118 charts were evaluated at T0 and T1, respectively. Mean score of targeted behavior improved between T0 and T1 (from 8.7 to 9.2 out of 10, P = .0001), with no significant improvement of nontargeted behavior/performance items. Annual costs and cost-per-point gained were US 7,897 dollars and 15 dollars. Participant satisfaction was positive, increasing after efficacy result distribution (P = .0001).
Audit with comparative, individualized, educational feedback is cost-effective and positively perceived CME, significantly improving targeted RO behavior. Oncologists' CME design and evaluation require further research.
- International Journal of Radiation OncologyBiologyPhysics 08/1996; 35(4):821-6. · 4.52 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: The process of interpreting the results of clinical studies and translating them into clinical practice is being debated. Here we examine the role of p values and confidence intervals in clinical decision-making, and draw attention to confusion in their interpretation. To improve result reporting, we propose the use of confidence levels and plotting of clinical significance curves and risk-benefit contours. These curves and contours provide degrees of probability of both the potential benefit of treatment and the detriment due to toxicity. Additionally, they provide clinicians with a mechanism of translating the results of studies into treatment for individual patients, thus improving the clinical decision-making process. We illustrate the application of these curves and contours by reference to published studies. Confidence levels, clinical significance curves, and risk-benefit contours can be easily calculated with a hand calculator or standard statistical packages. We advocate their incorporation into the published results of clinical studies.The Lancet 05/2001; 357(9265):1349-53. · 39.06 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Our primary aim was to design a new, internationally accredited, comprehensive radiation oncology (RO) training program for Singaporean residents that satisfied the needs of stake holders and incorporated published evidence. The evidence-based method included Medline literature review and broad-based training needs assessment. Literature review revealed few studies describing or evaluating RO resident training programs. Our program was designed by incorporating available published research and stakeholder views determined by the training needs assessment. The program includes novel evidence-based educational methods, including individually negotiated learning contracts, a mentor program, logbooks, task-based learning, tutorials, and formative plus summative assessments. The content and structure is consistent with most United States, United Kingdom, and Royal Australian and New Zealand College of Radiologist (RANZCR) guidelines, with resident evaluation via RANZCR examinations. The RANZCR accredited the program in January 2002. We recommend institutions or countries introducing or revising RO resident training programs use an evidence-based approach, addressing the needs of stake holders (determined by a comprehensive training needs assessment) and incorporating published research. Novel educational methods may be considered in RO training. This new Singapore program is the first to achieve international accreditation by the RANZCR. It is clear that additional research in the design and evaluation of RO resident training programs is required.International Journal of Radiation OncologyBiologyPhysics 08/2004; 59(4):1157-62. · 4.52 Impact Factor