Evaluation of an Audit With Feedback Continuing Medical Education Program for Radiation Oncologists
The Cancer Institute, Singapore. Journal of Cancer Education
(Impact Factor: 1.23).
02/2005; 20(4):216-21. DOI: 10.1207/s15430154jce2004_9
Meta-analyses demonstrate audit with feedback (AWF) is effective continuing medical education (CME). However, efficacy varies between specialties, with little published radiation oncologist (RO)-specific evidence. We evaluated an AWF CME intervention for ROs determining efficacy, cost-effectiveness, and participant satisfaction.
CME program: The CME incorporated fortnightly random patient chart audit, scoring management adequacy via a checklist. Scores were presented at a same-day institutional meeting, and case management discussed. Senior peers provided individualized, educational feedback.
Changes in behavior and performance were evaluated by chart review of new patients seen by ROs in the 2 months before commencement of AWF (T0), and at months 13-14 of the program (T1). Behavior and performance were evaluated with a validated, reproducible, 19-item instrument. Criteria for each case audited included 10 targeted and 3 nontargeted behavior items and 6 performance items; each scored 1 point if deemed adequate (maximum score 19). Cost-effectiveness was reported as cost to the institution per item point gained. The mean score (out of 5) of a 14-item questionnaire evaluated program perception.
A total of 113 and 118 charts were evaluated at T0 and T1, respectively. Mean score of targeted behavior improved between T0 and T1 (from 8.7 to 9.2 out of 10, P = .0001), with no significant improvement of nontargeted behavior/performance items. Annual costs and cost-per-point gained were US 7,897 dollars and 15 dollars. Participant satisfaction was positive, increasing after efficacy result distribution (P = .0001).
Audit with comparative, individualized, educational feedback is cost-effective and positively perceived CME, significantly improving targeted RO behavior. Oncologists' CME design and evaluation require further research.
Available from: Farshad Foroudi
- "In 1999, the Royal Australian and New Zealand College of Radiologists (RANZCR) designed an instrument to check and record the quality of radiotherapy notes and prescriptions . This audit tool has been shown to be cost effective, to improve targeted performance and positively received by participants . The RANZCR audit tool mandates feedback to treating clinicians as an educational function as well as to provide an additional tier of safety assessment. "
[Show abstract] [Hide abstract]
The Royal Australian and New Zealand College of Radiologists (RANZCR) initiated a unique instrument to audit the quality of patient notes and radiotherapy prescriptions. We present our experience collected over ten years from the use of the RANZCR audit instrument.
In this study, the results of data collected prospectively from January 1999 to June 2009 through the audit instrument were assessed. Radiotherapy chart rounds were held weekly in the uro-oncology tumour stream and real time feedback was provided. Electronic medical records were retrospectively assessed in September 2009 to see if any omissions were subsequently corrected.
In total 2597 patients were audited. One hundred and thirty seven (5%) patients had one hundred and ninety nine omissions in documentation or radiotherapy prescription. In 79% of chart rounds no omissions were found at all, in 12% of chart rounds one omission was found and in 9% of chart rounds two or more omissions were found. Out of 199 omissions, 95% were of record keeping and 2% were omissions in the treatment prescription. Of omissions, 152 (76%) were unfiled investigation results of which 77 (51%) were subsequently corrected.
Real-time audit with feedback is an effective tool in assessing the standards of radiotherapy documentation in our department, and also probably contributed to the high level of attentiveness. A large proportion of omissions were investigation results, which highlights the need for an improved system of retrieval of investigation results in the radiation oncology department.
BMC Health Services Research 04/2013; 13(1):148. DOI:10.1186/1472-6963-13-148 · 1.71 Impact Factor
[Show abstract] [Hide abstract]
ABSTRACT: The external audit of oncologist clinical practice is increasingly important because of the incorporation of audits into national maintenance of certification (MOC) programs. However, there are few reports of external audits of oncology practice or decision making. Our institution (The Cancer Institute, Singapore) was asked to externally audit an oncology department in a developing Asian nation, providing a unique opportunity to explore the feasibility of such a process.
We audited 100 randomly selected patients simulated for radiotherapy in 2003, using a previously reported audit instrument assessing clinical documentation/quality assurance and medical decision making.
Clinical documentation/quality assurance, decision making, and overall performance criteria were adequate 74.4%, 88.3%, and 80.2% of the time, respectively. Overall 52.0% of cases received suboptimal management. Multivariate analysis revealed palliative intent was associated with improved documentation/clinical quality assurance (p = 0.07), decision making (p = 0.007), overall performance (p = 0.003), and optimal treatment rates (p = 0.07); non-small-cell lung cancer or central nervous system primary sites were associated with better decision making (p = 0.001), overall performance (p = 0.03), and optimal treatment rates (p = 0.002).
Despite the poor results, the external audit had several benefits. It identified learning needs for future targeting, and the auditor provided facilitating feedback to address systematic errors identified. Our experience was also helpful in refining our national revalidation audit instrument. The feasibility of the external audit supports the consideration of including audit in national MOC programs.
International Journal of Radiation OncologyBiologyPhysics 04/2006; 64(3):941-7. DOI:10.1016/j.ijrobp.2005.08.027 · 4.26 Impact Factor
Available from: Thomas P Shakespeare
[Show abstract] [Hide abstract]
ABSTRACT: There has been little radiation oncologist (RO)-specific research in continuing medical education (CME) or quality improvement (QI) program efficacy. Our aim was to evaluate a CME/QI program for changes in RO behavior, performance, and adherence to department protocols/studies over the first 12 months of the program.
The CME/QI program combined chart audit with feedback (C-AWF), simulation review AWF (SR-AWF), reminder checklists, and targeted CME tutorials. Between April 2003 and March 2004, management of 75 patients was evaluated by chart audit with feedback (C-AWF) and 178 patients via simulation review audit (SR-AWF) using a validated instrument. Scores were presented, and case management was discussed with individualized educational feedback. RO behavior and performance was compared over the first year of the program.
Comparing the first and second 6 months, there was a significant improvement in mean behavior (12.7-13.6 of 14, p = 0.0005) and RO performance (7.6-7.9 of 8, p = 0.018) scores. Protocol/study adherence significantly improved from 90.3% to 96.6% (p = 0.005). A total of 50 actions were generated, including the identification of learning needs to direct CME tutorials, the systematic change of suboptimal RO practice, and the alteration of deficient management of 3% of patients audited during the program.
An integrated CME/QI program combining C-AWF, SR-AWF, QI reminders, and targeted CME tutorials effectively improved targeted RO behavior and performance over a 12-month period. There was a corresponding increase in departmental protocol and study adherence.
International Journal of Radiation OncologyBiologyPhysics 01/2007; 66(5):1457-60. DOI:10.1016/j.ijrobp.2006.07.018 · 4.26 Impact Factor
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.