Article

A comparison of RANZCR and Singapore-designed radiation oncology practice audit instruments: how does reproducibility affect future approaches to revalidation?

Radiotherapy Centre, The Cancer Institute, National University Hospital, Singapore.
Australasian Radiology (Impact Factor: 0.51). 06/2004; 48(2):195-203. DOI: 10.1111/j.1440-1673.2004.01296.x
Source: PubMed

ABSTRACT Physician competency assessment requires the use of validated methods and instruments. The Royal Australian and New Zealand College of Radiologists (RANZCR) developed a draft audit form to be evaluated as a competency assessment instrument for radiation oncologists (ROs) in Australasia. We evaluated the reliability of the RANZCR instrument as well as a separate The Cancer Institute (TCI) Singapore-designed instrument by having two ROs perform an independent chart review of 80 randomly selected patients seen at The Cancer Institute (TCI), Singapore. Both RANZCR and TCI Singapore instruments were used to score each chart. Inter- and intra-observer reliability for both audit instruments were compared using misclassification rates as the primary end-point. Overall, for inter-observer reproducibility, 2.3% of TCI Singapore items were misclassified compared to 22.3% of RANZCR items (P < 0.0001, 100.00% confidence that TCI instrument has less inter-observer misclassification). For intra-observer reproducibility, 2.4% of TCI Singapore items were misclassified compared to 13.6% of RANZCR items (P < 0.0001, 100.00% confidence that TCI instrument has less intra-observer misclassification). The proposed RANZCR RO revalidation audit instrument requires further refinement to improve validity. Several items require modification or removal because of lack of reliability, whereas inclusion of other important and reproducible items can be incorporated as demonstrated by the TCI Singapore instrument. The TCI Singapore instrument also has the advantage of incorporating a simple scoring system and criticality index to allow discrimination between ROs and comparisons against future College standards.

0 Bookmarks
 · 
40 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The National Radiotherapy Single Machine Unit Trial was a joint Australian and Victorian Government initiative to establish single machine radiotherapy service in regional areas. The trial arose in response to the need for decentralised radiotherapy services to improve access to treatment for rural patients. Key aims of the trial included assessing if single machine radiotherapy services could successfully be established and operated in regional areas, what impact they would have on patient access and radiotherapy utilisation and whether they could provide radiotherapy of equivalent safety and reliability to metropolitan services. An evaluation of the Single Machine Unit Trial was undertaken by the Victorian Department of Human Services to assess how well the trial aims were met. The Single Machine Unit Trial successfully demonstrated that single machine radiotherapy departments lead to more appropriate radiotherapy utilisation rates for rural cancer services, while providing quality of care comparable to larger metropolitan centres.
    Cancer Forum 07/2007; 31(2).
  • [Show abstract] [Hide abstract]
    ABSTRACT: In September 2006, the Royal Australian and New Zealand College of Radiologists (RANZCR) endorsed the modified Peer Review Audit Tool (PRAT). We aimed to assess the feasibility of using this tool in a busy radiation oncology department using an electronic medical record (EMR) system, identify areas of compliance and assess the impact of the audit process on patient management. Fortnightly random clinical audit was undertaken by using the revised RANZCR PRAT in the departments of radiation oncology at Liverpool and Macarthur Cancer Therapy Centres (LCTC and MCTC). Following audit of the EMR, treatment plans were audited by peer review. Data were collected prospectively from June 2007 to June 2008. Audits were carried out on 208 patients. Behaviour criteria were well documented in the EMR, but scanning of histology and medical imaging reports did not occur in up to a third of cases. With electronic prescriptions, treatment prescription errors were rare. In total, 8 (3.8%) out of 208 patients had a change to management recommended. Variability in interpretation of PRAT 'protocol/study' criteria was identified. We found that real-time audit is feasible and effective in detecting both issues with documentation in the EMR, and a small number of patients in whom a change to management is recommended. Recommendations have been made in order to continue to improve the audit process including documentation of any changes recommended and whether the recommended change occurred.
    Journal of Medical Imaging and Radiation Oncology 09/2009; 53(4):405-11. · 0.98 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Royal Australian and New Zealand College Radiologists (RANZCR) continuing professional development programme incorporates audit with feedback as one important activity. The 2004 audit tool improves radiation oncologist practice quality; however, the instrument is designed to be regularly refined. To refine the 2004 audit tool and present the new instrument we incorporated comments and suggestions from: (i) the auditor and radiation oncologist from the single machine unit trial; (ii) members of RANZCR Post-Fellowship Education Committee; (iii) New South Wales Department of Health mandatory prescription requirements; and (iv) the International Atomic Energy Agency audit tool. In July 2006, the revised instrument was designed then endorsed by Post-Fellowship Education Committee. Important changes include: (i) combining criteria which separately scored documentation and correctness for similar items; (ii) scoring treatment schedule more explicitly; (iii) separating target volume coverage and critical structure dose; (iv) altering performance criteria scoring to be sensitive to peer review when no consensus can be reached; and (v) strengthening instructions for use and notes to improve comprehension and acceptance. The refined 2006 instrument should be more user-friendly while increasing its usefulness.
    Journal of Medical Imaging and Radiation Oncology 09/2008; 52(4):403-13. · 0.98 Impact Factor