Article

Personal Computer versus Workstation Display: Observer Performance in Detection of Wrist Fractures on Digital Radiographs1

Middlemore Hospital, Окленд, Auckland, New Zealand
Radiology (Impact Factor: 6.87). 01/2006; 237(3):872-7. DOI: 10.1148/radiol.2373041439
Source: PubMed

ABSTRACT

To retrospectively compare the accuracy of observer performance with personal computer (PC) compared with that with dedicated picture archiving and communication system (PACS) workstation display in the detection of wrist fractures on computed radiographs.
This study was conducted according to the principles of the Declaration of Helsinki (2002 version) of the World Medical Association. The institutional clinical board approved the study; informed consent was not required. Seven observers independently assessed randomized anonymous digital radiographs of the wrist from 259 subjects; 146 had fractures, and 113 were healthy control subjects (151 male and 108 female subjects; average age, 33 years). Follow-up radiographs and/or computed tomographic scans were used as the reference standard for patients with fractures, and follow-up radiographs and/or clinical history data were used as the reference standard for controls. The PC was a standard hospital machine with a 17-inch (43-cm) color monitor with which Web browser display software was used. The PACS workstation had two portrait 21-inch (53-cm) monochrome monitors that displayed 2300 lines. The observers assigned scores to the radiographs on a scale of 1 (no fracture) to 5 (definite fracture). Receiver operating characteristic (ROC) curves, sensitivity, specificity, and accuracy were compared.
The areas under the ROC curves were almost identical for the PC and workstation (0.910 vs 0.918, respectively; difference, 0.008; 95% confidence interval: -0.029, 0.013). The average sensitivity with the PC was almost identical to that with the workstation (85% vs 84%, respectively), as was the average specificity (82% vs 81%, respectively). The average accuracy (83%) was the same for both.
The results of this study showed that there was no difference in accuracy of observer performance for detection of wrist fractures with a PC compared with that with a PACS workstation.

Download full-text

Full-text

Available from: James Le fevre, Jan 05, 2016
  • Source
    • "There has been relatively little published work regarding the influence of display technology on medical image reproduction, that which has, looks at the technical ability of particular display types and their limitations in displaying the stored information [4], these studies have generally used geometric test patterns and objective measuring to directly assess image fidelity, no studies have made a direct comparison of diagnostic accuracy of a projector and diagnostic monitor in the context of orthopaedic fractures. Some studies have shown important differences when evaluating non-orthopaedic images on smart-phones for example [5], or when studying fractures on different types of computer monitor [6], a study using more recent monitor technology did not shown any significant difference between high resolution diagnostic monitors and standard PC displays [7]. Garden I intracapsular neck of femur fractures were chosen as the subject of study as this injury frequently necessitates treatment and was felt to be potentially subtle enough to challenge various technical limitations of the displays being compared. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Introduction: Since the introduction of digital X-rays, many orthopaedic departments have used digital projection systems to display diagnostic images during discussion, there has been no published work directly comparing the sensitivity high resolution diagnostic monitors with standard digital projection systems in the context of orthopaedic injuries. Materials and methods: Participants were asked to review AP pelvic radiographs of non-displaced hip fractures on the department's digital projector and again on a diagnostic monitor, results were compared to determine if a true difference in sensitivity between the imaging modalities existed. Results: A significant difference in the sensitivity of the diagnostic monitor and meeting room projector was found, 0.85 vs 0.55, respectively (95% CI 0.78-0.89 vs 0.47-0.63); absolute difference 0.3 (95% CI 0.28-0.32, p≤0.001). Inter-observer agreement was moderate. Discussion: A difference in sensitivity was demonstrated to a high level of statistical power, and a positive result on either modality was highly likely to represent a true fracture, however a fracture cannot be confidently excluded examining a single image using the digital projector alone. The study was limited to a single view of one particular fracture type and may not be generalisable to all types of subtle fracture; in addition, the retrospective nature of the image review means that the sensitivity figures cannot be applied to a presenting patient population. Conclusions: This study demonstrates a significant difference in sensitivity between the two display types which may have implications with regard to reducing delays and unnecessary further imaging if clinicians are not aware of this potential limitation. Clinicians, if clinically suspicious of a fracture should always seek to review the images on a validated PACS display device if a fracture is not seen on a non-validated device. Departments should evaluate their current equipment, consider what equipment is available, what is the most suitable equipment for the environment in which it is being used and what the potential implications for patient care may be as a result.
    Preview · Article · Nov 2015 · Injury
  • Source
    • "According to various other medical studies which looked at brain CT [25], radiography of wrist fractures [26, 27], computed radiographs of the hands in early rheumatoid arthritis [28], and chest radiographs in interstitial lung disease [29] with different displays, no significant difference was detected between the monitors, concurring with our study. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Vertical root fracture (VRF) is a complication which is chiefly diagnosed radiographically. Recently, film-based radiography has been substituted with digital radiography. At the moment, there is a wide range of monitors available in the market for viewing digital images. The present study aims to compare the diagnostic accuracy, sensitivity and specificity of medical and conventional monitors in detection of vertical root fractures. In this in vitro study 228 extracted single-rooted human teeth were endodontically treated. Vertical root fractures were induced in 114 samples. The teeth were imaged by a digital charge-coupled device radiography using parallel technique. The images were evaluated by a radiologist and an endodontist on two medical and conventional liquid-crystal display (LCD) monitors twice. Z-test was used to analyze the sensitivity, accuracy and specificity of each monitor. Significance level was set at 0.05. Inter and intra observer agreements were calculated by Cohen's kappa. Accuracy, specificity and sensitivity for conventional monitor were calculated as 67.5%, 72%, 62.5% respectively; and data for medical grade monitor were 67.5%, 66.5% and 68% respectively. Statistical analysis showed no significant differences in detecting VRF between the two techniques. Inter-observer agreement for conventional and medical monitor was 0.47 and 0.55 respectively (moderate). Intra-observer agreement was 0.78 for medical monitor and 0.87 for conventional one (substantial). The type of monitor does not influence diagnosis of vertical root fractures.
    Full-text · Article · Jan 2013 · Iranian Endodontic Journal
  • Source
    • "The detectability was significantly poorer in the 800-speed CR image than in the 200-and 400-speed images; no significant difference was found between the latter two. Another recent ROC experiment compared observer performance in detection of wrist fractures with a common PC display and a dedicated diagnostic display (Doyle et al. 2005): surprisingly, no differences in "

    Preview · Article ·
Show more