Personal Computer versus Workstation Display: Observer Performance in Detection of Wrist Fractures on Digital Radiographs1

Middlemore Hospital, Окленд, Auckland, New Zealand
Radiology (Impact Factor: 6.21). 01/2006; 237(3):872-7. DOI: 10.1148/radiol.2373041439
Source: PubMed

ABSTRACT To retrospectively compare the accuracy of observer performance with personal computer (PC) compared with that with dedicated picture archiving and communication system (PACS) workstation display in the detection of wrist fractures on computed radiographs.
This study was conducted according to the principles of the Declaration of Helsinki (2002 version) of the World Medical Association. The institutional clinical board approved the study; informed consent was not required. Seven observers independently assessed randomized anonymous digital radiographs of the wrist from 259 subjects; 146 had fractures, and 113 were healthy control subjects (151 male and 108 female subjects; average age, 33 years). Follow-up radiographs and/or computed tomographic scans were used as the reference standard for patients with fractures, and follow-up radiographs and/or clinical history data were used as the reference standard for controls. The PC was a standard hospital machine with a 17-inch (43-cm) color monitor with which Web browser display software was used. The PACS workstation had two portrait 21-inch (53-cm) monochrome monitors that displayed 2300 lines. The observers assigned scores to the radiographs on a scale of 1 (no fracture) to 5 (definite fracture). Receiver operating characteristic (ROC) curves, sensitivity, specificity, and accuracy were compared.
The areas under the ROC curves were almost identical for the PC and workstation (0.910 vs 0.918, respectively; difference, 0.008; 95% confidence interval: -0.029, 0.013). The average sensitivity with the PC was almost identical to that with the workstation (85% vs 84%, respectively), as was the average specificity (82% vs 81%, respectively). The average accuracy (83%) was the same for both.
The results of this study showed that there was no difference in accuracy of observer performance for detection of wrist fractures with a PC compared with that with a PACS workstation.

  • Source
    • "The detectability was significantly poorer in the 800-speed CR image than in the 200-and 400-speed images; no significant difference was found between the latter two. Another recent ROC experiment compared observer performance in detection of wrist fractures with a common PC display and a dedicated diagnostic display (Doyle et al. 2005): surprisingly, no differences in "
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper introduces a new circuit noise analysis and modeling method. The noise analysis method computes an analytic expression of frequency, in rational form, which represents the Pad\'e approximation of the noise power spectral density. The approximation can be carried out efficiently, to the required accuracy, using a variant of the PVL~\cite{FelF95} or MPVL~\cite{FelF95b} algorithms. The new method is significantly more efficient than traditional methods for noise computation at numerous frequency points. In addition, it allows for a compact and cascadable modeling of noise that can be used in system level simulations.
    Proceedings of the 1997 IEEE/ACM international conference on Computer-aided design; 01/1997
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: To compare the diagnostic value of low-cost computer monitors and a Picture Archiving and Communication System (PACS) workstation for the evaluation of cervical spine fractures in the emergency room. Two groups of readers blinded to the diagnoses (2 radiologists and 3 orthopaedic surgeons) independently assessed-digital radiographs of the cervical spine (anterior-posterior, oblique and trans-oral-dens views). The radiographs of 57 patients who arrived consecutively to the emergency room in 2004 with clinical suspicion of a cervical spine injury were evaluated. The diagnostic values of these radiographs were scored on a 3-point scale (1 = diagnosis not possible/bad image quality, 2 = diagnosis uncertain, 3 = clear diagnosis of fracture or no fracture) on a PACS workstation and on two different liquid crystal display (LCD) personal computer monitors. The images were randomised to avoid memory effects. We used logistic mixed-effects models to determine the possible effects of monitor type on the evaluation of x ray images. To determine the overall effects of monitor type, this variable was used as a fixed effect, and the image number and reader group (radiologist or orthopaedic surgeon) were used as random effects on display quality. Group-specific effects were examined, with the reader group and additional fixed effects as terms. A significance level of 0.05 was established for assessing the contribution of each fixed effect to the model. Overall, the diagnostic score did not differ significantly between standard personal computer monitors and the PACS workstation (both p values were 0.78). Low-cost LCD personal computer monitors may be useful in establishing a diagnosis of cervical spine fractures in the emergency room.
    Emergency Medicine Journal 12/2006; 23(11):850-3. DOI:10.1136/emj.2006.036822 · 1.78 Impact Factor
Show more