Article

Development of a model of the coronary arterial tree for the 4D XCAT phantom.

Department of Radiology, Johns Hopkins University, Baltimore, MD, USA.
Physics in Medicine and Biology (Impact Factor: 2.92). 09/2011; 56(17):5651-63. DOI: 10.1088/0031-9155/56/17/012
Source: PubMed

ABSTRACT A detailed three-dimensional (3D) model of the coronary artery tree with cardiac motion has great potential for applications in a wide variety of medical imaging research areas. In this work, we first developed a computer-generated 3D model of the coronary arterial tree for the heart in the extended cardiac-torso (XCAT) phantom, thereby creating a realistic computer model of the human anatomy. The coronary arterial tree model was based on two datasets: (1) a gated cardiac dual-source computed tomography (CT) angiographic dataset obtained from a normal human subject and (2) statistical morphometric data of porcine hearts. The initial proximal segments of the vasculature and the anatomical details of the boundaries of the ventricles were defined by segmenting the CT data. An iterative rule-based generation method was developed and applied to extend the coronary arterial tree beyond the initial proximal segments. The algorithm was governed by three factors: (1) statistical morphometric measurements of the connectivity, lengths and diameters of the arterial segments; (2) avoidance forces from other vessel segments and the boundaries of the myocardium, and (3) optimality principles which minimize the drag force at the bifurcations of the generated tree. Using this algorithm, the 3D computational model of the largest six orders of the coronary arterial tree was generated, which spread across the myocardium of the left and right ventricles. The 3D coronary arterial tree model was then extended to 4D to simulate different cardiac phases by deforming the original 3D model according to the motion vector map of the 4D cardiac model of the XCAT phantom at the corresponding phases. As a result, a detailed and realistic 4D model of the coronary arterial tree was developed for the XCAT phantom by imposing constraints of anatomical and physiological characteristics of the coronary vasculature. This new 4D coronary artery tree model provides a unique simulation tool that can be used in the development and evaluation of instrumentation and methods for imaging normal and pathological hearts with myocardial perfusion defects.

2 Followers
 · 
114 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Many scientists in the field of x-ray imaging rely on the simulation of x-ray images. As the phantom models become more and more realistic, their projection requires high computational effort. Since x-ray images are based on transmission, many standard graphics acceleration algorithms cannot be applied to this task. However, if adapted properly, the simulation speed can be increased dramatically using state-of-the-art graphics hardware. A custom graphics pipeline that simulates transmission projections for tomographic reconstruction was implemented based on moving spline surface models. All steps from tessellation of the splines, projection onto the detector and drawing are implemented in OpenCL. We introduced a special append buffer for increased performance in order to store the intersections with the scene for every ray. Intersections are then sorted and resolved to materials. Lastly, an absorption model is evaluated to yield an absorption value for each projection pixel. Projection of a moving spline structure is fast and accurate. Projections of size 640 × 480 can be generated within 254 ms. Reconstructions using the projections show errors below 1 HU with a sharp reconstruction kernel. Traditional GPU-based acceleration schemes are not suitable for our reconstruction task. Even in the absence of noise, they result in errors up to 9 HU on average, although projection images appear to be correct under visual examination. Projections generated with our new method are suitable for the validation of novel CT reconstruction algorithms. For complex simulations, such as the evaluation of motion-compensated reconstruction algorithms, this kind of x-ray simulation will reduce the computation time dramatically.
    Physics in Medicine and Biology 09/2012; 57(19):6193-210. DOI:10.1088/0031-9155/57/19/6193 · 2.92 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Simultaneous dual-radionuclide myocardial perfusion/innervation SPECT imaging can provide important information about the mismatch between scar tissue and denervated regions. The Siemens IQ-SPECT system developed for cardiac imaging uses a multifocal SMARTZOOM collimator to achieve a four-fold sensitivity for the cardiac region, compared to a typical parallel-hole low-energy high-resolution collimator, but without the data truncation that can result with conventional converging-beam collimators. The increased sensitivity allows shorter image acquisition times or reduced patient dose, making IQ-SPECT ideal for simultaneous dual-radionuclide SPECT, where reduced administrated activity is desirable in order to reduce patient radiation exposure. However, crosstalk is a major factor affecting the image quality in dual-radionuclide imaging. In this work we developed a model-based method that can estimate and compensate for the crosstalk in IQ-SPECT data. The crosstalk model takes into account interactions in the object and collimator-detector system. Scatter in the object was modeled using the effective source scatter estimation technique (ESSE), previously developed to model scatter with parallel-hole collimators. The geometric collimator-detector response was analytically modeled in the IQ-SPECT projector. The estimated crosstalk was then compensated for in an iterative reconstruction process. The new method was validated with data from both Monte Carlo simulations and physical phantom experiments. The results showed that the estimated crosstalk was in good agreement with simulated and measured results. After model-based compensation the images from simultaneous dual-radionuclide acquisitions were similar in quality to those from single-radionuclide acquisitions that did not have crosstalk contamination. The proposed model-based method can be used to improve simultaneous dual-radionuclide images acquired using IQ-SPECT. This work also demonstrates that ESSE scatter modeling can be applied to non-parallel-beam projection geometries.
    Physics in Medicine and Biology 05/2014; 59(11):2813. DOI:10.1088/0031-9155/59/11/2813 · 2.92 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Radiation dose calculation using models of the human anatomy has been a subject of great interest to radiation protection, medical imaging, and radiotherapy. However, early pioneers of this field did not foresee the exponential growth of research activity as observed today. This review article walks the reader through the history of the research and development in this field of study which started some 50 years ago. This review identifies a clear progression of computational phantom complexity which can be denoted by three distinct generations. The first generation of stylized phantoms, representing a grouping of less than dozen models, was initially developed in the 1960s at Oak Ridge National Laboratory to calculate internal doses from nuclear medicine procedures. Despite their anatomical simplicity, these computational phantoms were the best tools available at the time for internal/external dosimetry, image evaluation, and treatment dose evaluations. A second generation of a large number of voxelized phantoms arose rapidly in the late 1980s as a result of the increased availability of tomographic medical imaging and computers. Surprisingly, the last decade saw the emergence of the third generation of phantoms which are based on advanced geometries called boundary representation (BREP) in the form of Non-Uniform Rational B-Splines (NURBS) or polygonal meshes. This new class of phantoms now consists of over 287 models including those used for non-ionizing radiation applications. This review article aims to provide the reader with a general understanding of how the field of computational phantoms came about and the technical challenges it faced at different times. This goal is achieved by defining basic geometry modeling techniques and by analyzing selected phantoms in terms of geometrical features and dosimetric problems to be solved. The rich historical information is summarized in four tables that are aided by highlights in the text on how some of the most well-known phantoms were developed and used in practice. Some of the information covered in this review has not been previously reported, for example, the CAM and CAF phantoms developed in 1970s for space radiation applications. The author also clarifies confusion about 'population-average' prospective dosimetry needed for radiological protection under the current ICRP radiation protection system and 'individualized' retrospective dosimetry often performed for medical physics studies. To illustrate the impact of computational phantoms, a section of this article is devoted to examples from the author's own research group. Finally the author explains an unexpected finding during the course of preparing for this article that the phantoms from the past 50 years followed a pattern of exponential growth. The review ends on a brief discussion of future research needs (a supplementary file '3DPhantoms.pdf' to figure 15 is available for download that will allow a reader to interactively visualize the phantoms in 3D).
    Physics in Medicine and Biology 08/2014; 59(18):R233-R302. DOI:10.1088/0031-9155/59/18/R233 · 2.92 Impact Factor

Full-text (2 Sources)

Download
59 Downloads
Available from
May 21, 2014