Three-dimensional reconstruction and quantification of cervical carcinoma invasion fronts from histological serial sections.

Interdisciplinary Center for Bioinformatics, University Leipzig, Leipzig, Germany.
IEEE Transactions on Medical Imaging (Impact Factor: 3.39). 11/2005; 24(10):1286-307.
Source: PubMed


The analysis of the three-dimensional (3-D) structure of tumoral invasion fronts of carcinoma of the uterine cervix is the prerequisite for understanding their architectural-functional relationship. The variation range of the invasion patterns known so far reaches from a smooth tumor-host boundary surface to more diffusely spreading patterns, which all are supposed to have a different prognostic relevance. As a very decisive limitation of previous studies, all morphological assessments just could be done verbally referring to single histological sections. Therefore, the intention of this paper is to get an objective quantification of tumor invasion based on 3-D reconstructed tumoral tissue data. The image processing chain introduced here is capable to reconstruct selected parts of tumor invasion fronts from histological serial sections of remarkable extent (90-500 slices). While potentially gaining good accuracy and reasonably high resolution, microtome cutting of large serial sections especially may induce severe artifacts like distortions, folds, fissures or gaps. Starting from stacks of digitized transmitted light color images, an overall of three registration steps are the main parts of the presented algorithm. By this, we achieved the most detailed 3-D reconstruction of the invasion of solid tumors so far. Once reconstructed, the invasion front of the segmented tumor is quantified using discrete compactness.

Download full-text


Available from: Jens Einenkel, Jan 23, 2014
  • Source
    • "Their values vary between 0 and 1 for a perfect sphere. These parameters are popular for evaluating morphological changes in biological entities [17]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Studying how individual cells spatially and temporally organize within the embryo is a fundamental issue in modern developmental biology to better understand the first stages of embryogenesis. In order to perform high-throughput analyses in three-dimensional microscopic images, it is essential to be able to automatically segment, classify and track cell nuclei. Many 3D/4D segmentation and tracking algorithms have been reported in the literature. Most of them are specific to particular models or acquisition systems and often require the fine tuning of parameters. We present a new automatic algorithm to segment and simultaneously classify cell nuclei in 3D/4D images. Segmentation relies on training samples that are interactively provided by the user and on an iterative thresholding process. This algorithm can correctly segment nuclei even when they are touching, and remains effective under temporal and spatial intensity variations. The segmentation is coupled to a classification of nuclei according to cell cycle phases, allowing biologists to quantify the effect of genetic perturbations and drug treatments. Robust 3D geometrical shape descriptors are used as training features for classification. Segmentation and classification results of three complete datasets are presented. In our working dataset of the Caenorhabditis elegans embryo, only 21 nuclei out of 3,585 were not detected, the overall F-score for segmentation reached 0.99, and more than 95% of the nuclei were classified in the correct cell cycle phase. No merging of nuclei was found. We developed a novel generic algorithm for segmentation and classification in 3D images. The method, referred to as Adaptive Generic Iterative Thresholding Algorithm (AGITA), is freely available as an ImageJ plug-in.
    BMC Bioinformatics 01/2014; 15(1):9. DOI:10.1186/1471-2105-15-9 · 2.58 Impact Factor
  • Source
    • "An important consideration is that histopathological features of tumor malignancy are usually diagnosed from two-dimensional (2D) microscopic findings, whereas tumor progression actually occurs within a three dimensional (3D) microenvironment. Recent developments in computing power and software sophistication have facilitated the 3D reconstruction of anatomical and pathological structures [21–24]. We previously described the development of technology to reconstruct the tumor-host environment using serial histological sections of archival paraffin-embedded human cancer specimens [25]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: We conducted three-dimensional (3D) reconstruction of oral tongue squamous cell carcinoma (OTSCC) using serial histological sections to visualize the architecture of invasive tumors. Fourteen OTSCC cases were collected from archival paraffin-embedded specimens. Based on a pathodiagnostic survey of whole cancer lesions, a core tissue specimen (3 mm in diameter) was dissected out from the deep invasion front using a paraffin tissue microarray. Serial sections (4 μ m thick) were double immunostained with pan-cytokeratin and Ki67 antibodies and digitized images were acquired using virtual microscopy. For 3D reconstruction, image registration and RGB color segmentation were automated using ImageJ software to avoid operator-dependent subjective errors. Based on the 3D tumor architecture, we classified the mode of invasion into four types: pushing and bulky architecture; trabecular architecture; diffuse spreading; and special forms. Direct visualization and quantitative assessment of the parenchymal-stromal border provide a new dimension in our understanding of OTSCC architecture. These 3D morphometric analyses also ascertained that cell invasion (individually and collectively) occurs at the deep invasive front of the OTSCC. These results demonstrate the advantages of histology-based 3D reconstruction for evaluating tumor architecture and its potential for a wide range of applications.
    International Journal of Dentistry 10/2013; 2013(2):482765. DOI:10.1155/2013/482765
  • Source
    • "Automated registration of histological sections (stained with the same modality) has been attempted on cervical carcinoma by Braumann et al. [9], while automated registration of multimodal microscopy with application to PCa is considered in a recent paper by Kwak et al. [10]. Their aim was to register pairs of images, from light microscopy and infrared spectroscopy, in order to extract morphological features for use in the classification of cancer versus non-cancer cases. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Prostate cancer is one of the leading causes of cancer related deaths. For diagnosis, predicting the outcome of the disease, and for assessing potential new biomarkers, pathologists and researchers routinely analyze histological samples. Morphological and molecular information may be integrated by aligning microscopic histological images in a multiplex fashion. This process is usually time-consuming and results in intra- and inter-user variability. The aim of this study is to investigate the feasibility of using modern image analysis methods for automated alignment of microscopic images from differently stained adjacent paraffin sections from prostatic tissue specimens. Tissue samples, obtained from biopsy or radical prostatectomy, were sectioned and stained with either hematoxylin & eosin (H&E), immunohistochemistry for p63 and AMACR or Time Resolved Fluorescence (TRF) for androgen receptor (AR).Image pairs were aligned allowing for translation, rotation and scaling. The registration was performed automatically by first detecting landmarks in both images, using the scale invariant image transform (SIFT), followed by the well-known RANSAC protocol for finding point correspondences and finally aligned by Procrustes fit. The Registration results were evaluated using both visual and quantitative criteria as defined in the text. Three experiments were carried out. First, images of consecutive tissue sections stained with H&E and p63/AMACR were successfully aligned in 85 of 88 cases (96.6%). The failures occurred in 3 out of 13 cores with highly aggressive cancer (Gleason score >= 8). Second, TRF and H&E image pairs were aligned correctly in 103 out of 106 cases (97%).The third experiment considered the alignment of image pairs with the same staining (H&E) coming from a stack of 4 sections. The success rate for alignment dropped from 93.8% in adjacent sections to 22% for sections furthest away. The proposed method is both reliable and fast and therefore well suited for automatic segmentation and analysis of specific areas of interest, combining morphological information with protein expression data from three consecutive tissue sections. Finally, the performance of the algorithm seems to be largely unaffected by the Gleason grade of the prostate tissue samples examined, at least up to Gleason score 7.
    BMC Cancer 09/2013; 13(1):408. DOI:10.1186/1471-2407-13-408 · 3.36 Impact Factor
Show more