Conference Paper

2D Time-Difference Electrical Impedance Tomography Image Reconstruction in a Head Model with Regularization by Denoising

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Time-Difference Electrical Impedance Tomography (TDEIT) is an imaging technique to visualize resistivity changes over time in a region of interest. Regularization is necessary because TDEIT is an ill-posed problem. In this work, we use Regularization by Denoising (RED) with four different denoisers to reconstruct brain images in a simplified 2D head model. We compared the RED results to two traditional reconstruction methods, generalized Tikhonov regularization and total variation regularization. In both the noiseless and the noisy scenarios, we achieved the best results using RED with non-local means as the denoiser in relation to figures of merit such as ringing, resolution and shape deformation.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
A constrained optimization type of numerical algorithm for removing noise from images is presented. The total variation of the image is minimized subject to constraints involving the statistics of the noise. The constraints are imposed using Lanrange multipliers. The solution is obtained using the gradient-projection method. This amounts to solving a time dependent partial differential equation on a manifold determined by the constraints. As t → ∞ the solution converges to a steady state which is the denoised image. The numerical algorithm is simple and relatively fast. The results appear to be state-of-the-art for very noisy images. The method is noninvasive, yielding sharp edges in the image. The technique could be interpreted as a first step of moving each level set of the image normal to itself with velocity equal to the curvature of the level set divided by the magnitude of the gradient of the image, and a second step which projects the image back onto the constraint set.
Article
Full-text available
Electrical impedance tomography (EIT) is an attractive method for clinically monitoring patients during mechanical ventilation, because it can provide a non-invasive continuous image of pulmonary impedance which indicates the distribution of ventilation. However, most clinical and physiological research in lung EIT is done using older and proprietary algorithms; this is an obstacle to interpretation of EIT images because the reconstructed images are not well characterized. To address this issue, we develop a consensus linear reconstruction algorithm for lung EIT, called GREIT (Graz consensus Reconstruction algorithm for EIT). This paper describes the unified approach to linear image reconstruction developed for GREIT. The framework for the linear reconstruction algorithm consists of (1) detailed finite element models of a representative adult and neonatal thorax, (2) consensus on the performance figures of merit for EIT image reconstruction and (3) a systematic approach to optimize a linear reconstruction matrix to desired performance measures. Consensus figures of merit, in order of importance, are (a) uniform amplitude response, (b) small and uniform position error, (c) small ringing artefacts, (d) uniform resolution, (e) limited shape deformation and (f) high resolution. Such figures of merit must be attained while maintaining small noise amplification and small sensitivity to electrode and boundary movement. This approach represents the consensus of a large and representative group of experts in EIT algorithm design and clinical applications for pulmonary monitoring. All software and data to implement and test the algorithm have been made available under an open source license which allows free research and commercial use.
Article
Full-text available
This paper develops a mathematical model for the physical properties of electrodes suitable for use in electric current computed tomography (ECCT). The model includes the effects of discretization, shunt, and contact impedance. The complete model was validated by experiment. Bath resistivities of 284.0, 139.7, 62.3, 29.5 omega.cm were studied. Values of "effective" contact impedance zeta used in the numerical approximations were 58.0, 35.0, 15.0, and 7.5 omega.cm2, respectively. Agreement between the calculated and experimentally measured values was excellent throughout the range of bath conductivities studied. It is desirable in electrical impedance imaging systems to model the observed voltages to the same precision as they are measured in order to be able to make the highest resolution reconstructions of the internal conductivity that the measurement precision allows. The complete electrode model, which includes the effects of discretization of the current pattern, the shunt effect due to the highly conductive electrode material, and the effect of an "effective" contact impedance, allows calculation of the voltages due to any current pattern applied to a homogeneous resistivity field.
Conference Paper
Full-text available
We propose a new measure, the method noise, to evaluate and compare the performance of digital image denoising methods. We first compute and analyze this method noise for a wide class of denoising algorithms, namely the local smoothing filters. Second, we propose a new algorithm, the nonlocal means (NL-means), based on a nonlocal averaging of all pixels in the image. Finally, we present some experiments comparing the NL-means algorithm and the local smoothing filters.
Article
Due to the importance in many military and civilian applications, hyperspectral anomaly detection has attracted remarkable interest. Low-rank representation (LRR)-based anomaly detectors use the low-rank property to represent background pixels, and pixels that cannot be well represented are detected as anomalies. The ability of an LRR-based detector to separate background pixels and anomalous pixels depends on the dictionary representation ability, which usually can be enhanced by designing a proper prior for dictionary representation coefficients and constructing a better dictionary. However, it is not easy to handcraft effective and meaningful regularizers for dictionary coefficients. In this article, we propose a novel anomaly detection algorithm that uses a plug-and-play prior for representation coefficients and constructs a new dictionary based on clustering. Instead of cumbersomely handcrafting a regularizer for representation coefficients, we propose solving the anomaly detection problem using the plug-and-play framework, which enables us to plug state-of-the-art priors for representation coefficients. An effective convolutional neural network (CNN) denoiser is plugged into our framework to fully exploit the spatial correlation of representation coefficients. We also propose a modified background dictionary construction method, which carefully includes background pixels and excludes anomalous pixels from clustering results. We refer to the proposed anomaly detection method as plug-and-play denoising CNN regularized anomaly detection (DeCNN-AD) method. Extensive experiments were performed on five data sets in a comparison with eight state-of-the-art anomaly detection methods. The experimental results suggest that the proposed method is effective in anomaly detection and can produce better anomaly detection results than that of the comparison methods. The codes of this work will be available at https://github.com/FxyPd for the sake of reproducibility.
Article
Removal of noise from an image is an extensively studied problem in image processing. Indeed, the recent advent of sophisticated and highly effective denoising algorithms lead some to believe that existing methods are touching the ceiling in terms of noise removal performance. Can we leverage this impressive achievement to treat other tasks in image processing? Recent work has answered this question positively, in the form of the Plug-and-Play Prior (P3P^3) method, showing that any inverse problem can be handled by sequentially applying image denoising steps. This relies heavily on the ADMM optimization technique in order to obtain this chained denoising interpretation. Is this the only way in which tasks in image processing can exploit the image denoising engine? In this paper we provide an alternative, more powerful and more flexible framework for achieving the same goal. As opposed to the P3P^3 method, we offer Regularization by Denoising (RED): using the denoising engine in defining the regularization of the inverse problem. We propose an explicit image-adaptive Laplacian-based regularization functional, making the overall objective functional clearer and better defined. With a complete flexibility to choose the iterative optimization procedure for minimizing the above functional, RED is capable of incorporating any image denoising algorithm, treat general inverse problems very effectively, and is guaranteed to converge to the globally optimal result. We test this approach and demonstrate state-of-the-art results in the image deblurring and super-resolution problems.
Article
Electrical Impedance Tomography (EIT) is a non-invasive image reconstruction technique, whereby current is injected by electrodes and electric potential is measured by electrodes. In some electronic hardware implementations, only two electrodes inject current simultaneously, and are denominated pair-wise current injection. Several possibilities of pair-wise current injection (electric current patterns) and electric potential measurement (single-ended and differential) have been addressed in the literature. Considering pair-wise current injection, the skip-m current pattern can be defined as a pair-wise injection strategy in which the number of non-current injecting electrodes enclosed between two injection electrodes is m. Single-ended electric potential measurements consist of measurements with a common potential reference. Differential electric potential measurements consist of pair-wise measurements between two electrodes. A theoretical analysis based on control theory is presented to show that some current and measurement pattern strategies convey less information than others. This hypothesis is verified by the analysis of the matrix containing possible measurement vectors, with respect to its rank, condition number and singular values. Additionally, a novel approach is proposed to analyse current and measurement patterns based on uncertainty estimation of difference images by the correlation matrix linearization of the reconstructed impedance matrix. The results show that single-ended potential measurements are usually better when compared to differential electric potential measurements. A conclusion supported by both points of view is that the cross current pattern (diametral) is the least informative for these symmetrical domains.
Article
We show that electrical impedance tomography (EIT) image reconstruction algorithms with regularization based on the total variation (TV) functional are suitable for in vivo imaging of physiological data. This reconstruction approach helps to preserve discontinuities in reconstructed profiles, such as step changes in electrical properties at interorgan boundaries, which are typically smoothed by traditional reconstruction algorithms. The use of the TV functional for regularization leads to the minimization of a nondifferentiable objective function in the inverse formulation. This cannot be efficiently solved with traditional optimization techniques such as the Newton method. We explore two implementations methods for regularization with the TV functional: the lagged diffusivity method and the primal dual-interior point method (PD-IPM). First we clarify the implementation details of these algorithms for EIT reconstruction. Next, we analyze the performance of these algorithms on noisy simulated data. Finally, we show reconstructed EIT images of in vivo data for ventilation and gastric emptying studies. In comparison to traditional quadratic regularization, TV regularization shows improved ability to reconstruct sharp contrasts.
Conference Paper
Bilateral filtering smooths images while preserving edges, by means of a nonlinear combination of nearby image values. The method is noniterative, local, and simple. It combines gray levels or colors based on both their geometric closeness and their photometric similarity, and prefers near values to distant values in both domain and range. In contrast with filters that operate on the three bands of a color image separately, a bilateral filter can enforce the perceptual metric underlying the CIE-Lab color space, and smooth colors and preserve edges in a way that is tuned to human perception. Also, in contrast with standard filtering, bilateral filtering produces no phantom colors along edges in color images, and reduces phantom colors where they appear in the original image
Regularization by denoising applied to non-linear traveltime tomography
  • A Vargas
Estimação não linear de estado através do unscented Kalman filter na tomografia por impedância elétrica
  • F S Moura