Cédric Vonesch

École Polytechnique Fédérale de Lausanne, Lausanne, Vaud, Switzerland

Are you Cédric Vonesch?

Claim your profile

Publications (36)52.2 Total impact


  • No preview · Article · Jan 2016
  • [Show abstract] [Hide abstract]
    ABSTRACT: We propose a framework for the detection of junctions in images. Although the detection of edges and key points is a well examined and described area, the multiscale detection of junction centers, especially for odd orders, poses a challenge in pattern analysis. The goal of our paper is to build optimal junction detectors based on 2D steerable wavelets that are polar-separable in the Fourier domain. The approaches we develop are general and can be used for the detection of arbitrary symmetric and asymmetric junctions. The backbone of our construction is a multiscale pyramid with a radial wavelet function where the directional components are represented by circular harmonics and encoded in a shaping matrix. We are able to detect M-fold junctions in different scales and orientations. We provide experimental results on both simulated and real data to demonstrate the effectiveness of the algorithm.
    No preview · Article · Dec 2015 · IEEE Transactions on Image Processing
  • Cédric Vonesch · Frédéric Stauber · Michael Unser
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we propose a continuous-domain version of principal-component analysis, with the constraint that the underlying family of templates appears at arbitrary orientations. We show that the corresponding principal components are steerable. Our method can be used for designing steerable filters so that they best approximate a given collection of reference templates. We apply this framework to the detection and classification of micrometer-sized particles that are used in a microfluidic diagnostics system. This is done in two steps. First, we decompose the particles into a small number of templates and compute their steerable principal components. Then we use these principal components to automatically estimate the orientation and the class of each particle.
    No preview · Article · Sep 2015 · SIAM Journal on Imaging Sciences
  • Masih Nilchian · John Paul Ward · Cedric Vonesch · Michael Unser
    [Show abstract] [Hide abstract]
    ABSTRACT: Kaiser-Bessel window functions are frequently used to discretize tomographic problems because they have two desirable properties: 1) their short support leads to a low computational cost and 2) their rotational symmetry makes their imaging transform independent of the direction. In this paper, we aim at optimizing the parameters of these basis functions. We present a formalism based on the theory of approximation and point out the importance of the partition-of-unity condition. While we prove that, for compact-support functions, this condition is incompatible with isotropy, we show that minimizing the deviation from the partition of unity condition is highly beneficial. The numerical results confirm that the proposed tuning of the Kaiser-Bessel window functions yields the best performance.
    No preview · Article · Jul 2015 · IEEE Transactions on Image Processing
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Optical tomography has been widely investigated for biomedical imaging applications. In recent years optical tomography has been combined with digital holography and has been employed to produce high-quality images of phase objects such as cells. In this paper we describe a method for imaging 3D phase objects in a tomographic configuration implemented by training an artificial neural network to reproduce the complex amplitude of the experimentally measured scattered light. The network is designed such that the voxel values of the refractive index of the 3D object are the variables that are adapted during the training process.We demonstrate the method experimentally by forming images of the 3D refractive index distribution of Hela cells.
    Full-text · Article · Feb 2015 · Optica
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Super resolution microscopy such as STORM and (F)PALM is now a well known method for biological studies at the nanometer scale. However, conventional imaging schemes based on sparse activation of photo-switchable fluorescent probes have inherently slow temporal resolution which is a serious limitation when investigating live-cell dynamics. Here, we present an algorithm for high-density super-resolution microscopy which combines a sparsity-promoting formulation with a Taylor series approximation of the PSF. Our algorithm is designed to provide unbiased localization on continuous space and high recall rates for high-density imaging, and to have orders-of-magnitude shorter run times compared to previous high-density algorithms. We validated our algorithm on both simulated and experimental data, and demonstrated live-cell imaging with temporal resolution of 2.5 seconds by recovering fast ER dynamics.
    Full-text · Article · Apr 2014 · Scientific Reports
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we introduce a new reconstruction algorithm for X-ray differential phase-contrast Imaging (DPCI). Our approach is based on 1) a variational formulation with a weighted data term and 2) a variable-splitting scheme that allows for fast convergence while reducing reconstruction artifacts. In order to improve the quality of the reconstruction we take advantage of higher-order total-variation regularization. In addition, the prior information on the support and positivity of the refractive index is considered, which yields significant improvement. We test our method in two reconstruction experiments involving real data; our results demonstrate its potential for in-vivo and medical imaging.
    Full-text · Article · Dec 2013 · Optics Express
  • Philip R. Thompson · Gary T. Mitchum · Cedric Vonesch · Jianke Li
    [Show abstract] [Hide abstract]
    ABSTRACT: Interannual to multidecadal variability of winter storminess in the eastern United States was studied using water level measurements from coastal tide gauges. The proximity to the coast of the primary winter storm track in the region allows the use of tide gauges to study temporal modulations in the frequency of these storms. Storms were identified in high-passed, detided sea level anomalies in 20 gauges from all coasts of North America to assess variability in winter storminess along particular storm tracks. The primary result is a significant multidecadal increase in the number of storms affecting the southeastern United States from the early to late twentieth century. The authors propose that this change is due to an increased tendency for the jet stream to meander south over the eastern United States since the 1950s. This mechanism is supported by long-term changes in the large-scale sea level pressure pattern over North America. The nature of the multidecadal change in storm frequency is unclear, because limited tide gauge record lengths prevent distinguishing between a trend and an oscillation.
    No preview · Article · Dec 2013 · Journal of Climate
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Differential phase-contrast is a recent technique in the context of X-ray imaging. In order to reduce the specimen's exposure time, we propose a new iterative algorithm that can achieve the same quality as FBP-type methods, while using substantially fewer angular views. Our approach is based on 1) a novel spline-based discretization of the forward model and 2) an iterative reconstruction algorithm using the alternating direction method of multipliers. Our experimental results on real data suggest that the method allows to reduce the number of required views by at least a factor of four.
    Full-text · Article · Mar 2013 · Optics Express
  • D. Sage · H. Kirshner · C. Vonesch · S. Lefkimmiatis · M. Unser
    [Show abstract] [Hide abstract]
    ABSTRACT: As the field of bioimage informatics matures, the issue of the validation of image reconstruction algorithms and the definition of proper performance criteria becomes more pressing. In this work, we discuss benchmarking aspects of fluorescence microscopy quantitative tools. We point out the importance of generating realistic datasets and describe our approach to this task. We rely on our experience and present arguments in favor of the use of 3D continuous-domain models of biological structures for simulating bioimaging datasets. We also present physically-realistic models of image formation that that are reasonably efficiently to implement.
    No preview · Article · Jan 2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: X-ray differential phase-contrast tomography is a recently-developed modality for the imaging of low-contrast biological samples. Its mathematical model is based on the first derivative of the Radon transform and the images, in practice, are reconstructed using a variant of filtered back-projection (FBP). In this paper, we develop an alternative reconstruction algorithm with the aim of reducing the number of required views, while maintaining image quality. To that end, we discretize the forward model based on polynomial B-spline functions. Then, we formulate the reconstruction as a regularized weighted-norm optimization problem with a penalty on the total variation (TV) of the solution. This leads to the derivation of a novel iterative algorithm that involves an alternation of gradient updates (FBP step) and shrinkage-thresholding (within the framework of the fast iterative shrinkage-thresholding algorithm). Experiments with real data suggest that the proposed method significantly improves upon FBP; it can handle a drastic reduction in the number of projections without noticeable degradation of the quality with respect to the standard procedure.
    No preview · Conference Paper · Jan 2013
  • Z. Puspoki · C. Vonesch · M. Unser
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a method for designing steerable wavelets that can detect local centers of symmetry in images. Based on this design, we then propose an algorithm for estimating the locations and the orientations of M-fold symmetric junctions in biological micrographs. The analysis with 2-D steerable wavelets allows us to have detections at different scales and arbitrary orientations. Owing to the steering property of our wavelets the detection is fast and accurate. We provide experimental results on both synthetic images and biological micrographs to demonstrate the performance of the algorithm.
    No preview · Conference Paper · Jan 2013
  • H. Kirshner · C. Vonesch · M. Unser
    [Show abstract] [Hide abstract]
    ABSTRACT: We introduce a general and computationally efficient approach to 3-D localization microscopy. The main idea is to construct a continuous-domain representation of the PSF by expanding it in a polynomial B-spline basis. This allows us to fit the PSF to the data with sub-pixel accuracy. Since the basis functions are compactly supported, the evaluation of the PSF is computationally efficient. Furthermore, our approach can accommodate for any 3-D PSF design, and it does not require a calibration curve for the axial position. We further introduce a computationally efficient implementation of the least-squares criterion and demonstrate its potential use for fast and accurate reconstruction of super-resolution data.
    No preview · Conference Paper · Jan 2013
  • C. Vonesch · F. Stauber · M. Unser
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents two contributions. We first introduce a continuous-domain version of Principal-Component Analysis (PCA) for designing steerable filters so that they best approximate a given set of image templates. We exploit the fact that steerability does not need to be enforced explicitly if one extends the set of templates by incorporating all their rotations. Our results extend previous work by Perona to multiple templates. We then apply our framework to the automatic detection and classification of micro-particles that carry biochemical probes for molecular diagnostics. Our continuous-domain PCA formalism is particularly well adapted in this context because the geometry of the carriers is known analytically. In addition, the steerable structure of our filters allows for a fast FFT-based recognition of the type of probe.
    No preview · Conference Paper · Jan 2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: Super-resolution localization microscopy relies on sparse activation of photo-switchable probes. Such activation, however, introduces limited temporal resolution. High-density imaging overcomes this limitation by allowing several neighboring probes to be activated simultaneously. In this work, we propose an algorithm that incorporates a continuous-domain sparsity prior into the high-density localization problem. We use a Taylor approximation of the PSF, and rely on a fast proximal gradient optimization procedure. Unlike currently available methods that use discrete-domain sparsity priors, our approach does not restrict the estimated locations to a pre-defined sampling grid. Experimental results of simulated and real data demonstrate significant improvement over these methods in terms of accuracy, molecular identification and computational complexity.
    No preview · Conference Paper · Jan 2013
  • Source
    Cédric Vonesch · Lanhui Wang · Yoel Shkolnisky · Amit Singer
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a novel algorithm for the 3D tomographic inversion problem that arises in single-particle electron cryo-microscopy (Cryo-EM). It is based on two key components: 1) a variational formulation that promotes sparsity in the wavelet domain and 2) the Toeplitz structure of the combined projection/back-projection operator. The first idea has proven to be very effective for the recovery of piecewise-smooth signals, which is confirmed by our numerical experiments. The second idea allows for a computationally efficient implementation of the reconstruction procedure, using only one circulant convolution per iteration.
    Preview · Article · Jun 2011 · Proceedings / IEEE International Symposium on Biomedical Imaging: from nano to macro. IEEE International Symposium on Biomedical Imaging
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We propose a class of spherical wavelet bases for the analysis of geophysical models and forthe tomographic inversion of global seismic data. Its multiresolution character allows for modeling with an effective spatial resolution that varies with position within the Earth. Our procedure is numerically efficient and can be implemented with parallel computing. We discuss two possible types of discrete wavelet transforms in the angular dimension of the cubed sphere. We discuss benefits and drawbacks of these constructions and apply them to analyze the information present in two published seismic wavespeed models of the mantle, for the statistics and power of wavelet coefficients across scales. The localization and sparsity properties of wavelet bases allow finding a sparse solution to inverse problems by iterative minimization of a combination of the $\ell_2$ norm of data fit and the $\ell_1$ norm on the wavelet coefficients. By validation with realistic synthetic experiments we illustrate the likely gains of our new approach in future inversions of finite-frequency seismic data and show its readiness for global seismic tomography.
    Full-text · Article · Apr 2011 · Geophysical Journal International
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The computation of finite frequency kernels using dynamic ray tracing, as originally proposed by Dahlen et al. (GJI, 2000) is very efficient, but the method has several disadvantages: it is prone to ambiguities and errors whenever the radius of curvature of a ray is smaller than the effective width of a kernel, and it breaks down completely for rays, such as PP or PKP that may reach as far as the antipode. Motivated by the development of a wavelet parameterization in a `cubed sphere', we have developed a new method for the fast computation of travel times and geometrical spreading factors. First, we confined our calculations to a finite set of discrete radii. The goal then is to find the travel time and spreading factors for an arbitrary location along one of these radii. We use a traditional method to compute a fine fan of rays through a spherically symmetric model. For each ray we obtain the travel time and its second derivative with respect to distance from the ray in the ray plane, and the geometrical spreading at a series of nodes, typically spaced 15 km apart. These values are then interpolated to obtain data along each of the radii, resulting in a finite set (for each radius) of closely sampled data. Different arrivals are recognized by non-monotonic behaviour of the time as function of distance. Travel times and geometrical spreading at arbitrary locations can be computed in two ways from this grid: either one interpolates from the times and spreading stored on the grid points, or one extrapolates a small distance from the nearest ray using dynamic ray tracing. We compare the two methods for efficiency and accuracy.
    Full-text · Conference Paper · May 2010
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Global seismic wavespeed models are routinely parameterized in terms of spherical harmonics, networks of tetrahedral nodes, rectangular voxels, or spherical splines. Up to now, Earth model parametrizations by wavelets on the three-dimensional ball remain uncommon. Here we propose such a procedure with the following three goals in mind: (1) The multiresolution character of a wavelet basis allows for the models to be represented with an effective spatial resolution that varies as a function of position within the Earth. (2) This property can be used to great advantage in the regularization of seismic inversion schemes by seeking the most sparse solution vector, in wavelet space, through iterative minimization of a combination of the ℓ2 (to fit the data) and ℓ1 norms (to promote sparsity in wavelet space). (3) With the continuing increase in high-quality seismic data, our focus is also on numerical efficiency and the ability to use parallel computing in reconstructing the model. In this presentation we propose a new wavelet basis to take advantage of these three properties. To form the numerical grid we begin with a surface tesselation known as the 'cubed sphere', a construction popular in fluid dynamics and computational seismology, coupled with an semi-regular radial subdivison that honors the major seismic discontinuities between the core-mantle boundary and the surface. This mapping first divides the volume of the mantle into six portions. In each 'chunk' two angular and one radial variable are used for parametrization. In the new variables standard 'cartesian' algorithms can more easily be used to perform the wavelet transform (or other common transforms). Edges between chunks are handled by special boundary filters. We highlight the benefits of this construction and use it to analyze the information present in several published seismic compressional-wavespeed models of the mantle, paying special attention to the statistics of wavelet and scaling coefficients across scales. We also focus on the likely gains of future inversions of finite-frequency seismic data using a sparsity promoting penalty in combination with our new wavelet approach.
    Full-text · Conference Paper · Apr 2010
  • Source
    Florian Luisier · Cédric Vonesch · Thierry Blu · Michael Unser
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a fast algorithm for image restoration in the presence of Poisson noise. Our approach is based on (1) the minimization of an unbiased estimate of the MSE for Poisson noise, (2) a linear parametrization of the denoising process and (3) the preservation of Poisson statistics across scales within the Haar DWT. The minimization of the MSE estimate is performed independently in each wavelet subband, but this is equivalent to a global image-domain MSE minimization, thanks to the orthogonality of Haar wavelets. This is an important difference with standard Poisson noise-removal methods, in particular those that rely on a non-linear preprocessing of the data to stabilize the variance.Our non-redundant interscale wavelet thresholding outperforms standard variance-stabilizing schemes, even when the latter are applied in a translation-invariant setting (cycle-spinning). It also achieves a quality similar to a state-of-the-art multiscale method that was specially developed for Poisson data. Considering that the computational complexity of our method is orders of magnitude lower, it is a very competitive alternative.The proposed approach is particularly promising in the context of low signal intensities and/or large data sets. This is illustrated experimentally with the denoising of low-count fluorescence micrographs of a biological sample.
    Full-text · Article · Feb 2010 · Signal Processing