Philippe Thévenaz

École Polytechnique Fédérale de Lausanne, Lausanne, Vaud, Switzerland

Are you Philippe Thévenaz?

Claim your profile

Publications (80)129.74 Total impact

  • Arash Amini · Philippe Thevenaz · John Paul Ward · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: Bayesian estimation problems involving Gaussian distributions often result in linear estimation techniques. Nevertheless, there are no general statements as to whether the linearity of the Bayesian estimator is restricted to the Gaussian case. The two common strategies for non-Gaussian models are either finding the best linear estimator or numerically evaluating the Bayesian estimator by Monte Carlo methods. In this paper, we focus on Bayesian interpolation of non-Gaussian first-order autoregressive (AR) processes where the driving innovation can admit any symmetric infinitely divisible distribution characterized by the Lévy-Khintchine representation theorem. We redefine the Bayesian estimation problem in the Fourier domain with the help of characteristic forms. By providing analytic expressions, we show that the optimal interpolator is linear for all symmetric -stable distributions. The Bayesian interpolator can be expressed in a convolutive form where the kernel is described in terms of exponential splines. We also show that the limiting case of Lévy-type AR(1) processes, the system of which has a pole at the origin, always corresponds to a linear Bayesian interpolator made of a piecewise linear spline, irrespective of the innovation distribution. Finally, we show the two mentioned cases to be the only ones within the family for which the Bayesian interpolator is linear.
    No preview · Article · Aug 2013 · IEEE Transactions on Information Theory
  • Philippe Thévenaz · Daniel Sage · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: Edge-preserving smoothers need not be taxed by a severe computational cost. We present, in this paper, a lean algorithm that is inspired by the bi-exponential filter and preserves its structure-a pair of one-tap recursions. By a careful but simple local adaptation of the filter weights to the data, we are able to design an edge-preserving smoother that has a very low memory and computational footprint while requiring a trivial coding effort. We demonstrate that our filter (a bi-exponential edge-preserving smoother, or BEEPS) has formal links with the traditional bilateral filter. On a practical side, we observe that the BEEPS also produces images that are similar to those that would result from the bilateral filter, but at a much-reduced computational cost. The cost per pixel is constant and depends neither on the data nor on the filter parameters, not even on the degree of smoothing.
    No preview · Article · May 2012 · IEEE Transactions on Image Processing
  • Source
    [Show abstract] [Hide abstract] ABSTRACT: We present a new class of continuously defined parametric snakes using a special kind of exponential splines as basis functions. We have enforced our bases to have the shortest possible support subject to some design constraints to maximize efficiency. While the resulting snakes are versatile enough to provide a good approximation of any closed curve in the plane, their most important feature is the fact that they admit ellipses within their span. Thus, they can perfectly generate circular and elliptical shapes. These features are appropriate to delineate cross sections of cylindrical-like conduits and to outline bloblike objects. We address the implementation details and illustrate the capabilities of our snake with synthetic and real data.
    Full-text · Article · Apr 2012 · IEEE Transactions on Image Processing
  • Source
    Ricard Delgado-Gonzalo · Philippe Thévenaz · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: Our interest is to characterize the spline-like integer-shift-invariant bases capable of reproducing exponential polynomial curves. We prove that any compact-support function that reproduces a subspace of the exponential polynomials can be expressed as the convolution of an exponential B-spline with a compact-support distribution. As a direct consequence of this factorization theorem, we show that the minimal-support basis functions of that subspace are linear combinations of derivatives of exponential B-splines. These minimal-support basis functions form a natural multiscale hierarchy, which we utilize to design fast multiresolution algorithms and subdivision schemes for the representation of closed geometric curves. This makes them attractive from a computational point of view. Finally, we illustrate our scheme by constructing minimal-support bases that reproduce ellipses and higher-order harmonic curves.
    Full-text · Article · Feb 2012 · Computer Aided Geometric Design
  • Source
    Daniel Ruijters · Philippe Thévenaz
    [Show abstract] [Hide abstract] ABSTRACT: Achieving accurate interpolation is an important requirement for many signal-processing applications. While nearest-neighbor and linear interpolation methods are popular due to their native GPU support, they unfortunately result in severe undesirable artifacts. Better interpolation methods are known but lack a native GPU support. Yet, a particularly attractive one is prefiltered cubic-spline interpolation. The signal it reconstructs from discrete samples has a much higher fidelity to the original data than what is achievable with nearest-neighbor and linear interpolation. At the same time, its computational load is moderate, provided a sequence of two operations is applied: first, prefilter the samples, and then only reconstruct the signal with the help of a B-spline basis. It has already been established in the literature that the reconstruction step can be implemented efficiently on a GPU. This article focuses on an efficient GPU implementation of the prefilter, on how to apply it to multidimensional samples (e.g., RGB color images), and on its performance aspects.
    Full-text · Article · Jan 2012 · The Computer Journal
  • Source
    Philippe Thévenaz · Ricard Delgado-Gonzalo · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: We propose an active contour (a.k.a. snake) that takes the shape of an ellipse. Its evolution is driven by surface terms made of two contributions: the integral of the data over an inner ellipse, counterbalanced by the integral of the data over an outer elliptical shell. We iteratively adapt the active contour to maximize the contrast between the two domains, which results in a snake that seeks elliptical bright blobs. We provide analytic expressions for the gradient of the snake with respect to its defining parameters, which allows for the use of efficient optimizers. An important contribution here is the parameterization of the ellipse which we define in such a way that all parameters have equal importance; this creates a favorable landscape for the proceedings of the optimizer. We validate our construct with synthetic data and illustrate its use on real data as well.
    Full-text · Article · Feb 2011 · IEEE Transactions on Software Engineering
  • Source
    [Show abstract] [Hide abstract] ABSTRACT: We present a novel algorithm for the registration of 2D image sequences that combines the principles of multiresolution B-spline-based elastic registration and those of bidirectional consistent registration. In our method, consecutive triples of images are iteratively registered to gradually extend the information through the set of images of the entire sequence. The intermediate results are reused for the registration of the following triple. We choose to interpolate the images and model the deformation fields using B-spline multiresolution pyramids. Novel boundary conditions are introduced to better characterize the deformations at the boundaries. In the experimental section, we quantitatively show that our method recovers from barrel/pincushion and fish-eye deformations with subpixel error. Moreover, it is more robust against outliers--occasional strong noise and large rotations--than the state-of-the-art methods. Finally, we show that our method can be used to realign series of histological serial sections, which are often heavily distorted due to folding and tearing of the tissues.
    Full-text · Article · Sep 2010 · Physics in Medicine and Biology
  • Source
    P Thévenaz · A S G Singh · E Bertseva · J Lekki · A J Kulik · M Unser
    [Show abstract] [Hide abstract] ABSTRACT: We propose a system to characterize the 3-D diffusion properties of the probing bead trapped by a photonic-force microscope. We follow a model-based approach, where the model of the dynamics of the bead is given by the Langevin equation. Our procedure combines software and analog hardware to measure the corresponding stiffness matrix. We are able to estimate all its elements in real time, including off-diagonal terms. To achieve our goal, we have built a simple analog computer that performs a continuous preprocessing of the data, which can be subsequently digitized at a much lower rate than is otherwise required. We also provide an effective numerical algorithm for compensating the correlation bias introduced by a quadrant photodiode detector in the microscope. We validate our approach using simulated data and show that our bias-compensation scheme effectively improves the accuracy of the system. Moreover, we perform experiments with the real system and demonstrate real-time capabilities. Finally, we suggest a simple adjunction that would allow one to determine the mass matrix as well.
    Full-text · Article · Mar 2010 · IEEE transactions on nanobioscience
  • Source
    Sathish Ramani · Philippe Thevenaz · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: Interpolation is the means by which a continuously defined model is fit to discrete data samples. When the data samples are exempt of noise, it seems desirable to build the model by fitting them exactly. In medical imaging, where quality is of paramount importance, this ideal situation unfortunately does not occur. In this paper, we propose a scheme that improves on the quality by specifying a tradeoff between fidelity to the data and robustness to the noise. We resort to variational principles, which allow us to impose smoothness constraints on the model for tackling noisy data. Based on shift-, rotation-, and scale-invariant requirements on the model, we show that the L(p)-norm of an appropriate vector derivative is the most suitable choice of regularization for this purpose. In addition to Tikhonov-like quadratic regularization, this includes edge-preserving total-variation-like (TV) regularization. We give algorithms to recover the continuously defined model from noisy samples and also provide a data-driven scheme to determine the optimal amount of regularization. We validate our method with numerical examples where we demonstrate its superiority over an exact fit as well as the benefit of TV-like nonquadratic regularization over Tikhonov-like quadratic regularization.
    Full-text · Article · Feb 2010
  • Uwe Meyer-Baese · Philippe Thevenaz · Thierry Blum
    [Show abstract] [Hide abstract] ABSTRACT: In this paper design options when implementing programmable sampling rate converters are discussed. The theory and performance of the classical and new converters based on maximum order and minimum support (MOMS) splines is presented. FPGA designs are shown in term of resources used, latency, and the quality of results. Second-order (or, equivalently, linear or triangular) and sine-based benchmarks are described and evaluated that show the advantages of the new MOMS-based sampling rate converters. The FPGA synthesis results of 7 different converters are presented as proof of concept.
    No preview · Article · Dec 2009 · Frequenz -Berlin-
  • Source
    [Show abstract] [Hide abstract] ABSTRACT: A traditional photonic-force microscope (PFM) results in huge sets of data, which requires tedious numerical analysis. In this paper, we propose instead an analog signal processor to attain real-time capabilities while retaining the richness of the traditional PFM data. Our system is devoted to intracellular measurements and is fully interactive through the use of a haptic joystick. Using our specialized analog hardware along with a dedicated algorithm, we can extract the full 3D stiffness matrix of the optical trap in real time, including the off-diagonal cross-terms. Our system is also capable of simultaneously recording data for subsequent offline analysis. This allows us to check that a good correlation exists between the classical analysis of stiffness and our real-time measurements. We monitor the PFM beads using an optical microscope. The force-feedback mechanism of the haptic joystick helps us in interactively guiding the bead inside living cells and collecting information from its (possibly anisotropic) environment. The instantaneous stiffness measurements are also displayed in real time on a graphical user interface. The whole system has been built and is operational; here we present early results that confirm the consistency of the real-time measurements with offline computations.
    Full-text · Article · Aug 2009 · Nanotechnology
  • Source
    Olivier Bernard · Denis Friboulet · Philippe Thévenaz · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: In the field of image segmentation, most level-set-based active-contour approaches take advantage of a discrete representation of the associated implicit function. We present in this paper a different formulation where the implicit function is modeled as a continuous parametric function expressed on a B-spline basis. Starting from the active-contour energy functional, we show that this formulation allows us to compute the solution as a restriction of the variational problem on the space spanned by the B-splines. As a consequence, the minimization of the functional is directly obtained in terms of the B-spline coefficients. We also show that each step of this minimization may be expressed through a convolution operation. Because the B-spline functions are separable, this convolution may in turn be performed as a sequence of simple 1-D convolutions, which yields an efficient algorithm. As a further consequence, each step of the level-set evolution may be interpreted as a filtering operation with a B-spline kernel. Such filtering induces an intrinsic smoothing in the algorithm, which can be controlled explicitly via the degree and the scale of the chosen B-spline kernel. We illustrate the behavior of this approach on simulated as well as experimental images from various fields.
    Full-text · Article · May 2009 · IEEE Transactions on Image Processing
  • Source
    A Beuchat · P Thévenaz · M Unser · T Ebner · A Senn · F Urner · M Germond · C O S Sorzano
    [Show abstract] [Hide abstract] ABSTRACT: Identification of embryos with high implantation potential remains a challenge in in vitro fertilization (IVF). Subjective pronuclear (PN) zygote scoring systems have been developed for that purpose. The aim of this work was to provide a software tool that enables objective measuring of morphological characteristics of the human PN zygote. A computer program was created to analyse zygote images semi-automatically, providing precise morphological measurements. The accuracy of this approach was first validated by comparing zygotes from two different IVF centres with computer-assisted measurements or subjective scoring. Computer-assisted measurement and subjective scoring were then compared for their ability to classify zygotes with high and low implantation probability by using a linear discriminant analysis. Zygote images coming from the two IVF centres were analysed with the software, resulting in a series of precise measurements of 24 variables. Using subjective scoring, the cytoplasmic halo was the only feature which was significantly different between the two IVF centres. Computer-assisted measurements revealed significant differences between centres in PN centring, PN proximity, cytoplasmic halo and features related to nucleolar precursor bodies distribution. The zygote classification error achieved with the computer-assisted measurements (0.363) was slightly inferior to that of the subjective ones (0.393). A precise and objective characterization of the morphology of human PN zygotes can be achieved by the use of an advanced image analysis tool. This computer-assisted analysis allows for a better morphological characterization of human zygotes and can be used for classification.
    Full-text · Article · Jul 2008 · Human Reproduction
  • Source
    Olivier Bernard · Denis Friboulet · Philippe Thévenaz · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: In the field of image segmentation, most of level-set-based active contour approaches are based on a discrete representation of the associated implicit function. We present in this paper a different formulation where the level-set is modelled as a continuous parametric function expressed on a B-spline basis. Starting from the Mumford-Shah energy functional, we show that this formulation allows computing the solution as a restriction of the variational problem on the space spanned by the B-splines. As a consequence, the minimization of the functional is directly obtained in terms of the B-splines parameters. We also show that each step of this minimization may be expressed through a convolution operation. Because the B-spline functions are separable, this convolution may in turn be performed as a sequence of simple 1D convolutions, which yields a very efficient algorithm. The behaviour of this approach is illustrated on biomedical images from various fields.
    Full-text · Conference Paper · May 2008
  • Source
    Philippe Thevenaz · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: A snakuscule (a minuscule snake) is the simplest active contour that we were able to design while keeping the quintessence of traditional snakes: an energy term governed by the data, and a regularization term. Our construction is an area-based snake, as opposed to curve-based snakes. It is parameterized by just two points, thus further easing requirements on the optimizer. Despite their ultimate simplicity, snakuscules retain enough versatility to be employed for solving various problems such as cell counting and segmentation of approximately circular features. In this paper, we detail the design process of a snakuscule and illustrate its usefulness through practical examples. We claim that our didactic intentions are well served by the simplicity of snakuscules.
    Full-text · Article · May 2008 · IEEE Transactions on Image Processing
  • Source
    Philippe Thévenaz · Thierry Blu · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: An interpolation model is a necessary ingredient of intensity-based registration methods. The properties of such a model depend entirely on its basis function, which has been traditionally characterized by features such as its order of approximation and its support. However, as has been recently shown, these features are blind to the amount of registration bias created by the interpolation process alone; an additional requirement that has been named constant-variance interpolation is needed to remove this bias. In this paper, we present a theoretical investigation of the role of the interpolation basis in a registration context. Contrarily to published analyses, ours is deterministic; it nevertheless leads to the same conclusion, which is that constant-variance interpolation is beneficial to image registration. In addition, we propose a novel family of interpolation bases that can have any desired order of approximation while maintaining the constant-variance property. Our family includes every constant-variance basis we know of. It is described by an explicit formula that contains two free functional terms: an arbitrary 1-periodic binary function that takes values from {−1, 1}, and another arbitrary function that must satisfy the partition of unity. These degrees of freedom can be harnessed to build many family members for a given order of approximation and a fixed support. We provide the example of a symmetric basis with two orders of approximation that is supported over [− 3 2 , 3 2 ]; this support is one unit shorter than a basis of identical order that had been previously published.
    Full-text · Article · Apr 2008 · Proceedings of SPIE - The International Society for Optical Engineering
  • Source
    [Show abstract] [Hide abstract] ABSTRACT: We construct parametric active contours (snakes) for outlining cells in histology images. These snakes are defined in terms of cubic B-spline basis functions. We use a steerable ridge detector for obtaining a reliable map of the cell boundaries. Using the contour informa-tion, we compute a distance map and specify it as one of the snake energies. To ensure smooth contours, we also introduce a regularization term that favors smooth contours. A convex combination of the two cost func-tions results in smooth contours that lock onto edges ef-ficiently and consistently. Experimental results on real histology images show that the snake algorithm is robust to imperfections in the images such as broken edges.
    Full-text · Article · Jan 2008
  • Source
    Aurélien Bourquard · Philippe Thévenaz · Katarina Balac · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: Because more output data must be created than is available from the input, magnification is an ill-posed problem. Traditional magnification relies on resampling an interpolation model at the appropriate rate; unfortunately, this simple solution is blind to the presence of the analog filter that was implicitly present when the samples of the function to be magnified were acquired. Consistent resampling has been introduced to take this into account, but it turns out that this solution is still under-constrained. In this paper, we propose regularization as a way to devise a deterministic magnification method that fully satisfies consistency constraints in the absence of noise, and at the same time that produces an output that best fulfills a wide class of criteria for regularity. Contrarily to many other methods, ours has been designed without ever leaving the continuous domain. We conduct experiments that show the benefit of our approach.
    Full-text · Conference Paper · Jan 2008
  • Source
    [Show abstract] [Hide abstract] ABSTRACT: One of the main applications of electrophoretic 2-D gels is the analysis of differential responses between different conditions. For this reason, specific spots are present in one of the images, but not in the other. In some other occasions, the same experiment is repeated between 2 and 12 times in order to increase statistical significance. In both situations, one of the major difficulties of these analysis is that 2-D gels are affected by spatial distortions due to run-time differences and dye-front deformations, resulting in images that are significantly dissimilar not only because of their content, but also because of their geometry. In this technical brief, we show how to use free, state-of-the-art image registration and fusion algorithms developed by us for solving the problem of comparing differential expression profiles, or computing an "average" image from a series of virtually identical gels.
    Full-text · Article · Jan 2008 · PROTEOMICS
  • Source
    Sathish Ramani · Philippe Thévenaz · Michael Unser
    [Show abstract] [Hide abstract] ABSTRACT: Interpolation is a vital tool in biomedical signal processing. Although there exists a substantial literature dedicated to noise-free conditions, much less is known in the presence of noise. Here, we document the breakdown of standard interpolation for noisy data and study the performance improvement due to regularized interpolation. In particular, we numerically investigate the Tikhonov (quadratic) regularization. On top of that, we explore non-quadratic regularization and show that this yields further improvements. We derive a novel bounded regularization approach to determine the optimal solution. We justify our claims with experimental results.
    Full-text · Conference Paper · Apr 2007