About
305
Publications
31,895
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
11,836
Citations
Introduction
Current institution
Additional affiliations
September 2017 - August 2022
September 2019 - present
Artificial and Natural Intelligence Toulouse Institute (ANITI)
Position
- Chair
September 2016 - present
Education
October 2011 - October 2012
October 2004 - September 2007
September 2003 - September 2004
Publications
Publications (305)
Multichannel blind source separation (MBSS), which focuses on separating signals of interest from mixed observations, has been extensively studied in acoustic and speech processing. Existing MBSS algorithms, such as independent low-rank matrix analysis (ILRMA) and multichannel nonnegative matrix factorization (MNMF), utilize the low-rank structure...
Incomplete multiview clustering (IMVC) has gained significant attention for its effectiveness in handling missing sample challenges across various views in real-world multiview clustering applications. Most IMVC approaches tackle this problem by either learning consensus representations from available views or reconstructing missing samples using t...
Blind source separation (BSS) refers to the process of recovering multiple source signals from observations recorded by an array of sensors. Common approaches to BSS, including independent vector analysis (IVA), and independent low-rank matrix analysis (ILRMA), typically rely on second-order models to capture the statistical independence of source...
Recent works on smart scanning techniques in Raman micro-imaging demonstrate the possibility of highly reducing acquisition time. In particular, Gilet et al. [Optics Express 32, 932 (2024)10.1364/OE.509736] proposed a protocol combining compression in both spectral and spatial domains by focusing on essential information. This protocol consists of...
This paper introduces a Bayesian framework for image inversion by deriving a probabilistic counterpart to the regularization-by-denoising (RED) paradigm. It additionally implements a Monte Carlo algorithm specifically tailored for sampling from the resulting posterior distribution, based on an asymptotically exact data augmentation (AXDA). The prop...
Normalizing flows (NF) use a continuous generator to map a simple latent (e.g. Gaussian) distribution, towards an empirical target distribution associated with a training data set. Once trained by minimizing a variational objective, the learnt map provides an approximate generative model of the target distribution. Since standard NF implement diffe...
This work introduces an on-the-fly (i.e., online) linear unmixing method which is able to sequentially analyze spectral data acquired on a spectrum-by-spectrum basis. After deriving a sequential counterpart of the conventional linear mixing model, the proposed approach recasts the linear unmixing problem into a linear state-space estimation framewo...
This paper introduces a stochastic plug-and-play (PnP) sampling algorithm that leverages variable splitting to efficiently sample from a posterior distribution. The algorithm based on split Gibbs sampling (SGS) draws inspiration from the half quadratic splitting method (HQS) and the alternating direction method of multipliers (ADMM). It divides the...
Spectral unmixing has been extensively studied with a variety of methods and used in many applications. Recently, data-driven techniques with deep learning methods have obtained great attention to spectral unmixing for its superior learning ability to automatically learn the structure information. In particular, autoencoder based architectures are...
In the context of spectral unmixing, essential information corresponds to the most linearly dissimilar rows and/or columns of a two-way data matrix which are indispensable to reproduce the full data matrix in a convex linear way. Essential information has recently been shown accessible on-the-fly via a decomposition of the measured spectra in the F...
Hicieron falta varias décadas de interacción y diálogo entre las ciencias humanas y las ciencias medioambientales, además de una serie de avances metodológicos, para que los espacios de altura fueran considerados algo más que inmutables y átonos. Sometidos a un cuestionamiento interdisciplinario integrado, revelan no una, sino muchas historias y di...
When adopting a model-based formulation, solving inverse problems encountered in multiband imaging requires to define spatial and spectral regularizations. In most of the works of the literature, spectral information is extracted from the observations directly to derive data-driven spectral priors. Conversely, the choice of the spatial regularizati...
In the context of multivariate curve resolution (MCR) and spectral unmixing, essential information (EI) corresponds to the most linearly dissimilar rows and/or columns of a two-way data matrix. In recent works, the assessment of EI has been revealed to be a very useful practical tool to select the most relevant spectral information before MCR analy...
In the context of multivariate curve resolution (MCR) and spectral unmixing, essential information (EI) corresponds to the most linearly dissimilar rows or/and columns of a two-way data matrix. In recent works, the assessment of EI has been revealed to be a very useful practical tool to select the most relevant spectral information before MCR analy...
In the context of multivariate curve resolution (MCR) and spectral unmixing, essential information (EI) corresponds to the most linearly dissimilar rows or/and columns of a two-way data matrix. In recent works, the assessment of EI has been revealed to be a very useful practical tool to select the most relevant spectral information before MCR analy...
Spectral unmixing has been extensively studied with a variety of methods and used in many applications. Recently, data-driven techniques with deep learning methods have obtained great attention to spectral unmixing for its superior learning ability to automatically learn the structure information. In particular, autoencoder based architectures are...
When adopting a model-based formulation, solving inverse problems encountered in multiband imaging requires to define spatial and spectral regularizations. In most of the works of the literature, spectral information is extracted from the observations directly to derive data-driven spectral priors. Conversely, the choice of the spatial regularizati...
Normalizing flows (NF) use a continuous generator to map a simple latent (e.g. Gaussian) distribution, towards an empirical target distribution associated with a training data set. Once trained by minimizing a variational objective, the learnt map provides an approximate generative model of the target distribution. Since standard NF implement diffe...
This paper introduces a stochastic plug-and-play (PnP) sampling algorithm that leverages variable splitting to efficiently sample from a posterior distribution. The algorithm based on split Gibbs sampling (SGS) draws inspiration from the alternating direction method of multipliers (ADMM). It divides the challenging task of posterior sampling into t...
In the context of multivariate curve resolution (MCR) and spectral unmixing, essential information (EI) corresponds to the most linearly dissimilar rows or/and columns of a two-way data matrix. These rows/columns are called essential because they are indispensable to reproduce the full data matrix in a convex linear way. The selection of EI is driv...
Optimal transport (OT) provides effective tools for comparing and mapping probability measures. We propose to leverage the flexibility of neural networks to learn an approximate optimal transport map. More precisely, we present a new and original method to address the problem of transporting a finite set of samples associated with a first underlyin...
In this paper we consider the problem of linear unmixing hidden random variables defined over the simplex with additive Gaussian noise, also known as probabilistic simplex component analysis (PRISM). Previous solutions to tackle this challenging problem were based on geometrical approaches or computationally intensive variational methods. In contra...
In this paper we consider the problem of linear unmixing hidden random variables defined over the simplex with additive Gaussian noise, also known as probabilistic simplex component analysis (PRISM). Previous solutions to tackle this challenging problem were based on geometrical approaches or computationally intensive variational methods. In contra...
This letter proposes a fast yet efficient method to solve the hyperspectral unmixing problem in the challenging unsupervised context, i.e., when the endmember spectral signatures are unknown. First, a coarse approximation of the hyperspectral image is computed by spatially averaging neighboring pixels, which significantly reduces the amount of pixe...
We propose a simple yet efficient sparse unmixing method for hyperspectral images. It exploits the spatial and spectral properties of hyperspectral images by designing a new regularization informed by multiscale analysis. The proposed approach consists of two steps. First, a sparse unmixing is conducted on a coarse hyperspectral image resulting fro...
When no arterial input function is available, quantification of dynamic PET images requires a previous step devoted to the extraction of a reference time-activity curve (TAC). Factor analysis is often applied for this purpose. This paper introduces a novel approach that conducts a new kind of nonlinear factor analysis relying on a compartment model...
This letter proposes a simple, fast yet efficient sparse hyperspectral unmixing algorithm. The proposed method consists of three main steps. First, a coarse approximation of the hyperspectral image is built using a off-the-shelf segmentation algorithm. Then, a low-resolution approximation of the abundance map is estimated by solving a weighted ℓ
<s...
Despite their advantages, normalizing flows generally suffer from several shortcomings including their tendency to generate unrealistic data (e.g., images) and their failing to detect out-of-distribution data. One reason for these deficiencies lies in the training strategy which traditionally exploits a maximum likelihood principle only. This paper...
Optimal transport (OT) provides effective tools for comparing and mapping probability measures. We propose to leverage the flexibility of neural networks to learn an approximate optimal transport map. More precisely, we present a new and original method to address the problem of transporting a finite set of samples associated with a first underlyin...
In the context of Earth observation, the detection of changes is performed from multitemporal images acquired by sensors with possibly different characteristics and modalities. Even when restricting to the optical modality, this task has proved to be challenging as soon as the sensors provide images of different spatial and/or spectral resolutions....
Hyperspectral unmixing plays an important role in hyperspectral image processing and analysis. It aims to decompose mixed pixels into pure spectral signatures and their associated abundances. The hyperspectral image contains spatial information in neighborhood regions, and spectral signatures existing in the region also have a high correlation. How...
This letter proposes a weighted residual nonnegative matrix factorization (NMF) with spatial regularization to unmix hyperspectral (HS) data. NMF decomposes a matrix into the product of two nonnegative matrices. However, NMF is known to be generally sensitive to noise, which makes it difficult to retrieve the global minimum of the underlying object...
Efficient sampling from a high-dimensional Gaussian distribution is an old but high-stake issue. Vanilla Cholesky samplers imply a computational cost and memory requirements which can rapidly become prohibitive in high dimension. To tackle these issues, multiple methods have been proposed from different communities ranging from iterative numerical...
In this work, we tackle the problem of hyperspectral (HS) unmixing by departing from the usual linear model and focusing on a Linear-Quadratic (LQ) one. The proposed algorithm, referred to as Successive Nonnegative Projection Algorithm for Linear Quadratic mixtures (SNPALQ), extends the Successive Nonnegative Projection Algorithm (SNPA), designed t...
In this work, we consider the problem of blind source separation (BSS) by departing from the usual linear model and focusing on the linear-quadratic (LQ) model. We propose two provably robust and computationally tractable algorithms to tackle this problem under separability assumptions which require the sources to appear as samples in the data set....
When no arterial input function is available, quantification of dynamic PET images requires a previous step devoted to the extraction of a reference time-activity curve (TAC). Factor analysis is often applied for this purpose. This paper introduces a novel approach that conducts a new kind of nonlinear factor analysis relying on a compartment model...
Efficient sampling from a high-dimensional Gaussian distribution is an old but high-stake issue. In past years, multiple methods have been proposed from different communities to tackle this difficult sampling task ranging from iterative numerical linear algebra to Markov chain Monte Carlo (MCMC) approaches. Surprisingly, no complete review and comp...
Hyperspectral imaging has become a significant source of valuable data for astronomers over the past decades. Current instrumental and observing time constraints allow direct acquisition of multispectral images, with high spatial but low spectral resolution, and hyperspectral images, with low spatial but high spectral resolution. To enhance scienti...
Accounting for endmember variability is a challenging issue when unmixing hyperspectral data. This paper models the variability that is associated with each endmember as a conical hull defined by extremal pixels from the data set. These extremal pixels are considered as so-called prototypal endmember spectra that have meaningful physical interpreta...
The James Webb Space Telescope (JWST) will provide multispectral and hyperspectral infrared images of a large number of astrophysical scenes. Multispectral images will have the highest angular resolution, while hyperspectral images (e.g., with integral field unit spectrometers) will provide the best spectral resolution. This paper aims at providing...
This paper discusses the reconstruction of partially sampled spectrum-images to accelerate the acquisition in scanning transmission electron microscopy (STEM). The problem of image reconstruction has been widely considered in the literature for many imaging modalities, but only a few attempts handled 3D data such as spectral images acquired by STEM...
This paper discusses the reconstruction of partially sampled spectrum-images to accelerate the acquisition in scanning transmission electron microscopy (STEM). The problem of image reconstruction has been widely considered in the literature for many imaging modalities, but only a few attempts handled 3D data such as spectral images acquired by STEM...
Hyperspectral unmixing aims at identifying a set of elementary spectra and the corresponding mixture coefficients for each pixel of an image. As the elementary spectra correspond to the reflectance spectra of real materials, they are often very correlated, thus yielding an ill-conditioned problem. To enrich the model and reduce ambiguity due to the...
This paper aims at providing a comprehensive framework to generate an astrophysical scene and to simulate realistic hyperspectral and multispectral data acquired by two JWST instruments, namely NIRCam Imager and NIRSpec IFU. We want to show that this simulation framework can be resorted to assess the benefits of fusing these images to recover an im...
Hyperspectral imaging has become a significant source of valuable data for astronomers over the past decades. Current instrumental and observing time constraints allow direct acquisition of multispectral images, with high spatial but low spectral resolution, and hyperspectral images, with low spatial but high spectral resolution. To enhance scienti...
The spatial pixel resolution of common multispectral and hyperspectral sensors is generally not sufficient to avoid that multiple elementary materials contribute to the observed spectrum of a single pixel. To alleviate this limitation, spectral unmixing is a by‐pass procedure which consists in decomposing the observed spectra associated with these...
Archetypal scenarios for change detection generally consider two images acquired through sensors of the same modality. However, in some specific cases such as emergency situations, the only images available may be those acquired through sensors of different modalities. This paper addresses the problem of unsupervisedly detecting changes between two...
Jointly segmenting a collection of images with shared classes is expected to yield better results than single-image based methods, due to the use of the shared statistical information across different images. This paper proposes a Bayesian approach for tackling this problem. As a first contribution, the proposed method relies on a new prior distrib...
Hyperspectral unmixing aims at identifying a set of elementary spectra and the corresponding mixture coefficients for each pixel of an image. As the elementary spectra correspond to the reflectance spectra of real materials, they are often very correlated yielding an ill-conditioned problem. To enrich the model and to reduce ambiguity due to the hi...
Markov chain Monte Carlo (MCMC) methods are an important class of computation techniques to solve Bayesian inference problems. Much recent research has been dedicated to scale these algorithms in high-dimensional settings by relying on powerful optimization tools such as gradient information or proximity operators. In a similar vein, this paper pro...
In recent years, much research has been devoted to the restoration of Poissonian images using optimization-based methods. On the other hand, the derivation of efficient and general fully Bayesian approaches is still an active area of research and especially if standard regularization functions are used, e.g. the total variation (TV) norm. This pape...
Factor analysis has proven to be a relevant tool for extracting tissue time-activity curves (TACs) in dynamic PET images, since it allows for an unsupervised analysis of the data. Reliable and interpretable results are possible only if considered with respect to suitable noise statistics. However, the noise in reconstructed dynamic PET images is ve...
Supervised classification and representation learning are two widely used methods to analyze multivariate images. Although complementary, these two classes of methods have been scarcely considered jointly. In this paper, a method coupling these two approaches is designed using a matrix cofactorization formulation. Each task is modeled as a factoriz...
Spectral variability is one of the major issues when conducting hyperspectral unmixing. Within a given image composed of some elementary materials (herein referred to as endmember classes), the spectral signatures characterizing these classes may spatially vary due to intrinsic component fluctuations or external factors (illumination). These redund...
Data augmentation, by the introduction of auxiliary variables, has become an ubiquitous technique to improve mixing/convergence properties, simplify the implementation or reduce the computational time of inference methods such as Markov chain Monte Carlo. Nonetheless, introducing appropriate auxiliary variables while preserving the initial target p...
To analyze dynamic positron emission tomography (PET) images, various generic multivariate data analysis techniques have been considered in the literature, such as clustering, principal component analysis (PCA), independent component analysis (ICA) and non-negative matrix factorization (NMF). Nevertheless, these conventional approaches generally fa...
Factor analysis has proven to be a relevant tool for extracting tissue time-activity curves (TACs) in dynamic PET images, since it allows for an unsupervised analysis of the data. To provide reliable and interpretable outputs, it requires to be conducted with respect to a suitable noise statistics. However, the noise in reconstructed dynamic PET im...
Only a few research works consider LiDAR data while conducting hyperspectral image unmixing. However, the digital surface model derived from LiDAR can provide meaningful information, in particular when spatially regularizing the inverse problem underlain by spectral unmixing. This paper proposes a general framework for spectral unmixing that incorp...
Archetypal scenarios for change detection generally consider two images acquired through sensors of the same modality. However, in some specific cases such as emergency situations, the only images available may be those acquired through sensors with different characteristics. This paper addresses the problem of unsupervisedly detecting changes betw...
Logistic regression has been extensively used to perform classification in machine learning and signal/image processing. Bayesian formulations of this model with sparsity-inducing priors are particularly relevant when one is interested in drawing credibility intervals with few active coefficients. Along these lines, the derivation of efficient simu...
Spectral variability is one of the major issue when conducting hyperspectral unmixing. Within a given image composed of some elementary materials (herein referred to as endmember classes), the spectral signature characterizing these classes may spatially vary due to intrinsic component fluctuations or external factors (illumination). These redundan...
Recently, a new class of Markov chain Monte Carlo (MCMC) algorithms took advantage of convex optimization to build efficient and fast sampling schemes from high-dimensional distributions. Variable splitting methods have become classical in optimization to divide difficult problems in simpler ones and have proven their efficiency in solving high-dim...
This paper derives two new optimization-driven Monte Carlo algorithms inspired from variable splitting and data augmentation. In particular, the formulation of one of the proposed approaches is closely related to the alternating direction method of multipliers (ADMM) main steps. The proposed framework enables to derive faster and more efficient sam...
Supervised classification and spectral unmixing are two methods to extract information from hyperspectral images. However, despite their complementarity, they have been scarcely considered jointly. This paper presents a new hierarchical Bayesian model to perform simultaneously both analysis in order to ensure that they benefit from each other. A li...
Unsupervised change detection techniques are generally constrained to two multi-band optical images acquired at different times through sensors sharing the same spatial and spectral resolution. This scenario is suitable for a straight comparison of homologous pixels such as pixel-wise differencing. However, in some specific cases such as emergency...
Unsupervised change detection techniques are generally constrained to two multi-band optical images acquired at different times through sensors sharing the same spatial and spectral resolution. This scenario is suitable for a straight comparison of homologous pixels such as pixel-wise differencing. However, in some specific cases such as emergency...
Remote sensing data are often degraded by many issues that may include the failure of onboard hardware, signal downlink, atmospheric conditions, and overall quality/age of the sensors (for example, in terms of signal-noise ratio or sharpness).
Electron microscopy has shown to be a very powerful tool to map the chemical nature of samples at various scales down to atomic resolution. However, many samples can not be analyzed with an acceptable signal-to-noise ratio because of the radiation damage induced by the electron beam. This is particularly crucial for electron energy loss spectroscop...
Electron microscopy has shown to be a very powerful tool to map the chemical nature of samples at various scales down to atomic resolution. However, many samples can not be analyzed with an acceptable signal-to-noise ratio because of the radiation damage induced by the electron beam. This is particularly crucial for electron energy loss spectroscop...
Spectral unmixing methods incorporating spatial regularizations have demonstrated increasing interest. Although spatial regularizers which promote smoothness of the abundance maps have been widely used, they may overly smooth these maps and, in particular, may not preserve edges present in the hyperspectral image. Existing unmixing methods usually...
Spectral unmixing methods incorporating spatial regularizations have demonstrated increasing interest. Although spatial regularizers which promote smoothness of the abundance maps have been widely used, they may overly smooth these maps and, in particular, may not preserve edges present in the hyperspectral image. Existing unmixing methods usually...
Within a supervised classification framework, labeled data are used to learn classifier parameters. Prior to that, it is generally required to perform dimensionality reduction via feature extraction. These preprocessing steps have motivated numerous research works aiming at recovering latent variables in an unsupervised context. This paper proposes...
Within a supervised classification framework, labeled data are used to learn classifier parameters. Prior to that, it is generally required to perform dimensionality reduction via feature extraction. These preprocessing steps have motivated numerous research works aiming at recovering latent variables in an unsupervised context. This paper proposes...