Conference PaperPDF Available

A Non-negative Tensor Factorization Approach to Feature Extraction for Image Analysis

Authors:

Abstract and Figures

In this paper, a decomposition method is proposed for Separable Non-negative Tensor Factorization (SNTF), which yields a structure similar to the PARATUCK2 model for the decomposition of non-negative tensors. Among many different possibilities for performing tensor factorization, we develop a specific procedure for SNTF with an aim to decompose multi-way dataset expressed in the form of a tensor into low-rank components that extract dominant features in the data. The SNTF method is evaluated using real image data and the results show that the proposed SNTF is superior to other NTF methods in terms of error performance and computational efficiency.
Content may be subject to copyright.
A preview of the PDF is not available
... A generalization of the separability assumption to higher order tensors is however not straightforward. Recently, an attempt was made to define "pure slices" [32], but another possible generalization is obtained by supposing the columns of factor B are contained in all the KˆM columns of the unfolding matrix T 2 . The drawback of this model is the possibly very large number of correlated atoms in the obtained dictionary T 2 . ...
Preprint
To ensure interpretability of extracted sources in tensor decomposition, we introduce in this paper a dictionary-based tensor canonical polyadic decomposition which enforces one factor to belong exactly to a known dictionary. A new formulation of sparse coding is proposed which enables high dimensional tensors dictionary-based canonical polyadic decomposition. The benefits of using a dictionary in tensor decomposition models are explored both in terms of parameter identifiability and estimation accuracy. Performances of the proposed algorithms are evaluated on the decomposition of simulated data and the unmixing of hyperspectral images.
... A generalization of the separability assumption to higher order tensors is however not straightforward. Recently, an attempt was made to define "pure slices" [31], but another possible generalization is obtained by supposing the columns of factor B are contained in all the KˆM columns of the unfolding matrix T 2 . The drawback of this model is the possibly very large number of correlated atoms in the obtained dictionary T 2 . ...
Article
Full-text available
To ensure interpretability of extracted sources in tensor decomposition, we introduce in this paper a dictionary-based tensor canonical polyadic decomposition which enforces one factor to belong exactly to a known dictionary. A new formulation of sparse coding is proposed which enables high dimensional tensors dictionary-based canonical polyadic decomposition. The benefits of using a dictionary in tensor decomposition models are explored both in terms of parameter identifiability and estimation accuracy. This is illustrated on the decomposition of simulated data and on the unmixing of hyperspectral images.
Article
Full-text available
In this paper, we study the nonnegative matrix factorization problem under the separability assumption (that is, there exists a cone spanned by a small subset of the columns of the input nonnegative data matrix containing all columns), which is equivalent to the hyperspectral unmixing problem under the linear mixing model and the pure-pixel assumption. We present a family of fast recursive algorithms, and prove they are robust under any small perturbations of the input data matrix. This family generalizes several existing hyperspectral unmixing algorithms and hence provides for the first time a theoretical justification of their better practical performance.
Article
Full-text available
We present a generative appearance-based method for recognizing human faces under variation in lighting and viewpoint. Our method exploits the fact that the set of images of an object in fixed pose, but under all possible illumination conditions, is a convex cone in the space of images. Using a small number of training images of each face taken with different lighting directions, the shape and albedo of the face can be reconstructed. In turn, this reconstruction serves as a generative model that can be used to render—or synthesize—images of the face under novel poses and illumination conditions. The pose space is then sampled, and for each pose the corresponding illumination cone is approximated by a low-dimensional linear subspace whose basis vectors are estimated using the generative model. Our recognition algorithm assigns to a test image the identity of the closest approximated illumination cone (based on Euclidean distance within the image space). We test our face recognition method on 4050 images from the Yale Face Database B; these images contain 405 viewing conditions (9 poses 45 illumination conditions) for 10 individuals. The method performs almost without error, except on the most extreme lighting directions, and significantly outperforms popular recognition methods that do not use a generative model.
Book
Standard ALS AlgorithmMethods for Improving Performance and Convergence Speed of ALS AlgorithmsALS Algorithm with Flexible and Generalized Regularization TermsCombined Generalized Regularized ALS AlgorithmsWang-Hancewicz Modified ALS AlgorithmImplementation of Regularized ALS Algorithms for NMFHALS Algorithm and its ExtensionsSimulation ResultsDiscussion and Conclusions Appendix 4.A: MATLAB Source Code for ALS AlgorithmAppendix 4.B: MATLAB Source Code for Regularized ALS AlgorithmsAppendix 4.C: MATLAB Source Code for Mixed ALS-HALS AlgorithmsAppendix 4.D: MATLAB Source Code for HALS CS AlgorithmAppendix 4.E: Additional MATLAB FunctionsReferences
Article
This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or N-way array. Decompositions of higher-order tensors (i.e., N-way arrays with N 3) have applications in psychomet- rics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank- one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.