Conference Paper

Proximal Methods for Sparse Hierarchical Dictionary Learning.

Conference: Proceedings of the 27th International Conference on Machine Learning (ICML-10), June 21-24, 2010, Haifa, Israel
Source: DBLP

ABSTRACT We propose to combine two approaches for mod- eling data admitting sparse representations: on the one hand, dictionary learning has proven ef- fective for various signal processing tasks. On the other hand, recent work on structured spar- sity provides a natural framework for modeling dependencies between dictionary elements. We thus consider a tree-structured sparse regulariza- tion to learn dictionaries embedded in a hierar- chy. The involved proximal operator is com- putable exactly via a primal-dual method, allow- ing the use of accelerated gradient techniques. Experiments show that for natural image patches, learned dictionary elements organize themselves in such a hierarchical structure, leading to an im- proved performance for restoration tasks. When applied to text documents, our method learns hi- erarchies of topics, thus providing a competitive alternative to probabilistic topic models.

0 Bookmarks
 · 
84 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider the problem of learning overcomplete dictionaries in the context of sparse coding, where each sample selects a sparse subset of dictionary elements. Our method consists of two stages, \viz initial estimation of the dictionary, and a clean-up phase involving estimation of the coefficient matrix, and re-estimation of the dictionary. We prove that our method exactly recovers both the dictionary and the coefficient matrix under a set of sufficient conditions.
    09/2013;
  • [Show abstract] [Hide abstract]
    ABSTRACT: In many machine learning and pattern analysis applications, grouping of features during model development and the selection of a small number of relevant groups can be useful to improve the interpretability of the learned parameters. Although this problem has been receiving a significant amount of attention lately, most of the approaches require the manual tuning of one or more hyper-parameters. In order to overcome this drawback, this work presents a novel hierarchical Bayesian formulation of a generalized linear model and estimates the posterior distribution of the parameters and hyper-parameters of the model within a completely Bayesian paradigm based on variational inference. All the required computations are analytically tractable. The performance and applicability of the proposed framework is demonstrated on synthetic and real world examples.
    International Journal of Machine Learning and Cybernetics. 12/2013;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Recent studies have demonstrated advantages of information fusion based on sparsity models for multimodal classification. Among several sparsity models, tree-structured sparsity provides a flexible framework for extraction of cross-correlated information from different sources and for enforcing group sparsity at multiple granularities. However, the existing algorithm only solves an approximated version of the cost functional and the resulting solution is not necessarily sparse at group levels. This paper reformulates the tree-structured sparse model for multimodal classification task. An accelerated proximal algorithm is proposed to solve the optimization problem, which is an efficient tool for feature-level fusion among either homogeneous or heterogeneous sources of information. In addition, a (fuzzy-set-theoretic) possibilistic scheme is proposed to weight the available modalities, based on their respective reliability, in a joint optimization problem for finding the sparsity codes. This approach provides a general framework for quality-based fusion that offers added robustness to several sparsity-based multimodal classification algorithms. To demonstrate their efficacy, the proposed methods are evaluated on three different applications - multiview face recognition, multimodal face recognition, and target classification.
    03/2014;

Full-text (2 Sources)

View
15 Downloads
Available from
Jun 3, 2014