Independent Component Analysis by Entropy Bound Minimization

Dept. of CSEE, UMBC, Baltimore, MD, USA
IEEE Transactions on Signal Processing (Impact Factor: 2.79). 11/2010; 58(10):5151 - 5164. DOI: 10.1109/TSP.2010.2055859
Source: IEEE Xplore


A novel (differential) entropy estimator is introduced where the maximum entropy bound is used to approximate the entropy given the observations, and is computed using a numerical procedure thus resulting in accurate estimates for the entropy. We show that such an estimator exists for a wide class of measuring functions, and provide a number of design examples to demonstrate its flexible nature. We then derive a novel independent component analysis (ICA) algorithm that uses the entropy estimate thus obtained, ICA by entropy bound minimization (ICA-EBM). The algorithm adopts a line search procedure, and initially uses updates that constrain the demixing matrix to be orthogonal for robust performance. We demonstrate the superior performance of ICA-EBM and its ability to match sources that come from a wide range of distributions using simulated and real-world data.

17 Reads
  • Source
    • "The estimated bound of entropy is minimized to find the ICAs. The algorithm adopts a line search procedure initially constraining the demixing matrix to be orthogonal (Li and Adali, 2010b). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Independent component analysis (ICA) techniques offer a data-driven possibility to analyze brain functional MRI data in real-time. Typical ICA methods used in functional magnetic resonance imaging (fMRI), however, have been until now mostly developed and optimized for the off-line case in which all data is available. Real-time experiments are ill-posed for ICA in that several constraints are added: limited data, limited analysis time and dynamic changes in the data and computational speed. Previous studies have shown that particular choices of ICA parameters can be used to monitor real-time fMRI (rt-fMRI) brain activation, but it is unknown how other choices would perform. In this rt-fMRI simulation study we investigate and compare the performance of 14 different publicly available ICA algorithms systematically sampling different growing window lengths (WLs), model order (MO) as well as a priori conditions (none, spatial or temporal). Performance is evaluated by computing the spatial and temporal correlation to a target component as well as computation time. Four algorithms are identified as best performing (constrained ICA, fastICA, amuse, and evd), with their corresponding parameter choices. Both spatial and temporal priors are found to provide equal or improved performances in similarity to the target compared with their off-line counterpart, with greatly reduced computation costs. This study suggests parameter choices that can be further investigated in a sliding-window approach for a rt-fMRI experiment.
    Frontiers in Human Neuroscience 02/2013; 7:19. DOI:10.3389/fnhum.2013.00019 · 3.63 Impact Factor
  • Source
    • "led to ICA algorithms such as EFICA [23] and adaptive complex maximization of non-Gaussianity [26]. ICA by entropy bound minimization (ICA-EBM) [21] use an efficient entropy estimator based on the bounding of the entropy estimates, and with only a few measuring functions, can approximate the pdf of a wide range of densities including sub-or super-Gaussian, unimodal or multimodal, symmetric or skewed distributions. A more powerful class takes both sample correlation and non- Gaussianity into account. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Since the discovery of functional connectivity in fMRI data (i.e., temporal correlations between spatially distinct regions of the brain) there has been a considerable amount of work in this field. One important focus has been on the analysis of brain connectivity using the concept of networks instead of regions. Approximately ten years ago, two important research areas grew out of this concept. First, a network proposed to be "a default mode of brain function" since dubbed the default mode network was proposed by Raichle. Secondly, multisubject or group independent component analysis (ICA) provided a data-driven approach to study properties of brain networks, including the default mode network. In this paper, we provide a focused review of how ICA has contributed to the study of intrinsic networks. We discuss some methodological considerations for group ICA and highlight multiple analytic approaches for studying brain networks. We also show examples of some of the differences observed in the default mode and resting networks in the diseased brain. In summary, we are in exciting times and still just beginning to reap the benefits of the richness of functional brain networks as well as available analytic approaches.
    12/2012; 5:60-73. DOI:10.1109/RBME.2012.2211076
  • Source
    • "Another approach to density matching is entropy estimation through the use of a number of suitably chosen measuring functions [70]. Such an approach can provide robust performance even with a small set of chosen measuring functions [70]. Since the demixing matrix is not constrained in the maximum likelihood approach, exact density matching for each Fig. 13. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Complex-valued signals occur in many areas of science and engineering and are thus of fundamental interest. In the past, it has often been assumed, usually implicitly, that complex random signals are proper or circular. A proper complex random variable is uncorrelated with its complex conjugate, and a circular complex random variable has a probability distribution that is invariant under rotation in the complex plane. While these assumptions are convenient because they simplify computations, there are many cases where proper and circular random signals are very poor models of the underlying physics. When taking impropriety and noncircularity into account, the right type of processing can provide significant performance gains. There are two key ingredients in the statistical signal processing of complex-valued data: 1) utilizing the complete statistical characterization of complex-valued random signals; and 2) the optimization of real-valued cost functions with respect to complex parameters. In this overview article, we review the necessary tools, among which are widely linear transformations, augmented statistical descriptions, and Wirtinger calculus. We also present some selected recent developments in the field of complex-valued signal processing, addressing the topics of model selection, filtering, and source separation.
    IEEE Transactions on Signal Processing 12/2011; 59(11-59):5101 - 5125. DOI:10.1109/TSP.2011.2162954 · 2.79 Impact Factor
Show more


17 Reads
Available from