Conference Paper

A New Information Theoretic Measure of PDF Symmetry

CNEL, Florida Univ., FL
DOI: 10.1109/ICASSP.2007.366289 Conference: Acoustics, Speech and Signal Processing, 2007. ICASSP 2007. IEEE International Conference on, Volume: 2
Source: IEEE Xplore


In this paper, a new quantity called symmetric information potential (SIP) is proposed to measure the reflection symmetry and to estimate the location parameter of probability density functions. SIP is defined as an inner product in the probability density function space and has a close relation to information theoretic learning. A simple nonparametric estimator directly from the data exists. Experiments demonstrate that this concept can be very useful dealing with impulsive data distributions, in particular, alpha-stable distributions

15 Reads
  • [Show abstract] [Hide abstract]
    ABSTRACT: Information theoretical learning (ITL) is a signal processing technique that goes far beyond the traditional techniques based on second order statistics which highly relies on the linearity and Gaussinarity assumptions. Information potential (IP) and symmetric information potential (SIP) are very important concepts in ITL used for system adaptation and data inference. In this paper, a mathematical analysis of the bias and the variance of their estimators is presented. Our results show that the variances decrease as the sample size N increases at the speed of O(N<sup>-1</sup>) and a bound exists for the biases. A simple numerical simulation is demonstrated to support our analysis.
    Machine Learning for Signal Processing, 2007 IEEE Workshop on; 09/2007


15 Reads
Available from