A New Information Theoretic Measure of PDF Symmetry
CNEL, Florida Univ., FLDOI: 10.1109/ICASSP.2007.366289 Conference: Acoustics, Speech and Signal Processing, 2007. ICASSP 2007. IEEE International Conference on, Volume: 2
Source: IEEE Xplore
In this paper, a new quantity called symmetric information potential (SIP) is proposed to measure the reflection symmetry and to estimate the location parameter of probability density functions. SIP is defined as an inner product in the probability density function space and has a close relation to information theoretic learning. A simple nonparametric estimator directly from the data exists. Experiments demonstrate that this concept can be very useful dealing with impulsive data distributions, in particular, alpha-stable distributions
Full-text previewDOI: · Available from: korea.ac.kr
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.
- [Show abstract] [Hide abstract]
ABSTRACT: Information theoretical learning (ITL) is a signal processing technique that goes far beyond the traditional techniques based on second order statistics which highly relies on the linearity and Gaussinarity assumptions. Information potential (IP) and symmetric information potential (SIP) are very important concepts in ITL used for system adaptation and data inference. In this paper, a mathematical analysis of the bias and the variance of their estimators is presented. Our results show that the variances decrease as the sample size N increases at the speed of O(N<sup>-1</sup>) and a bound exists for the biases. A simple numerical simulation is demonstrated to support our analysis.