ArticlePDF Available

Abstract and Figures

Blind source separation is an important but highly challenging technology in astronomy, physics, chemistry, life science, medical science, earth science, and applied sciences. Independent Component Analysis (ICA) employed technologies in applied computer science for blind source separation. In the separation of blind sources under multiple sensors, it can estimate approximately the types of signal. This study proposed a modified ICA algorithm which can estimate the actual phase and amplitude and retrieve the signals separated by blind source separation to its original state. This method has great potential for application in many different fields.
Content may be subject to copyright.
A Modified Method for Blind Source Separation
SHIH-LIN LIN, PI-CHENG TUNG
National Central University
Department of Mechanical Engineering
No.300, Jhongda Rd., Jhongli City, Taoyuan County 32001,
TAIWAN
t331166@ncu.edu.tw (P.C. Tung)
Abstract: - Blind source separation is an important but highly challenging technology in astronomy, physics,
chemistry, life science, medical science, earth science, and applied sciences. Independent Component Analysis
(ICA) employed technologies in applied computer science for blind source separation. In the separation of blind
sources under multiple sensors, it can estimate approximately the types of signal. This study proposed a
modified ICA algorithm which can estimate the actual phase and amplitude and retrieve the signals separated by
blind source separation to its original state. This method has great potential for application in many different
fields.
Key-Words: - Computer Science; Blind Source Separation; Independent Component Analysis
1 Introduction
Independent Component Analysis is a technology
that incorporates statistics, computer science, and
digital signal processing. It can estimate the original
signal from the mixed signals being measured. To
separate mixed signals of diverse sources is a
difficult task. ICA is one of the approaches in the
studies of signal separation. The early studies of blind
source separation are represented by Herault et al [1].
They proposed a novel neural network learning
algorithm [2]. Based on the feedback neural network,
this learning algorithm can separate mixed
independent signals and accomplish the goal of blind
source separation by means of selecting odd
nonlinear function to establish Hebb training. This
paper immediately drew the attention among the
scientists in the studies of neural network and signal
processing. In 1994, Comon introduced the concept
of independent component analysis and proposed
cost functions and the uncertainty of signal retrieval,
and so forth [3]. In 1995, Bell and Sejnowski
published another landmark study [4],[5]. Their
study has three major contributions. First, it is the
first time the neural network with sigmoid nonlinear
function was employed to cancel the high-order
statistic correlation in the measured signals. Second,
the study established contrastic function on the
principle of information maximization, thereby
incorporating ICA and information theories. Third, it
developed a line iteration learning algorithm
(Infomax algorithm), which successfully separated
ten mixed voice signals. However, the algorithm
requires matrix inverse and the convergence speed is
slow. The effectiveness of the algorithm is affected
by the ways of mixture in the original signals. It can
only separate super Gaussian signals. Despite these
disadvantages, the study led to increasing research
interest in ICA. In 1999, Hyvärinen proposed a fast
iteration algorithm called FastICA, with greatly
increased convergence speed [6]-[9]. The major
applications of ICA may be divided into two types.
One is InfomaxICA proposed by Lee in 1998 [4],[5].
The other is FastICA by Hyvärinen in 1999. The
latter is based on artificial neural network learning
algorithm. After derivation, it may be completed by
fixed-point algorithm with a faster convergence
speed.
In recent years, there have been broad
applications of ICA in diverse fields [10]-[24]. In
addition to the processing of acoustic and imaging
signals, applications of ICA are also found in feature
extraction, financial data, and telecommunication.
One of the major applications of ICA in biomedical
research is the analysis of Electroencephalogram
(EEG). ICA was proposed by Makeig et al. in 2002.
They analyzed signals of electroencephalography
(EEG) with ICA to understand the correlation
between brain activities and finger movement [24].
EEG records electrical potentials of brain activities
by means of sensors installed in various positions on
the epicranium. The electrical potential is composed
of basic components of brain activities and some
noises [2],[23]. If these components are independent,
ICA is able to retrieve the specific information that
interests us regarding brain activities.
Proceedings of the 6th WSEAS International Conference on Applied Computer Science, Tenerife, Canary Islands, Spain, December 16-18, 2006 543
However, ICA is not without disadvantages.
During estimation, aliasing and errors may occur.
The estimate results may have opposite phase and
unequal amplitude, leading to aliasing after the
original signals are retrieved. This study proposed an
modified ICA, which can estimate the actual phase
and amplitude, allowing precise retrieval of the
original signals.
2 Cocktail-party problem
2.1 Traditional ICA
Cocktail-party problem is the most famous example
of ICA application [4]-[6],[9],. In a cocktail party, as
shown in Figure 1, there are four different positions
1
s
, 2
s
, 3
s
, and 4
s
, and four original signals. 1()st
is the sound of a police car; 2()st, the rock music;
3()st, the classical music; and 4()st, the speaking
voices. All the original signals are supposed to satisfy
statistical independence. In traditional ICA, the
signals are supposed to be in non-Gaussian
probability distribution, with certain exception for
some of the components. The modified ICA can use
either Gaussian or non-Gaussian probability
distributed signals and therefore has broader
application, as most signals in our daily life are in
Gaussian distribution. When the four sound sources
occur at the same time, we can use four microphones
are used to record sounds at different positions in the
meeting.
Fig. 1. Cocktail-party problem.
Since the sounds are all mixed, it is not possible to
distinguish each individual’s words from the signals
received. The distance between the microphone and
each original signal may be represented by the A
matrix: 4-by- 4, where A should satisfied the
conditions of full-rank matrix, also called mixing
matrix. The signals represented as
1234
() [ , , , ]tssss
=
s entered the A matrix.
Through the A matrix, the four signals interfere and
mixed with one another, producing a mixed ()tx,
which contained four signals 1()
x
t, 2()
x
t, 3()
t,
and 4()
x
t. The signals received by the microphones
may be represented as the following:
1111122133144
2 21 1 22 2 23 3 24 4
3311322333344
4 41 1 42 2 43 3 44 4
() () () () ()
() () () () ()
() () () () ()
() () () () ()
x
t ast ast ast as t
x
t ast ast ast ast
x
t ast ast ast ast
x
t ast ast ast ast
=
+++
=+++
=+++
=+++
(1)
In the above equation, A is unknown, which makes
the retrieval of 1234
() ( (), (), (), ())t stststst
=
s
difficult. However, ICA can retrieve the related
1()
x
t, 2()
x
t, 3()
t,and 4()
x
t signals into
statistically independent signals. The estimated W
matrix is called unmixing matrix. ICA aims to
estimate 1
WA and W matrix and retrieving the
statistically independent signals. The equation for
signal retrieval may be shown as the following:
1 11 1 12 2 13 3 14 4
2211222233244
3311322333344
4411422433444
() () () () ()
() () () () ()
() () () () ()
() () () () ()
utwxtwxtwxtwxt
utwxtwxtwxtwxt
utwxtwxtwxtwxt
utwxtwxtwxtwxt
=
+++
=+++
=+++
=+++
(2)
Statistical independence can be measured by
entropy. Statistical independence can be measured by
entropy. Entropy is the basic concept of information
theory [3],[7],[9]. It indicates the degree of
indeterminacy of random variables. In other words,
the more unpredictable and unstable a variable is, the
greater the entropy and the greater its statistical
independence. The following equation defines the
entropy H of a binomial random variable. It can be
extended to continuous random variable and random
vectors. While the random vector is
y
, the density
distribution is ()
p
y:
() ()log ()Hy py pydy=−
. (3)
After measuring the statistical entropy, the greatest
entropy may be measured by mutual information, as
shown in the following equation:
1
x
2
x
4
s
4
x
3
s
3
x
1
s
2
s
Proceedings of the 6th WSEAS International Conference on Applied Computer Science, Tenerife, Canary Islands, Spain, December 16-18, 2006 544
1234 1 2 3
41234
(, , , ) () () ()
( ) ( , , , )
Iyy y y Hy Hy Hy
H
yHyyyy
=++
+− . (4)
When the output entropy 1234
(, , , )Hyy y y is at its
greatest value, the mutual information
1234
(, , , )
I
yyyy between the outputs is the smallest.
When 1234
(, , , )0Iyy y y =, 12
,
y
y,3
y
and 4
y
are
statistically independent. The relation between
mutual information and the W value, uncovered by
the scholarly efforts from related fields, may be
represented as the following
1
()
() () ()
TT
p
Hy p
⎛⎞
⎜⎟
Δ∝ = +
⎜⎟
⎜⎟
⎝⎠
u
u
WW x
Wu
(5)
ΔW is defined as the modified W,
123
[, , ]uuu=u, and () y
p
=
uu. However, the
learning rule is too complicated as it involves the
operation of inverse matrix. In the studies of Amari et
al., Cardoso, and Laheld in 1996 [24], the learning
rule was multiplied by T
WW
,resulting in its rescale.
In consequence, the learning rule is changed into
() ()
TT
Hy I
ϕ
⎡⎤
Δ∝ =+
⎣⎦
WWWuuW
W (6)
()
ϕ
u is defined as the ()
()
p
p
⎛⎞
⎜⎟
⎜⎟
⎜⎟
⎝⎠
u
u
u. The maximum
information method is to use estimate ascent
algorithm, adjusting W by means of continuous
iteration and achieving the greatest H(y). The
adjustment is presented as such
1pp
l
+=+ΔWW W
(7)
where p is the times of iteration and l stands for the
learning rate. W will be renewed continuously
according to the partial differential equation of the
maximized function ()H
y
in relation to W. ()H
y
will increase until the greatest value is identified. So
the best W may be estimated, enabling the
separation of original signals from the mixed signals.
However, the estimated original signals may be
distorted due to opposite phase and unequal
amplitude. Figure 2 shows the estimation of the
unmixing matrix W in a cocktail-party problem.
Fig. 2. Estimation of the unmixing matrix W in a
cocktail-party problem.
2.2 Modified ICA
Since the traditional ICA has the problems of
opposite phase and unequal amplitude, the study uses
gradient estimation algorithm to adjust gain and solve
these problems. The positive and negative values of
gain can be used to adjust opposite phase and its
amount, to adjust unequal amplitude. The proposed
method is to add automatically adjustable gain to the
traditional ICA. A received signal is employed as the
main input while various signals separated by
traditional ICA are the reference input. In our
proposed method, different gains are multiplied by
various reference input and sum together. The sum is
compared with the original signals. When the correct
gain is selected, the two signals are identical. The
gain is adjusted with gradient estimation algorithm,
one of the available methods to identify the optimal
parameter. The proposed method is presented in
Figure 3.
Fig. 3. The structure of the modified ICA.
Where ()nu is the various signals separated by ICA
and 1()
x
n is the main input. ()n
θ
represents the
weight vector adjusted by gradient estimation
A
S
Unknown
X
us
W
()en
() () ()
T
yn n n
θ
=u
Error
Gradient
Metho
d
0()n
θ
1()n
θ
2()n
θ
1()
Ln
θ
x
1
(
n
)
0()un
2()un
1()
L
un
Output
3()un
Proceedings of the 6th WSEAS International Conference on Applied Computer Science, Tenerife, Canary Islands, Spain, December 16-18, 2006 545
algorithm and is therefore the gain. ()yn is the
modified algorithm output ,and ()en is the errors.
01 1
() [ (), (), ... ()]
L
nunun un
=T
u
01 1
() [ (), (),..., ()]
L
nnn n
θθθ θ
=T
where L indicates the number of processed signals.
The output of gradient estimation algorithm
11 22
33 44
() () () () ()
( ) ( ) ( ) ( )
y
nnunnun
nu n nu n
θ
θ
θθ
=+
++ (8)
where 1()un is estimation of the sound of a police
car, 2()ut is estimation of the rock music, 3()ut is
estimation of classical music, and 4()ut is
estimation of the speaking voices. 1()n
θ
,2()n
θ
,
3()n
θ
and 4()n
θ
are parameter vector at time n .
The equation for errors in gradient estimation
algorithm
1
() () ()en x n yn=−
(9)
If Eq. 8 is applied to Eq.9, a renewed equation of the
weight function will be produced:
Thus, substituting Eq. 8 into Eq.9, we may compute
the updated value of the parameter vector (1)n
θ
+
by using the simple recursive relation
11 1
( 1) () [() ()]nnenun
θ
θμ
+= + , (10)
22 2
( 1) () [() ()]nnenun
θ
θμ
+= + , (11)
33 3
( 1) () [() ()]nnenun
θ
θ
μ
+= + , (12)
44 4
( 1) () [() ()]nnenun
θ
θ
μ
+= + (13)
where
μ
is the step-size.
The results of this algorithm are produced by
means of the smallest average errors. The advantage
of this algorithm is that it utilizes only addition and
multiplication. In this gradient algorithm, the
parameter
μ
is the step size, which mainly affects
the stability and convergence speed of the system.
3 Results and Discussion
This study uses CPU-P4 2.0 GHz industrial computer
with MATLAB 6.5 legal software. The
cocktail-party problem includes four simultaneous
sound sources from four different positions 1
s, 2
s,
3
s, and 4
s, representing individual original signals.
When the sounds are mixed, individual sources
cannot be distinguished. Figure 4 shows the time
domain of the signals received by the microphones.
In this system, if the mixed matrix A is known, the
analysis of 1
s, 2
s, 3
s, and 4
s is simply the question
of solving linear equations. However, the mixed
matrix A is usually unknown in our studies. Since
the sources of the signals are often uncertain, it is not
possible to determine the mixed matrix produced by
the distance and to measure 1
s, 2
s, 3
s, and 4
s
individually. ICA reorganizes a set of complex data
into independent components by means of statistic
algorithm. As long as there are sufficient amount of
known x and independent components are actually
present, it can estimate u, a value close to the
original independent component. In such analysis,
the mixed matrix A is not necessarily required. ICA
is able to separate the independent components
1
u,2
u,3
u,and 4
u as shown in Figure 5. However, the
amplitude of the separated signals become smaller,
showing distortion of the original signals. We
proposed a modified ICA to solve this problem.
Using gradient estimation algorithm, the proposed
method can estimate the original signals. After the
separation by ICA, the automatically adjustable gain
is added. Finally the modified ICA retrieves the
amplitude. Its time domain is shown in Figure 6. The
original signals 1
s, 2
s, 3
s, and 4
s is presented in
Figure 7 . Figure 8 represents the Gaussian
distribution of original signals. It can be observed
that the retrieved signals are very close to the original.
So is the retrieve amplitude. In terms of frequency
domain, the retrieved signals are very similar to the
original. Unlike the traditional ICA which allows
only one signal in Gaussian distribution, all the
signals in the modified method are in Gaussian
distribution.
0 12 3 4 5 6
-2
0
2
0 1 2 3 4 5 6
-2
0
2
Amplitude
0 1 2 3 4 5 6
-2
0
2
0 1 2 3 4 5 6
-1
0
1
Time (Sec)
Fig. 4. Time domain of signals received by microphone.
Proceedings of the 6th WSEAS International Conference on Applied Computer Science, Tenerife, Canary Islands, Spain, December 16-18, 2006 546
0 12 3 4 5 6
-0.02
0
0.02
0 1 2 3 4 5 6
-0.02
0
0.02
Amplitude
0 1 2 3 4 5 6
-0.05
0
0.05
0 1 2 3 4 5 6
-0.05
0
0.05
Time (Sec)
Fig. 6. Time domain of signals separated by ICA.
0 12 3 4 5 6
-1
0
1
0 1 2 3 4 5 6
-1
0
1
Amplitude
0 1 2 3 4 5 6
-1
0
1
0 1 2 3 4 5 6
-1
0
1
Time (Sec)
Fig. 8. Time domain of signals separated by modified ICA.
0 12 3 4 5 6
-1
0
1
0 1 2 3 4 5 6
-1
0
1
Amplitude
0 1 2 3 4 5 6
-1
0
1
0 1 2 3 4 5 6
-1
0
1
Time (Sec)
Fig. 10. Time domain of original signals.
-1 -0.5 00.5 1
0
0.5
1
1.5
2x 10
4
(A)
-1 -0.5 00.5 1
0
0.5
1
1.5
2
2.5
3x 10
4
(B)
-1 -0.5 00.5 1
0
0.5
1
1.5
2
2.5 x 10
4
(C)
-1 -0.5 00.5 1
0
0.5
1
1.5
2x 10
4
(D)
Fig. 12 Gaussian distribution of original signals.
4 Conclusion
One of the limitations of ICA is that the original
signals should be non-Gaussian and only one
Gaussian signal is allowed. Otherwise aliasing and
errors may occur during estimation. The opposite
phase and unequal amplitude in the estimation will
lead to the distortion of microphone signals during
retrieval. This study proposed a modified ICA, which
allows more accurate estimation of phase and
amplitude in Gaussian signals and ensure effective
retrieval of original signals. This method has
promising potential for application in many fields.
Acknowledgments
This project was supported by the National Science
Council in Taiwan, Republic of China, under Project
Number NSC 94-2218-E-008-006.
References:
[1] J. Herault & C. Jutten, Space or Time
Adaptive Signal Processing By Neural
Network Models, Neural Networks for
Computing, AIP Conference Proceedings
Vol.151, pp.207-211, 1986.
[2] P. Comon, Independent component
analysis-a new concept? Signal Process.
Vol.36, pp.287-314, 1994.
[3] A.J. Bell and T.J. Sejnowski, An
information maximisation approach to blind
separation and blind deconvolution, Neural
Comput. Vol.7, No.6, pp.1129-1159, 1995.
Proceedings of the 6th WSEAS International Conference on Applied Computer Science, Tenerife, Canary Islands, Spain, December 16-18, 2006 547
[4] T.W. Lee, Independent Component
Analysis: Theory and Applications, Kluwer
Academic Publishers, Boston, 1998.
[5] A. Hyvärinen, Sparse code shrinkage:
Denoising of non-Gaussian data by
maximum likelihood estimation,
Neurocomputing Vol.11, No.7, pp.
1739-1768, 1999.
[6] A. Hyvärinen and E. Oja, Independent
Component Analysis: Algorithms and
Applications, Neural Netw. Vol.13, No.4-5,
pp.411-430, 2000.
[7] A. Hyvärinen, Fast and robust fixed-point
algorithms for independent component
analysis, IEEE Trans. Neural Netw. Vol.10,
No.3, pp. 626-634, 1999b.
[8] A. Hyvärinen and E. Oja, A fast fixed-point
algorithm for independent component
analysis, Neural Comput. Vol.9, No.7, pp.
1483-1492, 1997.
[9] A. Sharma, K.K. Paliwal, Subspace
independent component analysis using
vector kurtosis, Pattern Recognition, Vol.39,
No.11, NOV 2006, pp. 2227-2232.
[10] J. Karhunen. Neural approaches to
independent component analysis and source
separation. In Proc. 4th European Symp.
Artificial Neural Networks, ESANN’96,
Bruges, Belgium, pp.249-266, Apr. 1996.
[11] H. Sahlin, H. Broman, Separation Of
Real-World Signals, Signal Processing,
Vol.64, No.2, pp.103-113, 1998.
[12] G.J. Erickson, J.T. Rychert and C.R. Smith.
Difficults applying recent blind source
separation techniques to EEG and MEG,
Maximum Entropy and Bayesian Methods,
Boise, Idaho, pp.209-222, 1997.
[13] J.K. Galbraith, G. Hauer, L. Helbig, Z.
Wang, M.J. Marchello, L.A.
Goonewardene, Nutrient profiles in retail
cuts of bison meat, Meat Science, Vol.74,
No.4, 2006,pp. 648-654.
[14] K.H. Ting , P.C.W. Fung , C.Q. Chang,
F.H.Y. Chan, Automatic correction of
artifact from single-trial event-related
potentials by blind source separation using
second order statistics only, Medical
Engineering & Physics, Vol.28, No.8, 2006,
pp.780-794.
[15] J. Marco-Pallares , L. Fuentecilla , T.
Muente , G. Grau, Gating mechanisms for
the auditory N1-evoked potential:
Independent component analysis
tomography and time-frequency analysis,
Journal of Psychophysiology, Vol. 20, No.3,
2006, pp.240-241.
[16] T.W. Boonstra, A. Daffertshofer, C.E.
Peper, P. Beek, Amplitude and phase
dynamics associated with acoustically paced
finger tapping, Brain Research, Vol.1109,
2006, pp.60-69.
[17] A. Singer, Spectral independent component
analysis, Applied and Computational
Harmonic Analysis, Vol.21, No.1, JUL
2006, pp.135-144.
[18] J.M. Lee, S.J. Qin,I.B. Lee, Fault detection
and diagnosis based on modified
independent component analysis, Aiche
Journal, Vol.52, No.10, OCT 2006,
pp.3501-3514.
[19] A. Frigyesi , S. Veerla , D. Lindgren , M.
Hoglund, Independent component analysis
reveals new and biologically significant
structures in micro array data, BMC
Bioinformatics, Vol.7, JUN 2006, No. 290.
[20] D.S. Huang, C.H. Zheng, Independent
component analysis-based penalized
discriminant method for tumor classification
using gene expression data, Bioinformatics,
Vol.22, No.15, AUG 2006, pp.1855-1862.
[21] J. Karhumen, A. Hyvarinen, Application Of
Neural Blind Separation To Signal And
Image Processing, In Proc ICASSP.
Germany, Munich, pp.131-134, 1997.
[22] S. Makeig, T.P. Jung, A.J. Bell, Blind
Separation Of auditory Event-related Brain
Reponses Into Independent Components, In
Proc Natl Acad Sci, Vol.94,
pp.10979-10984, 1997.
[23] S. Amari, A. Cichocki, H. H. Yang, A New
Learning Algorithm for Blind Signal
Separation, Advances in Neural Information
Processing Systems Vol.8, pp.757-763,
1996.
[24] S. Makeig, M. Westerfield, T-P. Jung, S.
Enghoff, J. Townsend, E. Courchesne & T.
J. Sejnowski, Dynamic brain sources of
visual evoked responses, Science, Vol.295,
pp.690-694, 2002.
Proceedings of the 6th WSEAS International Conference on Applied Computer Science, Tenerife, Canary Islands, Spain, December 16-18, 2006 548
... 4, the sources are a linear product of W and y as it remains in the case of conventional BSS. u = Wy (7) Another example of underdetermined BSS is studied in [Lin Q., et al. (2006)] to present a novel BSS-based speech encryption by properly constructing the underdetermined mixing matrix for encryption, and by generating the key signals that satisfy the necessary condition in the method to be unconditionally secure. ...
Article
Full-text available
Blind source separation (BSS) and independent component analysis (ICA) are by and large based on a wide class of unsupervised learning algorithms and they have potential applications in many areas from applied sciences like neuroscience to engineering. The BSS exploits the priori knowledge of nature and structure of hidden sources such as sparseness, noise, gaussinity, spatio-temporal decorrelation, statistical independence. Independent component analysis (ICA) and blind source separation (BSS) refer to the problem of recovering statistically independent signals from a linear mixture. Herewith, a review of BSS and ICA has been presented with respect to the number of methods, research procedures, models developed, technological up gradation being practiced, so that the pathways of research in the field can be obtained. The paper is divided into four sections such as, a) Introduction b) Approaches & methods c) Recent developments and d) Applications.
Conference Paper
In this study, a method to improve selectivity of chemically field-effect transistor (CHEMFET) sensor towards the main ion concentration in mixed solution is discussed. The approach is based on artificial neural network (ANN) as a post processing stage that performs the estimation of ion concentration in a mixed solution. CHEMFET sensor is viewed as non-linear model producing signal fed to blind-source separation algorithm. To describe how the ions interfere with main ion, the source signal of CHEMFET sensor is generated based on CHEMFET model. The sensor response is converted to frequency by using voltage to frequency converter (VFC). Simulation results confirm that the algorithm is able to separate the mixing signal.
Article
This discussion presents a new perspective of subspace independent component analysis (ICA). The notion of a function of cumulants (kurtosis) is generalized to vector kurtosis. This vector kurtosis is utilized in the subspace ICA algorithm to estimate subspace independent components. One of the main advantages of the presented approach is its computational simplicity. The experiments have shown promising results in estimating subspace independent components. (c) 2006 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
Article
Independent component analysis (ICA) of a mixed signal into a linear combination of its independent components, is one of the main problems in statistics, with wide range of applications. The un-mixing is usually performed by finding a rotation that optimizes a functional closely related to the differential entropy. In this paper we solve the linear ICA problem by analyzing the spectrum and eigenspaces of the graph Laplacian of the data. The spectral ICA algorithm is based on two observations. First, independence of random variables is equivalent to having the eigenfunctions of the limiting continuous operator of the graph Laplacian in a separation of variables form. Second, the first non-trivial Neumann function of any Sturm–Liouville operator is monotonic. Both the degenerate and non-degenerate spectrums corresponding to identical and non-identical sources are studied. We provide successful numerical experiments of the algorithm.
Article
A novel multivariate statistical process monitoring (MSPM) method based on modified independent component analysis (ICA) is proposed. ICA is a multivariate statistical tool to extract statistically independent components from observed data, which has drawn considerable attention in research fields such as neural networks, signal processing, and blind source separation. In this article, some drawbacks of the original ICA algorithm are analyzed and a modified ICA algorithm is developed for the purpose of MSPM. The basic idea of the approach is to use the modified ICA to extract some dominant independent components from normal operating process data and to combine them with statistical process monitoring techniques. Variable contribution plots to the monitoring statistics (T2 and SPE) are also developed for fault diagnosis. The proposed monitoring method is applied to fault detection and diagnosis in a wastewater treatment process, the Tennessee Eastman process, and a semiconductor etch process and is compared with conventional PCA monitoring methods. The monitoring results clearly illustrate the superiority of the proposed method. © 2006 American Institute of Chemical Engineers AIChE J, 2006
Article
A signal separation algorithm is used in the present paper in order to improve the signal-to-noise ratio (SNR) of a signal disturbed by noise. The algorithm uses a criterion of squared cross-correlations between separated signals, and is thus based on second-order statistics. Leaking is introduced in order to improve the performance. The signals used are real-world signals measured with a modified mobile unit with two microphones. Signal-to-noise ratios, before and after separation, are presented. Furthermore, it is shown how to compute the SNR for signals in scenarios when both the mixing system and the noise signals are unknown.
Article
The independent component analysis (ICA) of a random vector consists of searching for a linear transformation that minimizes the statistical dependence between its components. In order to define suitable search criteria, the expansion of mutual information is utilized as a function of cumulants of increasing orders. An efficient algorithm is proposed, which allows the computation of the ICA of a data matrix within a polynomial time. The concept of ICA may actually be seen as an extension of the principal component analysis (PCA), which can only impose independence up to the second order and, consequently, defines directions that are orthogonal. Potential applications of ICA include data analysis and compression, Bayesian detection, localization of sources, and blind identification and deconvolution.
Article
We introduce a novel fast algorithm for independent component analysis, which can be used for blind source separation and feature extraction. We show how a neural network learning rule can be transformed into a fixedpoint iteration, which provides an algorithm that is very simple, does not depend on any user-defined parameters, and is fast to converge to the most accurate solution allowed by the data. The algorithm finds, one at a time, all nongaussian independent components, regardless of their probability distributions. The computations can be performed in either batch mode or a semiadaptive manner. The convergence of the algorithm is rigorously proved, and the convergence speed is shown to be cubic. Some comparisons to gradient-based algorithms are made, showing that the new algorithm is usually 10 to 100 times faster, sometimes giving the solution in just a few iterations.
Article
Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon's information-theoretic approach and the projection pursuit approach. Using maximum entropy approximations of differential entropy, we introduce a family of new contrast (objective) functions for ICA. These contrast functions enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions. The statistical properties of the estimators based on such contrast functions are analyzed under the assumption of the linear mixture model, and it is shown how to choose contrast functions that are robust and/or of minimum variance. Finally, we introduce simple fixed-point algorithms for practical optimization of the contrast functions. These algorithms optimize the contrast functions very fast and reliably.
Article
The objectives were to determine the nutrient composition and variation in eight cuts of bison meat in bulls and heifers and identify nutrient relationships in the clod and sirloin by principal component analysis. The nutrients analyzed were: energy, protein, total fat, saturated fat, monounsaturated fat, polyunsaturated fat, transfat, cholesterol, vitamin A, Ca, Fe, Na and moisture. Differences were observed in fat components between cuts and bulls had higher (P<0.05) amounts of total, saturated, monounsaturated and polyunsaturated fat in the blade compared to the other cuts. The sirloins had less (P<0.05) cholesterol than all the other cuts in bulls and the clod in heifers. Fat varied more than protein and moisture in all cuts. Four principal components (PC) accounted for 63.9% of the total variation of the nutrient composition. Total, monounsaturated and saturated fats were in PC1 and cholesterol in PC2 showing that cholesterol is independent of other fats. If dietary alterations elicit changes in bison meat fatty acid profiles, it may be possible to reduce cholesterol independent of total, monounsaturated or saturated fat.