# What is the physical significance of eigenvalues or eigenvectors?? Please try to explain in very simple language.

If possible, with respect to signal processing.

If possible, with respect to signal processing.

- I generally think of eigenvectors as a natural basis (orthogonal for Hermitian operators) of the input and output spaces. The eigenvalues can be thought of as (complex) "gains" for the eigenvectors.
- In data analysis, the eigenvectors of a covariance (or correlation matrix) are usually calculated. Eigenvectors are the set of basis functions that are the most efficient set to describe data variability. They are also the coordinate system that the covariance matrix becomes diagonal allowing the new variables referenced to this coordinate system to be uncorrelated. The eigenvalues is a measure of the data variance explained by each of the new coordinate axis. They are used to reduce the dimension of large data sets by selecting only a few modes with significant eigenvalues and to find new variables that are uncorrelated; very helpful for least-square regressions of badly conditioned systems. It should be noted that the link between these statistical modes and the true dynamical modes of a system is not always straightforward because of sampling problems. cheers, arthur
- If we consider matrix as a transformation then in simple terms eigenvalue is the strength of that transformation in a particular direction known as eigenvector.
- Eigenvectors and eigenvalues are used widely in science and engineering. They have many applications, particularly in physics. Consider rigid physical bodies. Rigid physical bodies have a preferred direction of rotation, about which they can rotate freely. For example, if someone were to throw a football it would rotate around its axis while flying prettily through the air. If someone were to hit the ball in the air, the ball would be likely to flop in a less simple way. Although this may seem like common sense, even rigid bodies with a more complicated shape will have preferred directions of rotation. These are called axes of inertia, and they are calculated by finding the eigenvectors of a matrix called the inertia tensor. The eigenvalues, also important, are called moments of inertia.
- Consider a rubber and stretch it at the two ends, then the amount of deformation created by the stretching in both the direction is the eigen vector and the permanent stretch is Eigen value.
- With respect to signal processing, eigendecompositions are usually applied to covariance matrices. In this case, the eigenvectors are vectors that point in the direction of the largest variance, while the eigenvalues define the magnitude of this variance.

This can be understood intuitively by considering the fact that a covariance matrix is nothing more than the square of a linear transformation matrix, i.e. a rotation and a scaling of the data. The rotation is defined by the eigenvectors, while the scale is defined by the eigenvalues.

A nice explanation of this concept can be found here: http://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/

Already a member? Log in

## Popular Answers

Mazhar Iqbal· National University of Science and Technology