What’s an eigenvalue? What about an eigenvector?

The directions along which a particular linear transformation compresses, flips, or stretches is called eigenvalue. Eigenvectors are used to understand these linear transformations.

For example, to make better sense of the covariance of the covariance matrix, the eigenvector will help identify the direction in which the covariances are going. The eigenvalues will express the importance of each feature.

Eigenvalues and eigenvectors are both critical to computer vision and ML applications. The most popular of these is known as principal component analysis for dimensionality reduction (e.g., eigenfaces for face recognition).