Spectral Theorem
Spectral theorem
Every real symmetric matrix \( S \) can be expressed as
There are two arguments on the reverse that motivate this statement.
Both of the below arguments don't cover the existance of the eigenvectors.
Motivation 1: eigenvectors are in the column space and row space
Consider two eigenvector-value pairs \( (v_1, \lambda_1) \) and \( (v_2, \lambda_2) \).
Firstly, observe that for an eigenvector-value pair \( (v_1, \lambda_1) \), the defining statement:
also gives us the property: \[ (S - \alpha I)v_1 = (\lambda_1 - \alpha)v_1 \]
Which implies that \( v_1 \) is in the column space of \( S - \alpha I \) as long as \( \lambda_1 \neq \alpha \). When \( \lambda_1 = \alpha \), the eigenvector is in the null space of \( S - \alpha I \). Being in the null space implies that the eigenvector is orthogonal to all rows of \( S - \alpha I \).
When \( v_1 \) is in the null space of \( S - \lambda_1 I \), \( v_2 \) is in the column space, and as \( S \) is symmetric, \( v_2 \) is also in the row space of \( S - \lambda_1 I \). So this leads us to conclude that \( v_1 \) is orthogonal to \( v_2 \).
This argument extends to more eigenvectors, and we can conclude that a symmetric matrix can be diagonalized by orthogonal eigenvectors.
Motivation 2: SVD of a symmetric matrix
If we assume that symmetric matrix \( S \) can be decomposed via SVD as:
where \( U \) and \( V \) are orthonormal matrices with the same dimensionality as \( S \).
Then:
Equating these two expressions, we get:
TODO: how to justify this next jump?
This equality suggests that \( U = V \). And SVD is instead an eigenvector decomposition: