Product of eigenvalues and sum of eigenvalues
Let \( A \) be an \( N \times N \) matrix. Then the following two statements are true.
The product of the N eigenvalues equals the determinant of \( A \).
The product includes repeated eigenvalues.
The sum of the N eigenvalues equals the trace of \( A \).
Product of eigenvalues equals the determinant.
Intuition followed by the angle of attack for a more formal justification.
Intuition
Consider the 2D case. The column space of \( A \) can be spanned by the two independent eigenvectors. Consider the quadrilateral formed by these two vectors. Applying \( A \) to the space will a stretch in volume of the space by a factor of \( \det(A) \); the quadrilateral will grow in volume by a factor \( \det(A) \). The two sides defining this quadrilateral don't rotate or skew under the \( A \) transformation, as they are eigenvectors—they only scale. And so the factor by which the quadrilateral grows must be decomposable into the two scaling factors by which each eigenvector grows.
Proof idea
A more formal justification is to express \( \det(A - \lambda I) \) in terms of the N polynomial roots (always possible), then setting \( \lambda = 0 \) we are left with \( \det(A) = \lambda_1 \lambda_2 ... \lambda_N \).
The trace equals the sum of eigenvalues.
For the 2D case, this can be shown easily through the quadratic formula.
Let:
and then observer that:
The sum of these two eigenvalues will be the trace, \( a + d \).
Some easy examples of the trace = sum of eigenvalues proposition.
A few forms of 2D matrices make it easy to see how their trace is the sum of eigenvalues. Some examples.
General observation
An observation is that adding \( \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \) only affects the diagonal \( \begin{bmatrix}x & . \\ . & y\end{bmatrix} \).
A zero entry
When there is a zero entry, there are two ways to make a matrix singular by additions/subtractions of \( I \):
- the "axis" column vector (only 1 non-zero value) falls to the origin, \( [0, 0] \).
- the other column vector falls onto the axis of the "axis" column vector.
Symmetric matrices with equal diagonal
Symmetric matrices with the additional property that all diagonal entries are the same have simple eigenvalues, as the addition/subtraction of \( I \) acts the same on all columns.