\( \newcommand{\matr}[1] {\mathbf{#1}} \newcommand{\vertbar} {\rule[-1ex]{0.5pt}{2.5ex}} \newcommand{\horzbar} {\rule[.5ex]{2.5ex}{0.5pt}} \newcommand{\E} {\mathrm{E}} \)
deepdream of
          a sidewalk
Show Question
\( \newcommand{\cat}[1] {\mathrm{#1}} \newcommand{\catobj}[1] {\operatorname{Obj}(\mathrm{#1})} \newcommand{\cathom}[1] {\operatorname{Hom}_{\cat{#1}}} \newcommand{\multiBetaReduction}[0] {\twoheadrightarrow_{\beta}} \newcommand{\betaReduction}[0] {\rightarrow_{\beta}} \newcommand{\betaEq}[0] {=_{\beta}} \newcommand{\string}[1] {\texttt{"}\mathtt{#1}\texttt{"}} \newcommand{\symbolq}[1] {\texttt{`}\mathtt{#1}\texttt{'}} \newcommand{\groupMul}[1] { \cdot_{\small{#1}}} \newcommand{\groupAdd}[1] { +_{\small{#1}}} \newcommand{\inv}[1] {#1^{-1} } \newcommand{\bm}[1] { \boldsymbol{#1} } \require{physics} \require{ams} \require{mathtools} \)
Math and science::Algebra

Effect of add-identity to eigenvectors

Let \( A \) be a matrix that decomposes into:

\[ A = X\Lambda X^{-1} \]

Then it's also true that:

\[ A + \beta I = X(A + \beta I)X^{-1} \]

or expressed differently, if \( C = A + \beta I \), then what are the eigenvectors and eigenvalues of \( C \)?


In other words, the eigenvectors of a matrix don't change when any multiple of \( I \) is added to the matrix. The eigenvalues do change.

Intuition

To see this in 2D, imagine the columns of a matrix \( A \) as vectors, and regress them along the X and Y axes until the matrix is singular. The eigenvectors are the relative ratios of the vectors when in the singular configuration. This configuration doesn't change no matter how many \( I \) are added or subtracted from the original matrix \( A \), only the number of \( I \)s needed to be subtracted to reach this configuration changes.