\( \newcommand{\matr}[1] {\mathbf{#1}} \newcommand{\vertbar} {\rule[-1ex]{0.5pt}{2.5ex}} \newcommand{\horzbar} {\rule[.5ex]{2.5ex}{0.5pt}} \newcommand{\E} {\mathrm{E}} \)
deepdream of
          a sidewalk
Show Question
\( \newcommand{\cat}[1] {\mathrm{#1}} \newcommand{\catobj}[1] {\operatorname{Obj}(\mathrm{#1})} \newcommand{\cathom}[1] {\operatorname{Hom}_{\cat{#1}}} \newcommand{\multiBetaReduction}[0] {\twoheadrightarrow_{\beta}} \newcommand{\betaReduction}[0] {\rightarrow_{\beta}} \newcommand{\betaEq}[0] {=_{\beta}} \newcommand{\string}[1] {\texttt{"}\mathtt{#1}\texttt{"}} \newcommand{\symbolq}[1] {\texttt{`}\mathtt{#1}\texttt{'}} \newcommand{\groupMul}[1] { \cdot_{\small{#1}}} \newcommand{\groupAdd}[1] { +_{\small{#1}}} \newcommand{\inv}[1] {#1^{-1} } \newcommand{\bm}[1] { \boldsymbol{#1} } \require{physics} \require{ams} \require{mathtools} \)
Math and science::Algebra

Eigenvector decomposition. Equivalent forms.

The following is true, for a diagonizable matrix \( A \):

\[ S^{-1} A S = \Lambda \]

The above is equivalent to the more commonly seen:

{{\[ A = S \Lambda S^{-1} \]

And another equivalent is:

\[ AS = S \Lambda \]

Adding brackets to emphasize a multiplication order can help make the expressions more intuitive:

\[ S^{-1} (A (S)) = \Lambda \]

involves taking \( A \) times it's eigenvectors, which will produce scaled versions of these eigenvectors, and then expressing these scaled columns in the basis of the eigenvectors, which of course will just be a single number (an eigenvalue) for each column.

The expression:

\[ A S = S \Lambda \]

Is simply \( A v = \lambda v \), but for all eigenvectors at once. As \( \Lambda \) is a diagonal matrix, we can say:

\[ A S = S \Lambda = \Lambda S \]

which is slighly closer to the form \( A v = \lambda v \).