\( \newcommand{\matr}[1] {\mathbf{#1}} \newcommand{\vertbar} {\rule[-1ex]{0.5pt}{2.5ex}} \newcommand{\horzbar} {\rule[.5ex]{2.5ex}{0.5pt}} \newcommand{\E} {\mathrm{E}} \)
deepdream of
          a sidewalk
Show Question
\( \newcommand{\cat}[1] {\mathrm{#1}} \newcommand{\catobj}[1] {\operatorname{Obj}(\mathrm{#1})} \newcommand{\cathom}[1] {\operatorname{Hom}_{\cat{#1}}} \newcommand{\multiBetaReduction}[0] {\twoheadrightarrow_{\beta}} \newcommand{\betaReduction}[0] {\rightarrow_{\beta}} \newcommand{\betaEq}[0] {=_{\beta}} \newcommand{\string}[1] {\texttt{"}\mathtt{#1}\texttt{"}} \newcommand{\symbolq}[1] {\texttt{`}\mathtt{#1}\texttt{'}} \newcommand{\groupMul}[1] { \cdot_{\small{#1}}} \newcommand{\groupAdd}[1] { +_{\small{#1}}} \newcommand{\inv}[1] {#1^{-1} } \newcommand{\bm}[1] { \boldsymbol{#1} } \require{physics} \require{ams} \require{mathtools} \)
Math and science::Algebra

Projection matrix

Projection matrix

We wish to project the vector \( b \) onto the vector \( a \). The projection matrix \( P \) achieves this by \( Pb \), and is given by:

\[ P = \frac{aa^T}{a^Ta} \]

We expect that a projection matrix \( P \):

  • Is singular.
  • Has columns that are multiples of \( a \).
  • Is symmetric.
  • Has eigenvalues 1 and 0 corresponding to vectors \( a \) and vectors perpendicular to \( a \).

Dot product approach

The intuitive projection is to normalize \( a \) to be a unit vector then take the dot product with \( b \) to produce a scalar, and then scale the normalized \( a \) vector by this scalar.

Matrix approach

The matrix approach preloads \( a \) into the columns of a matrix \( P \), and scales the columns by and amount that both:

  • normalizes \( a \)
  • partially-computes the dot product

Matrix multiplication \( Pb \) then amounts to \( b_1 \) scaling the normalized \( a \) by the value \( b_1a_1 \), and so on for \( b_2, b_3 \) etc., and summing these intermediate vectors.

Compute

It's interesting that the intuitive approach is cheaper than the matrix approach.