\( \newcommand{\matr}[1] {\mathbf{#1}} \newcommand{\vertbar} {\rule[-1ex]{0.5pt}{2.5ex}} \newcommand{\horzbar} {\rule[.5ex]{2.5ex}{0.5pt}} \newcommand{\E} {\mathrm{E}} \)
deepdream of
          a sidewalk
Show Question
\( \newcommand{\cat}[1] {\mathrm{#1}} \newcommand{\catobj}[1] {\operatorname{Obj}(\mathrm{#1})} \newcommand{\cathom}[1] {\operatorname{Hom}_{\cat{#1}}} \newcommand{\multiBetaReduction}[0] {\twoheadrightarrow_{\beta}} \newcommand{\betaReduction}[0] {\rightarrow_{\beta}} \newcommand{\betaEq}[0] {=_{\beta}} \newcommand{\string}[1] {\texttt{"}\mathtt{#1}\texttt{"}} \newcommand{\symbolq}[1] {\texttt{`}\mathtt{#1}\texttt{'}} \newcommand{\groupMul}[1] { \cdot_{\small{#1}}} \newcommand{\groupAdd}[1] { +_{\small{#1}}} \newcommand{\inv}[1] {#1^{-1} } \newcommand{\bm}[1] { \boldsymbol{#1} } \require{physics} \require{ams} \require{mathtools} \)
Math and science::Algebra

Covariance of random vector linearly transformed

Linear transformation of random column vector. Covariance

Let \( z \) be a random column vector with mean vector \( m \) and covariance matrix \( \operatorname{cov}(z) \). If \( a \) is a derived random vector from the linear transformation \( a = Wz \), then:

  1. the mean of \( a \) is \( \operatorname{E}[a] = Wm \)
  2. the covariance matrix of \( a \) is \( \operatorname{cov}(a) = W \operatorname{cov}(z) W^T \)

\( z \) can be a column of random variables, each of which come from some distribution that we don't need to know. All we need to know is each variable's mean, and the covariance matrix that relates them.


Proof for the covariance matrix result

The mean result is not surprising. We have the proof for the covariance matrix below.

\[ \begin{align} \operatorname{Cov}(a) &= \operatorname{Cov}(Wz) \\ &= \mathrm{E}[(W\mathbf{z}-W\mathbf{m})(W\mathbf{z}-W\mathbf{m})^T]\\ &= \mathrm{E}[W(\mathbf{z}-\mathbf{m})(\mathbf{z}-\mathbf{m})^TW^T]\\ &= W\mathrm{E}[(\mathbf{z}-\mathbf{m})(\mathbf{z}-\mathbf{m})^T]W^T\\ &= W\operatorname{Cov}(\mathbf{z})W^T \end{align} \]

Further motivation