\(
\newcommand{\cat}[1] {\mathrm{#1}}
\newcommand{\catobj}[1] {\operatorname{Obj}(\mathrm{#1})}
\newcommand{\cathom}[1] {\operatorname{Hom}_{\cat{#1}}}
\newcommand{\multiBetaReduction}[0] {\twoheadrightarrow_{\beta}}
\newcommand{\betaReduction}[0] {\rightarrow_{\beta}}
\newcommand{\betaEq}[0] {=_{\beta}}
\newcommand{\string}[1] {\texttt{"}\mathtt{#1}\texttt{"}}
\newcommand{\symbolq}[1] {\texttt{`}\mathtt{#1}\texttt{'}}
\newcommand{\groupMul}[1] { \cdot_{\small{#1}}}
\newcommand{\groupAdd}[1] { +_{\small{#1}}}
\newcommand{\inv}[1] {#1^{-1} }
\newcommand{\bm}[1] { \boldsymbol{#1} }
\newcommand{\qed} { {\scriptstyle \Box} }
\require{physics}
\require{ams}
\require{mathtools}
\)
Math and science::INF ML AI
Ax=b, in the steepest descent method
When searching for an \( x \) such that \( Ax = b \), a fundamental equation
used in optimization is:
[\[
\frac{\partial}{\partial x} \left( \frac{1}{2} x^T A x \right) = \quad ?
\]]