Math and science::INF ML AI
Jacobian and Hessian
A Jacobian matrix collects all of the first partial derivatives of a function with multiple inputs and multiple outputs. The columns of the Jacobian keep the [input or output?] constant, and the rows keep the [ input or output?] constant. So, a function with a Jacobian matrix that is a colomun vector must be a Jacobian for a function with a single [input or output?].
Extending the idea of the Jacobian to second partial derivatives would give us a 3D tensor. A special case is when a function has only 1 output: In this case, the collection of all second partial derivatives can be collected in a 2D matrix. This is what is called the Hessian.