Math and science::INF ML AI
Negative log likelihood loss. A perspective.
Negative log likelihood loss is normally calculated as the positivized mean log likelihood. This is:
As this mean is taken over many samples, it approximates an expectation—an expectation over log probabilities. Sound familiar? This is an approximation to [what?].
\text{entropy} = \sum_{i=0}{N} -p(x) \log(p(x))