deepdream of
          a sidewalk
Show Answer
Math and science::INF ML AI

Negative log likelihood loss. A perspective.

Negative log likelihood loss is normally calculated as the positivized mean log likelihood. This is:

loss=i=0i=n_stepsP(data|model_out)

As this mean is taken over many samples, it approximates an expectation—an expectation over log probabilities. Sound familiar? This is an approximation to [what?].

\text{entropy} = \sum_{i=0}{N} -p(x) \log(p(x))