\( \newcommand{\matr}[1] {\mathbf{#1}} \newcommand{\vertbar} {\rule[-1ex]{0.5pt}{2.5ex}} \newcommand{\horzbar} {\rule[.5ex]{2.5ex}{0.5pt}} \newcommand{\E} {\mathrm{E}} \)
deepdream of
          a sidewalk
Show Question
Math and science::INF ML AI

Summand on Entropy

Consider an element of the form \( x_i \log(\frac{1}{x_i}) \), which makes up the sum to calculate entropy. As a function itself, \( f(x) = x \log(\frac{1}{x}) \) has the below shape:

and for a log of base \( b \) the maximum is reached at \( x=b \).


So with a natural logarithm, the maximum is around 3.6788.

Proof

The derivative is \( \log(\frac{1}{x}) - 1 \), and setting this to zero gives the result.