Math and science::INF ML AI
Summand on Entropy
Consider an element of the form \( x_i \log(\frac{1}{x_i}) \), which makes up the sum to calculate entropy. As a function itself, \( f(x) = x \log(\frac{1}{x}) \) has the below shape:
and for a log of base \( b \) the maximum is reached at \( x=b \).
So with a natural logarithm, the maximum is around 3.6788.
Proof
The derivative is \( \log(\frac{1}{x}) - 1 \), and setting this to zero gives the result.