Tuesday, July 29, 2014

Shannon's entropy | planetmath.org

Let X be a discrete random variable on a finite set X={x1,,xn}, with probability distribution function p(x)=Pr(X=x). The entropy H(X) of X is defined as

H(X)=xXp(x)logbp(x).
(1)

The convention 0log0=0 is adopted in the definition. The logarithm is usually taken to the base 2, in which case the entropy is measured in "bits," or to the base e, in which case H(X) is measured in "nats."

If X and Y are random variables on X and Y respectively, the joint entropy of X and Y is

H(X,Y)=(x,y)X×Yp(x,y)logbp(x,y),

where p(x,y) denote the joint distribution of X and Y.


Read full article from Shannon's entropy | planetmath.org

No comments:

Post a Comment

Labels

Popular Posts