Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory Differential Entropy

Differential Entropy

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Differential Entropy   ::   AEP   ::   Discretization   ::   Other Entropy

Joint, Conditional, and Relative Differential Entropy

More basic definitions:

 \begin{displaymath}
h(X_1,\ldots,X_n) = -\int f(x_1,\ldots,x_n)\log f(x_1,\ldots,x_n)
dx_1 \cdots x_n
\end{displaymath}

is the joint differential entropy .

 \begin{displaymath}
h(X|Y) = -\int f(x,y) \log f(x|y) dx dy
\end{displaymath}

is the conditional differential entropy .

An important special case is the following:
\begin{theorem}
Let $X_1,\ldots,X_n$\ have a multivariate normal distribution w...
...ac{1}{2}\log(2\pi e)^n \vert K\vert \text{ bits}.
\end{displaymath}\end{theorem}

The relative entropy is

 \begin{displaymath}
D(f\|g) = \int f \log \frac{f}{g}.
\end{displaymath}

The mutual information is

 \begin{displaymath}
I(X;Y) = \int f(x,y) \log \frac{f(x,y)}{f(x)f(y)} \, dx dy =
D(f(x,y) \| f(x)f(y))
\end{displaymath}

Some properties:


An important result is the following:


That is, for a given covariance, the normal (Gaussian) distribution has the one which maximizes the entropy.
\begin{proof}
Let $g(\Xbf)$\ be a distribution with the same covariance, and le...
...laymath}and that both $g$\ and $\phi$\ have the same second moments.
\end{proof}

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 19). Differential Entropy. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/Differential_Entropy_1.html. This work is licensed under a Creative Commons License Creative Commons License