Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory Differential Entropy

Differential Entropy

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Differential Entropy   ::   AEP   ::   Discretization   ::   Other Entropy

Differential entropy

A continuous random variable (for the purpose of this class) is one in which the distribution function F ( x ) is continuous: there are no jumps (discrete outputs).


\begin{definition}
The {\bf differential entropy} $h(X)$\ of a continuous rando...
...displaymath}h(X) = -\int_{S} f(x)\log f(x) dx.
\end{displaymath}\end{definition}

\begin{example}
Let $X \sim \Uc(0,a)$. (Uniform). Then
\begin{displaymath}h(X) ...
...em differential} entropy, since entropy
should always be positive.
\end{example}

\begin{example}
Normal: $X \sim \frac{1}{\sigma\sqrt{2\pi}}e^{-x^2/2\sigma^2} =...
...1}{2} \ln \pi e \sigma^2\text{ nats}
\end{aligned}\end{displaymath}\end{example}

Having defined the differential entropy, we can now go through and define all the sorts of things we did before.

Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, May 15). Differential Entropy. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture10.htm. This work is licensed under a Creative Commons License Creative Commons License