Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory Differential Entropy

Differential Entropy

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Differential Entropy   ::   AEP   ::   Discretization   ::   Other Entropy

Discretization

When a r.v. X with continuous distribution is broken into a range of bins, then (mean value theorem) there is a value x i in each range s.t.



 \begin{displaymath}
f(x_i)\Delta = \int_{i\Delta}^{(i+1)\Delta} f(x)dx
\end{displaymath}

Let $X^\Delta$ be the quantized r.v. defined by


 \begin{displaymath}
X^\Delta = x_i \text{ with probability } p_i
=\int_{i\Delta}^{(i+1)\Delta} f(x)dx = f(x_i)\Delta.
\end{displaymath}

The entropy of the quantized r.v. is

 \begin{displaymath}
H(X^\Delta) = -\sum f(x_i)\Delta \log (f(x_i)\Delta) = -\sum\Delta
f(x_i) \log f(x_i) - \log \Delta.
\end{displaymath}

In the limit as $\Delta \rightarrow 0$ then

 \begin{displaymath}
H(X^\Delta) + \log \Delta \rightarrow h(f).
\end{displaymath}

Thus the entropy of a n -bit quantization of a continuous r.v. increases with n .

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 19). Differential Entropy. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/Differential_Entropy.html. This work is licensed under a Creative Commons License Creative Commons License