Personal tools
You are here: Home Electrical and Computer Engineering Information Theory Differential Entropy

Differential Entropy

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Differential Entropy   ::   AEP   ::   Discretization   ::   Other Entropy


When a r.v. X with continuous distribution is broken into a range of bins, then (mean value theorem) there is a value x i in each range s.t.

f(x_i)\Delta = \int_{i\Delta}^{(i+1)\Delta} f(x)dx

Let $X^\Delta$ be the quantized r.v. defined by

X^\Delta = x_i \text{ with probability } p_i
=\int_{i\Delta}^{(i+1)\Delta} f(x)dx = f(x_i)\Delta.

The entropy of the quantized r.v. is

H(X^\Delta) = -\sum f(x_i)\Delta \log (f(x_i)\Delta) = -\sum\Delta
f(x_i) \log f(x_i) - \log \Delta.

In the limit as $\Delta \rightarrow 0$ then

H(X^\Delta) + \log \Delta \rightarrow h(f).

Thus the entropy of a n -bit quantization of a continuous r.v. increases with n .

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 19). Differential Entropy. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License