Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory Differential Entropy

Differential Entropy

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Differential Entropy   ::   AEP   ::   Discretization   ::   Other Entropy

AEP


\begin{theorem}
Let $X_1,\ldots,X_n$\ be a sequence of random variables drawn
...
...E[-\log f(X)] = h(X),
\end{displaymath}convergence in probability.
\end{theorem}

\begin{definition}
{\bf Typical sets} For any $\epsilon > 0$\ and any $n$, the ...
...ldots, x_n) - h(X)\vert \leq \epsilon \right\}
\end{displaymath}\end{definition}
That is, it is the set for which the empirical differential entropy is close to the differential entropy.

For discrete random variables, we talked about the number of elements in the typical set. For continuous random variables, the analogous concept is the volume of the typical set.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Differential Entropy. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture10_1.htm. This work is licensed under a Creative Commons License Creative Commons License