Personal tools
You are here: Home Electrical and Computer Engineering Information Theory Definitions and Basic Facts

Definitions and Basic Facts

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Entropy Function   ::   Joint Entropy   ::   Relative Entropy   ::   Multivariable   ::   Convexity

Joint entropy

Often we are interested in the entropy of pairs of random variables ( X , Y ). Another way of thinking of this is as a vector of random variables.

If $X$\ and $Y$\ are jointly distributed according to $p(X,Y...
\begin{displaymath}H(X,Y) = -E\log p(X,Y)

If $(X,Y) \sim p(x,y)$, then the {\bf conditional entropy} $...
...sum_{x\in \Xc} p(x) H(Y\vert X=x)

({\em chain rule})
\begin{displaymath}H(X,Y) = H(X) + H(Y\vert X)

Interpretation: The uncertainty (entropy) about both X and Y is equal to the uncertainty (entropy) we have about X , plus whatever we have about Y , given that we know X .

This proof is very typical of the proofs in this class: it consis...
...p(Y\vert X)
\end{displaymath}and take the expectation of both sides.

We can also have a joint entropy with a conditioning on it, as shown in the following corollary:

\begin{displaymath}H(X,Y\vert Z) = H(X\vert Z) + H(Y\vert X,Z)

The proof is similar to the one above. (This is a good one to work on your own.)

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Definitions and Basic Facts. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License