Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory Definitions and Basic Facts

Definitions and Basic Facts

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Entropy Function   ::   Joint Entropy   ::   Relative Entropy   ::   Multivariable   ::   Convexity

Joint entropy

Often we are interested in the entropy of pairs of random variables ( X , Y ). Another way of thinking of this is as a vector of random variables.

\begin{definition}
If $X$\ and $Y$\ are jointly distributed according to $p(X,Y...
...h}or
\begin{displaymath}H(X,Y) = -E\log p(X,Y)
\end{displaymath}\end{definition}


\begin{definition}
If $(X,Y) \sim p(x,y)$, then the {\bf conditional entropy} $...
...sum_{x\in \Xc} p(x) H(Y\vert X=x)
\end{aligned}\end{displaymath}\end{definition}


\begin{theorem}
({\em chain rule})
\begin{displaymath}H(X,Y) = H(X) + H(Y\vert X)
\end{displaymath}\end{theorem}

Interpretation: The uncertainty (entropy) about both X and Y is equal to the uncertainty (entropy) we have about X , plus whatever we have about Y , given that we know X .


\begin{proof}
This proof is very typical of the proofs in this class: it consis...
...p(Y\vert X)
\end{displaymath}and take the expectation of both sides.
\end{proof}

We can also have a joint entropy with a conditioning on it, as shown in the following corollary:

\begin{corollary}
\begin{displaymath}H(X,Y\vert Z) = H(X\vert Z) + H(Y\vert X,Z)
\end{displaymath}\end{corollary}

The proof is similar to the one above. (This is a good one to work on your own.)

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Definitions and Basic Facts. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture2_2.htm. This work is licensed under a Creative Commons License Creative Commons License