##### Personal tools
•
You are here: Home Definitions and Basic Facts

# Definitions and Basic Facts

##### Document Actions

Entropy Function   ::   Joint Entropy   ::   Relative Entropy   ::   Multivariable   ::   Convexity

## Joint entropy

Often we are interested in the entropy of pairs of random variables ( X , Y ). Another way of thinking of this is as a vector of random variables.

Interpretation: The uncertainty (entropy) about both X and Y is equal to the uncertainty (entropy) we have about X , plus whatever we have about Y , given that we know X .

We can also have a joint entropy with a conditioning on it, as shown in the following corollary:

The proof is similar to the one above. (This is a good one to work on your own.)

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Definitions and Basic Facts. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture2_2.htm. This work is licensed under a Creative Commons License