# Definitions and Basic Facts

Entropy Function :: Joint Entropy :: Relative Entropy :: Multivariable :: Convexity

## Joint entropy

Often we are interested in the entropy of pairs of random variables
(
*
X
*
,
*
Y
*
). Another way of thinking of this is as a vector of random
variables.

Interpretation: The uncertainty (entropy) about both
*
X
*
and
*
Y
*
is
equal to the uncertainty (entropy) we have about
*
X
*
, plus whatever we
have about
*
Y
*
, given that we know
*
X
*
.

We can also have a joint entropy with a conditioning on it, as shown
in the following corollary:

The proof is similar to the one above. (This is a good one to work on
your own.)