# Some More Bounds

Log Sum Inequality :: Data Processing Inequality :: Fano's Inequality

## Fano's Inequality

A fundamental operation in communications is estimating a value based on some measurement. That is, suppose a value
*
X
*
is sent through a channel (where it is corrupted) and a value
*
Y
*
is received. Based on that received value we want to determine an estimate of
*
X
*
by performing some function on the observed value
*
Y
*
. Denote the estimate of
*
X
*
by
:

A question of performance now arises naturally: what is the probability that we have estimated the correct value of
*
X
*
. This can be explored in a variety of ways. One of the ways that will be fruitful to us in this class is by Fano's inequality, which relates the probability of error to the conditional entropy
*
H
*
(
*
X
*
|Y). Intuitively, if there is little uncertainty about
*
X
*
when we know
*
Y
*
, then the probability of error should be small. In fact, when
*
H
*
(
*
X
*
|Y)=0, then the probability of error should be zero: there is no uncertainty left over after we observe
*
Y
*
. Fano's inequality makes a quantitative statement to this effect.

Let

Note that if

*P*

_{ e }=0 then

*H*(

*X*|Y)=0.