Personal tools

Some More Bounds

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Log Sum Inequality   ::   Data Processing Inequality   ::   Fano's Inequality

Fano's Inequality

A fundamental operation in communications is estimating a value based on some measurement. That is, suppose a value X is sent through a channel (where it is corrupted) and a value Y is received. Based on that received value we want to determine an estimate of X by performing some function on the observed value Y . Denote the estimate of X by $\Xhat$ :

\begin{displaymath}\Xhat = g(Y).\end{displaymath}

A question of performance now arises naturally: what is the probability that we have estimated the correct value of X . This can be explored in a variety of ways. One of the ways that will be fruitful to us in this class is by Fano's inequality, which relates the probability of error to the conditional entropy H ( X |Y). Intuitively, if there is little uncertainty about X when we know Y , then the probability of error should be small. In fact, when H ( X |Y)=0, then the probability of error should be zero: there is no uncertainty left over after we observe Y . Fano's inequality makes a quantitative statement to this effect.


\begin{displaymath}P_e = \text{probability of error} = \text{Pr}\{\Xhat \neq X\}.

(Fano's inequality)
\begin{displaymath}\boxed{H(P_e) +P_e \log(...
...ymath}1+P_e \log(\vert\Xc\vert) \geq H(X\vert Y).

Note that if P e =0 then H ( X |Y)=0.

Define the random variable $E$\ by
\begin{displaymath}E = \begin{...
...xt{(how many ways to make an

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Some More Bounds. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License