##### Personal tools
•
You are here: Home Some More Bounds

# Some More Bounds

##### Document Actions

Log Sum Inequality   ::   Data Processing Inequality   ::   Fano's Inequality

## Fano's Inequality

A fundamental operation in communications is estimating a value based on some measurement. That is, suppose a value X is sent through a channel (where it is corrupted) and a value Y is received. Based on that received value we want to determine an estimate of X by performing some function on the observed value Y . Denote the estimate of X by :

A question of performance now arises naturally: what is the probability that we have estimated the correct value of X . This can be explored in a variety of ways. One of the ways that will be fruitful to us in this class is by Fano's inequality, which relates the probability of error to the conditional entropy H ( X |Y). Intuitively, if there is little uncertainty about X when we know Y , then the probability of error should be small. In fact, when H ( X |Y)=0, then the probability of error should be zero: there is no uncertainty left over after we observe Y . Fano's inequality makes a quantitative statement to this effect.

Let

Note that if P e =0 then H ( X |Y)=0.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Some More Bounds. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture3_3.htm. This work is licensed under a Creative Commons License