##### Personal tools
•
You are here: Home More On Channel Capacity

# More On Channel Capacity

##### Document Actions

Converse   ::   Source/Channel

## The Converse To The Coding Theorem

We will begin with a special case: a zero-error channel, and show that this requires that the rate be less than the capacity. A zero error channel is one in which the sequence Y n determines the input symbol W without any error, so H ( W |Y n )=0. We can obtain a bound by assuming that W is uniformly distributed over the 2 nR input symbols, so H ( W ) = nR . Now

from which we conclude that for a zero-error channel, .

Recall that Fano's inequality related the probability of error to the entropy. Using the notation of the current context, it can be written as

We also observe (and could prove) that

(Using the channel n times, the capacity per transmission is not increased.)

We now have the tools necessary to prove the converse to the coding theorem: any sequence of (2 nR , n ) codes with must have ..

Hamming codes?

Feedback channels: the same capacity!

Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, May 15). More On Channel Capacity. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture9_1.htm. This work is licensed under a Creative Commons License