# More On Channel Capacity

Converse :: Source/Channel

## The Converse To The Coding Theorem

We will begin with a special case: a zero-error channel, and show that
this requires that the rate be less than the capacity. A zero error
channel is one in which the sequence
*
Y
*
^{
n
}
determines the input symbol
*
W
*
without any error, so
*
H
*
(
*
W
*
|Y
^{
n
}
)=0. We can obtain a bound by
assuming that
*
W
*
is uniformly distributed over the 2
^{
nR
}
input
symbols, so
*
H
*
(
*
W
*
) =
*
nR
*
. Now

from which we conclude that for a zero-error channel, .

Recall that Fano's inequality related the probability of error to the
entropy. Using the notation of the current context, it can be written
as

We also observe (and could prove) that

(Using the channel

*n*times, the capacity per transmission is not increased.)

We now have the tools necessary to prove the converse to the coding
theorem: any sequence of (2
^{
nR
}
,
*
n
*
) codes with
must have
..

Hamming codes?

Feedback channels: the same capacity!