Personal tools

Channel Capacity

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Definitions   ::   Symmetric Channels   ::   Closer Look   ::   Typical Sequences   ::   Theorem

Channel Capacity Definition and Examples

We are now ready to talk about the fundamental concept of the capacity of a channel. This is a measure of how much information per channel usage we can get through a channel.

The {\bf information channel capacity} is defined as the max...
... maximum is taken over all possible input distributions $p(x)$.

Consider the noiseless channel, with error-free transmission. Fo... equivocation: $C = 1$\ bit,
which occurs when $p(x) = (1/2,1/2)$\end{example}

Noisy channel with non-overlapping outputs. $C=1$\ bit, when $p(x) =

Noisy typewriter, with crossover of 1/2 and 26 input symbols. O...
...vert X)] = \max H(Y) - 1 = \log 26 - 1 = \log

The BSC, with crossover probability $p$.
...l information left over for every bit of information

Binary erasure channel: Probability $1-\alpha$\ of correct, $\a...
...\end{displaymath}This is maximized when $\pi=.5$, and $C=1-\alpha.$\end{example}

The channel capacity has the following properties:

  1. $C \geq 0$ . (why?)
  2. $C \leq \log \vert\Xc\vert$ . (why?)
  3. $C \leq \log\vert\Yc\vert$ .
  4. I ( X ; Y ) is continuous function of p ( x ).
  5. I ( X ; Y ) is a concave function of p ( x ). (It has a maximum)
Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, May 15). Channel Capacity. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License