Personal tools
You are here: Home Electrical and Computer Engineering Information Theory The Gaussian Channel

The Gaussian Channel

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Definitions   ::   Band-limited   ::   Kuhn-Tucker   ::   Parallel   ::   Colored Noise

Band-limited channels

We now come to the first time in the book where the information is actually carried by a time-waveform , instead of a random variable. We will consider transmission over a band-limited channel (such as a phone channel). A key result is the sampling theorem:

Theorem 2

This is the classical Nyquist sampling theorem. However, Shannon's name is also attached to it, since he provided a proof and used it. A representation of the function f ( t ) is

f(t) = \sum_n f(\frac{n}{2W}) \sinc(t-\frac{n}{2W})


\sinc(t) = \frac{\sin(2 \pi Wt)}{2\pi W t}

From this theorem, we conclude (the dimensionality theorem) that a bandlimited function has only 2 W degrees of freedom per second.

For a signal which has "most'' of the energy in bandwidth W and "most'' of the energy in a time T , then there are about 2 WT degrees of freedom, and the time- and band-limited function can be represented using 2 WT orthogonal basis functions, known as the prolate spheroidal functions. We can view band- and time-limited functions as vectors in a 2 TW dimensional vector space.

Assume that the noise power-spectral density of the channel is N 0 /2. Then the noise power is ( N 0 /2)(2 W ) = N 0 W . Over the time interval of T seconds, the energy per sample (per channel use) is

\frac{PT}{2WT} = \frac{P}{2W}.

Use this information in the capacity:

C &= \frac{1}{2}\log(1+\frac{P}{N}) \text{ bi...
...(1+\frac{P}{N_0 W}) \text{ bits per channel use}.

There are 2 W samples each second (channel uses), so the capacity is

C = (2W) \frac{1}{2}\log(1+\frac{P}{N_0 W}) \text{ bits/second}


\boxed{C = W \log(1+\frac{P}{N_0 W})}

This is the famous and key result of information theory.

As $W\rightarrow \infty$ , we have to do a little calculus to find that

C = \frac{P}{N_0} \log_2 e \text{ bits per second}.

This is interesting: even with infinite bandwidth, the capacity is not infinite, but grows linearly with the power.

For a phone channel, take $W=3300$\ Hz. If the SNR is $P/N_0 W ...
...= 21972 \text{ bits/second}.
\end{displaymath}(The book is dated.)
We cannot do better than capacity!

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). The Gaussian Channel. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License