Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory The Gaussian Channel

The Gaussian Channel

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Definitions   ::   Band-limited   ::   Kuhn-Tucker   ::   Parallel   ::   Colored Noise

Band-limited channels

We now come to the first time in the book where the information is actually carried by a time-waveform , instead of a random variable. We will consider transmission over a band-limited channel (such as a phone channel). A key result is the sampling theorem:

Theorem 2

This is the classical Nyquist sampling theorem. However, Shannon's name is also attached to it, since he provided a proof and used it. A representation of the function f ( t ) is


 \begin{displaymath}
f(t) = \sum_n f(\frac{n}{2W}) \sinc(t-\frac{n}{2W})
\end{displaymath}

where


 \begin{displaymath}
\sinc(t) = \frac{\sin(2 \pi Wt)}{2\pi W t}
\end{displaymath}

From this theorem, we conclude (the dimensionality theorem) that a bandlimited function has only 2 W degrees of freedom per second.

For a signal which has "most'' of the energy in bandwidth W and "most'' of the energy in a time T , then there are about 2 WT degrees of freedom, and the time- and band-limited function can be represented using 2 WT orthogonal basis functions, known as the prolate spheroidal functions. We can view band- and time-limited functions as vectors in a 2 TW dimensional vector space.

Assume that the noise power-spectral density of the channel is N 0 /2. Then the noise power is ( N 0 /2)(2 W ) = N 0 W . Over the time interval of T seconds, the energy per sample (per channel use) is


 \begin{displaymath}
\frac{PT}{2WT} = \frac{P}{2W}.
\end{displaymath}

Use this information in the capacity:

\begin{displaymath}\begin{aligned}
C &= \frac{1}{2}\log(1+\frac{P}{N}) \text{ bi...
...(1+\frac{P}{N_0 W}) \text{ bits per channel use}.
\end{aligned}\end{displaymath}

There are 2 W samples each second (channel uses), so the capacity is


 \begin{displaymath}
C = (2W) \frac{1}{2}\log(1+\frac{P}{N_0 W}) \text{ bits/second}
\end{displaymath}

or


 \begin{displaymath}
\boxed{C = W \log(1+\frac{P}{N_0 W})}
\end{displaymath}

This is the famous and key result of information theory.

As $W\rightarrow \infty$ , we have to do a little calculus to find that


 \begin{displaymath}
C = \frac{P}{N_0} \log_2 e \text{ bits per second}.
\end{displaymath}

This is interesting: even with infinite bandwidth, the capacity is not infinite, but grows linearly with the power.


\begin{example}
For a phone channel, take $W=3300$\ Hz. If the SNR is $P/N_0 W ...
...= 21972 \text{ bits/second}.
\end{displaymath}(The book is dated.)
\end{example}
We cannot do better than capacity!

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). The Gaussian Channel. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture11_1.htm. This work is licensed under a Creative Commons License Creative Commons License