Personal tools
  •  

Bits and Queues

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Introduction   ::   Codes   ::   Bounds

Codes


\begin{definition}An $(n,M,T,\epsilon)$\ code for a queue consists of
a codebo...
...ure from the queue
occurs on the average no later than $T$.
\end{definition}
The rate of the $(n,M,T,\epsilon)$ code is defined as

 

 \begin{displaymath}
\log M/T
\end{displaymath}

 


\begin{definition}
The capacity $C$\ of the queue is the largest $R$\ for whic...
...R - \gamma
\end{displaymath}
and $\epsilon_T \rightarrow 0$.
\end{definition}

The sequence seems to depend upon T .

Let λ be the average output rate.

\begin{definition}
$R$\ is $\epsilon$-achievable at output rate $\lambda$\ if ...
...evable at output rate $\lambda$\ for all $0 < \epsilon <
1$.
\end{definition}


We have the following theorem (which we will not prove): The capacity of a single-server $\cdot/G/1$ queue with service rate μ satisfies

 

 \begin{displaymath}
C = \sup_{\lambda<\mu} C(\lambda).
\end{displaymath}


We will need the following result. (There are some problems with this derivation, but the authors use it.)

\begin{theorem}
(sort of) Fano's inequality leads to
\begin{equation}\log \vert\Xc\vert \leq \frac{1}{1-P_e}[I(X;Y) - \log 2]
\end{equation}
\end{theorem}

As an important notational simplification, write $D^i = (D_1,\ldots,
D_i)$ . The key theorem depends upon the following lemma:

\begin{lemma}
\begin{equation}
I(A_1,\ldots,A_n; D_1,\ldots, D_n) = \sum_{i=1...
...(W_i; W_i +
S_i) - \sum_{i=2}^n I(D^{i-1};D_i).
\end{equation}
\end{lemma}

The important theorem we will work on is the following:

\begin{theorem}
For any $\cdot/G/1$\ queue with service time $S$\ and $E[S] = ...
...u} I(X;X+S),
\end{equation}
where $X$\ is independent of $S$.
\end{theorem}

\begin{proof}
% latex2html id marker 86
Let $U \in \{ 1,2,\ldots, M\}$\ indic...
...d{displaymath}
This is equivalent to the statement of the theorem.
\end{proof}

Two questions: (1) how to compute the maximum mutual information and (2) whether the upper bound is tight (that is, whether it is actually achieved). Let us now look at maximizing the mutual information in (6). Our result is analogous to the second-moment constrained random variables and Gaussian noise.

\begin{theorem}
Let $a$\ and $b$\ be nonnegative real numbers. Let $\Nbar$\ be...
...}\vert\vert P_{\Xbar+\Nbar}).
\end{displaymath}
\end{enumerate}
\end{theorem}

\begin{proof}
\begin{enumerate}
\item Let $\Ybar = \Xbar + \Nbar$. Then it c...
...+N}\vert\vert P_{\Xbar+\Nbar}).
\end{displaymath}
\end{enumerate}
\end{proof}

\begin{theorem}
The $\cdot/M/1$\ queue with service rate $\mu$\ satisfies
\be...
...{displaymath}C \leq e^{-1} \mu \text{ nats/s}.
\end{displaymath}
\end{theorem}

More generally, for a $\cdot/G/1$ queue, we find

 

 \begin{displaymath}
C(\lambda) \leq \lambda\log \mu/\lambda + \lambda D(P_s||e_\mu),
\lambda \leq \mu,
\end{displaymath}

where $e_\mu$ is an exponential with mean 1/ μ. This follows from part (4) of two theorems back.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Bits and Queues. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture12_2.htm. This work is licensed under a Creative Commons License Creative Commons License