##### Personal tools
•
You are here: Home Bits and Queues

# Bits and Queues

##### Document Actions

Introduction   ::   Codes   ::   Bounds

## Codes

The rate of the code is defined as

The sequence seems to depend upon T .

Let λ be the average output rate.

We have the following theorem (which we will not prove): The capacity of a single-server queue with service rate μ satisfies

We will need the following result. (There are some problems with this derivation, but the authors use it.)

As an important notational simplification, write . The key theorem depends upon the following lemma:

The important theorem we will work on is the following:

Two questions: (1) how to compute the maximum mutual information and (2) whether the upper bound is tight (that is, whether it is actually achieved). Let us now look at maximizing the mutual information in (6). Our result is analogous to the second-moment constrained random variables and Gaussian noise.

More generally, for a queue, we find

where is an exponential with mean 1/ μ. This follows from part (4) of two theorems back.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Bits and Queues. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture12_2.htm. This work is licensed under a Creative Commons License