Personal tools

Data Compression

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Introduction   ::   Kraft   ::   Optimal Codes   ::   Bounds   ::   Huffman   ::   Coding

Optimal codes

We deem a code to be optimal if it has the shortest average codeword length. The goal, after all, is to use the smallest number of bits to send the information. This may be regarded as an optimization problem. In designing the code, we must select the codeword lengths $l_1,l_2,\ldots, l_m$ so that that average length


\begin{displaymath}L = \sum_i p_i l_i


is as short as possible (less than any other prefix code), subject to the constraint that the lengths satisfy the Kraft inequality (so it will be a prefix code). That is, minimize


\begin{displaymath}L = \sum_i p_i l_i




\begin{displaymath}\sum D^{-l_i} \leq 1.

We will make two simplifying assumptions to get started: (1) we will neglect integer constraints on the codelengths; and (2) we will assume Kraft holds with equality. Then we can write a Lagrange-multiplier problem


\begin{displaymath}J = \sum_i p_i l_i + \lambda\sum_i D^{-l_i}.


Taking derivative with respect to l j and equating to zero


\begin{displaymath}\frac{\partial J}{\partial l_j} = p_j - \lambda D^{-l_j} \log D = 0


leads to


\begin{displaymath}D^{-l_j} = \frac{p_j}{\lambda\log D}


Substituting into the constraint,


\begin{displaymath}\sum_i \frac{p_i}{\lambda \log D} = 1


so $\lambda = 1/\log D$ , and


p i = D - l i


and the optimal codelengths are $l_i^* = -\log_D p_i$ . (The * denotes the optimal value.) Under this solution, the minimal average codeword length is


\begin{displaymath}L^* = \sum_i p_i l_i^* = H_D(X).


(The subscript D denotes the log with respect to D .)

Of course, in practice the codeword lengths must be integer values, so the result just obtained is a lower bound on the average codeword length. We will validate this lower bound in the following theorem:
The expected length $L$\ of any instantaneous $D$-ary code for ...
\end{displaymath}with equality if and only if $D^{-l_i} = p_i$.

L-H_D(X) &= \sum_i p_i...
...e relative entropy is nonnegative and $c\leq 1$\ (Kraft inequality).

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Data Compression. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License