##### Personal tools
•
You are here: Home More On Channel Capacity

# More On Channel Capacity

##### Document Actions

Converse   ::   Source/Channel

## Joint source/channel coding theorem

We have seen that for a source with entropy H ( X ), the data rate cannot be less than the entropy ( R > H . We have also seen that we can transmit reliably at rates less than capacity ( R < C ). How do these two major theorems tie together?

That is, is it better to remove the redundancy (source coding), then put some back in (channel coding)? Or is there some kind of joint coding method that would work better?

The joint source/channel coding theorem says (in essence) that provided that a source has entropy H < C , then there is a code with , and that (conversely) if then H < C . Note that the theorem is asymptotic . The proof of the theorem relies on AEP: we code only the typical sequences, and don't worry about the rest (forward). For the converse, we use (again) Fano.

Note that theorem is asymptotic: in practice, we have to deal with codes of finite length and take extra precautions.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). More On Channel Capacity. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture9_2.htm. This work is licensed under a Creative Commons License