Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory More On Channel Capacity

More On Channel Capacity

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Converse   ::   Source/Channel

Joint source/channel coding theorem

We have seen that for a source with entropy H ( X ), the data rate cannot be less than the entropy ( R > H . We have also seen that we can transmit reliably at rates less than capacity ( R < C ). How do these two major theorems tie together?

That is, is it better to remove the redundancy (source coding), then put some back in (channel coding)? Or is there some kind of joint coding method that would work better?

The joint source/channel coding theorem says (in essence) that provided that a source has entropy H < C , then there is a code with $P_e^{(n)} \rightarrow 0$ , and that (conversely) if $P_e^{(n)} \rightarrow 0$ then H < C . Note that the theorem is asymptotic . The proof of the theorem relies on AEP: we code only the typical sequences, and don't worry about the rest (forward). For the converse, we use (again) Fano.

Note that theorem is asymptotic: in practice, we have to deal with codes of finite length and take extra precautions.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). More On Channel Capacity. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture9_2.htm. This work is licensed under a Creative Commons License Creative Commons License