Personal tools
You are here: Home Electrical and Computer Engineering Information Theory Introduction to Information Theory

Introduction to Information Theory

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Communications Model   ::   Fundamental Concept   ::   Channel Models

The Digital Communications Model

In the transfer of digital information, the following framework is often used:

Digital Communications Model

  • The source is the source of (digital) data.
  • The source encoder serves the purpose of removing as much redundancy as possible from the data. This is the data compression portion.
  • The channel coder puts a modest amount of redundancy back in order to do error detection or correction.
  • The channel is what the data passes through, possibly becoming corrupted along the way. There are a variety of channels of interest, including:
    • The magnetic recording channel
    • The telephone channel
    • Other bandlimited channels
    • The multi-user channel
    • Deep-space channels
    • Fading and/or jamming and/or interference channels
    • The genetic representation channel
    • Any place where there is the possibility of corruption in the data
  • The channel decoder performs error correction or detection
  • The source decoder undoes what is necessary to get the data back.

There are also other possible blocks that could be inserted into this model:

  • A block to enforce channel-contraints. Some channels (e.g., the magnetic recording channel) have constraints on how long a run of zeros or ones can be. The constraints are enforced in what is often known as a line coder .
  • A block to perform encryption/decryption.
  • A block to perform lossy compression.

The first of these areas fall well within the scope of information theory, but unfortunately outside the scope of this class. I hope to get to the last one during the quarter.

In light of the model presented here, several questions arise of engineering interest:

  • How can we measure the amount of information?
  • How much can we compress?
  • How do we compress?
  • How do we avoid errors from affecting the performance?
  • How fast can we send through the channel?
  • What if data rate exceeds the capacity of the channel?

These are largely theoretical questions, and the answers are largely theoretical: it may take years of research to turn the answers (often expressed as existence theorems) into practical implementations.

History: Information theory was first published in 1948 by Claude Shannon . He suggested some fundamentla limits on the representation and transmission of information. Since that time, the results have been extended to cover a variety of problem areas and people have worked (hard!) to find ways of achieving the bounds that the theory specifies is possible. In a sense, then, information theory has provided the theoretical motivation for many of the outstanding advances in digital communications and digital storage. For example, how much information can be sent over the phone system?

Besides the (almost) practical applications of the theory, there is great beauty and elegance in the theorems, the study of which has intrinsic merit in a university education.

Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, May 15). Introduction to Information Theory. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License