Personal tools
  •  

Some More Bounds

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Log Sum Inequality   ::   Data Processing Inequality   ::   Fano's Inequality

The data processing inequality

The data processing inequality is a simple but interesting theorem that states (in essence) the following: no matter what processing you do on some data, you cannot get more information out of a set of data than was there to begin with. In a sense, it provide a bound on how much can be accomplished with signal processing.

\begin{definition}
Random variable $X$, $Y$, and $Z$\ are said to form a {\bf M...
...aymath}p(x,y,z) = p(x)p(y\vert x) p(z\vert y).
\end{displaymath}\end{definition}

A Markov chain is at the heart of the "state" idea in differential equations and is used commonly in controls. The concept of a state is that knowing the present state, the future of the system of independent of the past . In other words, the state provides all the information necessary to move into the future: the necessary initial conditions of the differential equations.

The "conditional independence" idea means

 

\begin{displaymath}p(x,z\vert y) = \frac{p(x,y,z)}{p(y)} = \frac{p(x,y)p(z\vert y)}{p(y)} =
p(x\vert y)p(z\vert y).
\end{displaymath}

 

Note that if Z = f ( Y ) then $X \rightarrow Y \rightarrow Z$ .

\begin{theorem}
(Data processing inequality) If $X \rightarrow Y \rightarrow Z$\ then
\begin{displaymath}I(X;Y) \geq I(X;Z)
\end{displaymath}\end{theorem}

Interpretation: If we think of Z as being the result of some processing that is done on the data Y , that is, Z = f ( Y ) for some function, deterministic or random, then there is no function that can increase the amount of information that Y tells about X .


\begin{proof}
By the chain rule for mutual information we can write
\begin{disp...
... 0$\ we have
\begin{displaymath}I(X;Y) \geq I(X;Z).
\end{displaymath}\end{proof}

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). Some More Bounds. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture3_2.htm. This work is licensed under a Creative Commons License Creative Commons License