Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory The Gaussian Channel

The Gaussian Channel

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Definitions   ::   Band-limited   ::   Kuhn-Tucker   ::   Parallel   ::   Colored Noise

Kuhn-Tucker Conditions

Before proceeding with the next section, we need a result from constrained optimization theory known as the Kuhn-Tucker condition.

Suppose we are minimizing some convex objective function L ( x ),


 \begin{displaymath}
\min L(x)
\end{displaymath}

subject to a constraint


 \begin{displaymath}
f(x) \leq 0.
\end{displaymath}

Let the optimal value of x be x 0 . Then either the constraint is inactive, in which case we get


 \begin{displaymath}
\left.\frac{\partial L}{\partial x}\right|_{x_0} = 0
\end{displaymath}

or, if the constraint is active, it must be the case that the objective function increases for all admissible values of x :


 \begin{displaymath}
\frac{\partial L}{\partial x}_{x \in \Ac} \geq 0
\end{displaymath}

where $\Ac$ is the set of admissible values, for which


 \begin{displaymath}
\frac{\partial f}{\partial y} \leq 0.
\end{displaymath}

(Think about what happens if this is not the case.) Thus,


 \begin{displaymath}
\sgn\frac{\partial L}{\partial x} = - \sgn \frac{\partial
  f}{\partial x}
\end{displaymath}

or

 \begin{equation}
\frac{\partial L}{\partial x} + \lambda \frac{\partial f}{\partial
  x} = 0 \qquad \lambda \geq 0.
\end{equation}

We can create a new objective function


 \begin{displaymath}
J(x,\lambda) = L(x) + \lambda f(x),
\end{displaymath}

so the necessary conditions become


 \begin{displaymath}
\frac{\partial J}{\partial x} = 0
\end{displaymath}

and


 \begin{displaymath}
f(x) \leq 0
\end{displaymath}

where


 \begin{displaymath}
\lambda  
\begin{cases}
  \geq 0 & f(y) = 0 \qquad \text{constraint is active} \\
= 0 & f(y) < 0 \qquad \text{constraint is inactive}.
\end{cases}
\end{displaymath}

For a vector variable $\xbf$ , then the condition (1) means:


 \begin{displaymath}
\frac{\partial L}{\partial x} \text{ is parallel to } \frac{\partial
  f}{\partial x} \text{ and pointing in opposite directions},
\end{displaymath}

where $\frac{\partial L}{\partial x}$ is interpreted as the gradient.

In words, what condition (1) says is: the gradient of L with respect to x at a minimum must be pointed in such a way that decrease of L can only come by violating the constraints. Otherwise, we could decrease L further. This is the essence of the Kuhn-Tucker condition.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). The Gaussian Channel. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture11_2.htm. This work is licensed under a Creative Commons License Creative Commons License