Personal tools
•
You are here: Home The Gaussian Channel

The Gaussian Channel

Document Actions

Definitions   ::   Band-limited   ::   Kuhn-Tucker   ::   Parallel   ::   Colored Noise

Kuhn-Tucker Conditions

Before proceeding with the next section, we need a result from constrained optimization theory known as the Kuhn-Tucker condition.

Suppose we are minimizing some convex objective function L ( x ),

subject to a constraint

Let the optimal value of x be x 0 . Then either the constraint is inactive, in which case we get

or, if the constraint is active, it must be the case that the objective function increases for all admissible values of x :

where is the set of admissible values, for which

(Think about what happens if this is not the case.) Thus,

or

We can create a new objective function

so the necessary conditions become

and

where

For a vector variable , then the condition (1) means:

where is interpreted as the gradient.

In words, what condition (1) says is: the gradient of L with respect to x at a minimum must be pointed in such a way that decrease of L can only come by violating the constraints. Otherwise, we could decrease L further. This is the essence of the Kuhn-Tucker condition.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 17). The Gaussian Channel. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture11_2.htm. This work is licensed under a Creative Commons License