# The Gaussian Channel

Definitions :: Band-limited :: Kuhn-Tucker :: Parallel :: Colored Noise

## Kuhn-Tucker Conditions

Before proceeding with the next section, we need a result from constrained optimization theory known as the Kuhn-Tucker condition.

Suppose we are minimizing some convex objective function
*
L
*
(
*
x
*
),

subject to a constraint

Let the optimal value of

*x*be

*x*

_{ 0 }. Then either the constraint is inactive, in which case we get

or, if the constraint is active, it must be the case that the objective function increases for all

*admissible*values of

*x*:

where is the set of admissible values, for which

(Think about what happens if this is not the case.) Thus,

or

We can create a new objective function

so the necessary conditions become

and

where

For a vector variable , then the condition (1) means:

where is interpreted as the gradient.

In words, what condition (1) says is:
*
the gradient of
L
with respect to
x
at a minimum must be pointed in such a way
that decrease of
L
can only come by violating the constraints.
*
Otherwise, we could decrease

*L*further. This is the essence of the Kuhn-Tucker condition.