Personal tools
  •  
You are here: Home Electrical and Computer Engineering Information Theory Maximum Entropy Estimation

Maximum Entropy Estimation

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Entropy Estimation   ::   Spectrum Estimation

Maximum Entropy Estimation

The concept of entropy has been applied to estimation problems. Estimation is the art and science of computing a value when incomplete information is available. It is the incompleteness that makes the concept of maximum entropy useful. For many estimation problems, it is necessary to make assumptions about values which are not explicitly available. One possible choice is to assume that the values are such that the entropy is maximized.

Suppose we want to maximize h ( f ) over all densities with the following constraints:

We can "pseudo-solve" this as follows. Let

 

 \begin{displaymath}
J(f) = -\int f \ln f + \lambda_0 \int f + \sum_{i=1}^m \lambda_i
\int f r_i.
\end{displaymath}

 

"Differentiate w.r.t. f ( x )" (this is the part we are skimping on) and equate to zero:

 

 \begin{displaymath}
\frac{\partial J}{\partial f(x)} = -\ln f(x) - 1  + \lambda_0 +
\sum_{i=1}^m \lambda_i r_i(x) = 0
\end{displaymath}

 

which leads to

 

 \begin{displaymath}
f(x) = e^{\lambda_0 - 1 + \sum_{i=1}^m \lambda_i r_i(x)}
\end{displaymath}

 

where the Lagrange multipliers are chosen to make f satisfy the constraints.

To show that this actually works, we will use an inequality approach. Let g be a density that satisfies the constraints. Then

 

\begin{displaymath}\begin{aligned}
h(g) &= -\int_S g \ln g \\
&= -\int_S g \ln ...
...i=1}^m \lambda_i r_i) \\
&= -\int f \ln f = h(f)
\end{aligned}\end{displaymath}

 


\begin{example}
Suppose we have $E X = 0$\ and $EX^2 = \sigma^2$. Then the
dis...
...After finding the constants, we recognize the normal distribution.
\end{example}

\begin{example}
(Weighted dice) Suppose we have a six-sided die with $EX = \sum...
... most
probable macrostate is therefore $(np_1,np_2,\ldots, np_6)$.
\end{example}

\begin{example}
Let $S = [0,\infty)$\ and let $EX = \mu$. Then the
entropy-max...
...}f(x) = \frac{1}{\mu} e^{-x/\mu} \qquad x \geq 0.
\end{displaymath}\end{example}
Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, May 15). Maximum Entropy Estimation. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Information_Theory/lecture13.htm. This work is licensed under a Creative Commons License Creative Commons License