Personal tools
You are here: Home Electrical and Computer Engineering Stochastic Processes More on Random Variables

More on Random Variables

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Expectation  ::  Properties  ::  Pairs  ::  Independence  ::  Two R.V.S.  ::  Functions  ::  Inequalities  ::  Conditional  ::  General

Some important inequalities

In general, when we observe an outcome of a random variable, we ''expect'' it to be near the mean (that is, near the expected value). Further, the farther the outcome is from the mean, the less likely we expect the outcome to be. There are some very useful probabilities which quantize these intuitive ''expectations.'' These are the Markov inequality, and its consequences, the Chebyshev inequality and the Chernoff bound. We will introduce these here.

Let $B \in \Bc$ for a Borel set $\Bc$ . Recall that

\begin{displaymath}I_B(x) =
1 & x \in B \\
0 & x \not \in B.

Let $X$ be a random variable, and let $Y = I_B(X)$ . This is a measurable function, so $Y$ is another random variable.

\begin{displaymath}E[Y] = P_X(B) = P(X \in B).

We will use this ''expectation as probability'' idea to get a bound.

Suppose $g$ is a nonnegative, nondecreasing function. Suppose $b \in
\Rbb$ with $g(b) > 0$ . Consider the function

\begin{displaymath}h(x) = \frac{g(x)}{g(b)} \rightarrow
\geq 1 & x \geq b \\
\geq 0 & \forall x.

Observe that

\begin{displaymath}h(x) \geq I_{[b,\infty)}(x)

for all $x$ , since

\begin{displaymath}I_{[b,\infty)}(x) =
1 & x \geq b  0 & x< b.

Note that $h(b) = I_{[b,\infty)}(b)$ .

Now we have

\begin{displaymath}E[h(X)] \geq E[I_{[b,\infty)}(X)] = P(X \geq b).


\begin{displaymath}E[h(X)] = \frac{E[g(X)]}{g(b)}


\begin{displaymath}P(X \geq b) \leq \frac{E[g(X)]}{g(b)}.

A similar result can be established if $g$ is nonnegative, nondecreasing on $[0,\infty)$ and symmetric about 0. We can thus establish that

\begin{displaymath}\boxed{P(\vert X\vert \geq b) \leq \frac{E[g(X)]}{g(b)}.}

Special case: Assume $X \geq 0$ , and let

\begin{displaymath}g(x) =
x & x \geq 0  0 & x < 0.

The inequality above gives rise to the Markov inequality :

\begin{displaymath}\boxed{ P(X \geq b) \leq \frac{E[X]}{b}}

for all $b > 0$ . Somewhat more generally, the Markov inequality says

\begin{displaymath}\boxed{ P(\vert X\vert \geq b) \leq \frac{E[\vert X\vert]}{b}}

for any $b > 0$ .

Special Case: Take $g(x) = X^2$ . (This satisfies the requirements for $g$ .) Then

\begin{displaymath}P(\vert Y\vert > b) \leq \frac{E[Y^2]}{b^2}.

Let $Y = X-\mu_x$ . We obtain the Chebyshev inequality ,

\begin{displaymath}\boxed{P(\vert X-\mu_x\vert > b) \leq \frac{E[(X-\mu_x)^2]}{b^2} =

Interpretation: The probability that $X$ differs from its mean by more than some amount $b$ is less than the variance of $X$ over $b^2$ . Further away, less probable. Higher varianace, more probable.

Special case: The Chernoff bound . In this case, let us take $X$ positive, and let $g(x) = e^{sx}$ for $s > 0$ . Then we obtain

\begin{displaymath}P(X \geq b) \leq \frac{E[e^{sx}]}{e^{sb}}.

There is some flexibility in the choice of $s$ , which may be selected to make the bound as tight as possible.

The Chernoff bound is a powerful tool which has been put to good use in digital communications. (See, e.g., Wozencraft and Jacobs.)

Jensen's inequality

Jensen's inequality can be used in some cases to ''interchange'' expectation and function evaluation (at least, approximately). It is based on the idea of convex functions.

A function $c:\Rbb \rightarrow \Rbb$ is {\bf convex} if
...splaymath}for all $x, y  in \Rbb$ and $0 \leq \alpha \leq 1$.
That is, a function is convex if the chord connecting the points $(x,f(x))$ and $(y,f(y))$ lies above the function between $x$ and $y$ . (Draw a picture.) It can be shown that if $c$ is twice differentiable then $c$ is convex iff $c''(x) \geq 0 \forall x \in \Rbb$ .
$c(x) = e^x$\par$c(x) = e^{-x}$\par$c(x) = x^2$.
\par$c(x) = ax + b$\end{example}
It can be shown that all convex functions are measurable with respect to the Borel field $\Bc(\Rbb)$ .

Theorem 2   If $c: \Rbb \rightarrow \Rbb$ is convex, then

\begin{displaymath}\boxed{E[c(x)] \geq c(E[x]),}

with equality if and only iff $c(x)$ is a constant function.

In otherwords, we can interchange expectation and functions, and at least get a bound.

$E[X^2] \geq (E[X])^2$\par$E[e^X] \geq e^{E[X]}$\par If $X \geq...
...$E[1/X] \geq 1/E[X]$\par If $X > 0$, $-E[\log X] \geq -\log E[x]$.

Cauchy-Schwartz inequality

This is an inequality that holds in any Hilbert space. It more or less forms the theme for the first several weeks of 6030.

Theorem 3   If $E[X^2] < \infty$ and $E[Y^2] < \infty$ then

\begin{displaymath}\vert E[XY]\vert^2 \leq E[X^2]E[Y^2].

For example,

\begin{displaymath}\vert\cov(X,Y)\vert^2 \leq \var(X)\var(Y)

implying $\vert\rho\vert \leq 1$ .

Observe that $\var(X) = E[X^2] - E[X]^2 \geq 0$ using Schwartz and Jensen inequalities.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 31). More on Random Variables. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License