##### Personal tools
•
You are here: Home More on Random Variables

# More on Random Variables

##### Document Actions

Expectation  ::  Properties  ::  Pairs  ::  Independence  ::  Two R.V.S.  ::  Functions  ::  Inequalities  ::  Conditional  ::  General

## Some important inequalities

In general, when we observe an outcome of a random variable, we ''expect'' it to be near the mean (that is, near the expected value). Further, the farther the outcome is from the mean, the less likely we expect the outcome to be. There are some very useful probabilities which quantize these intuitive ''expectations.'' These are the Markov inequality, and its consequences, the Chebyshev inequality and the Chernoff bound. We will introduce these here.

Let for a Borel set . Recall that

Let be a random variable, and let . This is a measurable function, so is another random variable.

We will use this ''expectation as probability'' idea to get a bound.

Suppose is a nonnegative, nondecreasing function. Suppose with . Consider the function

Observe that

for all , since

Note that .

Now we have

Also,

Thus

A similar result can be established if is nonnegative, nondecreasing on and symmetric about 0. We can thus establish that

Special case: Assume , and let

The inequality above gives rise to the Markov inequality :

for all . Somewhat more generally, the Markov inequality says

for any .

Special Case: Take . (This satisfies the requirements for .) Then

Let . We obtain the Chebyshev inequality ,

Interpretation: The probability that differs from its mean by more than some amount is less than the variance of over . Further away, less probable. Higher varianace, more probable.

Special case: The Chernoff bound . In this case, let us take positive, and let for . Then we obtain

There is some flexibility in the choice of , which may be selected to make the bound as tight as possible.

The Chernoff bound is a powerful tool which has been put to good use in digital communications. (See, e.g., Wozencraft and Jacobs.)

## Jensen's inequality

Jensen's inequality can be used in some cases to ''interchange'' expectation and function evaluation (at least, approximately). It is based on the idea of convex functions.

That is, a function is convex if the chord connecting the points and lies above the function between and . (Draw a picture.) It can be shown that if is twice differentiable then is convex iff .

It can be shown that all convex functions are measurable with respect to the Borel field .

Theorem 2   If is convex, then

with equality if and only iff is a constant function.

In otherwords, we can interchange expectation and functions, and at least get a bound.

## Cauchy-Schwartz inequality

This is an inequality that holds in any Hilbert space. It more or less forms the theme for the first several weeks of 6030.

Theorem 3   If and then

For example,

implying .

Observe that using Schwartz and Jensen inequalities.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 31). More on Random Variables. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/lec2_7.html. This work is licensed under a Creative Commons License