Personal tools
You are here: Home Electrical and Computer Engineering Stochastic Processes More on Random Variables

More on Random Variables

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Expectation  ::  Properties  ::  Pairs  ::  Independence  ::  Two R.V.S.  ::  Functions  ::  Inequalities  ::  Conditional  ::  General


When we say ''expectation,'' we mean ''average,'' the average being roughly what you would think of (i.e., the arithmetic average, as opposed to a median or mode). For a discrete r.v. $X$ , we define the expectation as

\begin{displaymath}\boxed{E[X] = \sum_i x_i p_X(x_i)}

For a continuous r.v., we define the expectation as

\begin{displaymath}\boxed{E[X] = \int_{-\infty}^\infty f_X(x) x dx}

Now a bit of technicality regarding integration, which introduces notation commonly used. When you integrate, you are typically doing a Riemann integral:

\begin{displaymath}\int_a^b xf_X(x) dx = \lim_{\max_{1 \leq i \leq n-1}\vert x_...
\rightarrow 0} \sum_{i=1}^n z_i f_X(z_i) (x_{i+1}-x_i)

\begin{displaymath}a = x_1 < x_2 < \cdots <x_n = b \qquad\qquad z_i \in (x_i,x_{i+1})

In other words, we break up the interval into little slices and add up the vertical rectangular pieces.

Another way of writing this is to recognize that

\begin{displaymath}z_i f_X(z_i) (x_{i+1}-x_i) \approx z_i P(x_i < X \leq X_{i+1}) =
z_i(F_X(x_{i+1}) - F_X(x_i))

and that in the limit, the approximation becomes exact. Note, however, that this is expressed in terms of the c.d.f., not the p.d.f., and so exists for all random variables, not just continuous ones.

This gives rise to what is known as the Riemann-Stieltjes Integral:

\begin{displaymath}E[X] = \lim_{\max_{1\leq i \leq n-1}(x_{i+1}-x_i) \rightarrow 0}
\sum_{i=1}^n z_i[F_X(x_{i+1}) - F_X(x_i)].

We write the limit as

\begin{displaymath}\int_a^b x dF_X(x)

This notation ''describes'' continuous, discrete, and mixed cases. That is,

\begin{displaymath}E[X] = \int_{-\infty}^\infty x dF_X(x).

We have defined the Riemann-Stieltjes integral in a context of expectation. However, it has a more general definition:

\begin{displaymath}\int_a^b f(x) dg(x) = \lim_{\max(x_{i+1}-x_i)\rightarrow 0}
\sum_{i} f(z_i)[g(x_{i+1})-g(x_i)]

When $g(x) = x$ , this reduces to the ordinary Riemann integral. Sufficient conditions for existence:
  • $g(x)$ of bounded variation
  • and $f(x)$ continuous on $[a,b]$
  • $f(x)$ of bounded variation
  • $g(x)$ continuous
The first case covers the case of expectation.

In a directly analogous way we define

\begin{displaymath}\int_{-\infty}^\infty g(x) d F_X(x) = \lim \sum_{i=1}^n g(z_i)
[F_X(x_{i+1}) - F_X(x_i)].

Now consider the r.v. $Y = g(X)$ .

\begin{displaymath}E[Y] = \int_{-\infty}^\infty y d F_Y(y)

Note that $dF_Y(y)$ is the representation of the limiting value

F_Y(y_{i+1}) - F_Y(y_i) &= Pr(y_i < Y \leq y_...
... \leq g^{-1}(y_{i+1})) = Pr(x_i < X
\leq x_{i+1})

which, in the limit is equal to $dF_X(x)$ , when $y = g(x)$ . Thus

\begin{displaymath}\boxed{E[Y] = \int_{-\infty}^\infty y d F_Y(y) =
\int_{\infty}^\infty g(x) dF_X(x)}

Let us put this in more familiar terms: If $Y = g(x)$ , then
\boxed{E[Y] = \int_{-\infty}^\infty g(x) f_X(x) dx}
\end{displaymath} (1)

One might think that finding $E[Y]$ would require finding $f_Y(y)$ . However, as ( 1 ) shows, all that is necessary is to substitute $g(x)$ for $x$ in the expectation. This is sometimes called the law of the unconcious statistitian , since it can be done nearly thoughtlessly.

An interesting result is obtained through the use of indicator functions. Let $I : (\Fc,\Omega) \rightarrow \Rbb$ be defined by

\begin{displaymath}I_A(\omega) =
1 & \omega \in A \\
0 & \omega \not \in A

In other words, the indicator function indicates which its argument is in the set which is the subscripted argument.

We define a simple function as one which is a linear combination of indicator functions: For some collection $A_1,A_2,\ldots,A_n \in
\Fc$ ,

\begin{displaymath}g(\omega) = \sum_{k=1}^n b_k I_{A_k}(\omega)

This gives us a piecewise-constant function on $\Omega$ . It also defines a random variable.

Note that the collection need not be disjoint. However, we can shuffle things around to write the function as

\begin{displaymath}g(\omega) = \sum_{k=1}^{n^*} b_k^* I_{A_{k}^*}(\omega)

where the $A_{k}^*$ s are disjoint, and where the $b_k^*$ s are unique. Note that

\begin{displaymath}A_k^* = \{\omega \in \Omega : g(\omega) = b_k^*\}

Now note that

\begin{displaymath}\boxed{E[I_A] = P(A)}

Based on this, and the disjointness of the $A_k^*$ , we can write

\begin{displaymath}E[g] = \sum_{i=1}^{n^*} b_k^* P(A_k^*)

There are many instances where indicator functions are used to get a ''handle'' on the probability of an event.

Now we will get a bit more technical, dealing with some issues related to the existence of expectations. We have seen how to define expectations for simple functions (which are random variables). But what about more general random variables? Let $X \geq 0$ be a random variable. We define

\begin{displaymath}E[X] = \sup_{g \text{simple} \leq X} E[g]

where the $\sup$ means ''least upper bound'', and the limit is taken over all simple functions $g$ satisfying $g \leq X$ . It can be shown that this limit always exists, though it may be infinite. There is thus no question of convergence or anything like that.

Generalizing further, let $X$ be an arbitrary r.v. Since the previous result holds for non-negative random variables, let us split $X$ :

\begin{displaymath}X = X^+ - X^-

where $X^+$ is the positive part and $X^-$ is the negative part:

\begin{displaymath}X^+ = \max(X(\omega),0) \geq 0 \qquad \qquad
X^- = -\min(X(\omega),0) \geq 0

Now $X^+$ and $X^-$ have well-defined expectations. We take

\begin{displaymath}E[X] = E[X^+] - E[X^-]

which is defined in every case except when $E[X^+]$ and $E[X^-]$ are both infinite (in which case the difference is undefined).

Let us examine the expectation in light of the Riemann-Stieltjes integral. We define

\begin{displaymath}E[X] = \int_{-\infty}^\infty x dF_X(x) = \lim_{a
\rightarrow-\infty, b \rightarrow \infty} \int_a^b xdF_X(x)

This is a stronger sense of the limit than, for example

\begin{displaymath}\int_{-\infty}^\infty x dF_X(x) = \lim_{a \rightarrow \infty}
\int_{-a}^a xdF_X(x)

For example, $\sin(x)$ has an integral in the latter sense (which is equal to 0), but not in the former sense.

Now we will consider an example of a density where the expectation does not exist.
{\bf Cauchy density}:
\begin{displaymath}f_X(x) = \frac{1}{\pi(...
...exists (although both are
$\infty$), but they can't be subtracted.

Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, May 31). More on Random Variables. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License