# More on Random Variables

Expectation :: Properties :: Pairs :: Independence :: Two R.V.S. :: Functions :: Inequalities :: Conditional :: General

## Expectation

When we say ''expectation,'' we mean ''average,'' the average being roughly what you would think of (i.e., the arithmetic average, as opposed to a median or mode). For a discrete r.v. , we define the**expectation**as

For a continuous r.v., we define the expectation as

Now a bit of technicality regarding integration, which introduces
notation commonly used. When you integrate, you are typically doing a
Riemann integral:

In other words, we break up the interval into little slices and add up the vertical rectangular pieces.

Another way of writing this is to recognize that

and that in the limit, the approximation becomes exact. Note, however, that this is expressed in terms of the c.d.f., not the p.d.f., and so exists for all random variables, not just continuous ones.

This gives rise to what is known as the Riemann-Stieltjes Integral:

We write the limit as

This notation ''describes'' continuous, discrete, and mixed cases. That is,

We have defined the Riemann-Stieltjes integral in a context of
expectation. However, it has a more general definition:

When , this reduces to the ordinary Riemann integral. Sufficient conditions for existence:

- of bounded variation
- and continuous on

- of bounded variation
- continuous

In a directly analogous way we define

Now consider the r.v. .

Note that is the representation of the limiting value

which, in the limit is equal to , when . Thus

Let us put this in more familiar terms: If , then

One might think that finding would require finding . However, as ( 1 ) shows, all that is necessary is to substitute for in the expectation. This is sometimes called the

*law of the unconcious statistitian*, since it can be done nearly thoughtlessly.

An interesting result is obtained through the use of
**
indicator
**
functions. Let
be defined by

In other words, the indicator function indicates which its argument is in the set which is the subscripted argument.

We define a
**
simple function
**
as one which is a linear combination
of indicator functions: For some collection
,

This gives us a piecewise-constant function on . It also defines a random variable.

Note that the collection need not be disjoint. However, we can
shuffle things around to write the function as

where the s

*are*disjoint, and where the s are unique. Note that

Now note that

Based on this, and the disjointness of the , we can write

There are many instances where indicator functions are used to get a ''handle'' on the probability of an event.

Now we will get a bit more technical, dealing with some issues related
to the existence of expectations. We have seen how to define
expectations for simple functions (which are random variables). But
what about more general random variables? Let
be a random
variable. We
*
define
*

where the means ''least upper bound'', and the limit is taken over all simple functions satisfying . It can be shown that this limit always exists, though it may be infinite. There is thus no question of convergence or anything like that.

Generalizing further, let
be an arbitrary r.v. Since the previous
result holds for non-negative random variables, let us split
:

where is the positive part and is the negative part:

Now and have well-defined expectations. We take

which is defined in every case except when and are both infinite (in which case the difference is undefined).

Let us examine the expectation in light of the Riemann-Stieltjes
integral. We define

This is a stronger sense of the limit than, for example

For example, has an integral in the latter sense (which is equal to 0), but not in the former sense.

Now we will consider an example of a density where the expectation
does not exist.