Personal tools
  •  
You are here: Home Electrical and Computer Engineering Stochastic Processes More on Random Variables

More on Random Variables

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Expectation  ::  Properties  ::  Pairs  ::  Independence  ::  Two R.V.S.  ::  Functions  ::  Inequalities  ::  Conditional  ::  General

Conditional Expectations and Distributions

Suppose $X$ is a discrete r.v. Now we define the conditional distribution of another r.v. $Y$ given $X=x_k$ (at some point where $P(X=x_k) > 0$ ) by

\begin{displaymath}\boxed{F_{Y\vert X}(y\vert x_k) = P(Y \leq y\vert X=x_k) = \frac{P(Y\leq
y,X=x_k)}{P(X=x_k)}.}
\end{displaymath}

By the law of total probability,

\begin{displaymath}F_Y(y) = \sum_k F_{Y\vert X}(y\vert x_k) p_X(x_k).
\end{displaymath}

As we discussed before, when we condition on an event, we are shrinking the sample space under consideration. So there is some normalization that takes place.

We also define

\begin{displaymath}E[Y\vert X=x_k] = \int_{-\infty}^\infty y dF_{Y\vert X}(y\vert x_k).
\end{displaymath}

Note that this depends on the value of $x_k$ ; it is a function of $x_k$ . Let us now take the expectation with respect to $X$ :

\begin{displaymath}E_X[E[Y\vert X=x_k]] = \sum_k E[Y\vert X=x_k] P_X(x_k) = E[Y].
\end{displaymath}

We can think of $E[Y\vert X=x_k]$ as a discrete random variable that is a function of $X$ .

For a discrete r.v. $X$ , the function $F_{Y\vert X}(y\vert x_k)$ could be either a discrete or a continuous r.v. Discrete:

\begin{displaymath}p_{Y\vert X}(y\vert x_k) = P(Y=y\vert X=x_k)
\end{displaymath}

Continuous: There exists a function $f_{Y\vert X}$ such that

\begin{displaymath}F_{Y\vert X}(y\vert x_k) = \int_{-\infty}^y f_{Y\vert X}(z\vert x_k) dz.
\end{displaymath}

We can also write

\begin{displaymath}F_{Y\vert X}(y\vert x_k) = E[I_{(-\infty,y]}(Y)\vert X=x_k]
\end{displaymath}

If $Y$ is discrete we have

\begin{displaymath}p_{Y\vert X}(y\vert x_k) = \frac{P(Y=y,X=x_k)}{p_x(x_k)} =
\frac{p_{xy}(x_k,y)}{ p_x(x_k)}.
\end{displaymath}

When $X$ is a continuous r.v., conditional probabilities and expectations are somewhat more complicated, because $P(X=xk) = 0$ for any particular value of $x$ .

Recall that $E[Y\vert X_k] = g(x_k)$ for some function $g$ , and $E[g(x)] =
E[Y]$ .
\begin{definition}
Suppose $Y$ is an r.v. on the probability space $(\Omega,\F...
...ne
\begin{displaymath}\int_A Y dP = E[I_A(Y)].
\end{displaymath}\end{definition}

\begin{definition}
Suppose $X$ and $Y$ are random variables and $E[\vert Y\ve...
...c$, where $X^{-1}(B) = \{\omega \in \Omega: X(\omega)
\in B\}$.
\end{definition}

  1. It can be shown that under the stated conditions, such a function always exists.

  2. If $X$ is discrete then $E[Y\vert X=x_k]$ as defined earlier satisfies the property.
  3. $E[Y\vert X=x]$ is unique, in the sense that if there are two functions $g(x)$ and $h(x)$ both satisfying ( 2 ) then $P(g(x) = h(x)) = 1$ .
When a condition is true with probability 1, we say that it is true ''almost surely,'' or ''a.s.''

Once we have defined conditional expectation, we can define a conditional c.d.f.:

\begin{displaymath}F_{Y\vert X}(y\vert x) = E[I_{(\infty,y]}(y)\vert X=x].
\end{displaymath}

Properties:
  1. This definition agrees with the previous one when $X$ is discrete.
  2. $F_Y(y) = \int_\Rbb F_{Y\vert X}(y\vert x)P_X(dx)$ .
  3. $F_{Y\vert X}$ is a c.d.f. as a function of $y$ because it satisfies all the properties of a c.d.f.
  4. If $X$ and $Y$ are jointly continuous then $F_{Y\vert X}(y\vert x)$ has a density for every $x$ ,

    \begin{displaymath}f_{Y\vert X}(y\vert x) = \frac{f_{XY}(x,y)}{f_X(x)}
\end{displaymath}

There is another interpretation:

\begin{displaymath}\begin{aligned}
F_{Y\vert X}(y\vert x) &= \lim_{\Delta x \ri...
...} \\
&= \frac{\partiald{}{x}F_{XY}(x,y)}{f_x(x)}
\end{aligned}\end{displaymath}

If $X$ and $Y$ are jointly continuous then $\partiald{}{y}
F_{Y\vert X}(y\vert x)$ exists and

\begin{displaymath}\boxed{\partiald{}{y} F_{Y\vert X}(y\vert x) = \frac{f_{XY}(x,y)}{f_X(x)}.}
\end{displaymath}

Also,

\begin{displaymath}
E[Y\vert X=x] = \int_{-\infty}^\infty y f_{Y\vert X}(y\vert x) dy.
\end{displaymath}

Analogously for continuous random variables

\begin{displaymath}E[Y\vert X=x_k] = \sum_l y_l p_{Y\vert X}(y_l\vert x_k).
\end{displaymath}

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 31). More on Random Variables. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/lec2_8.html. This work is licensed under a Creative Commons License Creative Commons License