Personal tools
You are here: Home Electrical and Computer Engineering Stochastic Processes More on Random Variables

More on Random Variables

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Expectation  ::  Properties  ::  Pairs  ::  Independence  ::  Two R.V.S.  ::  Functions  ::  Inequalities  ::  Conditional  ::  General

Expectations of functions of two r.v.s

Let $g: \Rbb^2 \rightarrow \Rbb$ be measurable (e.g., $\{(x,y)
\in \Rbb^2: g(x,y) \in B\} \in \Bc^2 \forall B \in \Bc^2$ ). Then for a bivariate r.v. $(X,Y)$ we can define $Z = g(X,Y)$ .

E[Z] &= \int_{-\infty}^\infty z dF_Z(z) = \i...
...{XY}(x_i,y_j) & (X,Y) \text{ discrete}

  1. $E[X+Y] = E[X]+ E[Y]$
  2. If $X \geq Y$ then $E[X] \geq E[Y]$ .
  3. If $X$ and $Y$ are independent then

    \begin{displaymath}E[g_1(X) g_2(Y)] = E[g_1(X)]E[g_2(Y)] \forall \text{(measurable,
well-defined)} g_1, g_2.

    Comments: If $X$ and $Y$ are independent, then

    \begin{displaymath}E[XY] = E[X]E[Y].

    However, if $E[X]E[Y] = E[XY]$ , this does not mean that they are independent. (Uncorrelated does not imply independence.)

    However, if $E[g_1(X) g_2(Y)] = E[g_1(X)]E[g_2(Y)]$ for all appropriate functions, then $X$ and $Y$ are independent. In fact, this is necessary and sufficient for independence.

The {\bf covariance} of $X$ and $Y$ are is defined as
...ned as
\begin{displaymath}\var(X) = \cov(X,X).
  1. $\cov(X,Y) = E[XY] - E[X]E[Y]$ .                  $\var(X) = E[X^2]
- (E[X])^2$ .
  2. If $X$ and $Y$ are independent then $\cov(X,Y) = 0$ . If $\cov(X,Y) = 0$ , we say that $X$ and $Y$ are uncorrelated.

    Again, uncorrelated does not imply independence.

  3. $\var(X+Y) = \var(X) + \var(Y) + 2 \cov(X,Y)$ . If $\cov(X,Y) = 0$ then $\var(X+Y) = \var(X) + \var(Y)$ .
  4. $\cov(aX+b, cY+d) = ac \cov(X,Y)$ for all constants $a,b,c,d
\in \Rbb$ . Thus

    \begin{displaymath}\var(aX) = a^2 \var(X).

If $0 < \var(X) < \infty$ and $0 < \var(Y) < \infty$, the {...
...end{displaymath}This is a normalized version of the covariance.
  1. $\vert\rho\vert \leq 1$ . This can be shown using the Cauchy-Schwartz inequality.

    $\vert\rho\vert=1$ iff $X$ and $Y$ are linearly related,

    \begin{displaymath}X = aY+b

    for some constants $(a,b)$ with $a \neq 0$ .
If $(X,Y) \sim \Nc(\mu_x,\mu_y,\sigma_x^2,\sigma_y^2,\rho)$, then
$\rho(X,Y) = \rho$.
    As we have observed before, if $X, Y$ are jointly Gaussian and $\rho = 0$ , then they are independent. Otherwise, $\rho = 0$ does not imply independence.
Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 31). More on Random Variables. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License