Personal tools
  •  

Homework Solutions

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Utah State University
ECE 6010
Stochastic Processes
Homework # 3 Solutions

  1. Show that $ \cov(aX+b,cY+d) = ac \cov(X,Y)$ .

    $\displaystyle \mathrm{cov}(aX+b,cY+d) = E \left[ \left( aX+b - E[aX+b] \right)
\left(cY+d- E[cY+d] \right) \right] $

    $\displaystyle E[aX+b] = a\mu_{x}+b \hspace{1cm} \& \hspace{1cm} E[cY+d] =
c\mu_{y}+d $

    Therefore,

    $\displaystyle \mathrm{cov}(aX+b,cY+d) = E[ a(X-\mu_{x}) \cdot c(Y-\mu_{y})] = ac \
E[(X-\mu_{x})(Y-\mu_{y})] = ac  \mathrm{cov}(X,Y) $

  2. Suppose $ X \sim \Nc(0,\sigma^2)$ . Use the ch.f. of $ X$ to find an expression for $ E[X^n]$ , $ n \in \Zbb^+$ .

    $\displaystyle \phi_{X}(u) = e^{j \mu u - \frac{1}{2} u^{2}\sigma^{2}} =
e^{\frac{1}{2} u^{2}\sigma^{2}} $

    $\displaystyle E[X^{n}] = \left. i^{-n} \frac{d^{n}}{du^{n}} \phi_{X}(u)
\right\...
...{2} 2!} - \frac{\sigma^{6} u^{6}}{2^{3}
3!} + \cdots \right)\right\vert _{u=0} $

    $\displaystyle = \left\{ \begin{array}{l l} 0 & n  \mathrm{odd} \\
\frac{\sigm...
...\cdot 3 \cdot 5 \ldots (n-1) \sigma^{n} & n
 \mathrm{even} \end{array}\right. $

  3. Suppose $ X$ and $ Y$ are the indicator functions of events $ A$ and $ B$ , respectively. Find $ \rho(X,Y)$ , and show that $ X$ and $ Y$ are independent if and only if $ \rho(X,Y) = 0$ .

    $\displaystyle X = \left\{ \begin{array}{ll} 1 & x\in A  0 & x \notin A
\end{a...
... Y = \left\{ \begin{array}{ll} 1 &
y\in B  0 & y \notin B \end{array} \right.$

    $\displaystyle \rho(X,Y) = \frac{ \mathrm{cov}(X,Y)}{\sqrt{\mathrm{var}(X)
\math...
...}  y \in B) -
P(x \in A) P(y \in B)} {\sqrt{\mathrm{var}(X)
\mathrm{var}(Y)}} $

    So from the equation above,

    $\displaystyle \rho = 0   \Rightarrow   P(x\in A  \mathrm{and}  y \in B)
= P(x \in A) P(y \in B)   \Rightarrow   X,Y$    are independent

    $\displaystyle X,Y$     independent $\displaystyle   \Rightarrow   P(x\in A \
\mathrm{and}  y \in B) = P(x \in A) P(y \in B)   \Rightarrow \
 \rho = 0 $

    Therefore, $ \rho(X,Y) = 0  \Leftrightarrow$ $ X$ and $ Y$ are independent.

  4. Suppose $ \phi(u)$ is a ch.f. Show that $ \vert\phi(u)\vert^2$ is also a ch.f.

    $\displaystyle \phi(u) = E[e^{jux}] = \int_{-\infty}^{\infty} e^{jux} f_{X}(x) dx $

    $\displaystyle \vert\phi(u)\vert^{2} = \phi(u) \cdot \phi^{*}(u) = E[e^{jux}]
E[...
...fty}^{\infty} e^{jux} f_{X}(x) dx
\int_{-\infty}^{\infty} e^{-jux} f_{X}(x) dx $

    Let $ Y = -X$ be another random variable, so $ f_{Y}(y) = f_{X}(-x)$ and $ dx = -dy$ .

    $\displaystyle \vert\phi(u)\vert^{2} = \int_{-\infty}^{\infty} e^{jux} f_{X}(x)d...
...\infty}\int_{-\infty}^{\infty}
e^{ju(x+y)}f_{X}(x)f_{Y}(y)dxdy = E[e^{ju(x+y)}]$

    So, $ \vert\phi(u)\vert^{2}$ is a ch.f. for r.v. $ X+Y$ .

  5. Suppose $ X$ and $ Y$ are jointly Gaussian. Use ch.f.s to show that $ \rho(X,Y) = \rho$ .

    $\displaystyle \phi_{X,Y}(u,v) = \exp \left[ i(u\mu_{x} +v\mu_{y}) - \frac{1}{2}(
u^{2} \sigma_{x}^{2} + v^{2} \sigma_{y}^{2} + 2uv \mu_{x}\mu_{y}
\rho) \right] $

    $\displaystyle E[XY] = \left. -\frac{\partial}{\partial u}\frac{\partial}{\parti...
...ight\vert _{u,v=0} =  \cdots  = \sigma_{x} \sigma_{y} \rho +
\mu_{x} \mu_{y} $

    Now,

    $\displaystyle \rho(X,Y) = \frac{ \mathrm{cov}(X,Y)}{\sqrt{\mathrm{var}(X)
\math...
...gma_{y} \rho +
\mu_{x} \mu_{y} -\mu_{x} \mu_{y}}{\sigma_{x} \sigma_{y}} = \rho $

  6. Suppose $ X$ and $ Y$ are jointly continuous. (a) Show that

    $\displaystyle F_{Y\vert X}(b\vert x) = \int_{-\infty}^b \frac{f_{XY}(x,y)}{f_X(x)} dx
$

    and thus that

    $\displaystyle f_{Y\vert X}(y\vert x) = \frac{f_{XY}(x,y)}{f_X(x)}
$

    For $ X$ and $ Y$ jointly continuous,

    $\displaystyle F_{Y\vert X}(b\vert x) = P(Y \leq b \vert X=x) = \frac{P(Y \leq b...
...}f_{XY}(x,y) dy}{f_{X}(x)} = \int_{-\infty}^{b}
\frac{f_{XY}(x,y)}{f_{X}(x)} dy$

    Now,

    $\displaystyle f_{Y\vert X}(y\vert x) = \frac{\partial}{\partial y} F_{Y\vert X}...
...nt_{-\infty}^{y}
\frac{f_{XY}(x,y)}{f_{X}(x)} dy = \frac{f_{XY}(x,y)}{f_{X}(x)}$

    (b) Suppose $ \int_{-\infty}^\infty \vert y\vert f_{Y\vert X}(y\vert x) dy < \infty$ . Show that $ E[Y\vert X=x] = \int_{-\infty}^\infty y f_{Y\vert X}(y\vert x) dy.$

    $\displaystyle E[Y\vert X=x] = \int_{-\infty}^{\infty} y  dF_{Y\vert X}(y\vert x) $

    Now,

    $\displaystyle F_{Y\vert X}(y\vert x) = \int_{-\infty}^{\infty} f_{Y\vert X}(y\vert x) dy$

    therefore,

    $\displaystyle dF_{Y\vert X}(y\vert x) = f_{Y\vert X}(y\vert x) dy $

    substituting this in the first equation we get

    $\displaystyle E[Y\vert X=x] = \int_{-\infty}^{\infty} y f_{Y\vert X}(y\vert x) dy$

  7. Suppose $ X$ and $ Y$ are independent continuous r.v.s with c.d.f.s $ F_X$ and $ F_Y$ , respectively. Suppose further that $ F_X(b)
\geq F_Y(b)$ for all $ b \in \Rbb$ . Show that $ P(X \geq Y) \leq
1/2.$


    $\displaystyle P(X \geq Y)$ $\displaystyle =$ $\displaystyle \int_{-\infty}^{\infty} \int_{-\infty}^{x}
f_{XY}(x,y) dxdy$  
      $\displaystyle =$ $\displaystyle \int_{-\infty}^{\infty} \int_{-\infty}^{x} f_{X}(x)
f_{Y}(y) dxdy$     (because independent)  
      $\displaystyle =$ $\displaystyle \int_{-\infty}^{\infty} F_{Y}(x) f_{X}(x) dx$  
      $\displaystyle \leq$ $\displaystyle \int_{-\infty}^{\infty} F_{X}(x) f_{X}(x) dx =
\left. \frac{F_{X}^{2}(x)} {2} \right\vert _{-\infty}^{\infty} = 1/2 -0
= 1/2$  

  8. Prove Jensen's inequality for the case of simple-function r.v.'s

    First, we prove that the convexity idea generalizes to multiple points. For a convex function $ c(x)$ we know that

    $\displaystyle \lambda c(x_1) + (1-\lambda) c(x_2) \geq c(\lambda x_1 +
(1-\lambda) x_2).
$

    The general result is:

    $\displaystyle \sum_{i=1}^n \lambda_i c(x_i) \geq c(\sum_{i=1}^n \lambda_i x_i)$ (***)

    where $ \sum_{i=1}^n \lambda_i = 1$ and $ \lambda_i \geq 0$ .

    We'll do it for three points, from which the induction to $ n$ points should be straightforward. Let $ \lambda_1 + \lambda_2 + \lambda_3 =
1$ . Consider

    $\displaystyle \lambda_1 c(x_1) + \lambda_2 c(x_2) + \lambda_3 c(x_3) = (\lambda_1
c(x_1) + \lambda_2 c(x_2)) + \lambda_3 c(x_3)
$

    Factor out of the first two terms the quantity $ \lambda_1 +
\lambda_2$ :

    $\displaystyle (\lambda_1 + \lambda_2)\left[\frac{\lambda_1}{\lambda_1 + \lambda...
...x_1)
+ \frac{\lambda_2}{\lambda_1+ \lambda_2}c(x_2)\right] + \lambda_3 c(x_3).
$

    Since $ \frac{\lambda_1}{\lambda_1 + \lambda_2} +
\frac{\lambda_2}{\lambda_1 + \lambda_2} = 1$ , convexity applies to the two terms in the square brackets:

    $\displaystyle (\lambda_1 + \lambda_2)\left[\frac{\lambda_1}{\lambda_1 + \lambda...
...ambda_2} x_1 + \frac{\lambda_2}{\lambda_1 + \lambda_2} x_2) + \lambda_3 c(x_3).$ (*)

    Now letting $ x^* = \frac{\lambda_1}{\lambda_1 + \lambda_2} x_1 +
\frac{\lambda_2}{\lambda_1 + \lambda_2} x_2$ and $ \lambda^* =
\lambda_1 + \lambda_2$ , we see that we have obtained

    $\displaystyle \lambda^* c(x^*) + (1-\lambda^*)c(x_3)
$

    for which convexity applies again:

    $\displaystyle \lambda^* c(x^*) + (1-\lambda^*)c(x_3) \geq c(\lambda^* x^* + (1-\lambda^* x_3).$ (**)

    Combining (*) and (**) we obtain

    $\displaystyle \lambda_1 c(x_1) + \lambda_2 c(x_2) + \lambda_3 c(x_3) \geq
c(\lambda_1 x_1 + \lambda_2 x_2 + \lambda_3 x_3).
$

    Now to Jensen's inequality. It, too, is proved by induction. We will demonstrate explicitly the first couple of steps. Suppose $ X = b_1
I_{A_i}(\omega)$ (a simple function involving a single set $ A_1
\subset \Omega$ . Then $ X$ takes on two values: $ b_1$ , with probability $ P(A_1)$ and 0 , with probability $ P(A_1^c)$ . Then

    $\displaystyle E[X] = b_1 P(A_1) + 0 P(A_1^c)
$

    and for a convex function $ c$

    $\displaystyle E[c(X)] = c(b_1) P(A_1) + c(0) P(A_1^c) \geq c(b_1 P(A_1) + 0
P(A_1)^c),
$

    where the inequality follows since $ c$ is convex, and $ P(A_1) +
P(A_1^c) = 1$ .

    Now consider a simple function involving two disjoint sets:

    $\displaystyle X = b_1 I_{A_1}(\omega) + b_2 I_{A_2}(\omega), \qquad A_1 \cap A_2
= \emptyset.
$

    Then $ X$ takes on three values, $ b_1$ , $ b_2$ , and 0, with probabilities $ P(A_1), P(A_2),$ and $ P(A_1^c \cap A_2^c)$ , respectively. Then

    $\displaystyle E[X] = b_1 P(A_1) + b_2 P(A_2) + 0 P(A_1^c \cap A_2^c).
$

    And

    $\displaystyle E[c(X)] = c(b_1)P(A_1) + c(b_2)P(A_2) + c(0) P(A_1^c \cap A_2^c)
\geq c(b_1 P(A_1) + b_2 P(A_2) + 0 P(A_1^c \cap A_2^c)
$

    by the convexity in (***).

  9. Prove the Schwartz inequality.

    Consider the quantity $ E[(X-\alpha Y)^2]$ , which is $ \geq 0$ for all values of the real constant $ \alpha$ . Expanding, we have

    $\displaystyle 0 \leq E[X^2] - 2 \alpha E[XY] + \alpha^2 E[Y^2]$ (*)

    Now find the value of $ \alpha$ that minimizes the right-hand side by differentiating with respect to $ \alpha$ :

    $\displaystyle -2 E[XY] + 2\alpha E[Y^2] = 0
$

    so the minimizing $ \alpha = E[XY]/E[Y^2]$ . Substitution into (*) we have

    $\displaystyle 0 \leq E[X^2] - 2 E[XY]^2/E[Y^2] + (E[XY]^2/E[Y^2]^2)E[Y^2].
$

    Simplifying, we obtain the expression

    $\displaystyle E[X^2]E[Y^2] \geq E[XY].
$


Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, June 13). Homework Solutions. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/hw3sol.html. This work is licensed under a Creative Commons License Creative Commons License