Personal tools
  •  

Homework Solutions

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Utah State University
ECE 6010
Stochastic Processes
Homework # 6 Solutions

  1. Suppose $ \{X_n\})_{n=1}^\infty$ is a sequence of independent r.v.s each of which is uniformly distributed on the interval $ (0,1)$ . Define a sequence of r.v.s $ \{Z_n\}$ by $ Z_n = n(1-Y_n)$ , where $ Y_N = \max_{1\leq i \leq n} X_i$ . Show that $ \{Z_n\}_{n=1}^\infty$ converges in distribution to an exponential r.v. with p.d.f.

    \begin{displaymath}f(x) =
\begin{cases}
e^{-x} & x \geq 0  0 &\text{otherwise}.
\end{cases}\end{displaymath}

    Here,

    $\displaystyle F_{Z_{n}}(z)  =  P(Z_{n} \leq z)  =  P(n(1-Y_{n}) \leq z)  =
 P( Y_{n} \geq (1-z/n)) = 1 - P(y_{n} <(1-z/n)) $

    $\displaystyle = 1- P( \max_{1 \leq i \leq n} X_{i} < (1-z/n)) = 1- P( X_{1} <
(1-z/n), X_{2} < (1-z/n), \ldots, X_{n} < (1-z/n))$

    $\displaystyle = 1- P(X_{1} < (1-z/n)) \cdot P(X_{2} < (1-z/n)) \cdots P(X_{n}
< (1-z/n)) $

    Now,

    $\displaystyle P(X_{i} < (1-z/n)) = \left \{
\begin{array}{ll}
0 & (1-z/n) <0 \...
...}0\leq z \leq n \\
1 & (1-z/n) >1 \text{that is, if } z<0
\end{array} \right.
$

    therefore,

    $\displaystyle F_{Z_{n}}(z) = \left\{
\begin{array}{ll}
0 & z>n  1-(1-z/n)^{n} & 0 \leq z \leq n  1 & z<0
\end{array} \right. $

    we have $ \lim_{n \rightarrow \infty} (1-z/n)^{n} = e^{-z} $ , so

    $\displaystyle \lim_{n \rightarrow \infty} F_{Z_{n}} (z) = F_{Z}(z)= \left\{
\begin{array}{ll}
1-e^{-z} & z \geq 0  0 & z<0
\end{array} \right. $

    Therefore,

    $\displaystyle f_{Z}(z) = \frac{d}{dz} F_{Z}(z) = \left\{
\begin{array}{ll}
e^{-z} & z \geq 0  0 & z <0
\end{array} \right.
$

    So it converges in distribution.

  2. Suppose $ X_n \rightarrow X$ (i.p.) and that there is a constant $ C$ such that $ \vert X_n\vert \leq C$ for all $ n$ . Show that $ X_n \rightarrow X$ (m.s.)

    We have $ X_{n} \rightarrow X$ (i.p.) and $ \vert X_{n}\vert \leq C$ . Therefore,

    $\displaystyle P(\vert X_{n} -X\vert > \varepsilon) \rightarrow 0 $

    Define,

    $\displaystyle A = \{\vert X_{n} -X\vert > \varepsilon \}    $     and $\displaystyle   \
 B = \{\vert X_{n} -X\vert \leq \varepsilon \}
$

    and let $ I_A(x)$ and $ I_B(x)$ be the corresponding indicator functions, so that $ I_{A} + I_{B} = 1$ .
    $\displaystyle E(\vert X_{n}-X\vert^{2})$ $\displaystyle =$ $\displaystyle E(\vert X_{n}-X\vert^{2} (I_{A} + I_{B})) =
E(\vert X_{n}-X\vert^{2} (I_{A})) + E(\vert X_{n}-X\vert^{2} (I_{B}))$  
      $\displaystyle \leq$ $\displaystyle E(\vert X_{n}-X\vert^{2} (I_{A})) + \varepsilon^{2}$     $\displaystyle \mbox{\qquad
(since over $I_{B}$ , $\vert X_{n} -X\vert \leq \varepsilon$ and $P(I_{B})
\rightarrow 1$)}$  
      $\displaystyle =$ $\displaystyle E((x_{n}^{2} - 2x_{n}x + x^{2})I_{A}) + \varepsilon^{2}$  
      $\displaystyle \leq$ $\displaystyle E(4C^{2} I_{A}) + \varepsilon^{2}  $     $\displaystyle \mbox{ ( as $\vert X_{n}\vert
\leq C$ )}$  
      $\displaystyle =$ $\displaystyle 4C^{2} E(I_{A}) + \varepsilon^{2} = \varepsilon^{2}  $     $\displaystyle \mbox{\qquad
(\qquad $P(I_{A}) \rightarrow 0$)}$  

    Taking limit $ n \rightarrow \infty$ we have $ \varepsilon \rightarrow
0$ . Therefore we have,

    $\displaystyle \lim_{n \rightarrow \infty} E((X_{n}-X)^{2}) \rightarrow 0 $

    Therefore, $ X_{n} \rightarrow X$ (i.p.) $ \Rightarrow X_{n}
\rightarrow X$ (m.s.) if $ \vert X_{n}\vert \leq C$ .

  3. Suppose $ X_n \rightarrow C$ (in distribution), where $ C$ is a constant. Show that $ X_n \rightarrow C$ (i.p.)

    $ X_{n} \rightarrow {C}$ (i.p.) $ \Rightarrow P(\vert X_{n} -C\vert >
\varepsilon) \rightarrow 0 $ .

    $\displaystyle P(\vert X_{n}-C\vert > \varepsilon)$ $\displaystyle =$ $\displaystyle P(X_{n}-C > \varepsilon) +
P(X_{n}-C < - \varepsilon)$  
      $\displaystyle =$ $\displaystyle P(X_{n} > C+ \varepsilon) + P(X_{n} < C- \varepsilon)$  
      $\displaystyle =$ $\displaystyle P(X_{n} > C+ \varepsilon) + P(X_{n} \leq C-\varepsilon)$  
      $\displaystyle =$ $\displaystyle 1 - F_{X_{n}}(C+\varepsilon) + F_{X_{n}}(C -\varepsilon)$  
      $\displaystyle \rightarrow$ $\displaystyle 1 - 1 + 0$            (by convergence in distribution)  

    Therefore,

    $\displaystyle P(\vert X_{n}-C\vert > \varepsilon) = 0 \Rightarrow \lim_{n \righ...
..._{n}-C\vert > \varepsilon) = 0 \Rightarrow X_{n} \rightarrow
{X} \mbox{ (i.p.)}$

Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, June 13). Homework Solutions. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/hw6sol.html. This work is licensed under a Creative Commons License Creative Commons License