Personal tools
  •  

Homework Solutions

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Utah State University
ECE 6010
Stochastic Processes
Homework # 2 Solutions

  1. Suppose $ X$ is a r.v. with c.d.f. $ F_X$ . Prove the following:
    1. $ F_X$ is nondecreasing.

      Let $ b > a$ .

      $\displaystyle \begin{aligned}
F_X(b) - F_X(a) &= P(X \leq b) - P(X \leq a) = P(...
...a) + P(a < X
\leq b) - P(X \leq a) \\
&= P(a < X \leq b) \geq 0.
\end{aligned}$

      So $ F_X(b) \geq F_X(a)$ for $ b > a$ , which means $ F_X$ is nondecreasing.

    2. $ \lim_{a\rightarrow \infty} F_X(a) = 1$ .

      $\displaystyle \lim_{a\rightarrow \infty} F_X(a) = \lim_{a\rightarrow \infty}
P(X\leq a) = P(\{\omega: X(\omega) \leq a\}) = P(\Omega) = 1.
$

    3. $ \lim_{a\rightarrow-\infty} F_X(a) = 0$ .

      $\displaystyle \lim_{a\rightarrow-\infty}F_X(a) = \lim_{a \rightarrow -\infty}
P(\{\omega: X(\omega) \leq a\}) = P(\emptyset) = 1.
$

    4. $ F_X$ is right continuous.

      Let $ B_n = \{\omega \in \Omega: X(\omega) \leq a + 1/n\}$ for $ n=1,2,\ldots$ . Note that this is a nested sequence, $ B_1 \supset
B_2 \supset \cdots$ . We have

      $\displaystyle \lim_{n\rightarrow \infty} F_X(a+1/n) = \lim_{n\rightarrow \infty}
P(B_n) = P(\lim_{n\rightarrow \infty} B_n)
$

      by continuity of probability. But $ lim_{n\rightarrow \infty} B_n =
\{\omega: X(\omega) \leq a\}$ , so

      $\displaystyle \lim_{n\rightarrow \infty} F_X(a+1/n) = P(X \leq a).
$

      Since the limit from the right is equal to the limiting value, we have right continuity.

    5. $ P(a < X \leq b) = F_X(b) - F_X(a)$ if $ b > a$ .

      $ P(a < X \leq b) = P(X \leq b) - P(X \leq a) = F_X(b) - F_X(a)$ .

    6. $ P(X=a) = F_X(a) - \lim_{b \rightarrow a^-} F_X(b).$

      $ P(X=a) = P(X\leq a) - P(X<a) = F_X(a) - \lim_{b \rightarrow a^-} F_X(b)$ .

    Also, find expressions for $ P(a \leq X \leq b)$ , $ P(a \leq X < b)$ and $ P(a < X < b)$ in terms of $ F_X$ .

    $\displaystyle P(a \leq X \leq b) = P(a < X \leq b) + P(X=a) = F_X(b) - F_X(a) +
(F_X(a) - \lim_{b \rightarrow a^-} F_X(b)).
$

    \begin{equation*}\begin{aligned}
P(a \leq X < b) &= P(a < X \leq b) + P(X = a) -...
...X(c)) -
(F_X(b) - \lim_{c \rightarrow b^-} F_X(c))
\end{aligned}\end{equation*}

  2. Show that the following are valid p.m.f.s:
    1. Binomial: $ f_X(a) = n!/((n-a)!a!) \pi^a (1-\pi)^{n-a}$ if $ a\in\{0,1,\ldots,n\}$ .

      Need to show that $ \sum_a f_X(a) = 1$ . Use the binomial theorem:

      $\displaystyle (x+y)^n = \sum_{i=0}^n \binom{n}{i} x^i y^{n-i}
$

      with $ x = \pi$ and $ y = 1-\pi$ . Then

      $\displaystyle \begin{aligned}
\sum_{i=0}^n f_X(i) = \sum_{i=0}^n \binom{n}{i}\pi^i (1-\pi)^{n-i} =
(\pi + 1-\pi)^n = 1^n = 1.
\end{aligned}$

    2. Poisson: $ f_X(a) = e^{-\lambda}\lambda^a/a!$ for $ a \in
\{0,1,\ldots\}$ .

      Need to show that $ \sum_a f_X(a) = 1$ .

      $\displaystyle \sum_{i=0}^\infty f_X(i) = \sum_{i=0}^\infty e^{-\lambda}
\frac{\lambda^i}{i!} = e^{-\lambda} e^{\lambda} = 1.
$

  3. Find the mean and variance of $ X$ when $ X$ is
    1. $ \mathcal{N}(\mu,\sigma^{2})$ ;
      Let $ Z \sim \mathcal{N}(0,1)$ , therefore

      $\displaystyle E(Z)  =  \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}}
ze^{-z^{2}/2} dz $

      Function $ ze^{-z^{2}/2}$ has odd symmetr. Integrating an odd function on a symmetric interval -a,a gives zero. Thus,

      $\displaystyle E(Z)  =  \lim_{a\rightarrow \infty} \int_{-a}^{a}\frac{1}{\sqrt{2\pi}}ze^{-z^{2}/2} dz  =  0. $

      For variance,

      $\displaystyle Var(Z)  =  \int_{-\infty}^{\infty} (z-E(Z))^{2} f(z)dz  =
 \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}}
z^{2} e^{-z^{2}/2} dz $

      With $ u=z$ and $ dv = ze^{-z^{2}/2}dz$ , we have $ du = dz$ and $ v =
-e^{-z^{2}/2}$ .

      $\displaystyle Var(Z) = \underbrace{\left. - \frac{1}{\sqrt{2\pi}}ze^{-z^{2}/2}
...
...race{\int_{-\infty}^{\infty}
\frac{1}{\sqrt{2\pi}}e^{-z^{2}/2} dz}_{1}  =  1 $

      Now $ X \sim \mathcal{N}(\mu,\sigma^{2})$ . So, $ X = \mu + \sigma Z$ .

      $\displaystyle E(X) = E( \mu + \sigma Z) = \mu + \sigma E(Z) = \mu $

      $\displaystyle Var(X) = Var( \mu + \sigma Z) = \sigma^{2} Var(Z) = \sigma^{2} $

    2. Binomial $ (n,\pi)$ ;
      Binomial p.m.f is given by

      $\displaystyle f_{X}(x) = \frac{ n!}{x!(n-x)!} \pi^{x}(1-\pi)^{n-x} $

      Therefore,

      $\displaystyle E(X) = \sum_{x=0}^{n} xf_{X}(x) = \sum_{x=0}^{n}x \frac{ n!}{x!(n-x)!} \pi^{x}(1-\pi)^{n-x} $

      The first term in the above summation will be zero so we could start it from 1. Also cancelling the common factors of $ x$ in numerator and denominator.

      $\displaystyle E(X) = \sum_{x=1}^{n} \frac{ n!}{(x-1)!(n-x)!}
\pi^{x}(1-\pi)^{n-x} $

      Making change of variable $ x' = x-1$ above we get,

      $\displaystyle E(X) = \sum_{x'=0}^{n-1} \frac{
n!}{x'!(n-x'-1)!}\pi^{x'+1}(1-\pi)^{n-x'-1} $

      $\displaystyle E(X) = n\pi \sum_{x'=0}^{n-1} \frac{ (n-1)!}{x'!(n-x'-1)!}
\pi^{x'}(1-\pi)^{n-x'-1} $

      The terms in the summation are just the binomial funciton for $ n-1$ trials, and we are summing it over all values of $ x$ so sum is 1.

      $\displaystyle E(X) = n \pi $

      Now,

      $\displaystyle Var(X) = E(X^2) - [E(X)]^{2} $

      $\displaystyle E(X^2) = \sum_{x=0}^{n}x^{2} \frac{ n!}{x!(n-x)!}
\pi^{x}(1-\pi)^...
...} = n\pi \sum_{x=0}^{n-1}(x+1) \frac{
(n-1)!}{x!(n-x-1)!}\pi^{x}(1-\pi)^{n-1-x}$

      $\displaystyle E(X^2) = n\pi \underbrace{
\sum_{x=0}^{n-1}\frac{(n-1)!}{x!(n-x-1...
...\sum_{x=0}^{n-1} x
\frac{(n-1)!}{x!(n-x-1)!}\pi^{x}(1-\pi)^{n-1-x}}_{(n-1)\pi} $

      $\displaystyle E(X^{2}) = n\pi + n(n-1)\pi^{2} $

      Therefore,

      $\displaystyle Var(X) = n\pi(1-\pi) $

    3. Poisson $ (\lambda)$ ;
      Poisson : $ p(x,\lambda) = \frac{\lambda^{x}}{x!} e^{-\lambda}$

      $\displaystyle E(X) = \sum_{x=0}^{\infty} x \frac{\lambda^{x}}{x!} e^{-\lambda} ...
...0}^{\infty}
\frac{\lambda^{x}}{x!} = \lambda e^{-\lambda} e^{\lambda} = \lambda$

      $\displaystyle E(X^{2}) = \sum_{x=0}^{\infty} x^{2} \frac{\lambda^{x}}{x!}
e^{-\...
...(x-1)!} = \lambda e^{-\lambda} \sum_{x=0}^{\infty} (x+1)
\frac{\lambda^{x}}{x!}$

      $\displaystyle E(X^{2}) = \lambda e^{-\lambda} \left( \sum_{x=0}^{\infty} x
\fra...
... = \lambda e^{-\lambda} (\lambda e^{\lambda}+e^{\lambda}) =
\lambda^2 + \lambda$

      So,

      $\displaystyle Var(X) = E(X^2) - [E(X)]^{2} = \lambda^2 + \lambda - \lambda^2 =
\lambda $

    4. Exponential $ (\lambda)$ ;
      Exponential : $ f_{X}(x) = \lambda e^{-\lambda x} \hspace{1cm} x \geq 0$ .

      $\displaystyle E(X) = \int_{0}^{\infty} x \lambda e^{-\lambda x} dx = \left. -x
...
...rac{1}{\lambda} e^{-\lambda x} \right) +
\frac{1}{\lambda} = \frac{1}{\lambda} $

      $\displaystyle E(X^2) = \int_{0}^{\infty} x^{2} \lambda e^{-\lambda x} dx =
\lef...
...ac{2}{\lambda^2} e^{-\lambda x} \right\vert^{\infty}_{0} =
\frac{2}{\lambda^2} $

      So,

      $\displaystyle Var(X) = E(X^2) - [E(X)]^{2} = \frac{2}{\lambda^2} -
\frac{1}{\lambda^2} = \frac{1}{\lambda^2} $

  4. Suuppose that $ X$ and $ Y$ are jointly continuous. Show that $ f_{X}(x) = \int_{-\infty}^{\infty} f_{XY}(x,y)dy \hspace{1cm} x \in \Re$

    $\displaystyle \int_{-\infty}^{x} \int_{-\infty}^{y} f_{XY}(a,b) dadb =
F_{XY}(x,y) = P(X \leq x, Y \leq y) $

    Therefore,

    $\displaystyle \lim_{y \rightarrow \infty} \int_{-\infty}^{x} \int_{-\infty}^{y}...
...) = \lim_{y
\rightarrow \infty} P(X \leq x, Y \leq y) = P(X \leq x) = F_{X}(x) $

    Now,

    $\displaystyle \frac{d}{dx} \int_{-\infty}^{x} \int_{-\infty}^{\infty}
f_{XY}(a,b) dadb = \frac{d}{dx} F_{X}(x) $

    $\displaystyle \Rightarrow  \int_{-\infty}^{\infty} f_{XY}(x,b) db = f_{X}(x) $

    $\displaystyle \Rightarrow  f_{X}(x) = \int_{-\infty}^{\infty} f_{XY}(x,y) dy $

  5. Suppose that $ X$ and $ Y$ are jointly Gaussian with parameters $ \mu_{x},\sigma_{x}^{2},\mu_{y},\sigma_{y}^{2},\rho
$ . Show that $ X \sim \mathcal{N}(\mu_{x},\sigma_{x}^{2})$ .
    In this case we have,

    $\displaystyle f_{XY}(x,y) = \frac{1}{2\pi \sigma_{y} \sigma_{x}
\sqrt{1-\rho^{2...
...igma_{x} \sigma_{y}} +
\frac{(y-\mu_{y})^{2}}{\sigma_{y}^{2}} \right] \right\} $

    $\displaystyle f_{X}(x) = \int_{-\infty}^{\infty} f_{XY}(x,y) dy $

    $\displaystyle \vdots $

    (Hint : Do substitution of variables and Complete the squares)
  6. Suppose $ X \sim \mathcal{N}(0,1)$ , and define $ Y =
X^2$ . Are $ X$ and $ Y$ uncorreleated? Are $ X$ and $ Y$ independent? Find the pdf of $ Y$ . Are $ X$ and $ Y$ jointly continuous?

    Note that $ E[X] = 0$ and $ E[Y] = E[X^2] = \sigma_x^2 = 1$ . Then

    $\displaystyle \rho = \frac{\mathrm{cov}(X,Y)}{\sqrt{\mathrm{var}(X)
\mathrm{var...
...X(X^2-1)]}{\sigma_{x}\sigma_{y}} =
\frac{E[X^3] - E[X]}{\sigma_{x}
\sigma_{y}}
$

    But for a Gaussian with mean zero, all odd moments are 0, so $ E[X^3] =
0$ . So $ \rho = 0$ , and $ X$ and $ Y$ are uncorrelated. As $ Y =
X^2$ , $ Y$ cannot be independent of $ X$ -- they are functionally related.

    $\displaystyle \{Y \leq y\} = \{X^{2} \leq y\} = \{-\sqrt{y} \leq X \leq \sqrt{y}
\} = \{-\sqrt{y} < X \leq \sqrt{y} \} \cup \{X = - \sqrt{y} \} $

    $\displaystyle F_{Y}(y) = F_{X}(\sqrt{y}) - F_{X}(-\sqrt{y})+P[X=-\sqrt{y}] $

    Now, X is a continuous r.v. so $ P[X=-\sqrt{y}] = 0$ , then for $ y>0$

    $\displaystyle f_{Y}(y) = \frac{d}{dy} [F_{Y}(y)] = \frac{1}{2 \sqrt{y}}
f_{X}(\...
...
\sqrt{y}\sqrt{2 \pi}} \exp(-y/2) + \frac{1}{2 \sqrt{y}\sqrt{2
\pi}}\exp(-y/2) $

    $\displaystyle f_{Y}(y) = \frac{1}{\sqrt{2 \pi y}} e^{-\frac{1}{2}y}u(y) $

    where, $ u(y)$ is the standard unit step function.

    X and Y are jointly continuous: Look at the joint CDF:

    \begin{equation*}\begin{aligned}F_{XY}(\alpha,\beta) &= P(X \leq \alpha, Y \leq
...
...rt{\beta} \leq X, X \leq \min(\alpha,\sqrt{\beta}))
\end{aligned}\end{equation*}

    This is a continuous function of $ \alpha$ and $ \beta$ (as can be realized with a little thought).

Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, June 13). Homework Solutions. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/hw2sol.html. This work is licensed under a Creative Commons License Creative Commons License