Personal tools
  •  

Homework Solutions

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Utah State University
ECE 6010
Stochastic Processes
Homework # 7 Solutions

  1. Suppose $ \{ X_{t}, t \geq 0 \}$ is a homogeneous Poisson process with parameter $ \lambda$ . Define a random variable $ \tau$ as the time of the first occurrence of an event. Find the p.d.f. and the mean of $ \tau$ .

    p.d.f. :
    We have,

    $\displaystyle P(X_{t} = k) = \frac{e^{- \lambda t} (\lambda t)^k}{k!} $

    So, $ P(\tau > t) = P(X_t = 0) = e^{-\lambda t}$
    and $ P(\tau \leq t) = 1 - e^{-\lambda t}$ , i.e. $ F_{\tau}(t) = 1 -
e^{-\lambda t}$ .

    Therefore,

    $\displaystyle f_{\tau}(t) = \lambda e^{-\lambda t} $

    Mean :

    $\displaystyle E[\tau] = \int_{0}^{\infty} \tau \lambda e^{-\lambda \tau}  d \t...
...tau} \right\vert _{0}^{\infty} = -0-0+0+ \frac{1}{\lambda} =
\frac{1}{\lambda} $

  2. Suppose $ \{ X_{t}, t \in \mathbb{R} \}$ is a w.s.s. random process with autocorrelation function $ R_{X}(\tau)$ . Show that if $ R_{X}$ is continuous at $ \tau = 0$ then it is continuous for all $ \tau \in \mathbb{R}$ .

    $\displaystyle [R_X (\tau + \delta) - R_X(\tau)]^{2}$ $\displaystyle =$ $\displaystyle (E[X_{\tau + \delta}
X_{0}] - E[X_{\tau } X_{0}])^2$  
      $\displaystyle =$ $\displaystyle (E[X_{0}(X_{\tau + \delta}- X_{\tau })])^2$  
      $\displaystyle \leq$ $\displaystyle E[X_{0}^2]E[(X_{\tau + \delta}- X_{\tau })^2]$    (Cauchy-Schwartz)  
      $\displaystyle =$ $\displaystyle R_{X}(0) [ R_{X}(0) - 2 R_{X}(\delta) + R_{X}(0)]$  
      $\displaystyle =$ $\displaystyle 2R_{X}(0) [ R_{X}(0) - R_{X}(\delta)]$  
      $\displaystyle =$ $\displaystyle \leq 2 R_{X}(0) \varepsilon$  

    So,

    $\displaystyle \Rightarrow \vert R_X (\tau + \delta) - R_X(\tau)\vert \leq \sqrt{2 R_{X}(0)
\varepsilon} $

    $\displaystyle \Rightarrow \vert R_X (\tau + \delta) - R_X(\tau)\vert < \varepsilon ' $

    So continuous at all $ \tau$ .

  3. Under the conditions of problem 2, show that for $ a > 0$ ,

    $\displaystyle P(\vert X_{t+\tau} - X_{t}\vert \geq a) \leq \frac{2(R_X(0) -
R_X(\tau))}{a^2}.$

    Let $ g(x) = x^2$ . $ g$ is non-negative, non-decreasing on $ [0,\infty)$ and symmetric about 0. Then,

    $\displaystyle P(\vert X\vert \geq b) \leq \frac{E[g(x)]}{b^2}.$

    Now, our example corresponds to:
    $\displaystyle P(\vert X_{t+\tau} - X_{t}\vert \geq a)$ $\displaystyle \leq$ $\displaystyle \frac{E[\vert X_{t+\tau} -
X_{t}\vert^2]}{a^2}$  
      $\displaystyle =$ $\displaystyle \frac{R_{X}(0) - 2R_{X}(\tau) + R_{X}(0)}{a^2}$  
      $\displaystyle =$ $\displaystyle \frac{2(R_X(0) - R_X(\tau))}{a^2}$  

  4. Suppose $ A$ and $ B$ are random variables with $ E[A^2] < \infty$ and $ E[B^2] < \infty$ . Define the random processes $ \{ X_{t}, t \in \mathbb{R} \}$ and $ \{ Y_{t}, t \in \mathbb{R} \}$ by

    $\displaystyle X_{t} =
A+Bt \quad \quad Y_{t} = B+At, \quad t \in \mathbb{R}. $

    Find the mean, autocorrelation, and cross correlations of these random processes in terms of the moments of $ A$ and $ B$ .

    Mean :

    $\displaystyle \mu_{X}(t) = E[ A+Bt] = E[A] +tE[B] $

    $\displaystyle \mu_{Y}(t) = E[ B+At] = E[B] +tE[A] $

    Autocorrelation :

    $\displaystyle R_{X}(t,s)$ $\displaystyle =$ $\displaystyle E[X_{t}X_{s}]$  
      $\displaystyle =$ $\displaystyle E[(A+Bt)(A+Bs)]$  
      $\displaystyle =$ $\displaystyle E[A^2 + ABs + ABt + B^2 ts]$  
      $\displaystyle =$ $\displaystyle E[A^2] + E[AB](s+t) + E[B^2]ts$  

    Similarly,
    $\displaystyle R_{Y}(t,s)$ $\displaystyle =$ $\displaystyle E[B^2] + E[AB](s+t) + E[A^2]st$  

    Cross correlation :
    $\displaystyle R_{XY}(t,s)$ $\displaystyle =$ $\displaystyle E[X_{t}Y_{s}]$  
      $\displaystyle =$ $\displaystyle E[(A+Bt)(B+As)]$  
      $\displaystyle =$ $\displaystyle E[AB + A^2 s + B^2 t +AB ts]$  
      $\displaystyle =$ $\displaystyle E[AB](1+st) + E[A^2]s + E[B^2]t$  

  5. Homogeneous Poisson

    Simply take $ p_k(t,s)$ and substitute it back into the differential equation and show that it works.

    $\displaystyle \frac{\partial}{\partial t} p_{k}(t,s)=\frac{\partial}{\partial t} \frac{e^{-\lambda(t-s)} (\lambda(t-s))^{k}}{k!}$

    $\displaystyle = \frac{-\lambda e^{-\lambda(t-s)}(\lambda(t-s))^{k}+e^{-\lambda(t-s)} k \lambda(\lambda(t-s))^{k-1}}{k!}$

    $\displaystyle =\lambda \left[\frac{-e^{-\lambda(t-s)}(\lambda(t-s))^{k}}{k!}+\frac{e^{-\lambda(t-s)}(\lambda(t-s))^{k-1}}{(k-1)!} \right]$

    $\displaystyle =\lambda [p_{k-1}(t,s)-p_{k}(t,s)]$

  6. Inhomogeneous Poisson

    Simply take $ p_k(t,s)$ and substitute it back into the differential equation and show that it works.

    $\displaystyle \frac{\partial}{\partial t} p_{k}(t,s)= \frac{-\lambda_{t}e^{-\int_s^t\lambda_x dx}+\lambda_t k(\int_s^t\lambda_x dx)^{k-1}}{k!}$

    $\displaystyle =\lambda_t \left[\frac{-e^{-\int_s^t \lambda_x dx}}{k!}+ \frac{(\int_s^t \lambda_x dx)^{k-1}}{(k-1)!} \right]$

    $\displaystyle = \lambda_t[p_{k-1}(t,s)-p_k(t,s)$

    Mean: $ \int_s^t \lambda_x dx$

    Covariance: Assume that $ t>s$ :

    $\displaystyle \begin{aligned}
E[X_t X_s] &= E[(X_t-X_s)X_s] + E[X_s^2] = E[X_t...
...&= \left[\int_0^t \lambda_x dx +1\right] \int_0^s \lambda_x dx
\end{aligned}
$

    Similarly when $ t < s$ . Then

    $\displaystyle R_X(t,s) =
\left[\int_0^s \lambda_x dx +1\right] \int_0^t \lambda_x dx
$

  7. Suppose $ \{ X_{t}, t \in \mathbb{R} \}$ is a random process with power spectral density

    $\displaystyle S_{X} (\omega) = \frac{1}{(1+
\omega^{2})^{2}}. $

    Find the autocorrelation function of $ X_{t}$ .

    $\displaystyle R_{X}(\tau)$ $\displaystyle =$ $\displaystyle \mathcal{F}^{-1} \{S_{X}(\omega) \} =
\mathcal{F}^{-1} \left\{ \f...
...ega^{2})} \right\} *
\mathcal{F}^{-1} \left\{ \frac{1}{(1+\omega^{2})} \right\}$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} e^{-\vert\tau\vert} * \frac{1}{2} e^{-\vert\tau\vert}$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \int_{-\infty}^{\infty} e^{-\vert t\vert} e^{-\vert\tau - t\vert}
 dt$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \int_{-\infty}^{0} e^{t} e^{t-\tau}  dt +
\frac{1}{4...
...{t-\tau}  dt + \frac{1}{4}
\int_{\tau}^{\infty} e^{-t} e^{\tau -t}  dt    $     $\displaystyle \mbox{ (for
$\tau \geq 0$)}$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \left[ \int_{-\infty}^{0} e^{2t - \tau}  dt +
\int_{0}^{\tau} e^{-\tau}  dt + \int_{\tau}^{\infty} e^{t -
2\tau}  dt \right]$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \left[ \frac{1}{2} e^{-\tau} + \tau e^{-\tau} +
\frac{1}{2} e^{-\tau} \right]$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \left( \tau e^{-\tau} + e^{-\tau} \right)  \
 $     $\displaystyle \mbox{ (for
$\tau \geq 0$)}$  

    Similarly,

    $\displaystyle R_{X}(\tau) = \frac{1}{4} \left( - \tau e^{\tau} + e^{\tau}
\right)    $     $\displaystyle \mbox{ (for $\tau < 0$)}$ $\displaystyle $

    Therefore,

    $\displaystyle R_{X}(\tau) = \frac{1}{4} e^{-\vert\tau\vert} ( \vert\tau \vert +1) $

  8. Suppose that $ \omega$ is a random variable with p.d.f. $ f_{\omega}$ and $ \theta$ is a random variable independent of $ \omega$ uniformly distributed in $ (-\pi,\pi)$ . Define a random process by $ X_{t} = a \cos(\omega t + \theta),  t\in
\mathbb{R}$ where $ a$ is a constant. Find the power spectral density of $ \{X_{t} \}$ .

    $\displaystyle E[X_{t_{1}} X_{t_{2}} ]$ $\displaystyle =$ $\displaystyle E \{ a^{2} \cos(\omega t_{1} +
\theta) \cos(\omega t_{2} + \theta)$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} a^{2} E \{ \cos(\omega t_{1}-\omega t_{2}) - \cos(\omega
t_{1} + \omega t_{2} + 2\theta) \}$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} a^{2} \left [ E \{ \cos(\omega t_{1}-\omega t_{2}) \} -
\underbrace{E \{ \cos(\omega t_{1} + \omega t_{2} + 2\theta)
\}}_{0} \right]$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} a^{2} E \{ \cos(\omega t_{1}-\omega t_{2}) \}$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} a^{2} \int_{-\infty}^{\infty} \cos(\tau
\omega) f_{\o...
...frac{ e^{j \omega \tau} + e^{-j \omega
\tau} } {2} f_{\omega}(\omega)  d\omega$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} a^{2}  2\pi [ \mathcal{F}^{-1} \{
f_{\omega}(\omega) \} + \mathcal{F}^{-1} \{ f_{\omega}(-\omega) \}]$  
      $\displaystyle =$ $\displaystyle \frac{\pi a^{2}}{2} [ \mathcal{F}^{-1} \{
f_{\omega}(\omega) \} + \mathcal{F}^{-1} \{ f_{\omega}(-\omega) \}]$  

    Therefore,

    $\displaystyle S_{X}(\omega) = \frac{\pi a^{2}}{2} [f_{\omega}(\omega) +
f_{\omega}(-\omega)]
$

  9. Suppose that $ \{X_t, t \in \mathbb{R} \}$ is a w.s.s., zero-mean, Gaussian random process with auto-correlation function $ R_{X}(\tau), \tau \in \mathbb{R}$ and power spectral density $ S_X(\omega), \omega \in \mathbb{R}$ . Define the random process $ \{ Y_{t}, t \in \mathbb{R} \}$ by $ Y_{t} = (X_{t})^{2}, t \in
\mathbb{R}$ . find the mean, autocorrelation, and powerspectral density of $ \{ Y_{t}, t \in \mathbb{R} \}$ .

    Mean :

    $\displaystyle \mu_{y}(t) = E[X_{t}^{2}] = R_{X}(0) = \sigma^{2} $

    Autocorrelation :
    $\displaystyle R_{Y}(t,s)$ $\displaystyle =$ $\displaystyle E[Y_{t}Y_{s}] = E[X_{t}^{2} X_{s}^{2}]$  
      $\displaystyle =$ $\displaystyle \left. \frac{\partial ^{4}}{\partial u^{2} \partial v^{2}}
\Phi_{X_{t}X_{s}}(u,v) \right\vert _{u=v=0} \frac{1}{i^{4}}$  
      $\displaystyle \vdots$    
      $\displaystyle =$ $\displaystyle R_{X}^{2}(0) + 2 R_{X}^{2} (\tau) \quad \quad (\tau = t-s)$  

    PSD :

    $\displaystyle S_{Y}(\omega) = \mathcal{F} \{R_{Y}(\tau) \} = 2 \pi R_{X}^{2}(0)
\delta(\omega) + 2 (S_{X}(\omega) *S_{X}(\omega))
$

  10. Suppose $ U$ and $ V$ are independent random variables with $ E[U] = E[V] = 0$ and $ \mathrm{var}(U) = \mathrm{var}(V) =
1$ . Define random processes by

    $\displaystyle X_t = U \cos t + V \sin t
\quad Y_{t} = U \sin t + V \cos t, \quad t \in \mathbb{R}.$

    Find the autocorrelation and cross-correlation functions of $ \{ X_{t}, t \in \mathbb{R} \}$ and $ \{ Y_{t}, t \in \mathbb{R} \}$ . Are $ \{X_{t} \}$ and $ \{Y_{t} \}$ jointly wide sense stationary? Are they individually wide sense stationary?


    $\displaystyle R_{X}(t,s)$ $\displaystyle =$ $\displaystyle E[X_{t}X_{s}] = E[(U \cos t + V \sin t)(U \cos s
+ V \sin s) ]$  
      $\displaystyle =$ $\displaystyle E[U^{2} \cos t \cos s + UV (\cos t \sin s + \sin t \cos s) +
V^{2} \sin t \sin s ]$  
      $\displaystyle =$ $\displaystyle \cos t \cos s E[U^{2}] + E[U] E[V] (\cos t \sin s + \sin t
\cos s) + E[V^{2}] \sin t \sin s$  
      $\displaystyle =$ $\displaystyle \cos t \cos s + \sin t \sin s$  
      $\displaystyle =$ $\displaystyle \cos(t-s)$  

    Similarly,

    $\displaystyle R_{Y}(t,s) = cos(t-s)
$


    $\displaystyle R_{XY}(t,s)$ $\displaystyle =$ $\displaystyle E[X_{t}Y_{s}] = E[(U \cos t + V \sin t)(U \sin
s + V \cos s) ]$  
      $\displaystyle =$ $\displaystyle E[ U^{2} \cos t \sin s + UV ( \cos t \cos s + \sin t \sin
s) + V^{2} \sin t \cos s]$  
      $\displaystyle =$ $\displaystyle E[ U^{2}] \cos t \sin s + E[UV] ( \cos t \cos s + \sin t \sin
s) E[ V^{2}] \sin t \cos s$  
      $\displaystyle =$ $\displaystyle \cos t \sin s + \sin t \cos s$  
      $\displaystyle =$ $\displaystyle \sin (t+s)$  

    $\displaystyle \mu_{X}(t) = 0 \quad \quad \mu_{Y}(t) $

    So, $ \{X_{t} \}$ and $ \{Y_{t} \}$ are individually WSS, but not jointly WSS.

Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, June 13). Homework Solutions. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/hw7sol.html. This work is licensed under a Creative Commons License Creative Commons License