Personal tools
  •  

Homework Solutions

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Utah State University
ECE 6010
Stochastic Processes
Homework # 9 Solutions

  1. Suppose $ \{ X_{t},t \in \mathbb{R} \}$ is a ramdom process with power spectral density

    $\displaystyle S_{X} (\omega) = \frac{1}{(1+
\omega^{2})^{2}}. $

    Find the autocorrelation function of $ X_{t}$ .

    $\displaystyle R_{X}(\tau)$ $\displaystyle =$ $\displaystyle \mathcal{F}^{-1} \{S_{X}(\omega) \} =
\mathcal{F}^{-1} \left\{ \f...
...ega^{2})} \right\} *
\mathcal{F}^{-1} \left\{ \frac{1}{(1+\omega^{2})} \right\}$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} e^{-\vert\tau\vert} * \frac{1}{2} e^{-\vert\tau\vert}$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \int_{-\infty}^{\infty} e^{-\vert t\vert} e^{-\vert\tau - t\vert}
 dt$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \int_{-\infty}^{0} e^{t} e^{t-\tau}  dt +
\frac{1}{4...
...{t-\tau}  dt + \frac{1}{4}
\int_{\tau}^{\infty} e^{-t} e^{\tau -t}  dt    $     $\displaystyle \mbox{ (for
$\tau \geq 0$)}$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \left[ \int_{-\infty}^{0} e^{2t - \tau}  dt +
\int_{0}^{\tau} e^{-\tau}  dt + \int_{\tau}^{\infty} e^{t -
2\tau}  dt \right]$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \left[ \frac{1}{2} e^{-\tau} + \tau e^{-\tau} +
\frac{1}{2} e^{-\tau} \right]$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} \left( \tau e^{-\tau} + e^{-\tau} \right)  \
 $     $\displaystyle \mbox{ (for
$\tau \geq 0$)}$  

    Similarly,

    $\displaystyle R_{X}(\tau) = \frac{1}{4} \left( - \tau e^{\tau} + e^{\tau}
\right)    $     $\displaystyle \mbox{ (for $\tau < 0$)}$ $\displaystyle $

    Therefore,

    $\displaystyle R_{X}(\tau) = \frac{1}{4} e^{-\vert\tau\vert} ( \vert\tau \vert +1) $

  2. Suppose that $ \omega$ is a random variable with p.d.f. $ f_{\omega}$ and $ \theta$ is a random variable independent of $ \omega$ uniformly distributed in $ (-\pi,\pi)$ . Define a random process by $ X_{t} = a \cos(\omega t + \theta),  t\in
\mathbb{R}$ where $ a$ is a constant. Find the power spectral density of $ \{X_{t} \}$ .

    $\displaystyle E[X_{t_{1}} X_{t_{2}} ]$ $\displaystyle =$ $\displaystyle E \{ a^{2} \cos(\omega t_{1} +
\theta) \cos(\omega t_{2} + \theta)$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} a^{2} E \{ \cos(\omega t_{1}-\omega t_{2}) - \cos(\omega
t_{1} + \omega t_{2} + 2\theta) \}$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} a^{2} \left [ E \{ \cos(\omega t_{1}-\omega t_{2}) \} -
\underbrace{E \{ \cos(\omega t_{1} + \omega t_{2} + 2\theta)
\}}_{0} \right]$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} a^{2} E \{ \cos(\omega t_{1}-\omega t_{2}) \}$  
      $\displaystyle =$ $\displaystyle \frac{1}{2} a^{2} \int_{-\infty}^{\infty} \cos(\tau
\omega) f_{\o...
...frac{ e^{j \omega \tau} + e^{-j \omega
\tau} } {2} f_{\omega}(\omega)  d\omega$  
      $\displaystyle =$ $\displaystyle \frac{1}{4} a^{2}  2\pi [ \mathcal{F}^{-1} \{
f_{\omega}(\omega) \} + \mathcal{F}^{-1} \{ f_{\omega}(-\omega) \}]$  
      $\displaystyle =$ $\displaystyle \frac{\pi a^{2}}{2} [ \mathcal{F}^{-1} \{
f_{\omega}(\omega) \} + \mathcal{F}^{-1} \{ f_{\omega}(-\omega) \}]$  

    Therefore,

    $\displaystyle S_{X}(\omega) = \frac{\pi a^{2}}{2} [f_{\omega}(\omega) +
f_{\omega}(-\omega)]
$

  3. Suppose events occur randomly in $ T = [0,\infty)$ in the following way:
    1. The numbers of events in nonoverlapping intervals are independent of one another.
    2. $ P($ exactly one event in $ (t,t+\Delta t)) =
\lambda(t) \Delta t + o(\Delta t)$ , where $ \lambda(t)$ is a continuous nonnegative function on $ [0,\infty)$ .
    3. $ P($ more than one event in an interval of length $ \Delta t) = o(\Delta t).$
    Define a random process $ \{X_{t}, t \in T \} $ by $ X_{0} = 0
$ and $ x_{t}$ i s the number of events occurring in $ (0,t]$ .
    1. For $ t> s \geq 0$ , show that $ (X_{t} - X_{s})$ is a Poisson random variable with parameter $ \int_{z}^{t} \lambda (x) \
dx$ .

      $ P($ k events in $ [s,t]) = p_{k}(t,s)$ for $ t> s \geq 0$ and $ k \geq 0$ .

      $\displaystyle p_{k}(t+\Delta t,s)$ $\displaystyle =$ $\displaystyle p_{k}(t,s) (1-\lambda(t) \Delta t +
o(\Delta t)) + p_{k-1}(t,s)(\lambda(t) \Delta t + o(\Delta t))+
p_{< k-1}(t,s) o(\Delta t)$  
        $\displaystyle =$ $\displaystyle p_{k}(t,s) (1-\lambda(t) \Delta t + o(\Delta t)) +
p_{k-1}(t,s)(\lambda(t) \Delta t + o(\Delta t))+ o(\Delta t)$  

      $\displaystyle \frac{d}{dt} p_{k}(t,s) = \lim_{\Delta t \rightarrow 0} \frac{
p_...
... - p_{k}(t,s)}{\Delta t} = - p_{k}(t,s)
\lambda (t) + p_{k-1}(t,s) \lambda (t)
$

      If $ p_{k}(t,s) = \frac{e^{-\int_{s}^{t} \lambda(q) dq}
\left(\int_{s}^{t} \lambda(q) dq \right)^{k} }{k!}$ satisfies the above equation then we can say that $ X_{t}-X_{s}$ is a Poisson random variable with parameter $ \int_{s}^{t} \lambda (q)  dq$ .

      Now, let $ I = \int_{s}^{t} \lambda (q)  dq$ and $ I' = \lambda (t)$

      Therefore,

      $\displaystyle \frac{d}{dt} p_{k}(t,s)$ $\displaystyle =$ $\displaystyle \frac{ -I' e^{I} I^{k} + e^{I} k
I^{k-1} I'}{k!}$  
        $\displaystyle =$ $\displaystyle - \frac{I' e^{I} I^{k}}{k!} + \frac{ e^{I} I^{k-1}
I'}{(k-1)!}$  
        $\displaystyle =$ $\displaystyle \frac{ e^{I} I^{k}}{k!} \lambda (t) + \frac{ e^{I} I^{k-1}
}{(k-1)!}\lambda (t)$  
        $\displaystyle =$ $\displaystyle - p_{k}(t,s) \lambda (t) + p_{k-1}(t,s) \lambda (t)$  

      So that $ p_{k}(t,s)$ does satisfy. And $ X_{t}-X_{s}$ is Poisson.

    2. Find the mean and autocorrelation functions of $ \{X_{t} \}$ .

      We have, $ E[X_{t}-X_{s}] = \sum_{k=0}^{\infty} k p_{k}(t,s)$ . So,

      $\displaystyle E[X_{t}]$ $\displaystyle =$ $\displaystyle E[X_{t}-X_{0}] = \sum_{k=0}^{\infty} k p_{k}(t,0)$  
        $\displaystyle =$ $\displaystyle \sum_{k=0}^{\infty} k \frac{e^{-\int_{0}^{t} \lambda(q) dq}
\left(\int_{0}^{t} \lambda(q) dq \right)^{k} }{k!}$  
        $\displaystyle =$ $\displaystyle \int_{0}^{t} \lambda(q)  dq$  

      Now,
      $\displaystyle E[X_{t}X_{s}]$ $\displaystyle =$ $\displaystyle E[X_{t}(X_{t}+X_{s}-X_{t})] = E[X_{t}^2] +
E[X_{t}(X_{s}-X_{t})]$  
        $\displaystyle =$ $\displaystyle E[X_{t}^2] + E[X_{t}Y_t] = E[X_{t}^2] +
E[X_{t}]E[Y_t]$  
        $\displaystyle =$ $\displaystyle \lambda_{X}^{2} + \lambda_{X} + \lambda_{X} \lambda_{Y} =
\lambda_{X}( \lambda_{X}+ 1 +\lambda_{Y} )$  
        $\displaystyle =$ $\displaystyle \int_{0}^{t} \lambda(q) dq \left[ \int_{0}^{t} \lambda(q)
dq + 1 + \int_{t}^{s} \lambda(q) dq \right]$  

      So,

      $\displaystyle R_X(t,s) = \int_{0}^{t} \lambda(q) dq \left[ \int_{0}^{s}
\lambda(q)  dq + 1 \right]$     $\displaystyle \mbox{ (for $s > t$)}$ $\displaystyle $

      and

      $\displaystyle R_X(t,s) = \int_{0}^{s} \lambda(q) dq \left[ \int_{0}^{t}
\lambda(q)  dq + 1 \right]$     $\displaystyle \mbox{ (for $t > s$)}$ $\displaystyle $

  4. Suppose that $ \{X_t, t \in \mathbb{R} \}$ is a w.s.s., zero-mean, Gaussian random process with auto-correlation function $ R_{X}(\tau), \tau \in \mathbb{R}$ and power spectral density $ S_X(\omega), \omega \in \mathbb{R}$ . Define the random process $ \{Y_{t}, t \in \mathbb{R} \}$ by $ Y_{t} = (X_{t})^{2}, t \in
\mathbb{R}$ . find the mean, autocorrelation, and powerspectral density of $ \{Y_{t}, t \in \mathbb{R} \}$ .

    Mean :

    $\displaystyle \mu_{y}(t) = E[X_{t}^{2}] = R_{X}(0) = \sigma^{2} $

    Autocorrelation :
    $\displaystyle R_{Y}(t,s)$ $\displaystyle =$ $\displaystyle E[Y_{t}Y_{s}] = E[X_{t}^{2} X_{s}^{2}]$  
      $\displaystyle =$ $\displaystyle \left. \frac{\partial ^{4}}{\partial u^{2} \partial v^{2}}
\Phi_{X_{t}X_{s}}(u,v) \right\vert _{u=v=0} \frac{1}{i^{4}}$  
      $\displaystyle \vdots$    
      $\displaystyle =$ $\displaystyle R_{X}^{2}(0) + 2 R_{X}^{2} (\tau) \quad \quad (\tau = t-s)$  

    PSD :

    $\displaystyle S_{Y}(\omega) = \mathcal{F} \{R_{Y}(\tau) \} = 2 \pi R_{X}^{2}(0)
\delta(\omega) - 2 (S_{X}(\omega) *S_{X}(\omega))
$

  5. Suppose $ U$ and $ V$ are independent random variables with $ E[U] = E[V] = 0$ and $ \mathrm{var}(U) = \mathrm{var}(V) =
1$ . Define random processes by

    $\displaystyle X_t = U \cos t + V \sin t
\quad Y_{t} = U \sin t + V \cos t, \quad t \in \mathbb{R}.$

    Find the autocorrelation and cross-correlation functions of $ \{ X_{t},t \in \mathbb{R} \}$ and $ \{Y_{t}, t \in \mathbb{R} \}$ . Are $ \{X_{t} \}$ and $ \{Y_{t} \}$ jointly wide sense stationary? Are they individually wide sense stationary?


    $\displaystyle R_{X}(t,s)$ $\displaystyle =$ $\displaystyle E[X_{t}X_{s}] = E[(U \cos t + V \sin t)(U \cos s
+ V \sin s) ]$  
      $\displaystyle =$ $\displaystyle E[U^{2} \cos t \cos s + UV (\cos t \sin s + \sin t \cos s) +
V^{2} \sin t \sin s ]$  
      $\displaystyle =$ $\displaystyle \cos t \cos s E[U^{2}] + E[U] E[V] (\cos t \sin s + \sin t
\cos s) + E[V^{2}] \sin t \sin s$  
      $\displaystyle =$ $\displaystyle \cos t \cos s + \sin t \sin s$  
      $\displaystyle =$ $\displaystyle \cos(t-s)$  

    Similarly,

    $\displaystyle R_{Y}(t,s) = cos(t-s)
$


    $\displaystyle R_{XY}(t,s)$ $\displaystyle =$ $\displaystyle E[X_{t}Y_{s}] = E[(U \cos t + V \sin t)(U \sin
s + V \cos s) ]$  
      $\displaystyle =$ $\displaystyle E[ U^{2} \cos t \sin s + UV ( \cos t \cos s + \sin t \sin
s) + V^{2} \sin t \cos s]$  
      $\displaystyle =$ $\displaystyle E[ U^{2}] \cos t \sin s + E[UV] ( \cos t \cos s + \sin t \sin
s) E[ V^{2}] \sin t \cos s$  
      $\displaystyle =$ $\displaystyle \cos t \sin s + \sin t \cos s$  
      $\displaystyle =$ $\displaystyle \sin (t+s)$  

    $\displaystyle \mu_{X}(t) = 0 \quad \quad \mu_{Y}(t) $

    So, $ \{X_{t} \}$ and $ \{Y_{t} \}$ are individually WSS, but not jointly WSS.

Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, June 13). Homework Solutions. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/hw9sol.html. This work is licensed under a Creative Commons License Creative Commons License