Personal tools
  •  
You are here: Home Electrical and Computer Engineering Stochastic Processes Basic Concepts of Random Processes

Basic Concepts of Random Processes

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Definitions  ::  Ergodicity  ::  Autocorrelations  ::  Sinusoidal  ::  Poisson  ::  Gaussian  ::  Properties  ::  Spectra  ::  Cases  ::  Random  ::  Independent

The Homogeneous Poisson Counting Process

Let $T = [0,\infty)$ . Suppose events occur randomly in time in the following fashion:

  1. The number of events occurring in non-overlapping intervals of time are independent.
  2. The probability of one event exactly in any interval of length $\Delta t$ is equal to $\lambda \Delta t + o(\Delta t)$ for $\Delta t$ sufficiently small.

    \begin{displaymath}\frac{o(\Delta t)}{\Delta t} \rightarrow 0 \text{ as } \Delta t
\rightarrow 0.
\end{displaymath}

    (That is, $o(\Delta t)$ is the generic term for terms of order higher than $\Delta t$ .) Also, the probability of more than one event occurring during an interval of length $\Delta t = o(\Delta t)$ .

Now define a r.p. $\{X_t, t\in T\}$ by $X_t$ as the number of events occurring in the interval $[0,t]$ . Then $X_t$ has the following properties:

  1. $X_t - X_s$ is Poisson with parameter $\lambda(t-s)$ :

    \begin{displaymath}P(X_t - X_s = k) = \frac{(\lambda (t-s))^k e^{-\lambda(t-s)}}{k!}.
\end{displaymath}

  2. $(X_{t_1} - X_{s_1})$ and $(X_{t_2} - X_{s_2})$ are independent r.v.s for all nonoverlapping intervals $[s_1,t_1]$ and $[s_2,t_2]$ .
The parameter $\lambda$ is called the rate of $X_t$ . Property 2 follows from the first assumption. We say that such a process has independent increments .

Such a process is called a Poisson counting process (PCP) with rate $\lambda$ . These two properties complete determine a PCP. All finite-dimensional distributions (fdds) of the process can be determined from these two properties.

How do we show the Poisson distribution property? Pick $t>s \geq 0$ . Let

\begin{displaymath}p_k(t,s) = \Pr(\text{exactly $k$ occurrences in $[s,t]$})
\end{displaymath}

for $k \geq 0$ . Then

\begin{displaymath}\begin{aligned}
p_k(t+\Delta t,s) &= \Pr(\text{$k$ occurrenc...
...occurrences in $[s,t]$})
\Pr(\text{all the rest})
\end{aligned}\end{displaymath}

By assumption 2,

\begin{displaymath}\begin{aligned}
p_k(t+\Delta t, s) &= p_k(t,s)(1-\lambda \Del...
..._{k-1}(t,s) - p_k(t,s)) + p_k(t,s) + o(\Delta t).
\end{aligned}\end{displaymath}

Now

\begin{displaymath}\begin{aligned}\partiald{}{t} p_k(t,s) &= \lim_{\Delta t
\ri...
...ta t \rightarrow 0}
\frac{o(\Delta t)}{\Delta t}.
\end{aligned}\end{displaymath}

So

\begin{displaymath}\partiald{}{t} p_k(t,s) = \lambda[p_{k-1}(t,s) - p_k(t,s)] \qquad t
\geq s.
\end{displaymath}

Now $p_{-1}(t,s) = 0$ . When $k=0$ we get
\begin{displaymath}
\partiald{}{t} p_0(t,s) = -\lambda p_0(t,s). \tag{*}
\end{displaymath} ()

so

\begin{displaymath}p_0(t,s) = C(s) e^{\lambda t}
\end{displaymath}

We have another boundary condition: $p_0(s,s) = 1$ , giving

\begin{displaymath}p_0(t,s) = e^{-\lambda (t-s)}
\end{displaymath}

Now we could proceed solve the set of equations for $k=1,2,\ldots.$ For example, when $k=1$ :

\begin{displaymath}\partiald{}{t} p_1(t,s) = \lambda(p_0(t,s) - p_1(t,s))
\end{displaymath}

This could be solved, e.g., using Laplace transforms. In general we would find

\begin{displaymath}p_k(t,s) = \frac{e^{-\lambda(t-s)}(\lambda(t-s))^k}{k!} \qquad
k=0,1,\ldots, \qquad t>s \geq 0.
\end{displaymath}

As stated, the properties allow us to find all finite dimensional distributions. For example, suppose we want to find the joint distribution of $X_{t_1}$ and $X_{t_2}$ for $t_1 < t_2$ .

\begin{displaymath}\begin{aligned}
P(X_{t_1} = i, X_{t_2} = j) &= P(X_{t_1} = i,...
...
(t_2-t_1))^{j-i} e^{-\lambda(t_2-t_1)}}{(j-i)!}
\end{aligned}\end{displaymath}

where the factorization occurs because of independent increments.

Draw a typical sample path...

The process is called homogeneous because the rate at which the events occur does not depend on $t$ .

Let us work out the mean and autocorrelation functions.

\begin{displaymath}\mu_X(t) = E[X_t] = E[X_t - X_0] = \lambda t
\end{displaymath}

(Poisson).

Assume $t>s$ :

\begin{displaymath}\begin{aligned}
E[X_tX_s] &= E[(X_t-X_s)X_s] + E[X_s^2] = E[X...
... + (\lambda s)^2] \\
&= \lambda^2 ts + \lambda s
\end{aligned}\end{displaymath}

and if $t < s$ , $\lambda^2ts + \lambda t$ .

This process is not WSS! The mean is not constant, and the autocorrelation is not a function of the time difference.

Now create a function $Z_t = X_{t+\Delta t} - X_t$ , for some fixed $\Delta t$ . The random process $\{Z_t\}$ is WSS. The increase in the number of counts (over some fixed interval) does not depend on the time.

We could create an inhomogeneous Poisson if the probability of an occurrence in the interval $[t,t + \Delta t]$ is $\lambda_t \Delta t
+ o(\Delta t)$ . Then

\begin{displaymath}X_t - X_2 \sim \text{Poisson with rate } \int_0^t \lambda_x dx.
\end{displaymath}
Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, June 07). Basic Concepts of Random Processes. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/lecture6_5.htm. This work is licensed under a Creative Commons License Creative Commons License