Personal tools
You are here: Home Electrical and Computer Engineering Stochastic Processes More on Random Variables

More on Random Variables

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Expectation  ::  Properties  ::  Pairs  ::  Independence  ::  Two R.V.S.  ::  Functions  ::  Inequalities  ::  Conditional  ::  General

Characteristic functions

The characteristic function is essentially the Fourier transform of the p.d.f. or p.m.f. They are useful in practice not for the usual reasons engineers use Fourier transforms (e.g., frequency content), but because they can provide a means of computing moments (as we will see), and they are useful in finding distributions of sums of independent random variables.

Let $X$ be a r.v. The {\bf characteristic function} (ch.f.)...
...Rbb$. (Here, $i = \sqrt{-1}$. We will not use $\sqrt{-1}
= j$.)
Let us write some more explicit formulas. Suppose $X$ is a continuous random variable. Then (by the law of the unconcious statistician)

\begin{displaymath}\boxed{\phi_X(u) = \int_{-\infty}^\infty e^{iux} f_X(x) dx.}

This may be recognized as the Fourier transform of $f_X(x)$ , where $u$ is the ''frequency'' variable. (Comment on sign of exponent.) Note that given $\phi_X$ we can determine $f_X$ by an inverse Fourier transform:

If $X$ is a discrete r.v.,

\begin{displaymath}\boxed{\phi_X(u) = \sum_i e^{i u x_i} p_X(x_i),}

which we recognize as the discrete-time Fourier transform, and as before $u$ is the ''frequency'' variable. (Comment on the sign of the exponent.) Given a $\phi_X$ , we can find $p_X$ by the inverse discrete-time Fourier transform.


  1. $\phi_X(0) = 1$ . (Why?)
  2. $\vert\phi_X(u)\vert \leq 1 \forall u$ . (Why?)
  3. $\phi_X$ and $f_X$ form a unique Fourier transform pair.

    \begin{displaymath}f_X \leftrightarrow \phi_X.

    Thus, $\phi_X$ provides yet another way of displaying the probability structure of $X$ .
  4. $\phi_X(u) = \int_{-\infty}^\infty e^{iux} dF_X(x)$ . This is referred to as the Fourier-Stieltjes transform of $F_X$ .
  5. $\phi_X$ is uniformly continuous.

For an r.v. $X$, the $k$th {\bf moment} of $X$ is $E[X^k]$, for $k
\in \Nbb$.
We can write

\begin{displaymath}E[X^k]= \int_{-\infty}^\infty x^k dF_X(x).

Theorem 1   If $E[\vert X\vert^k] < \infty$ then

\begin{displaymath}\boxed{E[X^k] = \left. i^{-k} \frac{d^k}{du^k} \phi_X(u)
\right\vert _{u=0}.}

That is, we can obtain moments by differentiating the characteristic function. For this reason, characteristic functions (or functions which are very similarly defined) are sometimes referred to as moment generating functions .
\begin{displaymath}\phi_X(u) = E[e^{iux}] = E \sum_{k=0}^\infty \f...
...^j}{du^j} \phi_X(j)\right\vert _{u=0} = i^j E[X^j].

$X \sim \Nc(\mu,\sigma^2)$.
Then it can be shown (homework!) t...
... = \sigma^2 + \mu^2
\end{displaymath}so that $\var(X) = \sigma^2$.

For a joint r.v. $(X,Y)$ we define a {\bf joint characteris...
...displaymath}\phi_{XY}(u,v) = E[e^{iuX + ivY}].
Then $\phi_{XY}$ and $F_{XY}$ are uniquely related (two-dimensional Fourier transforms).
The $n$th order moments of two random variables are the quan...
...(Y-E[Y])^l \qquad k \geq 0, l \geq 0, k+l = n.

For $n=2$, the second order moments are
$E[X^2], E[Y^2]$ and $...
...par The central moments are $\cov(X,Y)$, $\var(X)$ and $\var(Y)$.

  1. Moments:

    \begin{displaymath}\mu_{k,l} = \left. \frac{\partial^n}{\partial u^k \partial v^l}
\phi_{XY}(u,v) \right\vert _{u,v=0} i^{-(k+l)}

  2. $X$ and $Y$ are independent if and only if $\phi_{X,Y}(u,v) =
\phi_X(u) \phi_Y(v)$ for all $(u,v) \in \Rbb^2$ .

Sums of independent random variables

Let $X$ and $Y$ be independent r.v.s, and let

\begin{displaymath}Z = X+Y.


\begin{displaymath}\phi_Z(u) = E[\exp(iuz)] = E[\exp(iuX + iuY)] = \phi_{X,Y}(u,u).

But also

\begin{displaymath}E[\exp(iuX + iuY)]= E[\exp(iuX) \exp(iuY)] = \phi_X(u) \phi_Y(u).


\begin{displaymath}\boxed{\phi_Z(u) = \phi_X(u) \phi_Y(u)}

If $X$ and $Y$ are continuous r.v.s, then so is $Z$ .

\begin{displaymath}f_Z(z) = \Fc^{-1}[\phi_Z(u)] = \Fc^{-1}[\phi_X(z) \phi_Y(z)] =
f_X(z) * f_Y(z)

by the convolution theorem .

Thus, when continuous independent random variables are added, the p.d.f of the sum is the convolution of the p.d.f.s (and respectively p.m.f. for discrete independent r.v.s).

An example: Jointly Gaussian

If $(X,Y) \sim \Nc(\mu_x,\mu_y,\sigma_x^2,\sigma_y^2,\rho)$ , then

\begin{displaymath}\phi_{X,Y}(u,v) = \exp[ i(u \mu_x + v \mu_y) - \frac{1}{2}(u^2
\sigma_x^2 + v^2 \sigma_y^2 + 2 u v \rho \sigma_x \sigma_y)]

We make an observation here: the ''form'' of the Gaussian p.d.f. is the exponential of quadratics. The form of the Fourier transform of the eponential of quadratics is of the form exponential of quadratics. This little fact gives rise to much of the analytical and practical usefulness of Gaussian r.v.s.

Characteristic functions marginals

We observe that

\begin{displaymath}\phi_{X,Y}(u,0) = \phi_X(u)

In our Gaussian example, we have

\begin{displaymath}\phi_X(u) = \phi_{X,Y}(u,0) = \exp(iu\mu_x - \sigma_x^2 u^2/2)

which is the ch.f. for a Gaussian,

\begin{displaymath}X \sim \Nc(\mu_x,\sigma_x^2).

We could, of course, have obtained a similar result via integration, but this is much easier.
Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 31). More on Random Variables. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License