Personal tools
  •  
You are here: Home Electrical and Computer Engineering Stochastic Processes Change of Variable Theorems

Change of Variable Theorems

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

One Dimension  ::  Multiple Dimensions

Changing variables: One dimension

A simply invertible function

Let $Y = g(X)$ , where $X$ is a continuous r.v. and $g$ is a one-to-one, onto, measurable function. Then

\begin{displaymath}F_Y(y) = P(Y \leq y) = P(g(X) \leq y) = P(X \leq g^{-1}(y)) = F_X(g^{-1}(y))
\end{displaymath}

So we can determine the distribution of $Y$ . Let us now take a different point of view that will allow us to generalize to higher dimensions and develop and understand a commonly used formula.

Consider an interval along the $X$ axis,

\begin{displaymath}P(x \leq X \leq x+dx) \approx f_X(x)dx
\end{displaymath}

Suppose the function $g(x)$ has a positive derivative. The interval along the $Y$ axis, when $Y = g(X)$ is $dy \approx \frac{dy}{dx} dx$ at the point $x$ . The probability that $X$ falls in its interval is the probability that $Y$ falls in its interval:

\begin{displaymath}P(x \leq X \leq x+dx) = P(y \leq Y \leq y + dy)
\end{displaymath}

where $y = g(x)$ , or equivalently, $x = g^{-1}(y)$ . Then

\begin{displaymath}f_X(x)dx \approx f_Y(y) dy
\end{displaymath}

That is

\begin{displaymath}f_Y(y) = \left. f_X(x) \frac{dx}{dy}\right\vert _{x=g^{-1}(y)} =
f_X(g^{-1}(y)) \frac{dx}{dy}.
\end{displaymath}

If we take the other case that $g(x)$ has a negative derivative, we have to take $f_X(x)dx = f_Y(y) (-dy)$ . Combining these together we obtain

\begin{displaymath}\boxed{f_Y(y) = f_X(g^{-1}(y)) \left\vert \frac{dx}{dy}\right...
...vert dy/dx\vert} = \frac{f_X(g^{-1}(y)}{\vert g'(g^{-1}(x))}.}
\end{displaymath}


\begin{example}
Let $Y = aX+b$. Then
\begin{displaymath}f_Y(y) = \frac{1}{\vert a\vert} f_X((y-b)/a).
\end{displaymath}\end{example}

\begin{example}
Suppose $f_X(x) = \frac{a/\pi}{x^2+a^2}$ (Cauchy). Let $Y = 1/...
...math}f_Y(y) = \frac{1/a\pi}{y^2 + 1/a^2}
\end{displaymath}(Cauchy)
\end{example}

\begin{example}
Suppose $X \sim \Uc(a,b)$, with $0 < a < b$. Then $f_X(x) =
\f...
...y^2} \text{ for } \frac{1}{b} < y <
\frac{1}{a}.
\end{displaymath}\end{example}

\begin{example}
Let $Y = e^X$. Then
\begin{displaymath}f_Y(Y) = \frac{1}{y} f_X...
...)^2/2\sigma^2}
\end{displaymath}This density is {\bf log-normal.}
\end{example}

\begin{example}
Suppose $y = g(x) = \tan x$, or $x = \tan^{-1} y$. This has an ...
...splaymath}f_Y(y) = \frac{1}{\pi(1+y^2)}
\end{displaymath}(Cauchy).
\end{example}

\begin{example}
Suppose $X$ has continuous distribution $F_X(x)$, and
\begin{...
...ocessing literature, this is called {\em histogram
equalization}.
\end{example}

\begin{example}
Let $X \sim \Uc(0,1)$, and let $Y$ have a specified c.d.f $F_Y...
...ciple!) transform it to produce any other continuous
distribution.
\end{example}

Multiple inverses

It may happen that $g$ is not a uniquely invertible function. That is, for a given $y$ there may be more than one value of $x$ such that $y = g(x)$ . For example, $y = g(x) = x^2$ : then $x = \sqrt{y}$ and $x = -\sqrt{y}$ are both inverses.

We will prove the concept for two solutions, Let $y = g(x_1) =
g(x_2)$ , assuming to be specific that the slope is positive at $x_1$ and negative at $x_2$ .

\begin{displaymath}P(y < Y < y + dy) = P(x_1 < X < x_1 + dx_1) + P(x_2 + dx_2 < X <
x_2)
\end{displaymath}

That is

\begin{displaymath}f_Y(y)dy = f_X(x_1) dx_1 + f_X(x_1) \vert dx_2\vert
\end{displaymath}

From this,

\begin{displaymath}f_Y(y) = f_X(x_1)\left\vert\frac{dx_1}{dy}\right\vert +
f_X(x_2)\left\vert\frac{dx_2}{dy}\right\vert
\end{displaymath}

This is sometimes written

\begin{displaymath}f_Y(y) = \frac{f_X(x_1)}{\vert g'(x_1)\vert} + \frac{f_X(x_1)}{\vert g'(x_1)\vert}
\end{displaymath}

In general, with $n$ solutions $x_1, x_2, \ldots, x_n$ we have

\begin{displaymath}\boxed{f_Y(y) = \frac{f_X(x_1)}{\vert g'(x_1)\vert} +
\frac{...
... g'(x_1)\vert} + \cdots + \frac{f_X(x_n)}{\vert g'(x_n)\vert}}
\end{displaymath}


\begin{example}
Suppose $X \sim \Uc(-\pi,\pi)$ and $Y = a \sin(X+\theta)$. Gen...
...1}{\pi\sqrt{a^2 - y^2}}, \qquad \vert y\vert
< a.
\end{displaymath}\end{example}

$g(X)$ constant in an interval

If the function $g(X)$ is constant over any interval, then there is no inverse, nor even multiple inverses. However, we can still compute the distribution. Let $g(x) = y_1$ for $x_0 < x \leq x_1$ (i.e., constant). Then

\begin{displaymath}P(Y = y_1) = P(x_0 < X \leq x_1) = F_X(x_1) - F_X(x_0).
\end{displaymath}

Hence, there is probability mass at the point $y_1$ . This results in a c.d.f which is not continuous at that point.
\begin{example}
Suppose $g(x)$ is the limiter:
\begin{displaymath}g(x) =
\beg...
...-F_X(b).
\end{displaymath}For $-b \leq Y < b$, $F_Y(y) = F_X(y)$.
\end{example}

\begin{example}
Suppose
\begin{displaymath}g(x) =
\begin{cases}
1 & x> 0  ...
...).
\end{displaymath}We have a two-valued discrete random variable.
\end{example}
Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, May 31). Change of Variable Theorems. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/lec4_1.html. This work is licensed under a Creative Commons License Creative Commons License