##### Personal tools
•
You are here: Home More on Random Variables

# More on Random Variables

##### Document Actions

Expectation  ::  Properties  ::  Pairs  ::  Independence  ::  Two R.V.S.  ::  Functions  ::  Inequalities  ::  Conditional  ::  General

## Expectations of functions of two r.v.s

Let be measurable (e.g., ). Then for a bivariate r.v. we can define .

Properties:
1. If then .
2. If and are independent then

Comments: If and are independent, then

However, if , this does not mean that they are independent. (Uncorrelated does not imply independence.)

However, if for all appropriate functions, then and are independent. In fact, this is necessary and sufficient for independence.

1. .                  .
2. If and are independent then . If , we say that and are uncorrelated.

Again, uncorrelated does not imply independence.

3. . If then .
4. for all constants . Thus

1. . This can be shown using the Cauchy-Schwartz inequality.

iff and are linearly related,

for some constants with .

As we have observed before, if are jointly Gaussian and , then they are independent. Otherwise, does not imply independence.
Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 31). More on Random Variables. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/lec2_5.html. This work is licensed under a Creative Commons License