Personal tools

Homework Solutions

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed


\begin{displaymath}cov(aX+b, cY+d) =accov(X,Y) \end{displaymath}

\begin{displaymath}cov(X,Y)= E[(X-E[X])(Y-E[Y])]\end{displaymath}


\begin{displaymath}E(aX+bY)=aE[X]+bE[Y] \end{displaymath}

\begin{displaymath}cov(aX+b, cY+d)=(aX-b-aE[X]+b)(cY+d-cE[Y]+d)\end{displaymath}


\begin{displaymath}cov(aX+b,cY+d)= ac\underbrace{E[(X-\mu_{x})(Y-\mu_{y})]}_{cov(X,Y)} =accov(X,Y)\end{displaymath}



\begin{displaymath}X\sim \mathcal{N}(0,sigma^{2})\end{displaymath}

\begin{displaymath}\Phi_{X}(\mu) = E[e^{i\mu x}]= \int_{-\infty}^{\infty}e^{i\mu x}f_{x}(x)dx \end{displaymath}

\begin{displaymath}f_{x}(x)= \frac{1}{\sqrt{2\pi^{2}}\sigma_x} e^{-\frac{(x-\mu_x)^2}{2\sigma_{x}^{2}}} \end{displaymath}

\begin{displaymath}\Phi_{x}(\mu) = e^{i\mu u}-\frac{1}{2}u^2\sigma^2 = e^{-\frac{1}{2}u^2\sigma^2} \end{displaymath}

\begin{displaymath}E[X^{k}]= i^{-k} \frac{d^{k}}{du^{k}}\Phi_x(u)\mid_{u=0} \end{displaymath}

\begin{displaymath}= i^{-k} \frac{d^{k}}{du^{k}}e^{-\frac{1}{2}u^2\sigma^2}\mid_{u=0} \end{displaymath}

\begin{displaymath}=\frac{d^{k}}{du^{k}} i^{-k}(1-x+\frac{x^2}{2!}-\frac{x^3}{3!}...)\mid_{u=0} \end{displaymath}

\begin{displaymath}=\frac{d^{k}}{du^{k}} i^{-k}(1-\frac{\sigma^{2}u^{2}}{2^{1}2!...
...}u^{6}}{2^{3}3!}+\frac{\sigma^{8}u^{8}}{2^{4}4!}...)\mid_{u=0} \end{displaymath}

\begin{displaymath}E[x^{k}]=\left\{ \begin{array}{ll}
0 & \mbox{when k is odd}...
& \mbox{when k is even}
\right. \end{displaymath}

1.4.4 Solution: The final calculation of $\frac{2}{3}$ refers not to a single draw of one ball from an urn containing three, but rather to a composite experiment comprising more than one stage. While it is true that {two black, one white } is the only fixed collection of balls for which a random choice is black with probability $\frac{2}{3}$ , the composition of the urn is not determined prior to the final draw.

After all, if Carroll's argument were correct then it would apply also in the situation when the urn originally contains just one ball, either black or white. The final probability is now $\frac{3}{4}$ , implying that the original ball was one half black and one half white! Carroll was himself aware of the fallacy in this argument.

1.4.5 Solution: (a) (i)

\begin{displaymath}P(C_{3}\mid G)=\frac{P(C_{3}\cap G\mid
C_{1})P(C_{1})+P(G\mid C_{1}^{c})P(C_{1}^{c})}\end{displaymath}



\begin{displaymath}P(C_{3}\mid B)=\frac{P(C_{3}\cap B\mid
C_{1})P(C_{1})+P(B\mid C_{1}^{c})P(C_{1}^{c})}\end{displaymath}



\begin{displaymath}P(C_{3}\mid G)=\frac{P(C_{3}\cap G\mid
C_{1})P(C_{1})+P(G\mid C_{1}^{c})P(C_{1}^{c})}\end{displaymath}


(b) Let $\alpha \in[\frac{1}{2},\frac{2}{3}]$ , and suppose the presenter possesses a coin which falls with heads upward with probability $\beta =6\alpha-3$ . He flips the coin before the show, and adopts strategy (i) if and only if the coin shows heads, and otherwise strategy(iii). The probability in question is now


You never lose by swapping, but whether you gain depends on the presenter's protocol.
(c) Let D denote the first door chosen, and consider the following protocols: (iv) If D conceals a goat, open it. Otherwise open one of the other two doors at random. In this case $p=0$ .
(v) If D conceals a car, open it. Otherwise open the unique remaining door which conceals a goat. In this case $p=1$ .

1.5.1 Solution:

\begin{displaymath}P(A^{c}\cap B) =P(B \setminus \{A \cap B\}) =
P(B)-P(A \cap B)\end{displaymath}


\begin{displaymath}P(A^{c}\cap B^{c}) =P(A^{c} \setminus \{B \cap A^{c}\}) = P(A^{c})-P(B \cap A^{c})\end{displaymath}


1.5.2 Solution: Suppose $ i < j$ and $ m < n$ . If $ j < m$ , then $A_{ij}$ and $A_{mn}$ are determined by distinct independent rolls, and are therefore independent.

For the case $j=m$ we have that

$P(A_{ij} \cap A_{jn})$ =P( $\emph{i} th$ , $\emph{j}th$ , and $\emph{n}th$ rolls show same number)
$=\sum_{r=1}^{6}$ P( $\emph{j}th$ and $\emph{n}th$ rolls both show $\emph{r} \mid$ $\emph{i} th$ shows $\emph{r}$ ) $= \frac{1}{36}

so we have pair-wise independence.

But $if i\neq j\neq k $

\begin{displaymath}P(A_{ij} \cap A_{jn} \cap A_{ik}) =\frac{1}{36}\neq
\frac{1}{216}= P(A_{ij})P(A_{jn})P(A_{ik}).\end{displaymath}

Therefore not independence.

1.5.7 Solution:

\begin{displaymath}P(A)=P(BBB \cup
GGG)=\frac{1}{8}+\frac{1}{8}=\frac{1}{4} \end{displaymath}

\begin{displaymath}P(B)=P(BGG\cup GBG \cup GGB \cup GGG)=4\cdot\frac{1}{8}\end{displaymath}


\begin{displaymath}P(A\cap B)=P(GGG)=\frac{1}{8}=\frac{1}{4}\cdot\frac{1}{2}=P(A)P(B)\end{displaymath}

\begin{displaymath}P(B\cap C)=\frac{3}{8}=\frac{1}{2}\cdot\frac{3}{4}=P(B)P(C)\end{displaymath}


\begin{displaymath}P(A\cap C)=0\neq P(A)P(C)\end{displaymath}

(c) Only in the trivial cases when children are either almost surely boys or almost surely girls.

(d) No.

1.8.5 Solution:

\begin{displaymath}P(A\triangle??B)=P((A\cup B)\setminus P(A\cap B
))=P(A\cup B)-P(A\cap B)\end{displaymath}

\begin{displaymath}=P(A)+P(B)-2P(A\cap B).\end{displaymath}

1.8.6 Solution:

\begin{displaymath}P(A\cup B \cup C)=P((A^{c}\cap B^{c}\cap C^{c}

\begin{displaymath}=1-P(A^{c}\cap B^{c}\cap C^{c}) \end{displaymath}

\begin{displaymath}=1-P(A^{c}\mid B^{c} \cap C^{c})P(B^{c}\mid C^{c})P(C^{c})\end{displaymath}

1.8.19 Solution:

\begin{displaymath}P(A\ \leftrightarrow D\mid
AD^{c})=P(A\ \leftrightarrow D\mi...
BC^{c})p+P(A\ \leftrightarrow D\mid AD^{c}\cap BC)(1-p)\end{displaymath}



\begin{displaymath}P(A\ \leftrightarrow D\mid
BC^{c})=P(A\ \leftrightarrow D\mi...
BC^{c})p+P(A\ \leftrightarrow D\mid BC^{c}\cap AD)(1-p)\end{displaymath}



\begin{displaymath}P(A\ \leftrightarrow D\mid
AB^{c})=P(A\ \leftrightarrow D\mi...
AD^{c})p+P(A\ \leftrightarrow D\mid AB^{c}\cap AD)(1-p)\end{displaymath}



\begin{displaymath}P(A\ \leftrightarrow D\mid
AB^{c})=P(A\ \leftrightarrow D\mid AD^{c})p+P(A\
\leftrightarrow D\mid AD)(1-p)\end{displaymath}


1.8.20 Solution: We condition on the reuslt of the first toss. If this is a head, then we require an odd number of heads in the next $n-1$ tosses. Similarly, if the first toss is a tail, we require an even number of heads in the next $n-1$ tosses.

Hence $p_{n}$ =Prob. of even hands after n tosses
=P(even number of n-1 tosses) $\cdot$ P(tails on nth)
+P(odd number of n-1 tosses) $\cdot$ P(heads on nth)
$=(1-P)P_{n-1}+P(1-P_{n-1})$ with $p_{0}=1.$

As an alternative to induction, we may seek a solution of the form $p_{n}=A+B\lambda^{n-1}+p$ Hence $A=\frac{1}{2},B=\frac{1}{2},\lambda=1-2p$

1.8.30 In general , there are $365^{m}$ different combinations. $\frac{365!}{(365-m)!}$ ways of having different birthdays, $\frac{365!}{(365-m)!365^{m}}$ probability of being all different, $1-\frac{365!}{(365-m)!365^{m}}$ of two of them are the same. let $m=23$


Copyright 2008, Todd Moon. Cite/attribute Resource . admin. (2006, June 13). Homework Solutions. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: This work is licensed under a Creative Commons License Creative Commons License