Homework Solutions
Utah State University
ECE 6010
Stochastic Processes
Homework # 11 Solutions

Let
denote the sequence of sample means from an iid
random process
:

Is
a Markov process?
Clearly if is given then depends only on and is independent of . Therefore, is a Markov process. 
If the answer to part a is yes, find the following state
transition pdf:
.

Is
a Markov process?

An urn initially contains five black balls and five white
balls. The following experiment is repeated indefinitely: A ball is
drawn from the urn; if the ball is white it is put back in the urn,
otherwise it is left out. Let
be the number of black balls
remaining in the urn after
draws from the urn.
 Is a Markov process? If so, find the apropriate transition probabilities.
 Do the transition probabilities depend on ?
The number of black balls in the urn completely specifies the probability of the outcomes of a trial; therefore is independent of its past values and is a Markov proces.
All the transition probability are independent of time. 
Let
be the Bernoulli iid process, and let
be
given by
. It was shown in Example 8.2 that
is not a Markov process. Consider the vector process defined
by
.

Show that
is a Markov process.
Therefore, is a Markov process. 
Find the state transition diagram for
.
where .

Show that
is a Markov process.

Show that the following autoregressive process is a Markov
process:
where
and
is an
iid process.
is a Markov process. 
Let
be the Markov chain defined in Problem 2.

Find the onestep transition probability matrix
for
.

Find the twostep transition probability matrix
by matrix
multiplication. Check your answer by computing
and
comparing it to the correspoinding entry in
.

What happens to
as
approaches infinity ? Use your
answer to guess the limit of
as
.
As eventually all black balls are removed. Thus

Find the onestep transition probability matrix
for
.

Two gambler play the following game. A fair coin is flipped; if
the outcome is heads, player
pays player
$ 1, and if the
outcoime is tails player
plays player
$ 1. the game is
continued until one of the players goes broke. Suppose that
initially player
has $ 1 and player
has $ 2, so a total of
$ 3 is up for grabs. Let
denote the number of dolars held by
player
after
trials.

Show that
is a Markov chain.
since for
and if . 
Sketch the state transition diagram for
and give the
onestep transition probability matrix
.

Use the state trasition diagram to help you show that for
even
for
and
.
For
2 cycles and then go to 0 by symmetry.

Find the
step transition probability matrix for
even
using part c.

Find the limit of
as
.

Find the probability that player
eventually wins.
[player wins] = .

Show that
is a Markov chain.

A machine consists of two parts that fail and are repaired
independently. A working part fails during any given day with
probability
. A part that is not working is repaired by the next
day with probability
. Let
be the number of working
parts in day
.

Show that
is a threestate Markov chain and give its
onestep transition probability matrix
.

Show that the steady state pmf
is binomial with
parameter
.
Claim: the steady state pmf is

What do you expect is steady state pmf for a machine that
consists of
parts?

Show that
is a threestate Markov chain and give its
onestep transition probability matrix
.