##### Personal tools
•
You are here: Home Homework Solutions

# Homework Solutions

## Utah State University ECE 6010 Stochastic Processes Homework # 11 Solutions

1. Let denote the sequence of sample means from an iid random process :

1. Is a Markov process?

Clearly if is given then depends only on and is independent of . Therefore, is a Markov process.

2. If the answer to part a is yes, find the following state transition pdf: .

2. An urn initially contains five black balls and five white balls. The following experiment is repeated indefinitely: A ball is drawn from the urn; if the ball is white it is put back in the urn, otherwise it is left out. Let be the number of black balls remaining in the urn after draws from the urn.
1. Is a Markov process? If so, find the apropriate transition probabilities.
2. Do the transition probabilities depend on ?

The number of black balls in the urn completely specifies the probability of the outcomes of a trial; therefore is independent of its past values and is a Markov proces.

All the transition probability are independent of time.

3. Let be the Bernoulli iid process, and let be given by . It was shown in Example 8.2 that is not a Markov process. Consider the vector process defined by .
1. Show that is a Markov process.

Therefore, is a Markov process.

2. Find the state transition diagram for .

where .

4. Show that the following autoregressive process is a Markov process: where and is an iid process.

is a Markov process.
5. Let be the Markov chain defined in Problem 2.
1. Find the one-step transition probability matrix for .

2. Find the two-step transition probability matrix by matrix multiplication. Check your answer by computing and comparing it to the correspoinding entry in .

From the matrix .
3. What happens to as approaches infinity ? Use your answer to guess the limit of as .
As eventually all black balls are removed. Thus

6. Two gambler play the following game. A fair coin is flipped; if the outcome is heads, player pays player $1, and if the outcoime is tails player plays player$ 1. the game is continued until one of the players goes broke. Suppose that initially player has $1 and player has$ 2, so a total of \$ 3 is up for grabs. Let denote the number of dolars held by player after trials.
1. Show that is a Markov chain.

since for
and if .

2. Sketch the state transition diagram for and give the one-step transition probability matrix .

3. Use the state trasition diagram to help you show that for even for and .

For

 2 cycles and then go to 0 by symmetry.

4. Find the -step transition probability matrix for even using part c.

5. Find the limit of as .

6. Find the probability that player eventually wins.

[player wins] = .

7. A machine consists of two parts that fail and are repaired independently. A working part fails during any given day with probability . A part that is not working is repaired by the next day with probability . Let be the number of working parts in day .
1. Show that is a three-state Markov chain and give its one-step transition probability matrix .

2. Show that the steady state pmf is binomial with parameter .

Claim: the steady state pmf is

3. What do you expect is steady state pmf for a machine that consists of parts?