# Markov Processes

Concepts :: Discrete :: Continuous :: States

## Classes of States

Let . Then state is recurrent if .

If
, then state
is said to be
**
transient
**
.

- If started in a transient state, then the state does not recur an infinite number of times.
- If in a recurrent state, then the state recurs an infinite number of times.

Then

We see that recurrent means that .

Transient means that
.

Observation: The states of an irreducible, finite-state Markov chain are all recurrent.

## Limiting probabilities

If all states are transient, then all the state probabilities approach 0 as . If a M.C. has some transient classes and some recurrent classes, then eventually the process enters and remains in one of the recurrent classes. For limiting purposes, we can focus on individual recurrent classes.

Suppose a M.C. starts in a recurrent state at time 0. Let denote the times when the process returns to state , where is the time that elapses between the