9.4.24 Markov chains: markov
The markov command finds characteristic features of a Markov
chain.
-
markov takes one argument:
M, a transition matrix for a Markov process.
- markov(M) returns a sequence consisting of
-
the list of the positive recurrent states.
- the list of corresponding invariant probabilities.
- the list of other strong connected components.
- the list of probabilities of ending up in the sequence of recurrent states.
Example.
Input:
markov([[0,0,1/2,0,1/2],[0,0,1,0,0],[1/4,1/4,0,1/4,1/4],[0,0,1/2,0,1/2],[0,0,0,0,1]])
Output:
⎡
⎣ | | ⎤
⎦ | , | ⎡
⎣ | | ⎤
⎦ | , | ⎡
⎣ | | ⎤
⎦ | , | ⎡
⎢
⎢
⎢
⎢
⎢
⎣ | | ⎤
⎥
⎥
⎥
⎥
⎥
⎦ |