20.4.24 Markov chains
The
markov
command finds characteristic features of a Markov chain.
markov
takes
M
, a transition matrix for a Markov process.
markov(
M
)
returns a sequence consisting of
the list of the positive recurrent states.
the list of corresponding invariant probabilities.
the list of other strong connected components.
the list of probabilities of ending up in the sequence of recurrent states.
Example
markov
([[0,0,1/2,0,1/2],[0,0,1,0,0],[1/4,1/4,0,1/4,1/4],[0,0,1/2,0,1/2],[0,0,0,0,1]])
⎡
⎣
4
⎤
⎦
,
⎡
⎣
0
0
0
0
1
⎤
⎦
,
⎡
⎣
3
1
2
0
⎤
⎦
,
⎡
⎢
⎢
⎢
⎢
⎢
⎣
1
1
1
1
1
⎤
⎥
⎥
⎥
⎥
⎥
⎦