Simulation of Markov chains by TRANSFORM
TRANSFORM <data> BY #MARKOV(P)
makes random values according to a Markov chain.
Each observation (variables X1,X2,...,Xm)
will contain one realization of the chain
and starts from state given by START=i (i=1,2,...).
Default is START=1.
The transition probabilities are given as
a square matrix P.
TRANSFORM <data> BY #MARKOV(P,<var>,<n>)
works as the previous operation but only
saves the state of the chain in <var>
after <n> steps.
Also Markov chains of degrees 2,3,...,8 can be simulated by the two
TRANSFORM operations above. Then the P matrix has dimensions m^k,m,
k=2,3,...,8. The start state is in these cases always the first one.
See an example on the next page!
.......................................................................
Example: Simulation of a 3-state Markov chain of degree 2:
MATRIX P93
/// A B C
AA 0.6 0.4 0
AB 0 1 0
AC 0.2 0.2 0.6
BA 0.6 0.2 0.2
BB 0.2 0.2 0.6
BC 0.2 0.2 0.6
CA 1 0 0
CB 0 0 1
CC 0 0 1 / This is a final state!
MAT SAVE P93
FILE MAKE TEST,30,1000,L,S / Space for 1000 chains of length 30
TRANSFORM TEST BY #MARKOV(P93) / Generating the chains RND=1111
FILE LOAD -TEST / DELIMITER=NULL / Loading the chains (3 first shown)
ABBCAABBAABBAABBAAAABBBCCCCCCC
AAAABBBAAAABBACCCCCCCCCCCCCCCC
BBABBAABBAABBCBCBCBCCCCCCCCCCC
M = More information on Markov chains
V = More information on transformations