Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Markov Chain matrix calculations

  • 07-01-2012 1:03pm
    #1
    Registered Users, Registered Users 2 Posts: 3,803 ✭✭✭


    Alright folks,

    Have a question regarding Markov Chains. I've started doing them there recently and I'm a little unsure with some of the theory. I managed to produce the first observed tally matrix and observed probability matrix giving me the first step easy enough. It's just that after that I'm having trouble and I'm pretty lost. How do I get the second step? I went back over the data manually and counted and did a second step tally matrix and then produced the probability matrix. But I'm still pretty lost by what I'm doing and I have never worked with matrices before.

    I came across this website: http://www.math.ucdavis.edu/~daddel/linear_algebra_appl/Applications/MarkovChain/MarkovChain_9_18/node1.html, I thought it was useful because they were able to show the cyclical nature of the dataset which is what I'm interested in. I was working on this example but I'm lost as to how they were able to produce the second, third and n number of steps.

    Any help would be really appreciated!


Comments

  • Registered Users, Registered Users 2 Posts: 2,103 ✭✭✭misslt


    I have briefly studied these so might be able to help - are you looking for the n-step transition matrix as in [Probability of being in state j at time n|in state i at time 0]? If so you raise the transition matrix to the power of n.

    If you are looking for the long term behaviour of the matrix (as n approaches infinity) known as the stationary distn. you let the stationary distn. be a vector (I used pi) and solve the matrix equations {pi} = {pi}P where P is the initial transition matrix.

    I have no idea if this is what you're after (the text in the link you posted seems like this is it, but the notation I'm not familiar with), if not sorry, if it is and you need anything else let me know!


  • Registered Users, Registered Users 2 Posts: 3,803 ✭✭✭El Siglo


    I figured it out after a while, I get the observed probability matrix and multiply it by itself, take that result and multiply it by itself until there's an equilibrium state after n number of steps. I have a FORTRAN 77 program for running it but it's giving me grief with running on my compiler. So yeh, think I have it sorted now! Thanks for the reply! :D


  • Registered Users, Registered Users 2 Posts: 2,481 ✭✭✭Fremen


    Do you know that if A is the transition matrix for a finite-state discrete time Markov chain, then the eigenvector of A with eigenvalue one is the stationary distribution of the chain?


  • Registered Users, Registered Users 2 Posts: 3,803 ✭✭✭El Siglo


    Fremen wrote: »
    Do you know that if A is the transition matrix for a finite-state discrete time Markov chain, then the eigenvector of A with eigenvalue one is the stationary distribution of the chain?

    I'm not sure to be honest? I know it's a one step, first order regular markov observed probability matrix. I see that the eigenvalues add up to one and that the eigenvectors are associated with these, but I'm not sure after that.


  • Registered Users, Registered Users 2 Posts: 2,103 ✭✭✭misslt


    What exactly are you trying to do?


  • Advertisement
Advertisement