Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Markov Chain limiting probability vector/distributions

  • 07-03-2013 12:59pm
    #1
    Registered Users, Registered Users 2 Posts: 412 ✭✭


    Hi,

    Does anyone know if the limiting probability vector of a Markov Chain is the same as the limiting distribution of a MC? I know how to find the vector by using simultaneous equations but can't find anything about a distribution in my notes.

    Thanks for any help


Comments

  • Registered Users, Registered Users 2 Posts: 5,141 ✭✭✭Yakuza


    It sounds like two terms for the same concept. My CT4 notes call it the stationary probability distribution.
    Such a distribution of a Markov Chain (where it exists) has the property [LATEX]\pi = \pi P[/LATEX] (where [LATEX]\pi[/LATEX] is a row vector representing the probability distribution and [LATEX]P[/LATEX] is the transition matrix), or more formally:

    [LATEX]\pi_{j} = \sum\limits_{i \in S} \pi_{i}p_{ij}[/LATEX]

    S is the (finite)state space


Advertisement