A transition probability matrix is said to be doubly

stochastic if

i=0MPij=1

for all states j = 0, 1, ... , M. Show that such a Markov chain is ergodic, then

j = 1/(M + 1), j = 0, 1, ... , M.

Short Answer

Expert verified

It is proved that a Markov chain is ergodic, then πj=1M+1forj=0,1,...,M

Step by step solution

01

Given Information

We have given that the transition probability matrix is doubly stochastic if

i=0MPij=1

for all statesj=0,1,...,M.

02

Simplify

As the chain ergodic and the transition matrix is doubly stochastic, there exists a unique stationary distribution π. Now, we just have to check that is that localid="1648139523807" πi=1m+!solution of the system of the equation π=πPi.e., is it true

πj=ipijπjπj=ipijπj

but, we havelocalid="1648139404599" ipij=1, which means

1M+1=1M+1.ipij=1M+1.1=1M+1

So, we have proved that the stationary distribution is localid="1651480780070" πj=1m+1.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Study anywhere. Anytime. Across all devices.

Sign-up for free