Exercise 1: Consider the graph G = (V, E) non oriented and without loops.
V is the set of
the vertices and E the set of the edges of the graph G. A Random Walk on G, is a Markov
chain with transition matrix P = (pij )i,j∈V defined by
1/deg(i) if i e j are adjacent
pij = (1)
0 otherwise
where deg(i) = |{j : j ∼ i}| is the cardinality of the vertices connected with i.
Solution: If π is a stationary distirbution, it makes the probability outflow from any node
i∈V,
X 1
πi = πi
deg(i)
j∼i
equal to the probability inflow at the same node
X 1
πj .
deg(j)
j∼i
Substituting in the equality
X 1
πi = πj
deg(j)
j∼i
P
the values πi = deg(i)/d, with d = i∈V deg(i), we get the equality
deg(i) X deg(j) 1 X1 deg(i)
= = = .
d d deg(j) d d
j∼i j∼i
Exercise 2: Compute the stationary distribution for an HMC with values in I = {1, 2, 3}
and transition matrix
1−α α 0
P= 0 1−β β . (2)
γ 0 1−γ
where α, β, γ ∈ (0, 1).
Solution: Let π = (π(1), π(2), π(3)) be a distribution on I. By definition π is stationary for
P if it holds the equality:
1−α α 0
(π(1), π(2), π(3)) 0 1−β β = (π(1), π(2), π(3)), (3)
γ 0 1−γ
which is equivalent to the system
απ(1) = γπ(3);
βπ(2) = απ(1);
βπ(2) = γπ(3).
The solution (the value of π(1) will be fixed later) is
α α
π(3) = π(1) e π(2) = π(1). (4)
γ β
Now, imposing the normalization, we have
α α
π(1) + π(2) + π(3) = π(1) + π(1) + π(1) = 1. (5)
β γ
which gives:
γβ
π(1) = (6)
γβ + αβ + γα
Substituting this expression for π(1) in (4), we recover
γα
π(2) = ; (7)
γβ + αβ + γα
αβ
π(3) = . (8)
γβ + αβ + γα
Exercise 3: Let τ be a stopping time for the process (Xn )n≥0 and n̄ ∈ N = {0, 1, 2, . . .}.
Show that τ̄ = τ + n̄ is a stopping time.
Solution: Since τ is a stopping time, {τ = n} = fn (X0 , X1 , . . . Xn ).
It is clear that τ̄ ≥ n̄ a.s., therefore consider n ≥ n̄, we have
{τ̄ = n} = {τ = n − n̄} = fn−n̄ (X0 , X1 , . . . Xn−n̄ ) = gn (X0 , X1 , . . . Xn )
having set gn (x0 , x1 , . . . , xn ) = fn−n̄ (x0 , x1 , . . . , xn−n̄ ).
Exercise 4: Let (Xn )n≥0 be an HMC on I ⊂ N. Assume X0 = 0 and that the chain has no
closed sets. Moreover, let (Sj )j≥0 be random times defined as follows:
S0 = 0 and Sj = inf{n > Sj−1 |Xn ̸= XSj−1 } for j ≥ 1.
Define the process (Yn )n≥0 , setting Y0 = X0 ∈ I and Yn = XSn for n ≥ 1.
1. Prove that (Sj )j≥0 are stopping times for (Xn )n≥0 .
2. Argue that (Yn )n≥0 is a Markov chain.
3. Compute the transition matrix of (Yn )n≥0 .
Solution:
1. Sj is the (random) time at which (Xn )n≥0 changes its state for the j-th time. The chain
change its state every time that Xn−1 ̸= Xn . Thus
(m )
1{Xi−1 ̸=Xi } = j, Xm−1 ̸= Xm .
X
{Sj = m} = (9)
i=1
Eq. (9) shows (Sj )j≥0 is a sequence of stopping times.
2. Since (Sj )j≥0 is a sequence of stopping times, (Yn )n≥0 is a Markov chain for the Strong
Markov property.
3. Let us indicate with P = (pij )i,j∈I the transition matrix of (Xn )n≥0 and with P̃ = (p̃ij )i,j∈I
the transition matrix of the process (Yn )n≥0 .
The P̃ reads
pij
p̃ii = 0 e p̃ij = P per j ̸= i . (10)
h̸=i pih
First take: A first way for a rigorous derivation of (10) is to start from P̃ = (p̃ij )i,j∈I :
p̃ij = P(Yn = j|Yn−1 = i)
= P(XSn = j|XSn−1 = i)
P(XSn = j, XSn−1 = i)
=
P(XSn−1 = i)
P P
t>h h≥n−1 P(Xt= j, Xh = i, Sn = t, Sn−1 = h)
=
P(XSn−1 = i)
P P
t>h h≥n−1 P(Xt = j, Xt ̸= i, Xt−1 = i, . . . , Xh = i, Sn−1 = h)
=
P(XSn−1 = i)
= ...
Now, we condition with respect to {Xt−1 = i, . . . , Xh = i, Sn−1 = h} and use the the
Markov property. Notice that markovianity can be used because, being Sn−1 a stopping
time, the event {Sn−1 = h} depends solely on X0 , . . . , Xh . We obtain
P P
t>h h≥n−1 P(Xt = j, Xt ̸= i|Xt−1 = i)P(Xt−1 = i, . . . , Xh = i, Sn−1 = h)
... =
P(XSn−1 = i)
P P
t>h h≥n−1 P(Xt−1 = i, . . . , Xh = i, Sn−1 = h)
= P(Xt = j, Xt ̸= i|Xt−1 = i)
P(XSn−1 = i)
P P
t>h h≥n−1 P(Xt−1 = i, . . . , Xh = i, Sn−1 = h)
= pij (1 − δij )
P(XSn−1 = i)
= pij (1 − δij ) Zi .
where δij is the Kronecker’s delta, i.e. it equals 1 if i = j and it is 0 otherwise. In Zi we
collected all the terms that depend on i. The precise expression of Zi does not matter.
Zi is a normalizing factor statisfying to the condition
X
pij (1 − δij ) Zi = 1
j∈E
which implies
1
Zi = P .
h̸=i pih
Second take: A second way to prove (10) is to use the first-step analysis. Take i ̸= j.
Then, by definition of p̃ij and homogeneity we have:
p̃ij = P(Yn = j|Yn−1 = i)
= P(XS1 = j|X0 = i)
:= u(i).
(11)
We compute
X
u(i) = P(XS1 = j|X1 = k)pik
i∈E
= u(i)pii + pij
(12)
where we used that P(XS1 = j|X1 = i) = u(i) and P(XS1 = j|X1 = k) = δjk . The
latter equation gives for i ̸= j
pij pij
p̃ij := u(i) = =P .
1 − pii h̸=i pih
P
p̃ii = 0 follows from the condition j∈E p̃ij = 1.
Exercise 5: Let (Xn )n≥0 be a random walk on Z, with X0 = 0, show that Yn = |Xn | is a
birth-death process on N0 and compute its transition probabilities.
Solution: It is not difficult to be convinced that any path of length m that starts in X0 = 0
and ends in Xm = j has the following probability
if j ≥ 0 : P(Xm = j, . . . , X0 = 0) = pj × (pq)(m−j)/2 ,
if j ≤ 0 : P(Xm = j, . . . , X0 = 0) = q −j × (pq)(m+j)/2 .
Now,
Pr(Ym+1 = j + 1, Ym = j, . . . , Y0 = 0)
Pr(Ym+1 = j + 1|Ym = j, . . . , Y0 = 0) =
Pr(Ym = j, . . . , Y0 = 0)
As for the numerator, assuming j ≥ 0 and m = j + 2k for some k ≥ 0,
Pr(Ym+1 = j + 1, Ym = j, . . . , Y0 = 0)
= Pr(Ym+1 = j + 1, Xm = j, Ym = j, . . . , Y0 = 0, X0 = 0)
+ Pr(Ym+1 = j + 1, Xm = −j, Ym = j, . . . , Y0 = 0, X0 = 0)
= (pq)(m−j)/2 (pj+1 + q j+1 ) .
Doing the same for the denominator we have
pj+1 + q j+1
Pr(Ym+1 = j + 1|Ym = j, . . . , Y0 = 0) = = Pr(Ym+1 = j + 1|Ym = j) .
pj + q j