


Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
A lecture note from math275b - winter 2012, covering the topic of stationary stochastic processes with a focus on markov chains and their stationary distributions. The lecture also discusses the abstract setting of measure-preserving transformations and kolmogorov's extension theorem, which allows realizing all real-valued stationary stochastic processes in this framework.
Typology: Schemes and Mind Maps
1 / 4
This page cannot be seen from the preview
Don't miss anything!
MATH275B - Winter 2012 Lecturer: Sebastien Roch
References: [Var01, Chapter 6], [Dur10, Section 6.1], [Bil95, Chapter 24].
DEF 13.1 (Stationary stochastic process) A real-valued process {Xn}n≥ 0 is sta- tionary if for every k, m
(Xm,... , Xm+k) ∼ (X 0 ,... , Xk).
EX 13.2 IID sequences are stationary.
1.1.1 Markov chains
DEF 13.3 (Discrete-time finite-space MC) Let A be a finite space, μ a distribu- tion on A and {p(i, j)}i,j∈A a transition matrix on E. Let (Xn)n≥ 0 be a process with distribution
P[X 0 = x 0 ,... , Xn = xn] = μ(x 0 )p(x 0 , x 1 ) · · · p(xn− 1 , nn),
for all n ≥ 0 and x 0 ,... , xn ∈ A.
EX 13.4 (RW on a graph) Let G = (V, E) be a finite, undirected graph. Define
p(i, j) = 1 {(i, j) ∈ E} |{N (i)}|
where N (i) = {j : (i, j) ∈ E}.
This defines a RW on a graph as the finite MC with the above transition matrix (for each μ, an arbitrary distribution on V ). More generally, any finite MC can be seen as a RW on a weighted directed graph.
EX 13.5 (Asymmetric SRW on an interval) Let (Sn)n≥ 0 be an asymmetric SRW with parameter 1 / 2 < p < 1. Let a < 0 < b, N = Ta ∧ Tb. Then (Xn)n≥ 0 = (SN ∧n)n≥ 0 is a Markov chain.
1.1.2 Stationarity
DEF 13.6 (Stationary Distribution) A probability measure π on A is a stationary distribution if (^) ∑
i
π(i)p(i, j) = π(j),
for all i, j ∈ A. In other words, if X 0 ∼ π then X 1 ∼ π and in fact Xn ∼ π for all n ≥ 0.
EX 13.7 (RW on a graph) In the RW on a graph example above, define
π(i) = |N (i)| 2 |E|
Then ∑
i∈V
π(i)p(i, j) =
i:(i,j)∈E
|N (i)| 2 |E|
|N (i)|
|N (j)| = π(j),
so that π is a stationary distribution.
EX 13.8 (ASRW on interval) In the ASRW on [a, b], π = δa and π = δb as well as all mixtures are stationary.
EX 13.9 (Stationary Markov chain) Let X be a MC on A (countable) with tran- sition matrix {pij }i,j∈A and stationary distribution π > 0. Then X started at π is a stationary stochastic process. Indeed, by definition of π and induction
X 0 ∼ Xn,
for all n ≥ 0. Then for all m, k by definition of MCs
(X 0 ,... , Xk) ∼ (Xm,... , Xm+k).
EX 13.10 (A canonical example) Let (Ω, F, P) be a probability space. A map T : Ω → Ω is said to be measure-preserving (for P) if for all A ∈ F,
(P[ω : T ω ∈ A] =)P[T −^1 A] = P[A].
If X ∈ F then Xn(ω) = X(T nω), n ≥ 0 , defines a stationary sequence. Indeed, for all B ∈ B(Rk+1)
P[(X 0 ,... , Xk)(ω) ∈ B] = P[(X 0 ,... , Xk)(T mω) ∈ B] = P[(Xm,... , Xm+k)(ω) ∈ B].
[Bil95] Patrick Billingsley. Probability and measure. Wiley Series in Probability and Mathematical Statistics. John Wiley & Sons Inc., New York, 1995.
[Dur10] Rick Durrett. Probability: theory and examples. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, fourth edition, 2010.
[Var01] S. R. S. Varadhan. Probability theory, volume 7 of Courant Lecture Notes in Mathematics. New York University Courant Institute of Mathematical Sciences, New York, 2001.