Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Stationary Stochastic Processes: Markov Chains and Ergodicity, Schemes and Mind Maps of Stochastic Processes

A lecture note from math275b - winter 2012, covering the topic of stationary stochastic processes with a focus on markov chains and their stationary distributions. The lecture also discusses the abstract setting of measure-preserving transformations and kolmogorov's extension theorem, which allows realizing all real-valued stationary stochastic processes in this framework.

Typology: Schemes and Mind Maps

2021/2022

Uploaded on 09/27/2022

norris
norris 🇬🇧

4

(5)

212 documents

1 / 4

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Lecture 13 : Stationary Stochastic Processes
MATH275B - Winter 2012 Lecturer: Sebastien Roch
References: [Var01, Chapter 6], [Dur10, Section 6.1], [Bil95, Chapter 24].
1 Stationary stochastic processes
DEF 13.1 (Stationary stochastic process) A real-valued process {Xn}n0is sta-
tionary if for every k, m
(Xm, . . . , Xm+k)(X0, . . . , Xk).
EX 13.2 IID sequences are stationary.
1.1 Stationary Markov chains
1.1.1 Markov chains
DEF 13.3 (Discrete-time finite-space MC) Let Abe a finite space, µa distribu-
tion on Aand {p(i, j)}i,j Aa transition matrix on E. Let (Xn)n0be a process
with distribution
P[X0=x0, . . . , Xn=xn] = µ(x0)p(x0, x1)· · · p(xn1, nn),
for all n0and x0, . . . , xnA.
EX 13.4 (RW on a graph) Let G= (V, E)be a finite, undirected graph. Define
p(i, j) =
1
{(i, j)E}
|{N(i)}| ,
where
N(i) = {j: (i, j)E}.
This defines a RW on a graph as the finite MC with the above transition matrix (for
each µ, an arbitrary distribution on V). More generally, any finite MC can be seen
as a RW on a weighted directed graph.
EX 13.5 (Asymmetric SRW on an interval) Let (Sn)n0be an asymmetric SRW
with parameter 1/2< p < 1. Let a < 0< b,N=TaTb. Then (Xn)n0=
(SNn)n0is a Markov chain.
1
pf3
pf4

Partial preview of the text

Download Stationary Stochastic Processes: Markov Chains and Ergodicity and more Schemes and Mind Maps Stochastic Processes in PDF only on Docsity!

MATH275B - Winter 2012 Lecturer: Sebastien Roch

References: [Var01, Chapter 6], [Dur10, Section 6.1], [Bil95, Chapter 24].

1 Stationary stochastic processes

DEF 13.1 (Stationary stochastic process) A real-valued process {Xn}n≥ 0 is sta- tionary if for every k, m

(Xm,... , Xm+k) ∼ (X 0 ,... , Xk).

EX 13.2 IID sequences are stationary.

1.1 Stationary Markov chains

1.1.1 Markov chains

DEF 13.3 (Discrete-time finite-space MC) Let A be a finite space, μ a distribu- tion on A and {p(i, j)}i,j∈A a transition matrix on E. Let (Xn)n≥ 0 be a process with distribution

P[X 0 = x 0 ,... , Xn = xn] = μ(x 0 )p(x 0 , x 1 ) · · · p(xn− 1 , nn),

for all n ≥ 0 and x 0 ,... , xn ∈ A.

EX 13.4 (RW on a graph) Let G = (V, E) be a finite, undirected graph. Define

p(i, j) = 1 {(i, j) ∈ E} |{N (i)}|

where N (i) = {j : (i, j) ∈ E}.

This defines a RW on a graph as the finite MC with the above transition matrix (for each μ, an arbitrary distribution on V ). More generally, any finite MC can be seen as a RW on a weighted directed graph.

EX 13.5 (Asymmetric SRW on an interval) Let (Sn)n≥ 0 be an asymmetric SRW with parameter 1 / 2 < p < 1. Let a < 0 < b, N = Ta ∧ Tb. Then (Xn)n≥ 0 = (SN ∧n)n≥ 0 is a Markov chain.

1.1.2 Stationarity

DEF 13.6 (Stationary Distribution) A probability measure π on A is a stationary distribution if (^) ∑

i

π(i)p(i, j) = π(j),

for all i, j ∈ A. In other words, if X 0 ∼ π then X 1 ∼ π and in fact Xn ∼ π for all n ≥ 0.

EX 13.7 (RW on a graph) In the RW on a graph example above, define

π(i) = |N (i)| 2 |E|

Then ∑

i∈V

π(i)p(i, j) =

i:(i,j)∈E

|N (i)| 2 |E|

|N (i)|

2 |E|

|N (j)| = π(j),

so that π is a stationary distribution.

EX 13.8 (ASRW on interval) In the ASRW on [a, b], π = δa and π = δb as well as all mixtures are stationary.

EX 13.9 (Stationary Markov chain) Let X be a MC on A (countable) with tran- sition matrix {pij }i,j∈A and stationary distribution π > 0. Then X started at π is a stationary stochastic process. Indeed, by definition of π and induction

X 0 ∼ Xn,

for all n ≥ 0. Then for all m, k by definition of MCs

(X 0 ,... , Xk) ∼ (Xm,... , Xm+k).

1.2 Abstract setting

EX 13.10 (A canonical example) Let (Ω, F, P) be a probability space. A map T : Ω → Ω is said to be measure-preserving (for P) if for all A ∈ F,

(P[ω : T ω ∈ A] =)P[T −^1 A] = P[A].

If X ∈ F then Xn(ω) = X(T nω), n ≥ 0 , defines a stationary sequence. Indeed, for all B ∈ B(Rk+1)

P[(X 0 ,... , Xk)(ω) ∈ B] = P[(X 0 ,... , Xk)(T mω) ∈ B] = P[(Xm,... , Xm+k)(ω) ∈ B].

References

[Bil95] Patrick Billingsley. Probability and measure. Wiley Series in Probability and Mathematical Statistics. John Wiley & Sons Inc., New York, 1995.

[Dur10] Rick Durrett. Probability: theory and examples. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, fourth edition, 2010.

[Var01] S. R. S. Varadhan. Probability theory, volume 7 of Courant Lecture Notes in Mathematics. New York University Courant Institute of Mathematical Sciences, New York, 2001.