Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Understanding Trend, Seasonality, and Autocorrelation in Stationary Time Series, Study notes of Statistics

An overview of simple descriptive techniques for analyzing stationary time series data. The author, Dr. Bo Li, discusses various types of variation in time series, including trend, seasonality, and autocorrelation. The document also covers transformations to stabilize variance, make seasonal effects additive, and normalize data for modeling and forecasting. Examples and R code are included.

What you will learn

  • How can autocorrelation and the correlogram be used to analyze time series data?
  • What are the different types of variation in time series data?
  • What are the three main reasons for transforming time series data?
  • Why is it important to understand trend and seasonality in time series data?
  • What are some common seasonal models for time series analysis?

Typology: Study notes

2021/2022

Uploaded on 09/12/2022

kataelin
kataelin 🇬🇧

4.7

(9)

221 documents

1 / 20

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Type of variation
Stationary time series
Transformations
Simple Descriptive Techniques
Dr. Bo Li
January 9, 2012
Dr. Bo Li Simple Descriptive Techniques
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14

Partial preview of the text

Download Understanding Trend, Seasonality, and Autocorrelation in Stationary Time Series and more Study notes Statistics in PDF only on Docsity!

Stationary time series Transformations

Simple Descriptive Techniques

Dr. Bo Li

January 9, 2012

Stationary time series Transformations

Simple Descriptive Techniques

I Descriptive methods should generally be tried before

attempting more complicated procedures, because they can be

vital in “cleaning” the data, and then getting a “feel” for

them, before trying to generate ideas a regards a suitable

model.

I If a time series contains trend, seasonality or some other

systematic component, the usual summary statistics (e.g.,

mean and standard deviation)can be seriously misleading and

should not be calculated.

I Moreover, even when a series do not contain any systematic

components, the summary statistics do not have their usual

properties.

I Focus on ways of understanding typical time-series effects,

such as trend, seasonality and correlations between successive

observations.

Stationary time series Transformations

Example of type of variation

0 100 200 300

0

20

40

60

80

days

counts *

Hospital admission counts for circulatory (black line) and

respiratory (red line) disease in 2002.

Stationary time series Transformations

Stationary time series

I A time series is said to be stationary if there is no systematic

change in mean (no trend), if there is no systematic change in

variance and if strictly periodic variations have been removed.

I Intuitively, the properties of one section of the data are much

like those of any other section.

I Strictly speaking, there is no such thing as “stationary time

series”, as the stationarity property is defined for a model.

I However, the phrase is often used for time-series data

meaning that they exhibit characteristics that suggest a

stationary model can sensibly be fitted.

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Three main reasons for transformation

I Make the seasonal effect additive: If there is a trend in the

series and the size of the seasonal effect appears to increase

with the mean, then it may be advisable to transform the data

so as to make the seasonal effect constant from year to year.

I additive: constant seasonal effect

I multiplicative: the size of the seasonal effect is directly

proportional to the mean. A logarithmic transformation is

appropriate to make the effect additive.

I Make the data normally distributed: model building and

forecasting are usually carried out on the assumption that the

data are normally distributed. For example, Box-Cox

transformation

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Three main reasons for transformation

Use with caution!

I There are problems in practice with transformations in that a

transformation, which makes the seasonal effect addtive, may

fail to stabilize the variance. Thus it may be impossible to

achieve all the above requirement at the same time.

I It is more difficult to interpret and forecasts produced by the

transformed model may have to be “transformed back” in

order to be of use. This can introduce biasing effects.

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Time Series with a trend

Three approaches to describe trend:

I Curve fitting: A traditional method of dealing with

non-seasonal data that contain a trend, particular yearly data,

is to fit a simple function of time such as a polynomial curve

(linear, quadratic, etc.).

I The fitted function provides a measure of the trend, and the

residuals provide an estimate of local fluctuations, where the

residuals are the differences between the observations and the

corresponding values of the fitted curve.

I Polynomial curve: mt = α + βt, e.g, mt = 0.4 + 2t

I Gompertz curve: log(mt ) = a + br t^ , e.g.,

log(mt ) = 3 + 2 × 0. 5

t

I Logistic cruve: mt = a

1+be−ct^ , e.g.,^ mt^ =^

1+0. 3 e−^2 t

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Time Series with a trend

Three approaches to describe trend:

I Filtering Linear filter converts one time series, xt , into

another, yt , by the linear interpolation

yt =

∑^ +s

r =−q

ar xt+r.

If

ar = 1, the operation is referred to as a moving average.

Moving averages are often symmetric with s = q and

aj = a−j.

e.g. ar =

1 2 q+

for r = −q,... , +q, and the smoothed value

of xt is given by

Sm(xt ) =

2 q + 1

+q

r =−q

xt+r.

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Time Series with a trend

I Convolution: ck =

r =−∞

ar bk−r

ck = ar? bj

Example: (1/ 4 , 1 / 2 , 1 /4) = (1/ 2 , 1 /2)? (1/ 2 , 1 /2)

I Differencing

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Analyzing series that contain seasonal variation

I Three common seasonal models:

A Xt = mt + St + t

B Xt = mt St + t

C Xt = mt St t

I Smoothing average for monthly, quarterly data...

I Seasonal differencing

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Autocorrelation and Correlogram

Correlogram is also called sample AutoCorrelation Function (ac.f.)

ac.f. for house sales

0 5 10 15 20

−0.

Lag

ACF

Series sales

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Autocorrelation and Correlogram

ac.f. for wheat price index

0 5 10 15 20 25

Lag

ACF

Series wheat

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Autocorrelation and Correlogram

R code to generate the correlogram

acf(sales)

acf(wheat)

acf(temp$temperature)

help(acf)

Another Example:

x <- 1:

y <- sin(x)

plot(y)

plot(y,type=’l’)

acf(y)

Stationary time series Transformations

Time series with a trend Autocorrelation and correlogram

Interpreting the correlogram

I Random series

I Short-term correlation

I Alternating series

I Non-stationary series

I Seasonal series

I Outliers