Download Understanding Trend, Seasonality, and Autocorrelation in Stationary Time Series and more Study notes Statistics in PDF only on Docsity!
Stationary time series Transformations
Simple Descriptive Techniques
Dr. Bo Li
January 9, 2012
Stationary time series Transformations
Simple Descriptive Techniques
I Descriptive methods should generally be tried before
attempting more complicated procedures, because they can be
vital in “cleaning” the data, and then getting a “feel” for
them, before trying to generate ideas a regards a suitable
model.
I If a time series contains trend, seasonality or some other
systematic component, the usual summary statistics (e.g.,
mean and standard deviation)can be seriously misleading and
should not be calculated.
I Moreover, even when a series do not contain any systematic
components, the summary statistics do not have their usual
properties.
I Focus on ways of understanding typical time-series effects,
such as trend, seasonality and correlations between successive
observations.
Stationary time series Transformations
Example of type of variation
0 100 200 300
0
20
40
60
80
days
counts *
Hospital admission counts for circulatory (black line) and
respiratory (red line) disease in 2002.
Stationary time series Transformations
Stationary time series
I A time series is said to be stationary if there is no systematic
change in mean (no trend), if there is no systematic change in
variance and if strictly periodic variations have been removed.
I Intuitively, the properties of one section of the data are much
like those of any other section.
I Strictly speaking, there is no such thing as “stationary time
series”, as the stationarity property is defined for a model.
I However, the phrase is often used for time-series data
meaning that they exhibit characteristics that suggest a
stationary model can sensibly be fitted.
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Three main reasons for transformation
I Make the seasonal effect additive: If there is a trend in the
series and the size of the seasonal effect appears to increase
with the mean, then it may be advisable to transform the data
so as to make the seasonal effect constant from year to year.
I additive: constant seasonal effect
I multiplicative: the size of the seasonal effect is directly
proportional to the mean. A logarithmic transformation is
appropriate to make the effect additive.
I Make the data normally distributed: model building and
forecasting are usually carried out on the assumption that the
data are normally distributed. For example, Box-Cox
transformation
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Three main reasons for transformation
Use with caution!
I There are problems in practice with transformations in that a
transformation, which makes the seasonal effect addtive, may
fail to stabilize the variance. Thus it may be impossible to
achieve all the above requirement at the same time.
I It is more difficult to interpret and forecasts produced by the
transformed model may have to be “transformed back” in
order to be of use. This can introduce biasing effects.
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Time Series with a trend
Three approaches to describe trend:
I Curve fitting: A traditional method of dealing with
non-seasonal data that contain a trend, particular yearly data,
is to fit a simple function of time such as a polynomial curve
(linear, quadratic, etc.).
I The fitted function provides a measure of the trend, and the
residuals provide an estimate of local fluctuations, where the
residuals are the differences between the observations and the
corresponding values of the fitted curve.
I Polynomial curve: mt = α + βt, e.g, mt = 0.4 + 2t
I Gompertz curve: log(mt ) = a + br t^ , e.g.,
log(mt ) = 3 + 2 × 0. 5
t
I Logistic cruve: mt = a
1+be−ct^ , e.g.,^ mt^ =^
1+0. 3 e−^2 t
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Time Series with a trend
Three approaches to describe trend:
I Filtering Linear filter converts one time series, xt , into
another, yt , by the linear interpolation
yt =
∑^ +s
r =−q
ar xt+r.
If
ar = 1, the operation is referred to as a moving average.
Moving averages are often symmetric with s = q and
aj = a−j.
e.g. ar =
1 2 q+
for r = −q,... , +q, and the smoothed value
of xt is given by
Sm(xt ) =
2 q + 1
+q
r =−q
xt+r.
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Time Series with a trend
I Convolution: ck =
r =−∞
ar bk−r
ck = ar? bj
Example: (1/ 4 , 1 / 2 , 1 /4) = (1/ 2 , 1 /2)? (1/ 2 , 1 /2)
I Differencing
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Analyzing series that contain seasonal variation
I Three common seasonal models:
A Xt = mt + St + t
B Xt = mt St + t
C Xt = mt St t
I Smoothing average for monthly, quarterly data...
I Seasonal differencing
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Autocorrelation and Correlogram
Correlogram is also called sample AutoCorrelation Function (ac.f.)
ac.f. for house sales
0 5 10 15 20
−0.
Lag
ACF
Series sales
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Autocorrelation and Correlogram
ac.f. for wheat price index
0 5 10 15 20 25
Lag
ACF
Series wheat
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Autocorrelation and Correlogram
R code to generate the correlogram
acf(sales)
acf(wheat)
acf(temp$temperature)
help(acf)
Another Example:
x <- 1:
y <- sin(x)
plot(y)
plot(y,type=’l’)
acf(y)
Stationary time series Transformations
Time series with a trend Autocorrelation and correlogram
Interpreting the correlogram
I Random series
I Short-term correlation
I Alternating series
I Non-stationary series
I Seasonal series
I Outliers