Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
98 views9 pages

Lecture Notes in Statistics 145 Chapter 3 Part 2

This document discusses the properties of AR and MA processes. It begins by stating that AR and MA processes are stationary and conform to the conditions of a stationary series. It then derives the properties of AR(1), AR(2), MA(1), and MA(2) processes, including their means, variances, autocovariance functions, autocorrelation functions, and partial autocorrelation functions. It provides examples of simulated AR(1), AR(2), and MA processes to illustrate the theoretical properties.

Uploaded by

Eve Yap
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
98 views9 pages

Lecture Notes in Statistics 145 Chapter 3 Part 2

This document discusses the properties of AR and MA processes. It begins by stating that AR and MA processes are stationary and conform to the conditions of a stationary series. It then derives the properties of AR(1), AR(2), MA(1), and MA(2) processes, including their means, variances, autocovariance functions, autocorrelation functions, and partial autocorrelation functions. It provides examples of simulated AR(1), AR(2), and MA processes to illustrate the theoretical properties.

Uploaded by

Eve Yap
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

3.2.

3 Properties of the AR and MA Processes

Processes which are AR and MA are stationary. Thus these two


conforms with the conditions of a stationary series. We will not be
showing it for the general AR(p) and MA(q) but for only a few values
of the paramters. Moreover, we will derive the ACF and PACF of
some AR and MA processes.

(AR1 Properties) Consider the AR(1) model

Yt = + Yt 1 + t

Stationarity implies that the expected value must be constant:


E[Yt]=E[Yt-1] = . We will use this in deriving :

E [Yt ] = E [ + Yt 1 + t ]
E [Yt ] = + E [Yt 1 ] + E [ t ]
= +
(1 ) =

= .
(1 )

Remarks:
The effect of the constant term depends on the
autoregressive parameter .
Note that the mean is not defined when =1. The process
where the AR parameter is unity is not stationary.

Next, we calculate the variance. For convenience, define a mean-


adjusted series X t = Yt so that

Yt = + Yt 1 + t
Yt = (1 ) + Yt 1 + t
Yt = (Yt 1 ) + t
X t = X t 1 + t .

46
Note that the series and the mean-adjusted one have the same
variances: 0 = Var(Yt ) = Var(Yt ) = Var( X t ) . To derive the it,

[ ]
Var( X t ) = E X t E[X t ]
2 2

= E [(X ]
2
t 1 + t )
=E X [ 2 2
t 1
2
+ + 2 X t 1 t
t ]
= E X 2
[ ] + E[ ] + 2E[X
2
t 1
2
t
2

t 1 t ]
= Var( X t 1 ) + + 0.
2 2

But by stationarity, 0 = Var( X t ) = Var( X t 1 ) so that


0 = 2 0 + 2
0 (1 2 ) = 2
2
0 = .
1 2

Remarks:
The effect of the noise variance also depends on the
autoregressive parameter .
Note that the variance is defined only when ||<1. This is
the restriction on the AR parameter for AR(1) to be
stationary.

For the covariances, Cov(Yt , Yt k ) = Cov( X t , X t k ) ,

Cov( X t , X t k ) = E ( X t X t k )
k = 1 : 1 = E ( X t X t 1 ) = E[(X t 1 + t )X t 1 ]
2
[ ] [ ]
= E X t21 + E Yt*1 t = 0 =
12
k = 2 : 2 = E ( X t X t 2 ) = E[(X t 1 + t )X t 2 ]
2
= E[X t 1 X t 2 ] + E[X t 2 t ] = 1 = 2 0 = 2
12
2
in general, k = E ( X t X t k ) = k 0 = k .
12

47
The covariance structure of AR(1) does not depend on time t but on
the distance k between the two variables.

The ACF is given by


k k 0
k = = = k .
0 0

The PACF are simply the autoregressive coefficients ,0,0,.

Here are some simulated examples of AR(1) models. The red line in
the ACF and the PACF are the expected behavior:

Noise: yt = t

-1

-2

-3

-4
100 200 300 400 500 600 700 800 900 1000

High AR parameter: yt = 0.9 yt 1 + t . The series has a weak pressure


that keeps the series from meandering away from the mean.

10

-2

-4

-6

-8
100 200 300 400 500 600 700 800 900 1000

48
High negative AR parameter: yt = 0.9 yt 1 + t . Note the alternating
sign of the ACF when the parameter is negative.

12

-4

-8
100 200 300 400 500 600 700 800 900 1000

Low AR parameter: y t = 0.3 yt 1 + t . The series has a strong pressure


that keeps the series from meandering away from the mean.

-1

-2

-3

-4
100 200 300 400 500 600 700 800 900 1000

(AR2 Properties) Consider the AR(2) model

Yt = + 1Yt 1 + 2Yt 2 + t .

By stationarity, E[Yt]=E[Yt-1]= E[Yt-2]=. To derive the mean :

E [Yt ] = E [ + 1Yt 1 + 2Yt 2 + t ]


E [Yt ] = + 1 E [Yt 1 ] + 2 E [Yt 2 ] + E [ t ]
= + 1 + 2

49
(1 1 2 ) =

= .
1 1 2

For the autocovariance function, define again a mean-adjusted series


X t = Yt so that
X t = 1 X t 1 + 2 X t 2 + t .

Multiplying both sides by Xt and getting the expectations yields


[ ]
E X t2 = 1E ( X t 1 X t ) + 2 E ( X t 2 X t ) + E ( t X t )
1
424 3
2 ( why ?)

0 = 1 1 + 2 2 + 2 . (1)

Multiplying instead by Xt-1 gets us


E[X t X t 1 ] = 1E ( X t 1 X t 1 ) + 2 E ( X t 2 X t 1 ) + E ( t X t 1 )
1 = 1 0 + 2 1 . (2)

Now, multiplying instead by Xt-2 yields


E[X t X t 2 ] = 1E ( X t 1 X t 2 ) + 2 E ( X t 2 X t 2 ) + E ( t X t 2 )
2 = 1 1 + 2 0 . (3)

Lastly, multiplying by Xt-3 gets us


E[X t X t 3 ] = 1E ( X t 1 X t 3 ) + 2 E ( X t 2 X t 3 ) + E ( t X t 3 )
3 = 1 2 + 2 1 . (4)

In fact, for any k3, k = 1 k 1 + 2 k 2 . (5)

Equations 1 to 5 above are called Yule-Walker equations. Since


k = k 0 , we can derive the ACF:

1
1 = 1 + 2 1 1 =
1 2
12
2 = 11 + 2 2 = + 2
1 2
k = 1 k 1 + 2 k 1 , k 3.

50
The partial autocorrelations for the AR(2) process is (derivation is
not shown):
2
1 , 2 21 ,0,0,... .
1 1

In AR(1) we have found the condition on the AR parameter for the


process to be stationary. For AR(2), the following is the stationarity
condition (proof is omitted), forming a region inside a triangle:

2 1 < 1, 2 + 1 < 1 and 2 < 1.

Here are some simulated examples of AR(2) processes:

y t = 0.5 y t 1 + 0.3 y t 2 + t

-1

-2

-3

-4
100 200 300 400 500 600 700 800 900 1000

51
y t = 0.8 yt 2 + t

-2

-4

-6
100 200 300 400 500 600 700 800 900 1000

y t = 1.3 y t 1 0.8 y t 2 + t

12

-4

-8
100 200 300 400 500 600 700 800 900 1000

(MA1 & MA2 Properties) Consider the MA(1) process


Yt = + t + t 1 , where is called the MA parameter. The mean is
provided below:
E [Yt ] = E [ + t + t 1 ] = .

For the variance, define again a mean-adjusted series X t = Yt so


that

[ ] [
Var(Yt ) = Var( X t ) = E X t2 = E ( t + t 1 )
2
]
[ ]
= E t2 + 2 E [ ]+ 2E[ ]
2
t 1 t t 1
2 2 2
= +
= (1 + 2 ) 2 .

52
To derive the autocovariance function, Cov(Yt , Yt k ) = Cov( X t , X t k )
and Cov( X t , X t k ) = E ( X t X t k ) and are given as follows:

1 = E ( X t X t 1 ) = E [( t + t 1 )( t 1 + t 2 )]
[ ]
= E t t 1 + t t 2 + t21 + 2 t 1 t 2 = 2
2 = E ( X t X t 2 ) = E[( t + t 1 )( t 2 + t 3 )]
[ ]
= E t t 2 + t t 3 + t 1 t 2 + 2 t 1 t 3 = 0
M
k = E ( X t X t k ) = 0, k 2.

The ACF can right away be derived from the autocovariance function:

1 2
1 = = =
0 (1 + )
2 2
1+ 2
0
k = k = = 0, k 2.
0 (1 + 2 ) 2

The properties of the MA(2) process Yt = + t + 1 t 1 + 2 t 2 are as


follows (derive):

Mean and variance:

E [Yt ] = E [ + t + 1 t 1 + 2 t 2 ] =
(
Var[Yt ] = 0 = 1 + 1 + 2 2
2 2
)
Autocovariance and autocorrelation functions:

1 + 1 2
1 = (1 + 1 2 ) 2 1 = 2 2
1 + 1 + 2
2
2 = 2 2 2 = 2 2
1 + 1 + 2
k = 0, k 3 k = 0, k 3

53
y t = t + 0.9 t 1

-2

-4

-6
100 200 300 400 500 600 700 800 900 1000

y t = t 0.5 t 1

-1

-2

-3

-4

-5
100 200 300 400 500 600 700 800 900 1000

y t = t 1.3 t 1 + 0.85 t 2

-2

-4

-6

-8
100 200 300 400 500 600 700 800 900 1000

54

You might also like