Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
20 views1 page

Time Series Models Explained

Uploaded by

oumaima abaied
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views1 page

Time Series Models Explained

Uploaded by

oumaima abaied
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Chapter 4: Models for Stationary Time Series Autocorrelation function:

ρ฀ = { 1, k=0
−θ𝑘+θ1θ𝑘+1+θ2θ𝑘+2 +...+θ𝑞−𝑘θ𝑞
Moving Average Processes { 1+θ1²+θ2²+...+θ𝑞²
), k = 1,2,...,q-1
−θ𝑞
Terminology: Suppose {et} is a zero mean white noise process with var(et) = σe² { ), k=q
1+θ1²+θ2²+...+θ𝑞²
The process Yt = et- θ1et-1- θ2et-2 - …- θqet-q is called a moving average process of order q, MA(q). { 0, k>q
MA(1) process The salient feature is that the (population) ACF:
With q = 1, the MA process defined above becomes Yt = et- θ1et-1 ; this is called an MA(1) process. - for lags k = 1, 2, ...q. ρ฀ is nonzero
For this process: - for all lags k > q ρ฀ =0
- the mean is E( Yt ) = E (et- θ1et-1) = 0. Auto Regressive Processes
- the variance is γ₀ = var(et- θ1et-1) =σ²e(1 + θ²)
Terminology: Suppose {e฀} is a zero-mean white noise process with var(e฀) = σₑ².
Autocovariance function:
The autocovariance function for an MA(1) process is: The process Y฀ = ϕ₁Y฀₋₁ + ϕ₂Y฀₋₂ + ...+ ϕ฀Y฀₋฀ + e฀ is called an autoregressive process of order p,
γk = { σ²e (1 + θ²), k=0
AR(p).
{ -θσ²e , k=1
{ 0, k>1 ● In this model, the value of the process at time t, Y฀ , is a weighted linear combination of the values
Autocorrelation function: of the process from the previous p time points plus a "shock" or "innovation" term e฀ at time t.
The autocorrelation function for an MA(1) process is: ● We assume that e฀, the innovation at time t, is independent of all previous values Y฀₋₁, Y฀₋₂,...,.
ρ฀ = γ฀ /γ₀ = { 1, k=0 ● We continue to assume that E(Y฀) = 0. A nonzero mean could be added to the model by replacing
−θ
{ 1+θ² , k=1 Y฀ with Y฀-µ (for all t). This would not affect the stationarity properties.
{ 0, k>1
MA(2) process AR(1) process
With q = 2, the MA process defined above becomes Yt = et- θ1et-1- θet-2 ; this is called an MA(2) process. With p = 1, the RA process defined above becomes Y฀ = φY฀₋₁ + e฀ ; this is called an AR(1) process.
For this process: Note that:
- the mean is E( Yt ) = E (et- θ1et-1- θet-2) = 0. - If φ = 1, this process reduces to a random walk.
- the variance is γ₀ = var(et- θ1et-1- θet-2) =σ²e(1 + θ1²+θ2²) - If φ = 0, this process reduces to white noise.
Autocovariance function: Autocovariance function:
The autocovariance function for an MA(2) process is: γ฀ = φᵏ (σ²e / 1 - φ²) for k = 0,1,2,...
γk = {(1 + θ1² + θ2² ) σ²e , k=0 Autocorrelation function:
{(-θ1+ θ1θ2 ) σ²e , k=1 ρ฀ = γ฀ / γ₀ = φᵏ for k = 0,1,2,...
{ -θ2 σ²e k=2
{ 0, k>2 Important: For an AR(1) process, because -1 < φ < 1, the (population) ACF ρ฀ = φᵏ decays exponentially
Autocorrelation function: as k increases.
The autocorrelation function for an MA(2) process is: - If φ is close to ±1, then the decay will be slower.
ρ฀ = γ฀ /γ₀ = { 1, k=0 - If φ is not close to ±1, then the decay will take place rapidly.
−θ1+θ1θ2
{ ), k=1 - If φ > 0, then all of the autocorrelations will be positive.
1+θ1²+θ2²
−θ2
- If φ < 0, then the autocorrelations will alternate from negative (k=1), to positive (k=2), to negative
{ 1+θ1²+θ2²
), k=2 (k=3), to positive (k=4), and so on.
{ 0, k>2
Remember these theoretical patterns so that when we see sample ACFs (from real data), we can make
MA(q) process
sensible decisions about potential model selection.
With q , the MA process Yt = et- θ1et-1- θ2et-2 - …- θqet-q is called an MA(q) process.
For this process: Stationarity condition:
- the mean is E( Yt ) = 0. The AR(1) process is stationary if and only if |φ| < 1, that is, if -1 < φ < 1.
- the variance is γ₀ = var(Yt) =(1 + θ1² + θ2²+...+θq² )σ²e The AR(1) process is not stationary if |φ| ≥ 1.

You might also like