Chapter 2
Fundamental Concepts
2.1 Stochastic Processes
Family of time indexed random variables.
( ) belongs to a sample space ; to an index set.
As a function of ( ) is called a sample function.
A time series is a sample function from a stochastic process.
Consider { } from * ( ) +.
The dimensional distribution function
( ) ( )
The process is 1-st order stationary in distribution if
( ) ( )
for any integers and
The process is 2-nd order stationary in distribution if
( ) ( )
for any integers and .
The process is -order stationary in distribution if
( ) ( )
for any -tuple ( ) and of integers.
Consider the real-valued process
* +.
The mean function of the process
( ).
The variance function of the process
( ) .
The covariance function between and
( ) ( )( ).
The correlation function between and
( )
( )
For a strictly stationary process
if (| |) .
, if ( ) .
( ) ( )
( ) ( )
Set and From the previous two expressions
( ) ( ) ( )
( ) ( ) ( )
The covariance and correlation between and depends
only on the time difference .
Example of a strictly stationary process i.i.d. random
variables.
A process is th order weakly stationary if all its joint moments
up to order exist and are time invariant
A second order weakly stationary process will have constant
mean and variance. The covariance and correlation will be
functions of the time difference alone.
A second order weakly stationary process is sometimes called
covariance stationary.
A strictly stationary process can also be covariance stationary.
Examples 2.1 to 2.3 in Wei.
Gaussian process: the joint probabilty distribution of the
process is normal.
2.2 The Autocovariance and Autocorrelation Functions
For a stationary process with ( ) and ( )
( ) the covariance and correlation functions
can be written as
( ) ( )( )
and
( )
( )
is the autocovariance function ; the autocorrelation
function.
Properties:
1. ( )
2. | | | | .
3. and
4. and must be positive semidefinite, i.e.,
| |
| |
for any time points and real numbers .
2.3 The Partial Autocorrelation Function
The PACF is defined as ( | ).
Consider the regression equation
is from a zero mean stationary process.
denotes the th regression parameter.
is an error term with zero mean and uncorrelated
with the independent variables.
Multiplying on both sides by and taking expected
value, gives
For we have the system of equations
Solving successively for , we have
| |
| |
etc.
2.4 White Noise Processes
A process * + is called a white noise process if
it is a sequence of uncorrelated random variables
( ) can be zero ; ( )
( )
A white noise process * + is stationary with autocovariance
function
{
and autocorrelation function
and PACF
A white noise process is Gaussian if its joint distribution is
normal.
The basic phenomenon of the process is that the ACF and PACF
are identically equal to zero.
2.5 Estimation
Mean square convergence: * + converge in mean square to
if
( ) as
The sample mean
is an unbiased estimator for ( ) .
The variance is
| |
( ) ( )
( )
If
| |
[ ( ) ]
( )
then ( ) as and is a consistent
estimator for , i.e,
The process is ergodic for the mean if the last result holds.
Sample Autocovariance Function
The following estimators are employed for :
( )( )
or
( )( )
The expected values can be approximated by
( ) ( ) ( )
( ) ( )
Both estimators are biased.
becomes unbiased if ( ) is ignored.
has a larger bias than when is large w.r.t.
If ( ) then both and are asymptotically
unbiased.
is positive semidefinite ; not necessarily so for .
Sample Autocorrelation Function
For a given observed time series the sample ACF is
defined as
( )( )
( )
A plot of vs is called a correlogram.
For stationary Gaussian processes we have for
( ) ( )
which can be estimated by
( )
To test a white noise process, use
The sample ACF is symmetric about the origin, i.e.,
Sample Partial Autocorrelation Function
The sample PACF can be calculated by substituting parameters
by sample estimates in the expressions for .
A recursive formule is available for calculating the sample PACF.
See Wei p.22.
To test for a white noise process, the variance of can be
approximated by
( )
The quantity can be used as limits on to test the
hypothesis of a white noise process.
2.6 Two representations of a time series process
Moving average representation
The process is written as a linear combination of a sequence
of uncorrelated random variables, i.e.,
where * + is a zero mean white noise process, and
An infinite sum is defined as the limit in quadratic mean of the
finite partial sums, i.e.,
[( ) ]
where .
Using the backshift operator , the process can also
be written as
( )
where ( ) . For this process
( )
and
( )
The autocovariance function is given by
( )
( )
The ACF is
For the process to be stationary must be finite for each
Thus
| | | ( )| , ( ) ( )-
A required condition for stationarity is
The process is also called a linear process.
Let be a sequence of autocovariances. The
autocovariance generating function is defined as
( )
The variance, is the coefficient of and the
autocovariance of lag is the coefficient of both and
. Using the expression for , we have
( )
( ) ( )
Provides a way for calculating the autocovariances of some
linear processes. The autocorrelation generating function is
( )
( )
Autoregressive (AR) representation
or
( )
where ( ) ; | |
A process is called invertible if it can be written in this form.
Not every stationary process is invertible.
The linear process ( ) , will be invertible if the roots
of ( ) lie outside the unit circle.
If is a real root then | | If the the root is complex then
and | | .
An invertible process will be stationary if it can be written in
the linear form
( )
( )
so that is satisfied. The condition is that the roots
of ( ) lie outside the unit circle.
Different time series models
Autoregressive process of order
Moving average process of order .
Mixed autoregressive moving average model of order ( ).