Name:
Student ID number:
Midterm 1
Statistics 153 Introduction to Time Series
March 7th, 2019
General comments:
1. Flip this page only after the midterm has started.
2. Before handing in, write your name one every sheet of paper!
3. Anyone caught cheating on this midterm will receive a failing grade
and will also be reported to the University Office of Student Conduct.
In order to guarantee that you are not suspected of cheating, please
keep your eyes on your own materials and do not converse with others
during the midterm.
1/ 21
Name:
1. Consider the following model for time series data Xt = Xt−1 + Zt + δ, where δ is some non-zero
constant and Zt is white noise with variance σ 2 .
(a) Give the definition of weak and strong stationarity.
s (4 Points)
(b) Show that there exist no stationary solution for Xt in the above model.
s (2 Points)
2/ 21
Name:
(c) From now on suppose that X0 = 0. Compute the mean and the variance of Xt for all t > 0.
s (3 Points)
(d) Is Xt homoscedastic? Explain.
s (1 Points)
3/ 21
Name:
(e) Propose an invertible function f (·) such that the transformed data f (Xt ) has approximately
constant variance. Explain.
Hint: You may assume that all your observations are positive.
s (3 Points)
4/ 21
Name:
(f) Propose an invertible transformation of Xt such that it is stationary. Explain.
s (3 Points)
5/ 21
Name:
2. Consider the stationary, zero-mean AR(1) model Xt = 0.5Xt−1 + Zt and the MA(1) model
Wt = 0.5Zt−1 + Zt , where Zt is some white noise with variance σ 2 .
(a) For each of Zt , Wt , and Xt give the ACVF and ACF function.
i. For Zt :
s (1 Points)
ii. For Wt :
s (2 Points)
6/ 21
Name:
iii. For Xt :
s (2 Points)
7/ 21
Name:
(b) For each of Zt , Wt , and Xt give the approximate mean and variance of its sample ACF at
lag 2 for n = 100 observations.
Hint: Recall Bartlett’s formula Wij =
P∞
m=1 (ρ(m + i) + ρ(m − i) − 2ρ(i)ρ(m)) (ρ(m + j) + ρ(m − j) − 2ρ(j)ρ(m))
i. For Zt :
s (2 Points)
8/ 21
Name:
ii. For Wt :
s (4 Points)
9/ 21
Name:
iii. For Xt :
s (4 Points)
10/ 21
Name:
1.0
1.0
1.0
0.8
0.8
0.8
0.6
0.6
0.6
ACF
ACF
ACF
0.4
0.4
0.4
0.2
0.2
0.2
0.0
0.0
0.0
−0.2
−0.2
−0.2
0 2 4 6 8 10 0 2 4 6 8 10 0 2 4 6 8 10
Lag Lag Lag
Figure 1: Sample ACFs of different time series data.
(c) Figure 1 shows sample ACFs for each of the three models for n = 100 observations. Which
figure corresponds to which process? Explain.
s (3 Points)
11/ 21
Name:
3. For zero mean time series data {Xt } consider the model (1 − 0.2B)(Xt − 0.5Xt−1 ) = (Zt −
0.6Zt−1 + 0.05Zt−2 ), where {Zt } is white noise with variance σ 2 = 4.
(a) Identify {Xt } as an ARMA(p,q) model and give its MA and AR polynomials.
s (4 Points)
12/ 21
Name:
(b) Is the model invertible and causal?
s (2 Points)
13/ 21
Name:
(c) Find its unique stationary solution.
s (4 Points)
14/ 21
Name:
(d) Compute its ACVF.
s (4 Points)
15/ 21
Name:
(e) Assume someone wants to use this model to predict weekly car sales. On average the
company sells 100 cars per week. Two weeks ago they sold 95 cars and last week they sold
101 cars. Based on this, what is the best linear predictor of car sales next week?
Hint: You do not have to compute the actual value, it is enough to write down a linear
system of equations that needs to be solved.
s (3 Points)
16/ 21
Name:
4. A scientist considers the model Xt = mt + st + Wt for some time series data, where mt = a t + b
is a linear trend function with parameters a, b and st is a seasonal component with period 2, that
is, st = st+2 for all t. Wt is some zero mean stationary process.
(a) First, the scientist wants to estimate the trend function mt using a filter of the form 1 +
αB + βB 2 + γB 3 , where B denotes the backshift operator and α, β, γ are parameters. How
should she chose the parameters α, β, γ such that the filtered time series is an unbiased
estimator of the trend mt , that is, E((1 + αB + βB 2 + γB 3 )Xt ) = mt ?
Hint: First, argue that without loss of generality you can assume that s1 + s2 = 0.
s (5 Points)
17/ 21
Name:
(b) Is Xt a stationary process? Explain.
s (1 Points)
(c) Propose a transformation using differencing to make the process stationary. Explain.
s (3 Points)
18/ 21
Name:
(d) For the stationary process Wt the scientist considers two different models:
• an MA(1) model,
• an AR(1) model.
For both of these choices identify the transformed data from (4c) as some ARMA model.
Hint: It is enough to state the orders of the respective ARMA models with explanation.
s (6 Points)
19/ 21
Name:
5. For each statement, indicate whether it is true or false and give a short explanation.
You only get points when both, True/False and the explanation, are correct.
(a) For the sample autocorrelations of n = 1, 000 i.i.d. white noise random variables at lags
h = 1, . . . , 100, you expect on average 5 of them to be larger than 1.96 in absolute value.
[ ] True [ ] False
Explanation:
(b) The sample autocorrelations of an AR(1) process with i.i.d. white noise are (for large sample
size) approximately i.i.d..
[ ] True [ ] False
Explanation:
(c) Applying a linear (time invariant) filter to a stationary process results again in a stationary
process.
[ ] True [ ] False
Explanation:
(d) When you want to fit a seasonal parametric function of the form st = a0 +
Pk
f =1 (af cos(2πf t/d) + bf sin(2πf t/d)) with parameters a0 , a1 , . . . , ak , b1 , . . . , bk it can be
helpful to chose k > d/2.
[ ] True [ ] False
Explanation:
(e) A time series {Xt } where Xt follows a Gaussian distribution for each t is a Gaussian process.
[ ] True [ ] False
Explanation:
(f) Whether a time series is invertible or not is fully determined by its finite dimensional
distributions.
[ ] True [ ] False
Explanation:
20/ 21
Name:
(g) Whether a time series is strongly stationary or not is fully determined by its mean and
covariance function.
[ ] True [ ] False
Explanation:
(h) Whether a Gaussian process is strongly stationary or not is fully determined by its mean
and covariance function.
[ ] True [ ] False
Explanation:
s (8 Points)
21/ 21