Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
205 views26 pages

Chapter 4 Econometrics PDF

Chapter Four discusses violations of classical assumptions in regression analysis, focusing on multicollinearity, heteroscedasticity, and autocorrelation. It outlines the nature, sources, consequences, detection methods, and remedial measures for each issue. The chapter emphasizes the importance of addressing these violations to ensure accurate and efficient estimation in regression models.

Uploaded by

Getnet shita
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
205 views26 pages

Chapter 4 Econometrics PDF

Chapter Four discusses violations of classical assumptions in regression analysis, focusing on multicollinearity, heteroscedasticity, and autocorrelation. It outlines the nature, sources, consequences, detection methods, and remedial measures for each issue. The chapter emphasizes the importance of addressing these violations to ensure accurate and efficient estimation in regression models.

Uploaded by

Getnet shita
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Chapter Four Violations of Classical

Assumptions

BY Getnet Shita(MA)

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 1 / 26


Multicollinearity: Nature of the Problem

One classical linear regression assumption is no perfect


multicollinearity.
Multicollinearity = high correlation between regressors.
Leads to difficulties in estimating individual effects of
explanatory variables.
Example: In a model with income and wealth explaining
consumption, strong correlation between income and wealth
makes it hard to isolate their separate effects.
More problematic in time series data due to common trends.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 2 / 26


Sources of Multicollinearity

1 Data collection over a narrow range of variable values.


2 Population constraints, e.g., income vs house size.
3 Model specification, like polynomial terms.
4 Overdetermined models (more variables than observations).
5 Common trends in time series (e.g., income, wealth,
population all grow).

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 3 / 26


Consequences of Multicollinearity

Perfect Multicollinearity:
Coefficients are indeterminate; standard errors are infinite.
Cannot separate effects of collinear variables.
High (Imperfect) Multicollinearity:
OLS estimators have high variance; less precise.
Wide confidence intervals and low t-values.
Insignificant coefficients despite high R 2 .
Estimates are sensitive to small changes in data.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 4 / 26


Chapter Four Violations of Classical
Assumptions

BY Getnet Shita(MA)

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 5 / 26


Detecting Multicollinearity

Definition: Linear dependence among explanatory variables in a


regression model.
Sample Problem: Multicollinearity is a feature of the sample,
not the population.
Indicators:
High R 2 but few significant t-ratios.
High pairwise correlation (r > 0.8) among regressors.
High R 2 in auxiliary regression (regressing one Xi on others).
1
VIF: VIFi = 1−R 2 ; if VIF > 10, serious issue.
i
1
Tolerance: TOLi = 1 − Ri2 = VIFi

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 6 / 26


Effect on Variance

Variance of coefficient increases with multicollinearity:

σ2 σ2
Var(β̂i ) = P = · VIFi
xi2 (1 − Ri2 ) xi2
P

As Ri2 → 1, VIFi → ∞, hence:

Standard errors ↑⇒ imprecise estimates

TOL ↓ (closer to 0) indicates stronger multicollinearity.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 7 / 26


Remedial Measures

More data: Reduces variance of estimators.


Transform variables:
First-differencing to remove trends.
Use ratios (e.g., per capita).
Formalize relationships: Model interdependencies using
simultaneous equations.
Pool data: Combine time-series and cross-sectional.
Drop variable: Avoid if theory mandates its inclusion.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 8 / 26


Chapter Four Violations of Classical
Assumptions

BY Getnet Shita(MA)

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 9 / 26


Nature of Heteroscedasticity

Homoscedasticity: Constant variance of error terms:


Var(ui ) = σ 2 .
Heteroscedasticity: Variance of ui is not constant.
Common in cross-sectional data.
Example: In a savings–income model, savings variability
increases with income.
This violates CLRM assumptions and impacts inference accuracy.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 10 / 26


Causes and Consequences of Heteroscedasticity

Causes:
Outliers and data from different populations.
Model misspecification (e.g., omitted variables).
Skewed regressors (e.g., income, wealth).
Incorrect functional forms or data transformations.
Consequences:
OLS remains unbiased and consistent.
OLS is no longer BLUE (inefficient).
Invalid standard errors → misleading t and F tests.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 11 / 26


Detection of Heteroscedasticity
1. Informal Methods
Visual inspection of residual plots.
Patterned residuals may indicate non-constant variance.
2. Park Test
Regress ln(û 2 ) on ln(X ).
Significance of β indicates heteroscedasticity.
3. Spearman’s Rank Correlation
Correlate ranks of |ûi | and Xi .
Use t-test on correlation coefficient rs .
4. Goldfeld–Quandt Test
Split ordered data, omit middle, regress separately.
SSR2
Compare variances using F ∗ = SSR 1
.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 12 / 26


Chapter Four Violations of Classical
Assumptions

BY Getnet Shita(MA)

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 13 / 26


Breusch–Pagan (BP) Test

Purpose: Detect heteroscedasticity in linear regression models.


Steps:
1 Estimate: Yi = β0 + β1 X1i + . . . + βk Xki + ui by OLS.
2 Obtain residuals ûi and run auxiliary regression:
ûi2 = α0 + α1 X1i + . . . + αk Xki + vi
3 Compute R 2 from auxiliary regression and test statistic:

LM = nR 2 ∼ χ2k
Reject H0 (homoscedasticity) if p-value is small.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 14 / 26


White Test for Heteroscedasticity

General test for any heteroscedasticity: includes squares and


cross-products.
Steps:
1 Estimate: Yi = β0 + β1 X1 + β2 X2 + β3 X3 + ui
2 Run auxiliary regression:
ûi2 = α0 + α1 X1 + α2 X2 + α3 X3 + α4 X12 + . . . + α9 X2 X3 + vi
3 Compute: LM = nR 2 ∼ χ29
Reject H0 if p-value is below significance level.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 15 / 26


Remedial Measures for Heteroscedasticity

Goal: Restore efficiency of OLS estimates.


1. When σi2 Known: Weighted Least Squares (WLS)
Transform model:
Yi Xi
Yi∗ = , Xi∗ = ⇒ Homoscedastic error term
σi σi

2. When σi2 Unknown: White’s Robust Standard Errors


Use heteroscedasticity-consistent standard errors for inference.
3. Log Transformation:

ln Yi = β0 + β1 ln Xi + ui
Often reduces heteroscedasticity by compressing scale.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 16 / 26


Chapter Four Violations of Classical
Assumptions

BY Getnet Shita(MA)

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 17 / 26


Nature, Sources, and Consequences of
Autocorrelation
Definition: Correlation among successive disturbance terms in a
time series model.
AR(1) Process: ut = ρut−1 + εt where −1 < ρ < 1
Sources:
Inertia in economic series (e.g., GDP, prices)
Data smoothing or interpolation
Omitted variables or incorrect model specification
Lagged dependent variables omitted
Consequences:
OLS remains unbiased and consistent
OLS loses efficiency; t, F tests become invalid
Predictions become inefficient due to higher variance
BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 18 / 26
Detecting Autocorrelation

Graphical Method: Plot residuals vs. lagged residuals.


Clusters in 1st and 3rd quadrants suggest positive autocorrelation.
Durbin-Watson (DW) Test:
Pn
(ût − ût−1 )2
d = t=2Pn 2
≈ 2(1 − ρ̂)
t=1 ût

Decision Rule:
d ≈ 2: No autocorrelation
d < dL : Positive autocorrelation
d > 4 − dL : Negative autocorrelation
dL < d < dU : Inconclusive

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 19 / 26


Breusch–Godfrey (BG) Test

Steps:
1 Estimate: Yt = β0 + β1 Xt + ut
2 Run auxiliary regression:
ût = α + βXt + ρ1 ût−1 + . . . + ρp ût−p + εt
3 Compute: (n − p)R 2 ∼ χ2p
Decision: Reject H0 if test statistic exceeds chi-square critical value.
Advantages: Handles higher-order autocorrelation and lagged
dependent variables.
Limitation: Lag length p must be specified.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 20 / 26


Chapter Four Violations of Classical
Assumptions

BY Getnet Shita(MA)

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 21 / 26


Diagnosing Addressing Model Misspecification
Causes of Autocorrelation:
Omitted variables
Incorrect functional form
Before applying corrective methods, check:
Is the autocorrelation pure?
Or due to specification errors?
Remedies for Misspecification:
Include omitted variables
Use correct functional form
If pure autocorrelation:
Use transformations (GLS/FGLS)
Apply methods like Cochrane–Orcutt, Prais–Winsten,
Newey–West
BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 22 / 26
Generalized Least Squares (GLS) Approach

Assume: ut = ρut−1 + εt (−1 < ρ < 1)


Transformed Model (if ρ known):

Yt∗ = β0 (1 − ρ) + β1 (Xt − ρXt−1 ) + εt

Where: Yt∗ = Yt − ρYt−1


When ρ ≈ 1: First-Difference Method

∆Yt = β1 ∆Xt + εt

Valid if strong autocorrelation (e.g., d < R 2 )


No intercept; run regression through origin

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 23 / 26


Estimating ρ and Iterative Methods
Estimation Methods for ρ:
Approximate: ρ ≈ 1 − d2
Regress residuals: ût = ρût−1 + vt
Cochrane–Orcutt Iterative Procedure:
1 Run OLS and obtain residuals ût
2 Estimate ρ from ût = ρût−1 + vt
3 Transform model: regress Yt∗ on Xt∗
4 Re-estimate residuals and iterate until ρ converges
Notes:
Feasible GLS (FGLS): uses estimated ρ
BLUE properties hold asymptotically (large samples)
Keep the first observation (e.g., Prais–Winsten) for small
samples
BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 24 / 26
Chapter Four Violations of Classical
Assumptions

BY Getnet Shita(MA)

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 25 / 26


Newey–West Method for Standard Error Correction

Corrects OLS standard errors for both:


Heteroscedasticity
Autocorrelation
Known as HAC (Heteroscedasticity and Autocorrelation
Consistent) standard errors.
Extension of White’s method:
White: for heteroscedasticity only.
Newey–West: for both issues simultaneously.
Advantages:
Allows continued use of OLS coefficients.
Appropriate in large samples.
Caution: May not perform well in small samples.

BY Getnet Shita(MA) Chapter Four Violations of Classical Assumptions 26 / 26

You might also like