Econometric Problems: 5
238 Seong
Fey »
r, in most cases the whole soy
Ct Of @
icula
‘ rch, In partic
ometric researe Ag it has become a common practice , Play
0
“prog, Maloy
econ ee not
variables is RO - |
ae periment ion, that 18, by gradually including additional y My
Yahi and judging their importance by using, among othe; ‘*Tableg gt
ese errors For example suppose that the true relationsht’
Ship jg
their standar
Yedo thi Xi + baX, tu
wn, the researcher usually 1
starts h
ons
Since the true specification is not kno
aaa he sintpler tentative formulation
study with ¢
YeantaX, tv
y yield significant results, although a, wif
: sh ay will he
he addition of X, i ‘i haw
8 he
ince the model is in this ¢
case
This model will most probably yi¢
ror due to the omission of X)
cification en
relationship should normally improve the fit
correctly specified. However, if X1 and X, are highly correlated and th
standard errors are large, the researcher will usually reject X2, being mi ‘
his case multicollinearity results in the a
by its large standard error. Int
fication of the model, since Xy is by
decision and hence in the wrong speci
assumption (that is in the postulated model) an important explanatory variable
in the relationship. (See Theil, Economic Forecasts and Policy, p. 217, See
also Farrar and Glauber, op. cit., P- 94.)
However, large standard errors do not always appear even in functions in
which the regressors are strongly multicollinear. For example production
functions with overall correlations much in excess of 0-950 have been well-
estimated with intercorrelations between labour and capital as high as 0-800
to 0-900. If these functions were not well estimated, we would tend to find
high sampling errors of the estimates coefficients. By conventional criteria the
estimated parameters of most Cobb—Douglas production functions ace large
selative to their standard error. The coefficients are generally high multiples of
a errors. (See L. R. Klein, Introduction to Econometrics, P.
tly S. D. Silvey has published a study on the problem of multicollineatit!
Bae Silvey, “Multicollinearity and Imprecise Estimation’, Roya! Statist
ty, 1969.) We will not examine Silvey’s approach here, since it Is not
oe tially superior to Farrar's and Glauber’s test, which will be develOP® 2
next section.
11.4, TESTS FOR DETECTING MULTICOLLINEARITYmoarit?
239
dard errors do ‘
, bb Douglas oo appear with multic
ast arise for beis rears em: Pitesti (see
S 4 large sta
at tionships among the explanatory eae because of the Neen
The intercorrelations of the explanator ables, .
a ithe b's and theit standard ci variables need not be high {
tan adequate criterion by itself ne .
, is a ec vi
ty 1 The © erall R? may b high (relative to the r.
0 ie :
‘x; x/'8) and yet the results may
pohighY imprecise and insignificant (with wrong signs and
a. ee . and/or large standard
weer. a combination of all these criteria me
aioli. In order to gain as much Read help the detection of
-osnesS of multicollinearity we suggest the « sae as possible as to the
Seep essence See ieediversion of a eae of the following method
"Map Analysis’). sch’s ‘Confluence Analysis’ (or
The procedure is to regress the depe aria
tory variables separately. Thus aan a ana
and we examine their results on the basis of a ae cena pees
We choose the elementary Perion whic a statistical criteria.
results, On both a priori and statistical criteria, ea . Ea Sno
variables and we examine their effects on the ere ae Se aoe
sandard errors, and on the overall R?. A new variabl ‘i ot ae ae
fous or detrimental, as follows: SEE a
: ue, new variable improves R? without rendering the individual
ee ona priori considerations, the variable is
considered ained as an explanatory variable.
(2) If the new variable does not improve R? and does not affect to any
=. extent the values of the individual coefficients, it is considered as
8) a ae ae ee as the coer variables).
Peis es ynsi rably the signs or the values of the
. nts, it is considere as detrimental. If the individual coefficients are
f affected in such a way as to become unacceptable on theoretical, ¢ priori,
cman then we may say that this is warning that multicollinearity is
; z problem. The new variable is important, but because of intercorrela-
: with the other explanatory variables its influence cannot be assessed
i tically by ordinary least-squares. This does not mean that we must reject
detrimental variable. If we did so, we would ignore information valuable to
es of approaching as best we can the