Multiple Regression
Multiple Regression
Multiple Regression
Multiple Regression Model
Least Squares Method
Multiple Coefficient of
Determination
Model Assumptions
Testing for Significance
Using the Estimated Regression
Equation
for Estimation and Prediction
Multiple Regression Model
Multiple Regression Model
The equation that describes how the dependent
variable y is related to the independent variables x1,
x2, . . . xp and an error term is:
where:
b0, b1, b2, . . . , bp are the parameters, and
e is a random variable called the error term
Multiple Regression Equation
Estimated Multiple
Regression Equation
b0 , b1 , b2 , . . . , bp
provide estimates of yˆ b0 b1x1 b2x2 ... bpxp
b 0, b 1, b 2, . . . , b p Sample statistics are
b0 , b1 , b2 , . . . , bp
Least Squares Method
• Least Squares Criterion
Least Squares
Input Data Output
x1 x2 Computer b0 =
y Package b1 =
for Solving
4 78 b2 =
Multiple
24
Regression R2 =
7 100
43 Problems etc.
. .
.
. .
.
Solving for the Estimates of 0, 1, 2
SALARY
SALARY =
= 3.174
3.174 +
+ 1.404(EXPER)
1.404(EXPER) +
+ 0.251(SCORE)
0.251(SCORE)
b
b11 =
= 1.404
1.404
b
b22 =
= 0.251
0.251
SST = SSR +
SSE
i
( y y )2
= i
( ˆ
y y )2
+ i i
( y ˆ
y )2
where:
SST = total sum of squares
SSR = sum of squares due to regression
SSE = sum of squares due to error
Multiple Coefficient of Determination
R2 = SSR/SST
R2 = 500.3285/599.7855 = .83418
Adjusted Multiple Coefficient
of Determination
n 1
Ra2 2
1 (1 R )
n p 1
2 20 1
R 1 (1 .834179)
a .814671
20 2 1
Assumptions About the Error Term
The error is
The error is aa random
random variable
variable with
with mean
mean of
of zero.
zero.
The
The variance of ,, denoted
variance of by
denoted by 22,, is
is the
the same
same for
for all
all
values
values of
of the
the independent
independent variables.
variables.
The
The values of are
values of are independent.
independent.
The error is
The error is aa normally
normally distributed
distributed random
random variable
variable
reflecting
reflecting the
the deviation
deviation between
between thethe yy value
value and and the the
expected
expected value
value ofof yy given by 00 +
given by + 11xx11 +
+ 22xx22 + + ppxxpp..
+ .. .. +
Testing for Significance
In
In simple
simple linear
linear regression,
regression, the
the FF and
and tt tests
tests provide
provide
the
the same
same conclusion.
conclusion.
In
In multiple
multiple regression,
regression, the
the FF and
and tt tests
tests have
have different
different
purposes.
purposes.
Testing for Significance: F Test
The
The FF test
test is
is used
used toto determine
determine whether
whether aa significant
significant
relationship
relationship exists
exists between
between the
the dependent
dependent variable
variable
and
and the
the set
set ofof all
all the
the independent
independent variables.
variables.
The
The FF test
test is
is referred
referred to
to as
as the
the test
test for
for overall
overall
significance.
significance.
Testing for Significance: t Test
If
If the
the FF test
test shows
shows an
an overall
overall significance,
significance, the
the tt test
test is
is
used
used toto determine
determine whether
whether each
each of
of the
the individual
individual
independent
independent variables
variables is
is significant.
significant.
A
A separate
separate tt test
test is
is conducted
conducted for
for each
each of
of the
the
independent
independent variables
variables in
in the
the model.
model.
We
We refer
refer to
to each
each of
of these
these tt tests
tests as
as aa test
test for
for individual
individual
significance.
significance.
Testing for Significance: F Test
Hypotheses H0: 1 = 2 = . . . = p = 0
Ha: One or more of the parameters
is not equal to zero.
Hypotheses H0: 1 = 2 = 0
Ha: One or both of the parameters
is not equal to zero.
Hypotheses H 0 : i 0
H a : i 0
bi
Test Statistics t
sbi
The
The term
term multicollinearity
multicollinearity refers
refers to
to the
the correlation
correlation
among
among the
the independent
independent variables.
variables.
When
When thethe independent
independent variables
variables are
are highly
highly correlated
correlated
(say,
(say, |r
|r || >
> .7),
.7), it
it is
is not
not possible
possible to
to determine
determine the
the
separate
separate effect
effect of
of any
any particular
particular independent
independent variable
variable
on
on the
the dependent
dependent variable.
variable.
Testing for Significance: Multicollinearity
If
If the
the estimated
estimated regression
regression equation
equation is
is to
to be
be used
used only
only
for
for predictive
predictive purposes,
purposes, multicollinearity
multicollinearity is
is usually
usually
not
not aa serious
serious problem.
problem.
Every
Every attempt
attempt should
should be
be made
made toto avoid
avoid including
including
independent
independent variables
variables that
that are
are highly
highly correlated.
correlated.
Using the Estimated Regression Equation
for Estimation and Prediction
The
The procedures
procedures for
for estimating
estimating the
the mean
mean value
value of
of yy
and
and predicting
predicting an
an individual
individual value
value of
of yy in
in multiple
multiple
regression
regression are
are similar
similar to
to those
those in
in simple
simple regression.
regression.
We
We substitute
substitute the
the given
given values
values of
of xx11,, xx22,, .. .. .. ,, xxpp into
into
the
the estimated
estimated regression
regression equation
equation andand use use the the
corresponding
corresponding value
value of
of yy as
as the
the point
point estimate.
estimate.
Using the Estimated Regression Equation
for Estimation and Prediction
The
The formulas
formulas required
required to
to develop
develop interval
interval estimates
estimates
for ^ y and for an individual value
for the
the mean
mean value
value of
of y and for an individual value
of
of yy are
are beyond
beyond the
the scope
scope ..
Software
Software packages
packages for
for multiple
multiple regression
regression will
will often
often
provide
provide these
these interval
interval estimates.
estimates.
Categorical Independent Variables
In
In many
many situations
situations we
we must
must work
work with
with categorical
categorical
independent
independent variables
variables such
such as
as gender
gender (male,
(male, female),
female),
method
method of
of payment
payment (cash,
(cash, check,
check, credit
credit card),
card), etc.
etc.
For
For example,
example, xx22 might
might represent
represent gender
gender where
where xx22 =
=00
indicates
indicates male
male and
and xx22 =
=11 indicates
indicates female.
female.
In
In this
this case,
case, xx22 is
is called
called aa dummy
dummy or
or indicator
indicator variable.
variable.
Categorical Independent Variables
where:
y =^annual salary ($1000)
x1 = years of experience
x2 = score on programmer aptitude test
x3 = 0 if individual does not have a graduate degree
1 if individual does have a graduate degree
x3 is a dummy variable
Categorical Independent Variables
Not significant
Categorical Independent Variables
If
If aa categorical
categorical variable
variable has
has kk levels,
levels, kk -- 1
1 dummy
dummy
variables
variables are
are required,
required, with
with each
each dummy
dummy variable
variable
being
being coded
coded as
as 00 or
or 1.
1.
For
For example,
example, aa variable
variable with
with levels
levels A,
A, B,
B, and
and CC could
could
be
be represented
represented byby xx11 and
and xx22 values
values of
of (0,
(0, 0)
0) for
for A,
A, (1,
(1, 0)
0)
for
for B,
B, and
and (0,1)
(0,1) for
for C.
C.
Care
Care must
must be
be taken
taken in
in defining
defining and
and interpreting
interpreting the
the
dummy
dummy variables.
variables.
More Complex Categorical Variables
Highest
Degree x1 x2
Bachelor’s 0 0
Master’s 1 0
Ph.D. 0 1
Residual Analysis
2
Residuals
Standard
1
0
0 10 20 30 40 50
-1
-2
-3
Predicted Salary
Logistic Regression
Interpretation of E(y) as a
Probability in Logistic Regression
If the two values of y are coded as 0 or 1,
the value
of E(y) provides the probability that y = 1 given
a
particular
E(y) set of values
estimate of Pfor
(y x11|, xx12,,x.2 ,.. ,,xxp )p.
Logistic Regression
b0 b1x1b2x2 bpxp
e
yˆ b0 b1x1b2x2 bpxp
1 e
Example: Simmons
Stores
Simmons’ catalogs are expensive and
Simmons
would like to send them to only those customers
who
have the highest probability of making a $200
Simmons’ management thinks that annual
purchase
spending
using the discount coupon included in the
at catalog.
Simmons Stores and whether a customer has
a
Simmons credit card are two variables that might
be
helpful in predicting whether a customer who
receives
the catalog will use the coupon to make a $200
Logistic Regression
Example: Simmons
Stores
Simmons conducted a study by sending out
100
catalogs, 50 to customers who have a Simmons
credit
card and 50 to customers who do not have the
card.
At the end of the test period, Simmons noted for
each of
the 100 customers:
1) the amount the customer spent last year at
Simmons,
2) whether the customer had a Simmons credit
card, and
3) whether the customer made a $200 purchase.
Logistic Regression
x1 x2 y
Simmons Test Data (partial)
Annual Spending Simmons $200
Customer ($1000) Credit Card Purchase
1 2.291 1 0
2 3.215 1 0
3 2.135 1 0
4 3.924 0 0
5 2.528 1 0
6 2.473 0 1
7 2.384 0 0
8 7.076 0 0
9 1.182 1 1
10 3.345 0 0
Logistic Regression
Odds 95% CI
Predictor Coef SE Coef Z p Ratio Lower Upper
Log-Likelihood = -60.487
Test that all slopes are zero: G = 13.628, DF = 2, P-Value = 0.001
Logistic Regression
2.14640.3416x11.0987x2
e
yˆ 2.14640.3416x11.0987x2
1 e
Logistic Regression
Hypotheses H0: 1 = 2 = 0
Ha: One or both of the parameters
is not equal to zero.
Test Statisticsz = bi/sbi
Odds Ratio
odds1
Odds Ratio
odds0
Logistic Regression
Estimated Probabilities
Annual Spending
$1000 $2000 $3000 $4000 $5000 $6000 $7000
Computed
earlier
Logistic Regression
Comparing Odds
Suppose we want to compare the odds of
making a
$200 purchase for customers who spend $2000
annually
and have a Simmons credit card to the odds of
making a
$200 purchase .4099
estimatefor customers
of odds1 who spend $2000
.6946
annually 1- .4099
and do not have a Simmons .1880
credit card.
estimate of odds0 .2315
1- .1880
.6946
Estimate of odds ratio 3.00
.2315