Chapter 13
Introduction to Multiple
Regression
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 1
Objectives
In this chapter, you learn:
How to develop a multiple regression model
How to interpret the regression coefficients
How to determine which independent variables to
include in the regression model
How to use categorical independent variables in a
regression model
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 2
The Multiple Regression Model
Idea: Examine the linear relationship between
1 dependent (Y) & 2 or more independent variables (Xi)
Multiple Regression Model with k Independent Variables:
Y-intercept Population slopes Random Error
Yi β 0 β1X1i β 2 X 2i β k X ki ε i
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 3
Multiple Regression Equation
The coefficients of the multiple regression model are
estimated using sample data
Multiple regression equation with k independent variables:
Estimated Estimated
(or predicted) Estimated slope coefficients
intercept
value of Y
ˆ b b X b X b X
Yi 0 1 1i 2 2i k ki
In this chapter we will use Excel to obtain the regression
slope coefficients and other regression summary measures.
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 4
Multiple Regression Equation (continued)
Two variable model
Y
Ŷ b0 b1X1 b 2 X 2
X1
e
abl
i
var
r
fo
ope X2
Sl
varia ble X 2
pe fo r
Slo
X1
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 5
Example: 2 Independent Variables
A distributor of frozen dessert pies wants to
evaluate factors thought to influence demand
Dependent variable: Pie sales (units per week)
Independent variables: Price (in $)
Advertising ($100’s)
Data are collected for 15 weeks
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 6
Pie Sales Example
Pie Price Advertising
Week Sales ($) ($100s)
1 350 5.50 3.3 Multiple regression equation:
2 460 7.50 3.3
3 350 8.00 3.0
Sales = b0 + b1 (Price)
4 430 8.00 4.5
5 350 6.80 3.0 + b2 (Advertising)
6 380 7.50 4.0
7 430 4.50 3.0
8 470 6.40 3.7
9 450 7.00 3.5
10 490 5.00 4.0
11 340 7.20 3.5
12 300 7.90 3.2
13 440 5.90 4.0
14 450 5.00 3.5
15 300 7.00 2.7
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 7
Excel Multiple Regression Output
Regression Statistics
Multiple R 0.72213
R Square 0.52148
Adjusted R Square 0.44172
Standard Error 47.46341 Sales 306.526 - 24.975(Price) 74.131(Advertising)
Observations 15
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 8
The Multiple Regression Equation
Sales 306.526 - 24.975(Price) 74.131(Advertising)
where
Sales is in number of pies per week
Price is in $
Advertising is in $100’s.
b1 = -24.975: sales b2 = 74.131: sales will
will decrease, on increase, on average,
average, by 24.975 by 74.131 pies per
pies per week for week for each $100
each $1 increase in increase in
selling price, net of advertising, net of the
the effects of effects of changes
changes due to due to price
advertising
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 9
Using The Equation to Make Predictions
Predict sales for a week in which the selling
price is $5.50 and advertising is $350:
Sales 306.526 - 24.975(Price) 74.131(Advertising)
306.526 - 24.975 (5.50) 74.131 (3.5)
428.62
Note that Advertising is
Predicted sales in $100s, so $350 means
that X2 = 3.5
is 428.62 pies
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 10
Predictions in Excel using PHStat
PHStat | regression | multiple regression …
Check the
“confidence and
prediction interval
estimates” box
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 11
Predictions in PHStat (continued)
DCOVA
Input values
<
Predicted Y value
Confidence interval for the
mean value of Y, given
these X values
Prediction interval for an
individual Y value, given
these X values
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 12
The Coefficient of Multiple
Determination, r2
Reports the proportion of total variation in Y
explained by all X variables taken together
2 SSR regression sum of squares
r
SST total sum of squares
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 13
Multiple Coefficient of Determination In Excel
Regression Statistics
2 SSR 29460.0
Multiple R 0.72213
r .52148
R Square 0.52148 SST 56493.3
Adjusted R Square 0.44172
52.1% of the variation in pie sales
Standard Error 47.46341
is explained by the variation in
Observations 15
price and advertising
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 14
Adjusted r2
r2 never decreases when a new X variable is
added to the model
This can be a disadvantage when comparing
models
What is the net effect of adding a new variable?
We lose a degree of freedom when a new X
variable is added
Did the new X variable add enough
explanatory power to offset the loss of one
degree of freedom?
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 15
Adjusted r2 (continued)
Shows the proportion of variation in Y explained by
all X variables adjusted for the number of X
variables used
2 2 n 1
radj 1 (1 r )
n k 1
(where n = sample size, k = number of independent variables)
Penalizes excessive use of unimportant independent
variables
Smaller than r2
Useful in comparing among models
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 16
Adjusted r2 in Excel
Regression Statistics 2
Multiple R 0.72213 radj .44172
R Square 0.52148
Adjusted R Square 0.44172
44.2% of the variation in pie sales is
Standard Error 47.46341 explained by the variation in price and
Observations 15 advertising, taking into account the sample
size and number of independent variables
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 17
Is the Model Significant?
F Test for Overall Significance of the Model
Shows if there is a linear relationship between all of the X
variables considered together and Y
Use F-test statistic
Hypotheses:
H0: β1 = β2 = … = βk = 0 (no linear relationship)
H1: at least one βi ≠ 0 (at least one independent
variable affects Y)
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 18
F Test for Overall Significance
Test statistic:
SSR
MSR k
FSTAT
MSE SSE
n k 1
where FSTAT has numerator d.f. = k and
denominator d.f. = (n – k - 1)
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 19
F Test for Overall Significance In Excel
(continued)
Regression Statistics
Multiple R 0.72213
MSR 14730.0
R Square 0.52148
FSTAT 6.5386
Adjusted R Square 0.44172 MSE 2252.8
Standard Error 47.46341
With 2 and 12 degrees P-value for
Observations 15 of freedom the F Test
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 20
F Test for Overall Significance (continued)
H0: β1 = β2 = 0 Test Statistic:
H1: β1 and β2 not both zero MSR
FSTAT 6.5386
= .05 MSE
df1= 2 df2 = 12
Decision:
Critical
Since FSTAT test statistic is
Value:
in the rejection region (p-
F0.05 = 3.885
value < .05), reject H0
= .05
Conclusion:
0 F There is evidence that at least one
Do not Reject H0
reject H0 independent variable affects Y
F0.05 = 3.885
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 21
Multiple Regression Assumptions
Errors (residuals) from the regression model:
<
ei = (Yi – Yi)
Assumptions:
The errors are normally distributed
Errors have a constant variance
The model errors are independent
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 22
Residual Plots Used in Multiple Regression
These residual plots are used in multiple
regression:
Residuals vs. Yi
Residuals vs. X1i <
Residuals vs. X2i
Residuals vs. time (if time series data)
Use the residual plots to check for
violations of regression assumptions
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 23
Are Individual Variables Significant?
Use t tests of individual variable slopes
Shows if there is a linear relationship between
the variable Xj and Y holding constant the
effects of other X variables
Hypotheses:
H0: βj = 0 (no linear relationship)
H1: βj ≠ 0 (linear relationship does exist
between Xj and Y)
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 24
Are Individual Variables Significant?
(continued)
H0: βj = 0 (no linear relationship between Xj and Y)
H1: βj ≠ 0 (linear relationship does exist
between Xj and Y)
Test Statistic:
bj 0
t STAT (df = n – k – 1)
Sb
j
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 25
Are Individual Variables
Significant? Excel Output (continued)
Regression Statistics
Multiple R 0.72213
t Stat for Price is tSTAT = -2.306, with
R Square 0.52148 p-value .0398
Adjusted R Square 0.44172
Standard Error 47.46341 t Stat for Advertising is tSTAT = 2.855,
Observations 15 with p-value .0145
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 26
Inferences about the Slope:
t Test Example
From the Excel output:
H0: βj = 0
For Price tSTAT = -2.306, with p-value .0398
H1: βj 0
For Advertising tSTAT = 2.855, with p-value .0145
d.f. = 15-2-1 = 12
= .05 The test statistic for each variable falls
t/2 = 2.1788 in the rejection region (p-values < .05)
Decision:
/2=.025 /2=.025 Reject H0 for each variable
Conclusion:
There is evidence that both
Reject H0 Do not reject H0
-tα/2 tα/2
Reject H0
Price and Advertising affect
0
-2.1788 2.1788 pie sales at = .05
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 27
Confidence Interval Estimate for the Slope
Confidence interval for the population slope βj
b j tα / 2 Sb where t has
(n – k – 1) d.f.
j
Coefficients Standard Error
Intercept 306.52619 114.25389 Here, t has
Price -24.97509 10.83213
Advertising 74.13096 25.96732
(15 – 2 – 1) = 12 d.f.
Example: Form a 95% confidence interval for the effect of changes in
price (X1) on pie sales:
-24.975 ± (2.1788)(10.832)
So the interval is (-48.576 , -1.374)
(This interval does not contain zero, so price has a significant effect on sales)
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 28
Confidence Interval Estimate for the Slope
(continued)
Confidence interval for the population slope βj
Coefficients Standard Error … Lower 95% Upper 95%
Intercept 306.52619 114.25389 … 57.58835 555.46404
Price -24.97509 10.83213 … -48.57626 -1.37392
Advertising 74.13096 25.96732 … 17.55303 130.70888
Example: Excel output also reports these interval endpoints:
Weekly sales are estimated to be reduced by between 1.37 to
48.58 pies for each increase of $1 in the selling price, holding the
effect of advertising constant
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 29
Using Dummy Variables
A dummy variable is a categorical independent
variable with two levels:
yes or no, on or off, male or female
coded as 0 or 1
Assumes the slopes associated with numerical
independent variables do not change with the
value for the categorical variable
If more than two levels, the number of dummy
variables needed is (number of levels - 1)
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 30
Dummy-Variable Example
(with 2 Levels)
Ŷ b0 b1 X1 b 2 X 2
Let:
Y = pie sales
X1 = price
X2 = holiday (X2 = 1 if a holiday occurred during the week)
(X2 = 0 if there was no holiday that week)
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 31
Dummy-Variable Example
(with 2 Levels) (continued)
DCOVA
Ŷ b0 b1 X1 b 2 (1) (b0 b 2 ) b1 X1 Holiday
Ŷ b0 b1 X1 b 2 (0) b0 b1 X1 No Holiday
Different Same
intercept slope
Y (sales)
If H0: β2 = 0 is
b0 + b 2
Holi rejected, then
day
b0 (X = “Holiday” has a
No H 2 1)
olida significant effect
y (X
2 = 0 on pie sales
)
X1 (Price)
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 32
Interpreting the Dummy Variable
Coefficient (with 2 Levels)
Example: Sales 300 - 30(Price) 15(Holiday)
Sales: number of pies sold per week
Price: pie price in $
1 If a holiday occurred during the week
Holiday:
0 If no holiday occurred
b2 = 15: on average, sales were 15 pies greater in
weeks with a holiday than in weeks without a
holiday, given the same price
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 33
Interaction Between Independent Variables
Hypothesizes interaction between pairs of X
variables
Response to one X variable may vary at different
levels of another X variable
Contains two-way cross product terms
Ŷ b0 b1X1 b 2 X 2 b3 X3
b0 b1X1 b 2 X 2 b3 (X1X 2 )
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 34
Effect of Interaction
Given: Y β0 β1X1 β 2 X 2 β3 X1X 2 ε
Without interaction term, effect of X1 on Y is
measured by β1
With interaction term, effect of X1 on Y is
measured by β1 + β3 X2
Effect changes as X2 changes
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 35
Interaction Example
Suppose X2 is a dummy variable and the estimated
regression equation is Ŷ = 1 + 2X1 + 3X2 + 4X1X2
Y
12
X2 = 1:
8 Y = 1 + 2X1 + 3(1) + 4X1(1) = 4 + 6X1
4 X2 = 0:
Y = 1 + 2X1 + 3(0) + 4X1(0) = 1 + 2X1
0
X1
0 0.5 1 1.5
Slopes are different if the effect of X1 on Y depends on X2 value
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 36
Chapter Summary
In this chapter we discussed:
How to develop a multiple regression model
How to interpret the regression coefficients
How to determine which independent variables to
include in the regression model
How to use categorical independent variables in a
regression model
Copyright © 2016 Pearson Education, Ltd. Chapter 13, Slide 37