Ch.
4: Mathematical Expectation
Mohammad Adam & Ruba Hamouri
Palestine Polytechnic University
February 7, 2021
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 1 / 24
4.1 Mean of a Random Variable
Definition
Let X be a random variable with probability distribution f (x). The mean,
or expected value, of X is
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 2 / 24
4.1 Mean of a Random Variable
Definition
Let X be a random variable with probability distribution f (x). The mean,
or expected value, of X is
X
µ = E (X ) = xf (x)
x
if X is discrete, and
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 2 / 24
4.1 Mean of a Random Variable
Definition
Let X be a random variable with probability distribution f (x). The mean,
or expected value, of X is
X
µ = E (X ) = xf (x)
x
if X is discrete, and
Z ∞
µ = E (X ) = xf (x)dx
−∞
if X is continuous.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 2 / 24
4.1 Mean of a Random Variable
Example 4.1
A lot containing 7 components is sampled by a quality inspector; the lot
contains 4 good components and 3 defective components. A sample of 3
is taken by the inspector. Find the expected value of the number of good
components in this sample.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 3 / 24
4.1 Mean of a Random Variable
Example 4.1
A lot containing 7 components is sampled by a quality inspector; the lot
contains 4 good components and 3 defective components. A sample of 3
is taken by the inspector. Find the expected value of the number of good
components in this sample.
Example 4.2
In a gambling game, a man is paid $5 if he gets all heads or all tails when
3 coins are tossed and he will pay $3 if either one or two heads show.
What is his expected gain?
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 3 / 24
4.1 Mean of a Random Variable
Example 4.3
Let X be the random variable that denotes the life in hours of a certain
electronic device. The probability density function is
(
20,000
x3
, x > 100,
f (x) =
0, elsewhere.
Find the expected life of this type of device.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 4 / 24
4.1 Mean of a Random Variable
Theorem 4.1
Let X be a random variable with probability distribution f (x). The
expected value of the random variable g (X ) is
X
µg (X ) = E (g (X )) = g (x)f (x)
x
if X is discrete, and
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 5 / 24
4.1 Mean of a Random Variable
Theorem 4.1
Let X be a random variable with probability distribution f (x). The
expected value of the random variable g (X ) is
X
µg (X ) = E (g (X )) = g (x)f (x)
x
if X is discrete, and
Z ∞
µg (X ) = E (g (X )) = g (x)f (x)dx
−∞
if X is continuous.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 5 / 24
4.1 Mean of a Random Variable
Example 4.4
Suppose that the number of cars X that pass through a car wash between
4:00 P.M. and 5:00 P.M. on any sunny Friday has the following probability
distribution:
1
12 , x = 4, 5,
P(X = x) = 14 , x = 6, 7,
1
6, x = 8, 9.
Let g (X ) = 2X − 1 represent the amount of money, in dollars, paid to the
attendant by the manager. Find the attendant’s expected earnings for this
particular time period.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 6 / 24
4.1 Mean of a Random Variable
Example 4.4
Let X be a random variable with density function
( 2
x
, −1 < x < 2,
f (x) = 3
0, elsewhere.
Find the expected value of g (X ) = 4X + 3.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 7 / 24
4.1 Mean of a Random Variable
Definition 4.2
Let X and Y be random variables with joint probability distribution
f (x, y ). The mean, or expected value, of the random variable g (X , Y ) is
XX
µg (X ,Y ) = E [g (X , Y )] = g (x, y )f (x, y )
x y
if X and Y are discrete, and
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 8 / 24
4.1 Mean of a Random Variable
Definition 4.2
Let X and Y be random variables with joint probability distribution
f (x, y ). The mean, or expected value, of the random variable g (X , Y ) is
XX
µg (X ,Y ) = E [g (X , Y )] = g (x, y )f (x, y )
x y
if X and Y are discrete, and
Z ∞ Z ∞
µg (X ,Y ) = E [g (X , Y )] = g (x, y )f (x, y )dxdy
−∞ −∞
if X and Y are continuous.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 8 / 24
4.1 Mean of a Random Variable
Definition 4.2
Let X and Y be random variables with joint probability distribution
f (x, y ). The mean, or expected value, of the random variable g (X , Y ) is
XX
µg (X ,Y ) = E [g (X , Y )] = g (x, y )f (x, y )
x y
if X and Y are discrete, and
Z ∞ Z ∞
µg (X ,Y ) = E [g (X , Y )] = g (x, y )f (x, y )dxdy
−∞ −∞
if X and Y are continuous.
Example 4.6
Let X and Y be the random variables with joint probability distribution
indicated in Table 3.1 on page 96. Find the expected value of
g (X , Y ) = XY .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 8 / 24
4.1 Mean of a Random Variable
Example 4.7
Find E ( Y
X ) for the density function
x(1+3y 2 )
(
4 , 0 < x < 2, 0 < y < 1,
f (x, y ) =
0, elsewhere.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 9 / 24
4.1 Mean of a Random Variable
Remarks
Note that if g (X , Y ) = X in Definition 4.2, we have
(P P P
x y xf (x, y ) = x xg (x) (discrete case),
E (X ) = R ∞ R ∞ R∞
−∞ −∞ xf (x, y )dxdy = −∞ xg (x)dx (continuous case)
where g (x) is the marginal distribution of X .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 10 / 24
4.1 Mean of a Random Variable
Remarks
Note that if g (X , Y ) = X in Definition 4.2, we have
(P P P
x y xf (x, y ) = x xg (x) (discrete case),
E (X ) = R ∞ R ∞ R∞
−∞ −∞ xf (x, y )dxdy = −∞ xg (x)dx (continuous case)
where g (x) is the marginal distribution of X .
Therefore, in calculating E (X ) over a two-dimensional space, one may use
either the joint probability distribution of X and Y or the marginal
distribution of X .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 10 / 24
4.1 Mean of a Random Variable
Remarks
Note that if g (X , Y ) = X in Definition 4.2, we have
(P P P
x y xf (x, y ) = x xg (x) (discrete case),
E (X ) = R ∞ R ∞ R∞
−∞ −∞ xf (x, y )dxdy = −∞ xg (x)dx (continuous case)
where g (x) is the marginal distribution of X .
Therefore, in calculating E (X ) over a two-dimensional space, one may use
either the joint probability distribution of X and Y or the marginal
distribution of X .
Similarly, we define
(P P P
x y yf (x, y ) = y yh(y ) (discrete case),
E (Y ) = R ∞ R ∞ R∞
−∞ −∞ yf (x, y )dxdy = −∞ yh(y )dx (continuous case)
where h(y ) is the marginal distribution of the random variable Y .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 10 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.3
Let X be a random variable with probability distribution f (x) and mean µ.
The variance of X is
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 11 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.3
Let X be a random variable with probability distribution f (x) and mean µ.
The variance of X is
(
σ 2 = E [(X − µ)2 ] = x (x − µ)2 f (x),
P
if X is discrete, and
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 11 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.3
Let X be a random variable with probability distribution f (x) and mean µ.
The variance of X is
(
σ 2 = E [(X − µ)2 ] = x (x − µ)2 f (x),
P
if X is discrete, and
2 2
R∞ 2
σ = E [(X − µ) ] = −∞ (x − µ) f (x)dx, if X is continuous.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 11 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.3
Let X be a random variable with probability distribution f (x) and mean µ.
The variance of X is
(
σ 2 = E [(X − µ)2 ] = x (x − µ)2 f (x),
P
if X is discrete, and
2 2
R∞ 2
σ = E [(X − µ) ] = −∞ (x − µ) f (x)dx, if X is continuous.
The positive square root of the variance, σ, is called the standard
deviation of X .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 11 / 24
4.2 Variance and Covariance of Random Variables
Example 4.8
Let the random variable X represent the number of automobiles that are
used for official business purposes on any given workday. The probability
x 1 2 3
distribution for company A is
f (x) 0.3 0.4 0.3
x 0 1 2 3 4
and that for company B is
f (x) 0.2 0.1 0.3 0.3 0.1
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 12 / 24
4.2 Variance and Covariance of Random Variables
Example 4.8
Let the random variable X represent the number of automobiles that are
used for official business purposes on any given workday. The probability
x 1 2 3
distribution for company A is
f (x) 0.3 0.4 0.3
x 0 1 2 3 4
and that for company B is
f (x) 0.2 0.1 0.3 0.3 0.1
Show that the variance of the probability distribution for company B is
greater than that for company A.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 12 / 24
4.2 Variance and Covariance of Random Variables
Example 4.8
Let the random variable X represent the number of automobiles that are
used for official business purposes on any given workday. The probability
x 1 2 3
distribution for company A is
f (x) 0.3 0.4 0.3
x 0 1 2 3 4
and that for company B is
f (x) 0.2 0.1 0.3 0.3 0.1
Show that the variance of the probability distribution for company B is
greater than that for company A.
Theorem 4.2
The variance of a random variable X is
σ 2 = E (X 2 ) − µ2 .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 12 / 24
4.2 Variance and Covariance of Random Variables
Example 4.9
Let the random variable X represent the number of defective parts for a
machine when 3 parts are sampled from a production line and tested. The
following is the probability distribution of X .
x 0 1 2 3
f (x) 0.51 0.38 0.10 0.01
Using Theorem 4.2, calculate σ 2 .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 13 / 24
4.2 Variance and Covariance of Random Variables
Example 4.9
Let the random variable X represent the number of defective parts for a
machine when 3 parts are sampled from a production line and tested. The
following is the probability distribution of X .
x 0 1 2 3
f (x) 0.51 0.38 0.10 0.01
Using Theorem 4.2, calculate σ 2 .
Example 4.10
The weekly demand for a drinking-water product, in thousands of liters,
from a local chain of efficiency stores is a continuous random variable X
having the probability density
(
2(x − 1), 1 < x < 2,
f (x) =
0, elsewhere.
Find the& R.mean
M. Adam Hamouriand
(PPU)variance of X . PS Ch.4 February 7, 2021 13 / 24
4.2 Variance and Covariance of Random Variables
Thoerem 4.3
Let X be a random variable with probability distribution f (x). The
variance of the random variable g (X ) is
(
σg2 (X ) = E [(g (X ) − µg (X ) )2 ] = x (g (x) − µg (X ) )2 f (x),
P
if X is discre
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 14 / 24
4.2 Variance and Covariance of Random Variables
Thoerem 4.3
Let X be a random variable with probability distribution f (x). The
variance of the random variable g (X ) is
(
σg2 (X ) = E [(g (X ) − µg (X ) )2 ] = x (g (x) − µg (X ) )2 f (x),
P
if X is discre
∞
σg2 (X ) = E [(g (X ) − µg (X ) )2 ] = −∞ (g (x) − µg (X ) )2 f (x)dx, if X is contin
R
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 14 / 24
4.2 Variance and Covariance of Random Variables
Thoerem 4.3
Let X be a random variable with probability distribution f (x). The
variance of the random variable g (X ) is
(
σg2 (X ) = E [(g (X ) − µg (X ) )2 ] = x (g (x) − µg (X ) )2 f (x),
P
if X is discre
∞
σg2 (X ) = E [(g (X ) − µg (X ) )2 ] = −∞ (g (x) − µg (X ) )2 f (x)dx, if X is contin
R
Example 4.11
Calculate the variance of g (X ) = 2X + 3, where X is a random variable
x 0 1 2 3
with probability distribution
f (x) 14 18 12 18
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 14 / 24
4.2 Variance and Covariance of Random Variables
Thoerem 4.3
Let X be a random variable with probability distribution f (x). The
variance of the random variable g (X ) is
(
σg2 (X ) = E [(g (X ) − µg (X ) )2 ] = x (g (x) − µg (X ) )2 f (x),
P
if X is discre
∞
σg2 (X ) = E [(g (X ) − µg (X ) )2 ] = −∞ (g (x) − µg (X ) )2 f (x)dx, if X is contin
R
Example 4.11
Calculate the variance of g (X ) = 2X + 3, where X is a random variable
x 0 1 2 3
with probability distribution
f (x) 14 18 12 18
Example 4.12
Let X be a random variable having the density function given in Example
4.5 on page 115. Find the variance of the random variable g (X ) = 4X + 3.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 14 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.4
Let X and Y be random variables with joint probability distribution
f (x, y ). The covariance of X and Y is
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 15 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.4
Let X and Y be random variables with joint probability distribution
f (x, y ). The covariance of X and Y is
XX
σXY = E [(X − µX )(Y − µY )] = (x − µX )(y − µY )f (x, y )
x Y
if X and Y are discrete, and
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 15 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.4
Let X and Y be random variables with joint probability distribution
f (x, y ). The covariance of X and Y is
XX
σXY = E [(X − µX )(Y − µY )] = (x − µX )(y − µY )f (x, y )
x Y
if X and Y are discrete, and
Z ∞ Z ∞
σXY = E [(X − µX )(Y − µY )] = (x − µX )(y − µY )f (x, y )dxdy
−∞ −∞
if X and Y are continuous.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 15 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.4
Let X and Y be random variables with joint probability distribution
f (x, y ). The covariance of X and Y is
XX
σXY = E [(X − µX )(Y − µY )] = (x − µX )(y − µY )f (x, y )
x Y
if X and Y are discrete, and
Z ∞ Z ∞
σXY = E [(X − µX )(Y − µY )] = (x − µX )(y − µY )f (x, y )dxdy
−∞ −∞
if X and Y are continuous.
Theorem 4.4
The covariance of two random variables X and Y with means µX and µY
, respectively, is given by
σXY = E [XY ] − µX µY .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 15 / 24
4.2 Variance and Covariance of Random Variables
Example 4.14
For the random variables X and Y in Example 3.14, find the covariance of
X and Y .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 16 / 24
4.2 Variance and Covariance of Random Variables
Example 4.14
For the random variables X and Y in Example 3.14, find the covariance of
X and Y .
Example 3.15
The fraction X of male runners and the fraction Y of female runners who
compete in marathon races are described by the joint density function
(
8xy , 0 ≤ y ≤ x ≤ 1,
f (x, y ) =
0, elsewhere.
Find the covariance of X and Y .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 16 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.5
Let X and Y be random variables with covariance σXY and standard
deviations σX and σY , respectively. The correlation coefficient of X and Y
is
σXY
ρXY = .
σX σY
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 17 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.5
Let X and Y be random variables with covariance σXY and standard
deviations σX and σY , respectively. The correlation coefficient of X and Y
is
σXY
ρXY = .
σX σY
Example 4.16
Find the correlation coefficient of X and Y in Example 4.13.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 17 / 24
4.2 Variance and Covariance of Random Variables
Definition 4.5
Let X and Y be random variables with covariance σXY and standard
deviations σX and σY , respectively. The correlation coefficient of X and Y
is
σXY
ρXY = .
σX σY
Example 4.16
Find the correlation coefficient of X and Y in Example 4.13.
Example 4.16
Find the correlation coefficient of X and Y in Example 4.14.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 17 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.5
If a and b are constants, then
E (aX + b) = aE (X ) + b.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 18 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.5
If a and b are constants, then
E (aX + b) = aE (X ) + b.
Corollary 4.1 & Corollary 4.2
Setting a = 0, we see that E (b) = b.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 18 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.5
If a and b are constants, then
E (aX + b) = aE (X ) + b.
Corollary 4.1 & Corollary 4.2
Setting a = 0, we see that E (b) = b.
Setting b = 0, we see that E (aX ) = aE (X ).
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 18 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.5
If a and b are constants, then
E (aX + b) = aE (X ) + b.
Corollary 4.1 & Corollary 4.2
Setting a = 0, we see that E (b) = b.
Setting b = 0, we see that E (aX ) = aE (X ).
Example 4.17
Applying Theorem 4.5 to the discrete random variable f (X ) = 2X − 1,
rework Example 4.4 on page 115.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 18 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.5
If a and b are constants, then
E (aX + b) = aE (X ) + b.
Corollary 4.1 & Corollary 4.2
Setting a = 0, we see that E (b) = b.
Setting b = 0, we see that E (aX ) = aE (X ).
Example 4.17
Applying Theorem 4.5 to the discrete random variable f (X ) = 2X − 1,
rework Example 4.4 on page 115.
Example 4.18
Applying Theorem 4.5 to the continuous random variable g (X ) = 4X + 3,
rework Example 4.5 on page 115.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 18 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.6
The expected value of the sum or difference of two or more functions of a
random variable X is the sum or difference of the expected values of the
functions. That is,
E [g (X ) ± h(X )] = E [g (X )] ± E [h(X )].
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 19 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.6
The expected value of the sum or difference of two or more functions of a
random variable X is the sum or difference of the expected values of the
functions. That is,
E [g (X ) ± h(X )] = E [g (X )] ± E [h(X )].
Example 4.19
Let X be a random variable with probability distribution as follows:
x 0 1 2 3
f (x) 13 12 0 16
Find the expected value of Y = (X − 1)2 .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 19 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Example 4.20
The weekly demand for a certain drink, in thousands of liters, at a chain of
convenience stores is a continuous random variable g (X ) = X 2 + X − 2,
where X has the density function
(
2(x − 1), 1 < x < 2,
f (x) =
0, elsewhere.
Find the expected value of the weekly demand for the drink.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 20 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.7
The expected value of the sum or difference of two or more functions of
the random variables X and Y is the sum or difference of the expected
values of the functions. That is,
E [g (X , Y ) ± h(X , Y )] = E [g (X , Y )] ± E [h(X , Y )].
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 21 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.7
The expected value of the sum or difference of two or more functions of
the random variables X and Y is the sum or difference of the expected
values of the functions. That is,
E [g (X , Y ) ± h(X , Y )] = E [g (X , Y )] ± E [h(X , Y )].
Corollary 4.3
Setting g (X , Y ) = g (X ) and h(X , Y ) = h(Y ), we see that
E [g (Y ) ± h(Y )] = E [g (X )] ± E [h(Y )].
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 21 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.7
The expected value of the sum or difference of two or more functions of
the random variables X and Y is the sum or difference of the expected
values of the functions. That is,
E [g (X , Y ) ± h(X , Y )] = E [g (X , Y )] ± E [h(X , Y )].
Corollary 4.3
Setting g (X , Y ) = g (X ) and h(X , Y ) = h(Y ), we see that
E [g (Y ) ± h(Y )] = E [g (X )] ± E [h(Y )].
Theorem 4.8
Let X and Y be two independent random variables. Then
M. Adam & R. Hamouri (PPU) E (XY ) =PS ECh.4
(X )E (Y ). February 7, 2021 21 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Corollary 4.5
Let X and Y be two independent random variables. Then σXY = 0.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 22 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Corollary 4.5
Let X and Y be two independent random variables. Then σXY = 0.
Example 4.21
It is known that the ratio of gallium to arsenide does not affect the
functioning of gallium-arsenide wafers, which are the main components of
microchips. Let X denote the ratio of gallium to arsenide and Y denote
the functional wafers retrieved during a 1-hour period. X and Y are
independent random variables with the joint density function
x(1+3y 2 )
(
4 , 0 < x < 2, 0 < y < 1,
f (x, y ) =
0, elsewhere.
Show that E (XY ) = E (X )E (Y ), as Theorem 4.8 suggests
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 22 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.9
If X and Y are random variables with joint probability distribution f (x, y )
and a, b, and c are constants, then
2 2 2 2 2
σaX +bY +c = a σX + b σY + 2abσXY .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 23 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.9
If X and Y are random variables with joint probability distribution f (x, y )
and a, b, and c are constants, then
2 2 2 2 2
σaX +bY +c = a σX + b σY + 2abσXY .
Corollary 4.6, Corollary 4.7, & Corollary 4.8
Setting b = 0, we see that
2 2 2 2 2
σaX +c = a σX = a σ .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 23 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.9
If X and Y are random variables with joint probability distribution f (x, y )
and a, b, and c are constants, then
2 2 2 2 2
σaX +bY +c = a σX + b σY + 2abσXY .
Corollary 4.6, Corollary 4.7, & Corollary 4.8
Setting b = 0, we see that
2 2 2 2 2
σaX +c = a σX = a σ .
Setting a = 1 and b = 0, we see that
σX2 +c = σX2 = σ 2 .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 23 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Theorem 4.9
If X and Y are random variables with joint probability distribution f (x, y )
and a, b, and c are constants, then
2 2 2 2 2
σaX +bY +c = a σX + b σY + 2abσXY .
Corollary 4.6, Corollary 4.7, & Corollary 4.8
Setting b = 0, we see that
2 2 2 2 2
σaX +c = a σX = a σ .
Setting a = 1 and b = 0, we see that
σX2 +c = σX2 = σ 2 .
Setting b = 0 and c = 0, we see that
2
σaX = a2 σX2 = a2 σ 2 .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 23 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Corollary 4.9 & Corollary 4.10
If X and Y are independent random variables, then
2 2 2 2 2
σaX ±bY = a σX + b σY .
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 24 / 24
4.3 Means and Variances of Linear Combinations of
Random Variables
Corollary 4.9 & Corollary 4.10
If X and Y are independent random variables, then
2 2 2 2 2
σaX ±bY = a σX + b σY .
Example 4.23
Let X and Y denote the amounts of two different types of impurities in a
batch of a certain chemical product. Suppose that X and Y are
independent random variables with variances σX2 = 2 and σY2 = 3. Find
the variance of the random variable Z = 3X − 2Y + 5.
M. Adam & R. Hamouri (PPU) PS Ch.4 February 7, 2021 24 / 24