Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
89 views53 pages

Lecture 2 Chapter3

The document outlines sections from a probability and statistics course, including: - Section 3.2 on discrete random variables, which defines discrete random variables, probability mass functions, and cumulative distribution functions. Examples of calculating probabilities for dice rolls are provided. - Section 3.3 on continuous random variables, which discusses variables that can take continuous values over an interval rather than discrete values. An example of a continuous variable for a random number between 0 and 2 is given. - Other sections mentioned include joint probability distributions, expected values, variance, and specific distributions like binomial, negative binomial, Poisson, and normal.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
89 views53 pages

Lecture 2 Chapter3

The document outlines sections from a probability and statistics course, including: - Section 3.2 on discrete random variables, which defines discrete random variables, probability mass functions, and cumulative distribution functions. Examples of calculating probabilities for dice rolls are provided. - Section 3.3 on continuous random variables, which discusses variables that can take continuous values over an interval rather than discrete values. An example of a continuous variable for a random number between 0 and 2 is given. - Other sections mentioned include joint probability distributions, expected values, variance, and specific distributions like binomial, negative binomial, Poisson, and normal.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 53

IE 6200

Probability

Definition

Theorem

Example 3

Definition

Example
Probability and Statistics

Theorem
Theorem

Theorem
Probability and Statistics
Outline
Probability

Definition

Theorem
Sec. 3.2: Discrete Random Variables
Example 3

Definition
Sec. 3.3: Continuous Random Variables
Example
Sec. 3.4: Joint Probability Distributions
Sec. 4.1: Expected Values
Sec. 4.2: The Variance
Sec. 5.2-5.4: Binomial, Negative Binomial and
Hypergeometric Distributions
Theorem Sec. 5.5: The Poisson Distribution
Theorem
Sec. 6.2: The Normal Distribution

Theorem
Hany Sadaka
Probability

Definition

Theorem

Example 3

Definition

Example
Section 3.2: Discrete Random Variables

Theorem
Theorem

Theorem
Section 3.2: Discrete Random Variables
Discrete Random Variables

Probability
Motivation Example
Definition
Toss 2 fair 6-sided dice. What is the probability that the sum of the
Theorem
numbers equal to 9?
Example 3
The sample space S has 36 sample points given by 
Definition
 (1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6) 
Example
 
(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6) 

 

 

(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6)
 
S=

 (4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6)  
(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6) 

 

 

(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)
 
Theorem
Theorem 1
We know that the probability of each sample point is .
36
The event A is given by A = {(6, 3), (5, 4), (4, 5), (3, 6)}
4
The probability of A is P(A) = .
36

Theorem
Section 3.2: Discrete Random Variables
Discrete Random Variables

Probability

Definition
Example 1
Theorem
In this example, we only care about the sum of two numbers.
Example 3 So we want to redefine the sample space S to a “smaller
Definition sample space” in R using a random variable X assigning each
Example
outcome a real number,

X :S →R

Theorem
In our example, we define X : (a, b) → a + b.
Theorem As a set X (S) = {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12} ⊂ R, which is
called the range of X .
If the range of X is finite or countably subset of R, then X is
called a discrete random variable.

Theorem
Section 3.2: Discrete Random Variables
Discrete Random Variables

Probability

Definition
Definition
Theorem

Example 3
For every discrete random variable X , we define a probability
Definition
density function (pdf) by
Example
pX (k) = P(X = k) := P({s ∈ S|X (s) = k})

(If k ∈
/ X (S), then pX (k) = 0. )
Remark:
Theorem
Theorem
1. Probability function from sample space S to [0, 1] ⊂ R.
2. Random variable is a function from sample space S to R.
3. pdf pX (k) is a function from R to R.

Theorem
Section 3.2: Discrete Random Variables
Discrete Random Variables

Probability
Question
Definition
Find probability pX (k) for all k.
Theorem

Example 3

Definition Solution
Example
In our example, the probability pX (k) of each number k in the
range is assigned by

k 2 3 4 5 6 7 8 9 10 11 12
Theorem
Theorem 1 2 3 4 5 6 5 4 3 2 1
pX (k) 36 36 36 36 36 36 36 36 36 36 36

The probability that the sum of the numbers equal to 9 is given


4
by pX (9) = 36 .

Theorem
Section 3.2: Discrete Random Variables
Discrete Random Variables

Probability Question
Definition What is the probability that the sum of the numbers ≤ 4?
Theorem

Example 3 Solution
1 2 3
Definition
It is given by P(X ≤ 4) = pX (2) + pX (3) + pX (4) = 36 + 36 + 36 .
Example
This is another useful function called cumulative distribution
function (cdf), defined as

FX (t) = P(X ≤ t) := P({s ∈ S|X (s) ≤ t})


Theorem
Theorem In our example, find the cumulative distribution function (cdf).

t 2 3 4 5 6 7 8 9 10 11 12

1 3 6 10 15 21 26 30 33 35 36
FX (t) 36 36 36 36 36 36 36 36 36 36 36

Theorem
Section 3.2: Discrete Random Variables
Discrete Random Variables

Probability

Definition

Theorem

Example 3

Definition
Theorem
Example For any discrete random variable X , the pdf satisfies
1 pX (k) ≥ 0.
P
2 pX (k) = 1.
x∈X (S)
Theorem
Theorem

Theorem
Section 3.2: Discrete Random Variables
Discrete Random Variables

Probability
Example 2
Definition
If a pdf is given by pX (k) = c · k 2 for X (S) = {1, 2, 3, 4}. Find c,
Theorem and graph the pdf function.
Example 3

Definition
Solution
Example
By the above theorem,
pX (1) + pX (2) + pX (3) + PX (4) = 1,
1
So, c + 4c + 9c + 16c = 1, which implies c = 30 .
1 2
So pX (k) = ( 30 ) · k .
Theorem
Theorem

Theorem
For any discrete random variable X , the cdf function FX (t) and the
pdf function pX (k) satisfy
pX (k) = FX (k) − FX (k − 1).
Theorem
Section 3.2: Discrete Random Variables
Discrete Random Variables

Probability

Definition
Example 3
Theorem

Example 3
Suppose the cumulative distribution function (cdf) is given by
Definition

Example
t 1 3 5 6 8 9 10 12 15 16 19
1 1 2 3 1 2 1 1 2 33
FX (t) 39 36 37 35 11 13 6 5 3 35 1

What is the pdf P(X = 9)?


Theorem
Theorem
Solution
P(X = 9) = FX (9) − FX (8) = 2/13 − 1/11

Theorem
Section 3.2: Discrete Random Variables
Probability

Definition

Theorem

Example 3

Definition

Example
Section 3.3: Continuous Random Variables

Theorem
Theorem

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability
The range of a continuous random variable X is a (piecewise)
Definition continuous interval of R.
Theorem

Example 3
Motivation Example
Definition Choose a real number randomly from the interval [0, 2] (sample
Example space). If we assume the numbers are equally likely, we have the
following the probabilities:
• P(X ≤ 2) = 1

• P(X ≤ 0.2) = 0.1

• P(X ≤ 0.02) = 0.01


Theorem
Theorem .
.
.

• P(X ≤ x) = x/2

We can continue this and P(X = 0) = 0.


In fact, P(X = a) = 0 for any real number. So, we care about the
Theorem
probability for a interval.
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability
Definition
Definition The probability density function (pdf) of a continuous random
Theorem
variable X is a piecewise continuous function fX (x) satisfying
Example 3
1 fX (x) ≥ 0
Definition Z ∞
Example
2 fX (x)dx = 1.
−∞

We also define fX (x) = 0 if x in not in the range of X .

Theorem
Definition
Theorem
The probability that X is in an interval [a, b] is
Z b
P(a ≤ X ≤ b) = fX (x)dx
a

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability
Definition
Definition
The cumulative distribution function (cdf) of a continuous
Theorem random variable X is
Rx
Example 3
FX (x) = P(X ≤ x) = −∞ fX (t)dt
Definition

Example
From the Fundamental Theorem of Calculus, we have the relation
between cdf and pdf:

Theorem
Theorem FX0 (x) = fX (x)
Theorem

We can use the cdf to find the probability

Theorem
P(a ≤ X ≤ b) = FX (b) − FX (a)
Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability

Definition

Theorem
Example
Example 3

Definition
In our motivation example,
 we have cdf
Example
0
 for x ∈ [−∞, 0]
FX (x) = P(X ≤ x) = x/2 for x ∈ [0, 2]

1 for x ∈ [2, ∞]

(
1/2 for x ∈ [0, 2]
Theorem So, the pdf fX (x) = FX0 (x) =
Theorem
0 for others

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability
Example
Definition
The graphs for the pdf fX (x) and the cdf FX (x):
Theorem

Example 3

Definition

Example

Theorem
Theorem

he cdf function is always a continuous, increasing function.


The minimum is 0 and the maximum is 1.

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability
Example 1
Definition
Choose a number randomly from the interval [a, b]. If we assume the
Theorem
numbers are equally likely. (This distribution is called uniform
Example 3

Definition
distribution.) Find pdf fX (x) and cdf FX (x).
Example

Solution
The cdf function is 
0
 for x ∈ [−∞, a]
x−a
FX (x) = P(X ≤ x) = for x ∈ [a, b]
Theorem
Theorem
 b−a
1 for x ∈ [b, ∞]

The pdf function is (
1
for x ∈ [a, b]
fX (x) = FX0 (x) = b−a
0 for others

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability

Definition

Theorem

Example 3
Example 2
Definition

Example Suppose the pdf of a random variable Y is fY (y ) = c · y 3 for


0 ≤ y ≤ 2.
1 Find c and calculate P(0 ≤ Y ≤ 1).
2 Find the cdf FY (y ).
Theorem
Theorem

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability

Definition
Solution
2
cy 4

Theorem R∞ 3
R2 3
1= −∞
cy dy = 0
cy dy = = 4c
Example 3 4 0
Definition
1 So c = 1/4.
Example
The probability
 4 1
R1 1 3 y
P(0 ≤ Y ≤ 1) = 0 y dy = = 1/16
4 16 0

Theorem
2 The cdf is  4 y
Theorem Ry Ry 1 3 t
FY (y ) = f (t)dy =
−∞ Y 0 4
t dt = = y 4 /16
16 0
where 0 ≤ y ≤ 2.
FY (y ) = 0 when y < 0, FY (y ) = 1 when y > 2.

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability

Definition
Example 3
Theorem

Example 3
An important continuous distribution is the exponential
Definition distribution defined as
Example

fX (x) = λe −λx for x ≥ 0

where λ is a positive parameter.


Z ∞
Theorem
Theorem 1 Check that fX (x) ≥ 0 and fX (x)dx = 1.
−∞
2 Calculate the cdf FX (x) of X .

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability
Solution
Definition
1 It is clear that fX (x) ≥ 0. For the second equality,
Theorem

Example 3
Z ∞ Z ∞ h i∞
Definition fX (x)dx = λe −λx dx = −e −λx =1
−∞ 0 0
Example

2
Z x Z x
FX (x) = fX (t)dt = λe −λt dt
Theorem −∞ 0
Theorem h ix
= −e −λt
0
−λx
= −e − (−1)
−λx
= 1−e

Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability
Example 4
Definition
The cdf of a random variable X is given by
Theorem

Example 3

 0 x <0
Definition
 x/4

 0≤x <1
Example
FX (x) = 1/4
√ 1≤x <2
2x/8 2 ≤ x < 32




x ≥ 32

1

Theorem
Theorem 1 Sketch the graph the cdf FX (x).
2 Find P(1/2 ≤ X < 4).
3 Find P(X > 4).
4 Find the pdf of X .
Theorem
Section 3.3: Continuous Random Variables
Continuous Random Variables

Probability Solution
Definition

Theorem

Example 3
1
Definition

Example 1 8 1
2 P( ≤ x < 4) = FX (4) − FX (1/2) = −
2 8 8

8
3 P(X > 4) = 1 − P(X ≤ 4) = 1 − FX (4) = 1 −
8
Theorem
Theorem
4 The pdf of X is given by

 0 x <0

0≤x <1



 1/4

1≤x <2

 0

0
fX (x) = FX (x) = √

 2 1 −1


 · x 2 2 ≤ x < 32



 8 2
x ≥ 32

0
Theorem
Section 3.3: Continuous Random Variables
Probability

Definition

Theorem

Example 3

Definition

Example
Section 3.4: Joint Probability Distributions

Theorem
Theorem

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability
Example 1: Toss 2 fair 6-sided dice
Definition
Let X be the difference of the two numbers.
Theorem Let Y be the larger number of the two numbers.
Example 3 The sample space S has 36 sample points given by
Definition
(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6)
 
 
(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6)
 
Example 





(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6)
 
S =

 (4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6) 

 (5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6)

 

 

(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)

From Section 3.5, the range of X is X (S) = {0, 1, 2, 3, 4, 5} and the


Theorem
Theorem
pdf of X is

X =x 0 1 2 3 4 5
6 10 8 6 4 2
pX (x) 36 36 36 36 36 36

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability
Example 1
Definition
small Similarly, range of Y is Y (S) = {1, 2, 3, 4, 5, 6} and the
Theorem pdf of Y is
Example 3

Definition
Y =y 1 2 3 4 5 6
Example 1 3 5 7 9 11
pY (y ) 36 36 36 36 36 36

Definition: Discrete Joint Density


Theorem
Theorem Let S is a discrete sample space. Let X and Y be two random
variables on S. The joint probability density function (joint pdf)
of X and Y is denoted by pX ,Y (x, y ) defined as
pX ,Y (x, y ) := P(X = x, Y = y ).
Here, P(X = x, Y = y ) is the probability when X = x and Y = y .
Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability Theorem
Definition The joint pdf pX ,Y (x, y ) satisfies
Theorem 1 pX ,Y (x, y ) ≥ 0.
Example 3 P P
Definition
2 pX ,Y (x, y ) = 1.
All x All y
Example

Question
Find the joint pdf of X and Y , pX ,Y (x, y ).
Theorem X
0 1 2 3 4 5 PY (y )
Theorem Y
1 1 0 0 0 0 0 1
36 36
2 1 2 0 0 0 0 3
36 36 36
3 1 2 2 0 0 0 5
36 36 36 36
4 1 2 2 2 0 0 7
36 36 36 36 36
5 1 2 2 2 2 0 9
36 36 36 36 36 36
6 1 2 2 2 2 2 11
36 36 36 36 36 36 36
pX (x) 6 10 8 6 4 2 1
36 36 36 36 36 36
Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability Theorem
Definition Let pX ,Y (x, y ) be the joint pdf of X and Y . Then
Theorem X X
Example 3
pX (x) = pX ,Y (x, y ), and pY (y ) = pX ,Y (x, y )
Definition
All y All x

Example
They are called the marginal pdfs of random variables X and Y
respectively.
X
pX (a) = P(X = a) = P(X = a, Y = y )
Theorem
All y
Theorem X
pY (b) = P(Y = b) = P(X = x, Y = b)
All x

Remark: In general, one can NOT recover joint pdf of X and Y ,


pX ,Y (x, y ) from the marginal pdfs pX (x) and pY (y ).
Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability
Question
Definition Find the marginal pdfs for X and Y in Example 1.
Theorem

Example 3 Definition
Definition Two random variables X and Y are called independent if and only if
Example
pX ,Y (x, y ) = pX (x)pY (y ).

Definition
If X and Y are continuous random variables. the joint pdf
Theorem fX ,Y (x, y ) of X and Y is a piecewise continuous multi-variable
Theorem
function satisfying
1 fX ,Y (x, y ) ≥ 0.
Z ∞Z ∞
2 fX ,Y (x, y ) dxdy = 1.
−∞ −∞
Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability The probability that X and Y are in a region R in the xy -plane R2 is


given by ZZ
Definition
P ((X , Y ) ∈ R) = fX ,Y (x, y ) dxdy
Theorem R RR
Example 3 This involves the calculation the double integral R fX ,Y (x, y ) dxdy
Definition from Calculus 3. If you have not learned Calculus 3, we can learn and
Example
do some easy examples here.

Definition
Let fX ,Y (x, y ) be the joint pdf of random variables X and Y .
Theorem
Then, the marginal pdf of X Z is
Theorem

fX (x) = fX ,Y (x, y ) dy
−∞
and the marginal pdf of Y Zis

fY (y ) = fX ,Y (x, y ) dx
−∞
Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability

Definition

Theorem

Example 3 Example 2
Definition
Suppose the pdf function is fX ,Y (x, y ) = c(x + y ) for
Example
0 ≤ x ≤ 1 and 0 ≤ y ≤ 1.
1 Find c.
2 Find P(Y ≤ 21 )
Theorem 3 P(X + Y ≤ 1)
Theorem

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability

Solution
Definition

Theorem
1.
Example 3
Z 1 Z 1
Definition 1 = c(x + y ) dydx
Example
0 0
1  1
y 2
Z 
= c xy + dx
0 2 0
Z 1  
1
= c x+ dx
Theorem 0 2
Theorem  2  1
x x
= c +
2 2 0
 
1 1
= c + = c ⇒ c = 1.
2 2

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability
Solution
Definition

2.
Theorem
Z 1 Z 1/2
Example 3
1
Definition P(Y ≤ ) = x + y dydx
2 0 0
Example
1  1/2
y 2
Z 
= xy + dx
0 2 0
Z 1 
x 1
= + dx
Theorem 0 2 8
Theorem  2  1
x x
= +
4 8 0
 
1 1 3
= + = .
4 8 8

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability Solution
Definition 3.
Theorem Z 1 Z 1−x
Example 3 P(X + Y ≤ 1) = x + y dydx
Definition 0 0
Z 1  1−x
Example y2
= xy + dx
0 2
0
1
(1 − x)2
Z
= x(1 − x) + dx
0 2
1 1
1 x 2
Theorem
Z
Theorem
= −
0 2 6 0
 1
x 2

x
= +
2 6 0
1 1 1
= − = .
Theorem
2 6 3
Section 3.4: Joint Probability Distributions
Joint Densities

Probability

Definition

Theorem

Example 3 Example 3
Definition
Suppose the pdf function is fX ,Y (x, y ) = cxy for 0 ≤ x ≤ 1,
Example
0 ≤ y ≤ 1 and y < x.
1 Find c.
2 Find P(X > 1/2)
Theorem 3 Find P(X > 1/2)
Theorem

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability
Solution
Definition
1.
Theorem Z 1 Z x
Example 3
1 = cxy dydx
Definition 0 0
Z 1  2  x
Example
xy
= c dx
0 2 0
Z 1  3
x
= c dx
0 2
Theorem  4  1
Theorem x
= c
4 0
 
1
= c = c ⇒ c = 8.
8

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability Solution
Definition 2.
Theorem Z 1 Z x
1
Example 3 P(X ≤ ) = 8xy dydx
Definition
2 1/2 0
 x
Example
Z 1
2
= 4xy dx
1/2 0
Z 1
4x 3 dx

=
Theorem
1/2
 1
Theorem
4
= x
1/2
 4
1 15
= 1− = .
2 16
Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability

Definition Solution
Theorem 3.
Example 3 Z x
Definition
fX (x) = 8xy dy
Example 0
x
= 4xy = 4x 3 ,
2

0 ≤ x ≤ 1.
0
Z 1
Theorem fY (y ) = 8xy dx
Theorem y
1
= 8xy = 4y − 4y 3 ,

0 ≤ y ≤ 1.
y

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability

Definition
Two random variables X and Y are called independent if and
only if fX ,Y (x, y ) = fX (x)fY (y ).
Theorem

Example 3
If the pdf function fX ,Y (x, y ) = c in region R is constant, it is
Definition
called bivariate uniform density. (HW3.7.10) This means all
Example points are equally likely.
Example 4
Two independent random variables X and Y both have
uniform distributions: X is uniform on [0, 20], Y is uniform on
Theorem
Theorem [5, 10].
1 Find the joint pdf fX ,Y (x, y ) for X and Y .
2 Find the probability that |X − Y | ≤ 5.

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability

Definition

Theorem
Solution
Example 3

Definition

Example

Theorem
Theorem

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability Solution
Definition

Theorem

Example 3

Definition

Example

Theorem
Method 2:
Theorem
Z 10 Z y +5
1
P(|X − Y | ≤ 5) = dxdy
5 y −5 100
Z 10
1
= (10)dy = 0.5
5 100
Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability

Definition
Theorem
Theorem

Example 3
Two continuous random variables X and Y are independent if and
Definition
only if there are functions g (x) and h(x) such that
Example fX ,Y (x, y ) = g (x)h(y ), fX (x) = g (x) and fY (y ) = h(y ).

It is easy to see that X and Y in Example 2 are not independent.

Example 5
Theorem
Theorem
Suppose two random variables X and Y are independent and
fX (x) = 3x 2 for 0 ≤ x ≤ 1 and fY (y ) = 12 y for 0 ≤ y ≤ 2. Find
P(Y > X ).

Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability Solution
Definition

3
Theorem = fX (x) · fY (y ) = x 2 y
fX ,Y (x, y )
Example 3
2
Z 1Z 2
Definition
3 2
P(X > Y ) = x y dydx
Example 0 x 2
Z 1 2
3 2 2
= x y dx
0 4 x
Z 1
3
= 3x 2 − x 4 dx
Theorem
Theorem 0 4
1
3
= x 3 − x 5
20 0
3 17
= 1− = .
20 20
Theorem
Section 3.4: Joint Probability Distributions
Joint Densities

Probability Example 6
Definition Two friends agree to meet on Curry student center “sometime
Theorem
around 12:30pm”. Each will arrive at random sometime from 12pm
Example 3 to 1pm. If one arrives and the other is not there, the first person will
Definition wait 15 minutes or until 1pm, whichever comes first, and then leave.
Example
What is the probability that the two will get together?

Solution
Let x and y denote the two arrival times. Two random variable are
Theorem uniform distributions on [0, 60]. The two friends meet each other if
Theorem
and only if |x − y | ≤ 15. That is −15 ≤ x − y ≤ 15.

area M
So P(Meet) = P(|X − Y | ≤ 15) =
602
602 − 552
= = 0.44
602
Theorem
Section 3.4: Joint Probability Distributions
Probability

Definition

Theorem

Example 3

Definition Lecture 2: Joint Probability Density Functions


Example

Northeastern University

Theorem
Theorem

Theorem
Northeastern University Lecture 2: Joint Probability Density Functions
Joint Probability Density functions

Probability

Definition
Definition: For Discrete Case
The conditional distribution of X given that Y = y is given by
Theorem

Example 3
P(x, y )
Definition P(X = x | Y = y ) = .
Example
PY (y )

Definition: For Continuous Case


Let X and Y be continuous random variables with joint density
Theorem function f (x, y ). The conditional density for X given that
Theorem
Y = y is given by

f (x, y )
f (x | Y = y ) = .
fY (y )

Theorem
Northeastern University Lecture 2: Joint Probability Density Functions
Conditional Expected Value

Probability
For discrete random variables
Definition
X
Theorem
E (Y | X = x) = y P(y |x)
Example 3
y
Definition X
Example E (X | Y = y ) = x P(x|y )
x

For continuous random variables


Theorem
Theorem Z ∞
E (Y | X = x) = y f (y |x) dy
Z−∞

E (X | Y = y ) = x f (x|y ) dx
−∞
Theorem
Northeastern University Lecture 2: Joint Probability Density Functions
Independent Random Variables

Probability

Definition
Definition: For Discrete Random variables
Theorem
Two discrete random variables X and Y are independent if
Example 3

Definition
P(x, y ) = PX (x)PY (y )
Example

for all pairs of outcomes (x, y ).

Definition: For Continuous Random Variables


Theorem Two continuous random variables X and Y are independent if
Theorem

f (x, y ) = fX (x)fY (y )

for all pairs (x, y ).

Theorem
Northeastern University Lecture 2: Joint Probability Density Functions
Example 7

Probability

Definition
Example 7
Theorem
A diagnostic test for the presence of a disease has two possible
Example 3

Definition
outcomes: 1 for disease present and 0 for disease not present.
Example
Let X denote the disease state of a patient, and let Y denote
the outcome of the diagnostic test. The joint probability
function of X and Y is given by:

P(X = 0, Y = 0) = 0.800 P(X = 1, Y = 0) = 0.050


Theorem
Theorem
P(X = 0, Y = 1) = 0.025 P(X = 1, Y = 1) = 0.125

Calculate Var(Y |X = 1).

Theorem
Northeastern University Lecture 2: Joint Probability Density Functions
Solution

Probability Solution
Definition We can calculate this variance if we know the conditional distribution
Theorem
of Y given that X = 1.
X
0 1
Example 3 Y
0 0.800 0.050
Definition 1 0.025 0.125
Example PX (x) 0.825 0.175

P(Y = 0 & X = 1) 0.050


P(Y = 0 | X = 1) = = = 0.2857
P(X = 1) 0.175
P(Y = 1 & X = 1) 0.125
Theorem P(Y = 1 | X = 1) = = = 0.7143
Theorem P(X = 1) 0.175
E [Y | X = 1] = 0.2857(0) + 0.7143(1) = 0.7143
E [Y 2 | X = 1] = 0.2857(0)2 + 0.7143(1)2 = 0.7143
Var(Y | X = 1) = 0.7143 − 0.71432 = 0.204.

Theorem
Northeastern University Lecture 2: Joint Probability Density Functions
Example

Probability
Example 8
Definition

Once a fire is reported to a fire insurance company, the


Theorem

Example 3
company makes an initial estimate, X , of the amount it will
Definition pay to the claimant for the fire loss. When the claim is finally
Example settled, the company pays an amount, Y , to the claimant. The
company has determined that X and Y have the joint density
function
2 −
(2x−1)
f (x, y ) = y (x−1) , x > 1, y > 1.
Theorem
Theorem x 2 (x − 1)

Given that the initial claim estimated by the company is 2,


determine the probability that the final settlement amount is
between 1 and 3.

Theorem
Northeastern University Lecture 2: Joint Probability Density Functions
Solution

Probability Solution
Definition To find P(1 < Y < 3 | X = 2) we need:
Theorem
f (2, y ) 0.5 y −3
Example 3 f (y | X = 2) = =
Definition
fX (2) fX (2)
Example ∞ ∞ ∞
−y −2
Z Z
1
fX (2) = f (2, y ) dy = 0.5 y −3 dy = =
1 1 −4 1 4
f (2, y ) 0.5 y −3
f (y | X = 2) = = = 2 y −3
Theorem
fX (2) 0.25
Theorem

Z 3
P(1 < Y < 3 | X = 2) = f (y | X = 2) dy
1
Z 3 3
8
2 y −3 dy = −y −2 = .

=
1 1 9
Theorem
Northeastern University Lecture 2: Joint Probability Density Functions

You might also like