Ma2013e Mat IV.1
Ma2013e Mat IV.1
1/118
Topic
1 Probability
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
2/118
Topic
1 Probability
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
3/118
What is Probability?
1 Probability
4/118
Elements of Probability
1 Probability
Experiment
An experiment is a systematic and controlled process or activity carried out to gather data
or information about a particular phenomenon or system.
• Deterministic Experiment
An experiment or a process in which the outcome can be predicted with certainty
before it is observed or performed.
— For instance, if we add 5 and 3, we know the outcome will always be 8, and there is no
randomness involved.
• Random Experiment
An experiment or a process in which the outcome cannot be predicted with
certainty.
— For example, tossing a coin is a random experiment because we cannot predict the
outcome of the coin toss with certainty.
5/118
Elements of Probability
1 Probability
Sample Space
• The set of all possible outcomes of a random experiment is called the sample space
and is represented by the symbol S.
— For example, the sample space of a coin toss is S = {H, T}, where H represents the
outcome of getting heads and T represents the outcome of getting tails.
• Each outcome in a sample space is called an element or a member of the sample
space, or simply a sample point.
Event
• An event is a subset of a sample space and is denoted by the symbol E.
— For example, the event of getting heads in a coin toss is E = {H}.
• Equally Likely Events
Equally likely events are events that have the same theoretical probability (or
likelihood) of occurring.
6/118
Naive Definition
1 Probability
7/118
Set Theory
1 Probability
Theorem
If an event E does not happen, i.e., E = ; is the empty set, then
P(;) = 0.
Theorem
If Ec is the complement of an event E with respect to S, then P(Ec ) = 1 P(E).
E1 \ E2 = ;,
Theorem
Suppose E1 and E2 are any two events in S, then
Theorem
Suppose E1 and E2 are two mutually exclusive events in S, then
9/118
Set Theory
1 Probability
Theorem
Suppose E1 and E2 are any two events in S such that E1 ⇢ E2 , then
P(E1 ) P(E2 ).
Theorem
Suppose E1 , E2 and E3 are any three events in S, then
P(E1 [E2 [E3 ) = P(E1 )+P(E2 )+P(E3 ) P(E1 \E2 ) P(E2 \E3 ) P(E1 \E3 )+P(E1 \E2 \E3 ).
10/118
Set Theory
1 Probability
Theorem
Suppose E1 , E2 , . . . , En are any n events in S, then
n
X n
X n
X
P([Ni=1 Ei ) = P(Ei ) + ( 1)1 P(Ei \ Ej ) + ( 1)2 P(Ei \ Ej \ Ek )
i=1 j=2, i<j k=3, i<j<k
+ · · · + ( 1)n 1
P(\ni=1 Ei ).
Definition
The events E1 , E2 , . . . , En in S are mutually exclusive (or mutually disjoint) if and only if
n
X
P([ni=1 Ei ) = P(Ei ).
i=1
11/118
Axiomatic Definition of Probability
1 Probability
Definition
The axiomatic definition of probability includes the following four axioms:
• For any event E in a sample space S, 0 P(E) 1.
• P(S) = 1.
• For any finite number n of mutually exclusive events E1 , . . . , En in S,
n
X
P([ni=1 Ei ) = P(Ei ).
i=1
Multiplication Rule
Suppose we consider a compound Experiment E consisting of two experiments E1 and E2 .
Let #S1 = n1 and #S2 = n2 are the sample sizes (number of outcomes) of E1 and E2 ,
respectively, then there are #S = n1 · n2 outcomes to E.
Example
Consider tossing a true coin two times.
Example
A true coin is tossed three times. Using a tree diagram, find all possible outcomes.
Ans=8.
Example
Suppose we toss a true coin and cast a true die. Using a tree diagram, find all possible
outcomes.
Ans=12.
14/118
Permutation and Combination
1 Probability
n+k 1
With Replacements nk k
nP n
Without Replacements k k
15/118
Conditional Probability and Product Rule
1 Probability
P(E2 \ E1 )
P(E2 |E1 ) = ,
P(E1 )
Product Rule
For any two events E1 and E2 ,
Definition
Two events E1 and E2 are said to be independent if
Theorem
If events E1 and E2 are independent, then
17/118
Independent Events
1 Probability
Theorem
If E1 and E2 are independent events, then the following holds:
1. E1 and E2c are independent events.
2. E1c and E2c are independent events.
3. E1c and E2 are independent events.
Definition
The n events E1 , E2 , . . . , En are mutually independent if and only if the probability of the
intersection of any k (k = 2, 3, . . . , n) of these events is the product of their respective
probability, that is, for k = 2, 3, . . . , n
18/118
Partition of the Sample Space
1 Probability
Definition
If E1 , E2 , . . . , En are mutually exclusive events in the sample space S, then E1 , E2 , . . . , En
form a partition of S if
E1 [ E2 [ . . . [ En = S.
19/118
Rule of Elimination or Total Probability, and Bayes’
Theorem
1 Probability
Theorem
If events E1 , E2 , . . . , En represent a partition of the sample space S, then the total
probability of an arbitrary event E in S is
n
X
P(E) = P(Ei ) · P(E|Ei )
i=1
P(Ek ) · P(E|Ek )
P(Ek |E) = Pn , for k = 1, 2, . . . , n.
i=1 P(Ei ) · P(E|Ei )
20/118
Topic
2 Random Variable
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
21/118
Topic
2 Random Variable
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
22/118
Examples
2 Random Variable
Example
Experiment: Consider tossing two fair coins and observe the up face.
• It is a random experiment with 4 outcomes.
— So the sample space is: S = {HH, HT, TH, TT}.
• Let X(e), for outcome e 2 S, represent the number of heads obtained in a toss of the
two coins, then X(e) = 2 or 1 or 0.
X RX
S
HH 2
HT
1
TH
TT 0
X : Random Variable.
23/118 RX : Range of X.
Definition
2 Random Variable
What is a Random Variable?
Definition
Let S be the sample space for an experiment and X : S ! R be a real-valued function that
assigns a real number X(e) for every outcome e 2 S, then X is called as random variable.
OR
24/118
Examples of Random Variables
2 Random Variable
Example
• Experiment: Roll a pair of dice and observe the up faces.
— The sample space:
S = {(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6),
(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6),
(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6),
(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6),
(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6),
(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)}.
— Attribute we are observing is the sum of the up faces are 7.
— Then the random variable
( X : S ! R is given by
1, if e 2 {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)},
X(e) =
0, otherwise.
25/118
Examples of Random Variables
2 Random Variable
Example
• Experiment: Suppose that an individual purchases two electronic components each
of which may be either defective (d) or acceptable (a).
— The sample space: S = {(d, d), (d, a), (a, d), (a, a)}.
— Attribute we are observing is the acceptable components.
— Then the random variable X : S !8R is given by
<2, if e 2 {(a, a)},
>
X(e) = 1, if e 2 {(d, a), (a, d)},
>
:
0, otherwise.
26/118
Examples of Random Variables
2 Random Variable
Example
• Experiment: A cathode ray tube is manufactured, put on life test, and aged to failure.
The elapsed time (in hours) at failure is recorded.
— The sample space: S = {t : t 0} = [0, 1).
— Attribute we are observing is the time to failure.
— Then the random variable X : S ! R is given by X(t) = t.
Example
• Experiment: In a particular chemical plant the volume produced per day for a
particular product ranges between a minimum value a and a maximum value b,
which corresponds to the capacity. A day is randomly selected and the amount
produced is observed.
— The sample space: S = {x : a x b} = [a, b].
— Attribute we are observing is the amount produced.
— Then the random variable X : S ! R is given by X(t) = t.
27/118
Types of Random Variables
2 Random Variable
28/118
Types of Random Variables
2 Random Variable
29/118
Important Points and Notations
2 Random Variable
Let S be a sample space of an experiment and X a random variable with range space RX .
Definition
If A is an event in S (that is, A ⇢ S) and B is an event in Rx (that is, B ⇢ RX ), then A and B
are equivalent events if
A = {e 2 S : X(e) 2 B}.
More simply, if event A in S consists of all outcomes e in S for which X(e) 2 B, then A and
B are equivalent. Whenever A occurs, B occurs, and vice-versa.
30/118
Important Points and Notations
2 Random Variable
Definition
If A is an event in S and B is an event in Rx , then we define the probability of B as
31/118
Important Points and Notations
2 Random Variable
Example
Experiment: Toss two coins and observe the up face.
• It is a random experiment with 4 outcomes.
— The sample space is: S = {HH, HT, TH, TT}.
• The random variable X represent the number of heads in a toss of two coins.
• RX = {0, 1, 2}.
Events in RX Events in S PX (X = xi )
X=2 {HH} 1
4
X=1 {HT, TH} 2
4
X=0 {TT} 1
4
#RX
X
PX (X = xi ) = 1
i=1
32/118
Important Points
2 Random Variable
Example
• Experiment: Roll a pair of fair dice and observe the up faces.
— The random variable X represent the sum of up faces in a roll of two fair dices.
— RX = {2, 3, . . . , 12}.
Events (X = xi ) in RX Events in S PX (X = xi )
X=2 {(1, 1)} 1
36
X=3 {(1, 2), (2, 1)} 2
36
X=4 {(1, 3), (2, 2), (3, 1)} 3
36
X=5 {(1, 4), (2, 3), (3, 2), (4, 1)} 4
36
X=6 {(1, 5)(2, 4), (3, 3), (4, 2), (5, 1)} 5
36
X=7 {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)} 6
36
X=8 {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} 5
36
X=9 {(3, 6), (4, 5), (5, 4), (6, 3)} 4
36
X = 10 {(4, 6), (5, 5), (6, 4)} 3
36
X = 11 {(5, 6), (6, 5)} 2
36
X = 12 {(6, 6)} 1
36
#RX
X
PX (X = xi ) = 1
33/118 i=1
Important Points
2 Random Variable
• Let S be a sample space for an experiment and X be a discrete random variable on S
with range space RX = {x1 , x2 , . . .}, then
#R
! #R
[X XX
1 = P(S) = PX (X = xi ) = PX (X = xi )
i=1 i=1
36/118
Definition
3 Distribution Function
Definition
Let X be a random variable, the function F : R ! R defined by
FX (x) := PX (X x),
37/118
Distribution Function
3 Distribution Function
Example
In the case of the three-times coin-tossing experiment, the range of the random variable X is
RX = {0, 1, 2, 3}, where X represents the number of heads. Find the distribution function for the
following table:
Events X = xi in RX PX (X = xi )
X=0 1/8
X=1 3/8
X=2 3/8
X=3 1/8
8
>
>
> 0, x < 0,
>
>
>
<1/8, 0 x < 1,
Ans. FX (x) = 4/8, 1 x < 2,
>
>
>7/8,
> 2 x < 3,
>
>
:1, x 3
38/118
Topic
4 Probability Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
39/118
Topic
4 Probability Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
40/118
Definition
4 Probability Function
41/118
Probability Distribution
4 Probability Function
Example
In the case of the three-times coin-tossing experiment, the range of the random variable X is
RX = {0, 1, 2, 3}, where X represents the number of heads.
x p(x)
0 1/8
1 3/8
2 3/8
3 1/8
42/118
Table: Probability distribution table.
Topic
5 Distribution Function and Probability Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
43/118
Topic
5 Distribution Function and Probability Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
44/118
Properties
5 Distribution Function and Probability Function
lim FX (x + ) = FX (x)
!0
45/118
Examples
5 Distribution Function and Probability Function
Example
A discrete random variable X has the following probability distribution table:
x 0 1 2 3 4 5 6 7
p(x) 0 k 2k 2k 3k k2 2k2 7k2 +k
Then
1. Find k. Ans. k = 1/10.
2. Evaluate P(X < 6), P(X 6) and P(0 < X < 5). Ans. 81/100, 19/100 and 4/5.
3. If P(X a) > 1/6, find the minimum value of a. Ans. a = 2.
4. Determine the distribution function of X.
46/118
Examples
5 Distribution Function and Probability Function
Example
(x
, x = 1, 2, 3, 4, 5,
If p(x) = 15
0, otherwise.
Then find
1. P(X = 1 or 2). Ans. 1/5.
2. P( 12 < X < 5
2 X > 1). Ans. 1/7.
47/118
Topic
6 Probability Density Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
48/118
Topic
6 Probability Density Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
49/118
Definition
6 Probability Density Function
50/118
Properties
6 Probability Density Function
51/118
Examples
6 Probability Density Function
52/118
Topic
7 Some Characteristics of Distributions
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
53/118
Topic
7 Some Characteristics of Distributions
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
54/118
Mean
7 Some Characteristics of Distributions
Example
In the case of the three-times coin-tossing experiment, the range of the random variable
X is RX = {0, 1, 2, 3}, where X represents the number of heads. Find the mean of X.
x 0 1 2 3
Sol.
PX (X = x) 1/8 3/8 3/8 1/8
µX = 0 ⇥ 1/8 + 1 ⇥ 3/8 + 2 ⇥ 3/8 + 3 ⇥ 1/8 = 3/2.
55/118
Mean
7 Some Characteristics of Distributions
Example
Let X be a continuous random variable(
with PDF
3/x4 , x 1,
fX (x) =
0, otherwise.
Find the mean of X.
R1 R1
Sol. µX = xfX (x)dx = (3/x3 )dx = 3/2.
1 1
56/118
Median
7 Some Characteristics of Distributions
57/118
Median
7 Some Characteristics of Distributions
Example
⇢
3/x4 , x 1,
Let X be a continuous random variable with PDF fX (x) =
0, otherwise.
Find the median of X.
m
RX R1 1
Sol. Clearly mX > 1, then fX (x)dx = 1
2
= fX (x)dx =) mX = (2) 3 .
1 mX
Example
In the case of the three-times coin-tossing experiment, where X represents the number of heads.
Find the median of X.
x 0 1 2 3
Sol.
PX (X = x) 1/8 3/8 3/8 1/8
P(X 1) = P(X = 0) + P(X = 1) = 1/8 + 3/8 = 1/2 0.5 and
P(X 2) = P(X = 0) + P(X = 1) + P(X = 2) = 1/8 + 3/8 + 3/8 = 7/8 0.5. Thus, mX = 1+2
3
= 1.5.
58/118
Mode
7 Some Characteristics of Distributions
59/118
Variance and Standard Deviation
7 Some Characteristics of Distributions
Example
Let X be a continuous random variable( with PDF
3/x4 , x 1,
fX (x) =
0, otherwise.
Find the variance and standard deviation of X.
R1 R1 R1
Sol. var(X) = (x µX )2 fX (x)dx = (x 3 2
2
) (3/x4 )dx =3 1
x2
3
x2
+ 9
4x3
dx = 3/4.
1 1 1
p p
X = var(X) = 3/2.
61/118
Moment about the Origin and Moment about the Mean
7 Some Characteristics of Distributions
64/118
Functions of a Discrete Random Variable
8 Functions of a Random Variable
• Let X be a discrete random variable (defined on a sample space S) with range space
RX = {x1 , x2 , . . .} and H be a real-valued function on a domain that contains RX .
— Then, the composition function H X : S ! R given by H X(e) = H(X(e)) is well
defined.
— The function Y = H X = H(X) is also a discrete random variable.
— Let RY = {y1 , y2 , . . .} be the range spaces of Y.
— Suppose ⌦i := H 1 (yi ) = {x 2 RX : H(x) = yi }, then the probability mass function
(PMF) for Y at yi is given by P
pY (yi ) = pY (Y = yi ) = pX (X = x),
x2⌦i
where pX is a PMF of X.
65/118
Functions of a Discrete Random Variable
8 Functions of a Random Variable
Example
In the case of the three-times coin-tossing experiment, the range of the random variable X is
RX = {0, 1, 2, 3}, where X represents the number of heads.
x 0 1 2 3
PX (X = x) 1/8 3/8 3/8 1/8
Sol.
y 1 1 3 5
1. pY (y) = PY (Y = y) = PX (X = (y + 1)/2), then
pY (y) 1/8 3/8 3/8 1/8
y 0 1 2
2. pY (y) = PY (Y = y) = PX ((X = y + 2) or (X = 2 y)), then
pY (y) 3/8 1/2 1/8
66/118
Continuous Functions of a Continuous Random Variable
8 Functions of a Random Variable
• Let X be a continuous random variable with probability density function (PDF) fX and
H be a real-valued function on a domain that contains RX .
— Then, the function Y = H X = H(X) is also a continuous random variable.
— Let fY be PDF of Y. fY may be found by performing the following three steps:
1. Obtain the CDF, FY (y) = PY (Y y) = PX (B), where B is the event in RX equivalent to
(Y y) in Y, that is, B := H 1 (y) = {x 2 RX : H(x) = y}.
d
2. Then obtain fY (y) = FY (y).
dy
3. Find the range space of the new random variable Y.
67/118
Continuous Functions of a Continuous Random Variable
8 Functions of a Random Variable
Example
Suppose the random variable X has the(following PDF:
x/8, 0 x 4,
fX (x) =
0, otherwise.
If Y = H(X), find the PDF fY of Y, where H(x) = 2x + 8.
d
fY (y) = FY (y) = (y 8)/32.
dy
(
(y 8)/32, 8 y 16,
fY (y) =
0, otherwise.
68/118
Topic
9 Expectation
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
69/118
Topic
9 Expectation
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
70/118
Definition
9 Expectation
Definition (Expectation)
Let X be a random variable and Y = H(X) be a function of X, then the expected value of
H(X) is defined as follows: X
8
>
> H(xi ) · pX (xi ), if X is a discrete random variable,
>
>
< i
E[Y] = E[H(X)] := Z1
>
>
>
> H(x) · fX (x), if X is a continuous random variable.
:
1
Note that:
• In the above definition, in the case where X is a continuous random variable, we
restrict H so that Y = H(X) is a continuous random variable.
• We assume that, in the above definition, the right-hand side sum and the integral
71/118
exist (finite).
Some Important Observations
9 Expectation
• The kth moment about the origin µ0k = E[X k ], (origin moment of X).
⇥ ⇤
• The kth moment about the mean µk = E (X E[X])k , (central moment of X).
• (E[X])2 (E[|X|])2
E[X 2 ].
• Let a, b 2 R be constants, then
— E[a] = a,
— E[aX + b] = aE[X] + b,
— var(aX + b)=a2 var(X).
— P(a < X b) = 1, then a < E[X] b.
72/118
Topic
10 Moment-Generating Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
73/118
Topic
10 Moment-Generating Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
74/118
Definition
10 Moment-Generating Function
Remark
For real constants a and b,
MaX+b (t) = E[exp(t(aX + b))] = exp(bt)E[exp(atX)] = exp(bt) MX (at)
75/118
MGF and Moments
10 Moment-Generating Function
Theorem
If X has moment-generating function MX (t), with MX (t) < 1 for |t| < a for some a > 0,
then the distribution of X is determined uniquely, and
1 1
X E[X k ] t k X µ0k t k
MX (t) = = ,
k! k!
k=0 k=0
where µ0K is the kth moment about the origin. In this case,
dk
µ0k = k MX (t) = E[X k exp(tX)] .
dt t=0 t=0
76/118
Limitations of MGF
10 Moment-Generating Function
A random variable X may have MGF and some or all moments, yet the MGF
does not generate the moments.
E.g., consider a discrete random variable X with PMF:
exp( 1)
pX (2x ) = , x = 0, 1, 2, 3, . . . .
x!
77/118
Limitations of MGF
10 Moment-Generating Function
A random variable X may have some or all moments, but the MGF does not exist
except possibly at one point.
E.g., consider a discrete random variable X with PMF:
exp( 1)
pX (±2x ) = , x = 0, 1, 2, 3, . . . .
x! 2
78/118
Topic
11 Some Important Discrete Distributions
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
79/118
Topic
11 Some Important Discrete Distributions
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
80/118
Definition
11 Some Important Discrete Distributions
FFF 0
FFS
FSF 1
SFF
FSS
SFS 2
SSF
SSS 3
S : Success.
81/118 F : Failure.
Mean and Variance of Binomial Distribution
11 Some Important Discrete Distributions
Theorem
If X follows the binomial distribution with parameter 0 < q < 1, then
E[X] = nq and var(X) = nq(1 q).
Proof.
Mean Variance
n
X
n
X Xn ⇣n⌘ E[X(X 1)] = x(x 1)pX (x)
E[X] = x pX (x) = x (1 q)n x x
q x=0
x=0 x=0
x
n ⇣
X n 2⌘
n
X n! = n(n 1)q2 (1 q)n y x 2
q
= x (1 q) n x x
q x=2
x 2
x=1
x!(n x)!
n
X2 ⇣ n 2⌘
n
X (n 1)! n x x 1
= n(n 1)q2 (1 q)n 2 y y
q
= nq (1 q) q y=0
y
x=1
(x 1)!(n x)!
n
= n(n 1)q2
X1 ⇣ n 1⌘ n 1 y y
= nq (1 q) q = nq 2
var(X) = E[X ] (E[X])2 = E[X(X 1)] + E[X] (E[X])2
y=0
y
= n(n 1)q2 + nq n2 q2 = nq(q 1).
82/118
MGF of Binomial Distribution
11 Some Important Discrete Distributions
Theorem
The moment-generating function for the binomial distribution with parameter 0 < q < 1,
MX (t) = (q exp(t) + 1 q)n for t 2 R.
Proof. n
X
MX (t) = E[exp(tX)] = exp(txi )pX (xi )
i=0
n n
!
X X n
= exp(ti)pX (i) = exp(ti) (1 q)n i qi
i=0 i=0
i
n
!
X n i
= (1 q)n i
exp(t)q
i=0
i
n
= q exp(t) + (1 q) , for all t
83/118
Definition
11 Some Important Discrete Distributions
Theorem
If X follows the Geometric distribution with parameter q, then
E[X] = 1q and var(X) = 1q2q .
Proof. Exercise.
84/118
Definition
11 Some Important Discrete Distributions
Theorem
The moment-generating function for Geometric distribution with parameter 0 < q < 1,,
q exp(t) 1
MX (t) = , for t < log .
1 (1 q) exp(t) 1 q
1
X
Proof.
MX (t) = E[exp(tX)] = exp(txi )pX (xi )
i=1
X1 1
X
= exp(ti)pX (i) = exp(ti)(1 q)i 1
q
i=1 i=1
1
X i 1
= q exp(t) exp(t)(1 q)
i=1
q exp(t) 1
= , for t < log .
1 (1 q) exp(t) 1 q
85/118
Definition
11 Some Important Discrete Distributions
Example
Suppose that the average number of accidents occurring weekly on a particular stretch of
a highway equals 3. Calculate the probability that there is at least one accident this week.
Ans.=0.9502
86/118
Poisson Approximation of Binomial Distribution
11 Some Important Discrete Distributions
87/118
Poisson Approximation of Binomial Distribution
11 Some Important Discrete Distributions
Example
Suppose the probability that an item produced by a certain machine will be defective is
0.1. Find the probability that a sample of 10 items will contain at most one defective item.
Assume that the quality of successive items is independent.
Ans.=2e 1 ⇡ 0.7358
88/118
Mean and Variance of Poisson Distribution
11 Some Important Discrete Distributions
Theorem
If X follows the binomial distribution with parameter > 0, then
E[X] = and var(X) = .
Proof.
Variance
1
X
E[X(X 1)] = x(x 1)pX (x)
Mean x=0
1 1 xe 1
X xe 1
X x 2
X X 2
E[X] = x pX (x) = x = x(x 1) = e
x! x=2
x! x=2
(x 2)!
x=0 x=0
1 xe 1 x 1 1
X y
X X 2 2
= x = e = e =
x! (x 1)! y=0
y!
x=1 x=1
1 y
X var(X) = E[X 2 ] (E[X])2
= e = e e = .
y=0
y! = E[X(X 1)] + E[X] (E[X])2
2 2
= + = .
89/118
MGF of Poisson Distribution
11 Some Important Discrete Distributions
Theorem
The moment-generating function for the Poisson distribution with parameter > 0,
(et 1)
MX (t) = e for all t 2 R.
1
X 1
X 1
X ie
Proof. MX (t) = E[etX ] = etxi pX (xi ) = eti pX (i) = eti
i!
i=0 i=0 i=0
1
X t )i
(e et (et 1)
=e =e e =e , for all t 2 R
i!
i=0
90/118
Topic
12 Some Important Continuous Distributions
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
91/118
Topic
12 Some Important Continuous Distributions
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
92/118
Definition
12 Some Important Continuous Distributions
Example
Buses arrive at a specified stop at 15-minute intervals starting at 7 AM, that is, they arrive at 7, 7 : 15, 7 : 30,
7 : 45, and so on. If a passenger arrives at the stop at a time that is uniformly distributed between 7 and
7 : 30, find the probability that he waits
1. less than 5 minutes for a bus;
2. at least 12 minutes for a bus.
Ans.= 1. 1/3, 2. 1/5.
93/118
Mean and Variance of Uniform Distribution
12 Some Important Continuous Distributions
Theorem
If X follows the uniform distribution over the interval [↵, ], then
↵+ 2
E[X] = and var(X) = (↵ 12 ) .
2
Proof.
Z
VarianceZ
1 1
2
E[X ] = x2 fX (x)dx = x2 dx
1 ↵ ↵
Mean =
1 3 ↵3
=
↵2 + 2 +↵
.
Z 1 Z ↵ 3 3
1
E[X] = x fX (x)dx = x dx
1 ↵ ↵ var(X) = E[X 2 ] (E[X])2
2 ↵2
1 ↵+ ↵2 + 2 +↵ (↵ + )2
= = . =
↵ 2 2 3 4
(↵ )2
= .
12
94/118
MGF of Uniform Distribution
12 Some Important Continuous Distributions
Theorem
The moment-generating function for the uniform distribution X over the interval [↵, ]
8 t
<e e↵t
, for t 2 R \ {0}
MX (t) = ( ↵)t .
:
1, if t = 0.
Z 1
tX
Proof. MX (t) = E[e ] = etx fX (x)dx
1
Z
1
= etx dx
↵ ↵
8 t
<e e↵t
, for all t 2 R \ {0}
= ( ↵)t
:
1, for t = 0.
95/118
Definition
12 Some Important Continuous Distributions
96/118
Mean and Variance of Gamma Distribution
12 Some Important Continuous Distributions
Theorem
If X follows the Gamma distribution with parameters > 0 and r > 0, then
r
E[X] = and var(X) = r2 .
Proof.
Z 1
Variance
Z 1
1
Mean E[X 2 ] = x2 fX (x)dx = ( x)r+1 e x
dx
Z 1 1 0 (r)
Z
E[X] = x fX (x)dx 1 1
1 = 2
t r+1 e t
dt, taking x = t,
Z 1
(r) 0
1
= ( x)r e x dx (r + 2) (r + 1)r (r) (r + 1)r
0 (r) = 2 (r)
= 2 (r)
= 2
.
Z 1
1
= t r e t dt, taking x = t, var(X) = E[X 2 ] (E[X])2
(r) 0
(r + 1) r (r) r (r + 1)r r2
= = = . = 2 2
(r) (r)
r
= 2
.
97/118
MGF of Gamma Distribution
12 Some Important Continuous Distributions
Theorem
The moment-generating function for the Gamma distribution X with parameters > 0
and r > 0,
⇣ t⌘ r
MX (t) = 1 , for t < .
Z 1
Proof. MX (t) = E[etX ] = etx ( x)r 1 e x dx
0 (r)
Z 1
= ( x)r 1 e ( t)x dx
(r) 0
Z
1 ⇣ t⌘ r 1 r 1 s
= 1 s e ds, taking ( t)x = s,
(r) 0
1 ⇣ t⌘ r ⇣ t⌘ r
= 1 (r) = 1
(r)
98/118
Definition
12 Some Important Continuous Distributions
Example
An electronic component is known to have a useful life represented by an exponential
density with a failure rate of 10 5 failure per hour (i.e., = 10 5 ). Determine the
fraction of such components that would fail before the mean or expected life.
Ans.: 63.212% would fail before the mean life.
99/118
Mean, Variance and MGF of Exponential Distribution
12 Some Important Continuous Distributions
Theorem
The moment-generating function for the exponential distribution X with parameter > 0,
⇣ t⌘ 1
MX (t) = 1 , for t < .
100/118
Definition
12 Some Important Continuous Distributions
fX
Normal Distribution
p1
2⇡
x
µ
101/118
Properties of Normal Distribution
12 Some Important Continuous Distributions
Z 1
1. fX (x) 0 for all x, and fX (x)dx = 1.
1
2. lim fX (x) = 0 and lim fX (x) = 0.
x! 1 x!1
3. fX (x µ) = fX (x + µ), that is, fX is symmetric about x = µ.
4. The maximum value of fX occurs at x = µ, that is,
1 0.39894
max fX (x) = fX (µ) = p ⇡ .
1<x<1 2⇡
5. The points of inflection of fX are at x = µ ±
Note
The shorthand notation X ⇠ N (µ, 2) is often employed to indicate that the random
variable X is normally distributed.
102/118
Mean and Variance of Normal Distribution
12 Some Important Continuous Distributions
Theorem
If X follows the normal distribution with parameters µ ( 1 < µ < 1) and > 0, then
E[X] = µ and var(X) = 2 .
Proof.
Mean Variance
Z 1
E[X µ] var(X) = E[(X 2
µ) ] = (x µ)2 fX (x)dx
Z 1 1
Z
= (x µ) fX (x)dx 1 1 (x µ)2 /2 2
1 = (x µ)2 p e dx
Z 1 1 2⇡
1 2
(x µ) /2 2
Z
= (x µ) p e dx 2 1 (x µ)
t 2 /2
1 2⇡ = p t (te )dt, taking = t,
Z 1 2⇡ 1
2
t /2 (x µ) Z
= p te dt, taking = t, 2 1
2⇡ t 2 /2
1 = p e dt, applying int. by parts,
= 0. 2⇡ 1
2 p 2
= p 2⇡ = .
=) E[X] = µ. 2⇡
103/118
MGF of Normal Distribution
12 Some Important Continuous Distributions
Theorem
The moment-generating function for the normal distribution X with parameters
µ ( 1 < µ < 1) and > 0,
1
MX (t) = e 2 ( t) +µt for all
2
1 < t < 1.
Z 1 Z 1
Proof. 1
MX (t) = E[etX ] = etx fX (x)dx = etx p e (x µ) /2 dx
2 2
1 1 2⇡
Z 1 2 +2 2 t
1 (x µ)
= p e 2 2 dt
2⇡ 1
Z 1
1 1 (x r)2
= e 2 ( t) +µt p
2
e 2 2 dr, where r = µ + 2 t,
2⇡ 1
1
t)2 +µt
= e2( .
104/118
The Normal Approximation to the Binomial Distribution
12 Some Important Continuous Distributions
Let the random variable X denote the number of successes in n Bernoulli trials. We recall the
binomial distribution as ✓ ◆
n x
pX (x) = q (1 q)n x ; x = 1, 2, . . . , n,
x
with mean µX = nq and variance X2 = nq(1 q).
105/118
The Normal Approximation to the Binomial Distribution
12 Some Important Continuous Distributions
⇣ ⌘ ⇣ ⌘
x n x
log = (x + 0.5) log nq + (n x + 0.5) log n(1 q) .
p
Using the change of variable x =
q nq + nq(1 q) z, we have
q
x 1 q n x q
nq =1+ nq z and n(1 q) =1 n(1 q) z.
and p ⇣ q ⌘ p
log = (nq + nq(1 q) z + 0.5) log 1 + 1nqq z + (n(1 q) nq(1 q) z +
⇣ q ⌘
q
0.5) log 1 n(1 q) z ⇡ 0.5z + O(n
2 0.5 ).
106/118
Theorem
If X ⇠ N (µ, then for any constants a and b, b 6= 0, the random variable Y = a + bX
2 ),
where Z z
1
e s /2 ds
2
(z) := p
2⇡ 1
is called as Standard Normal Cumulative Distribution Function.
108/118
Definition
12 Some Important Continuous Distributions
110/118
Standard Normal Cumulative Distribution Function Table
12 Some Important Continuous Distributions
2.1 0.98214 0.98257 0.983 0.98341 0.98382 0.98422 0.98461 0.985 0.98537 0.98574
2.2 0.9861 0.98645 0.98679 0.98713 0.98745 0.98778 0.98809 0.9884 0.9887 0.98899
2.3 0.98928 0.98956 0.98983 0.9901 0.99036 0.99061 0.99086 0.99111 0.99134 0.99158
2.4 0.9918 0.99202 0.99224 0.99245 0.99266 0.99286 0.99305 0.99324 0.99343 0.99361
2.5 0.99379 0.99396 0.99413 0.9943 0.99446 0.99461 0.99477 0.99492 0.99506 0.9952
2.6 0.99534 0.99547 0.9956 0.99573 0.99585 0.99598 0.99609 0.99621 0.99632 0.99643
2.7 0.99653 0.99664 0.99674 0.99683 0.99693 0.99702 0.99711 0.9972 0.99728 0.99736
2.8 0.99744 0.99752 0.9976 0.99767 0.99774 0.99781 0.99788 0.99795 0.99801 0.99807
2.9 0.99813 0.99819 0.99825 0.99831 0.99836 0.99841 0.99846 0.99851 0.99856 0.99861
3 0.99865 0.99869 0.99874 0.99878 0.99882 0.99886 0.99889 0.99893 0.99896 0.999
3.1 0.99903 0.99906 0.9991 0.99913 0.99916 0.99918 0.99921 0.99924 0.99926 0.99929
3.2 0.99931 0.99934 0.99936 0.99938 0.9994 0.99942 0.99944 0.99946 0.99948 0.9995
3.3 0.99952 0.99953 0.99955 0.99957 0.99958 0.9996 0.99961 0.99962 0.99964 0.99965
3.4 0.99966 0.99968 0.99969 0.9997 0.99971 0.99972 0.99973 0.99974 0.99975 0.99976
3.5 0.99977 0.99978 0.99978 0.99979 0.9998 0.99981 0.99981 0.99982 0.99983 0.99983
3.6 0.99984 0.99985 0.99985 0.99986 0.99986 0.99987 0.99987 0.99988 0.99988 0.99989
3.7 0.99989 0.9999 0.9999 0.9999 0.99991 0.99991 0.99992 0.99992 0.99992 0.99992
3.8 0.99993 0.99993 0.99993 0.99994 0.99994 0.99994 0.99994 0.99995 0.99995 0.99995
3.9 0.99995 0.99995 0.99996 0.99996 0.99996 0.99996 0.99996 0.99996 0.99997 0.99997
4 0.99997 0.99997 0.99997 0.99997 0.99997 0.99997 0.99998 0.99998 0.99998 0.99998
111/118
Standard Normal Cumulative Distribution Function Table
12 Some Important Continuous Distributions
Example
If X is a normal random variable with mean µ = 3 and variance 2 = 16, find
1. P(X < 11),
2. P(X > 1),
3. P(2 < X < 7).
Ans.: 1. (2), 2. (1/2), 3. (1) ( 1/4).
https://eee.poriyaan.in/topic/normal-distributions--solved-example-problems-11271/
112/118
Topic
13 Markov’s and Chebyshev’s Inequalities
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
113/118
Topic
13 Markov’s and Chebyshev’s Inequalities
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution
114/118
Markov’s Inequality
13 Markov’s and Chebyshev’s Inequalities
Theorem
If X is a random variable that takes only non-negative values, then for any value a > 0
E[X]
P(X a)
a
proof. We have two Cases:
When X is discrete with mass pX When XZis continuous with density fX .
X X 1
E[X] = xpX (x)dx = xpX (x)dx E[X] = xfX (x)dx
x x 0 Z0 a Z
X X 1
= xpX (x)dx + xpX (x)dx = xfX (x)dx + xfX (x)dx
0xa x a Z0 1 a
Z
X X 1
xpX (x)dx apX (x)dx xfX (x)dx afX (x)dx
x X
a x a aZ a
1
=a pX (x)dx = aP(X a). =a fX (x)dx = aP(X a).
a
x a
Theorem
If X is a random variable with mean µX and variance var(X) = 2
X, then for any a > 0
2
X
P(|X µX | a)
a2
proof. Since (X µX )2 is a non-negative random variable, by applying Markov’s inequality we have
E[(x µX )2 ]
P (X µX )2 a2
a2
But since (X µX )2 a2 if and only if |X µX | a, the above equation is equivalent to
E[(x µX )2 ] 2
X
P |X µX | a =
a2 a2
Corollary
If X is a random variable with mean µX and variance var(X) = 2
X, then for any k > 0
1
P(|X µX | k X ) 2
k
116/118
Examples
13 Markov’s and Chebyshev’s Inequalities
Example
Suppose that it is known that the number of items produced in a factory during a week is
a random variable with the mean 50.
1. What can be said about the probability that this week’s production will exceed 75?
2. If the variance of a week’s production is known to equal 25, then what can be said
about the probability that this week’s production will be between 40 and 60?
Ans.: 1. 2/3, 2. 3/4
117/118
Thank you for listening!
Any questions?
118/118