Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
20 views118 pages

Ma2013e Mat IV.1

Uploaded by

jakharviru009
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views118 pages

Ma2013e Mat IV.1

Uploaded by

jakharviru009
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 118

Probability and Statistics

Department of Mathematics, NIT Calicut

1/118
Topic
1 Probability
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
2/118
Topic
1 Probability
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

3/118
What is Probability?
1 Probability

• Probability is a branch of mathematics that studies random events or experiments.


• It measures the chance that a particular event will occur.
• The probability of an event can be calculated by dividing the number of favorable
outcomes by the total number of possible outcomes.
• The probability of an event can be calculated by dividing the number of favorable
outcomes by the total number of possible outcomes.
How can we define it?
• To define probability mathematically, we need to set appropriate context.
• We will do so, by defining the elements of probability.

4/118
Elements of Probability
1 Probability

Experiment
An experiment is a systematic and controlled process or activity carried out to gather data
or information about a particular phenomenon or system.
• Deterministic Experiment
An experiment or a process in which the outcome can be predicted with certainty
before it is observed or performed.
— For instance, if we add 5 and 3, we know the outcome will always be 8, and there is no
randomness involved.
• Random Experiment
An experiment or a process in which the outcome cannot be predicted with
certainty.
— For example, tossing a coin is a random experiment because we cannot predict the
outcome of the coin toss with certainty.

5/118
Elements of Probability
1 Probability

Sample Space
• The set of all possible outcomes of a random experiment is called the sample space
and is represented by the symbol S.
— For example, the sample space of a coin toss is S = {H, T}, where H represents the
outcome of getting heads and T represents the outcome of getting tails.
• Each outcome in a sample space is called an element or a member of the sample
space, or simply a sample point.

Event
• An event is a subset of a sample space and is denoted by the symbol E.
— For example, the event of getting heads in a coin toss is E = {H}.
• Equally Likely Events
Equally likely events are events that have the same theoretical probability (or
likelihood) of occurring.
6/118
Naive Definition
1 Probability

Naive Definition of Probability


• If the sample space S is finite, then the probability of an event E is the likelihood of
the event E in S, so it can be defined as,
#E
P(E) = ,
#S
where #A denotes the number of elements in the set A and P(E) denotes the
probability of E.
— For example, the probability of getting heads in a fair coin toss is 1/2.
• P(S) = 1, P(;) = 0 and 0  P(E)  1, for any event E in S.
• Is this definition okay?
— The elements mentioned above form the basis of probability, but we need to develop a
strong foundation using set theory and counting principles to define it rigorously.

7/118
Set Theory
1 Probability

Theorem
If an event E does not happen, i.e., E = ; is the empty set, then

P(;) = 0.

Theorem
If Ec is the complement of an event E with respect to S, then P(Ec ) = 1 P(E).

Definition (Mutually Exclusive or Disjoint Events)


Two events E1 and E2 are mutually exclusive, or disjoint, if

E1 \ E2 = ;,

that is, if E1 and E2 have no elements in common.


8/118
Set Theory
1 Probability

Theorem
Suppose E1 and E2 are any two events in S, then

P(E1 [ E2 ) = P(E1 ) + P(E2 ) P(E1 \ E2 ).

Theorem
Suppose E1 and E2 are two mutually exclusive events in S, then

P(E1 [ E2 ) = P(E1 ) + P(E2 ).

9/118
Set Theory
1 Probability

Theorem
Suppose E1 and E2 are any two events in S such that E1 ⇢ E2 , then

P(E1 )  P(E2 ).

Theorem
Suppose E1 , E2 and E3 are any three events in S, then

P(E1 [E2 [E3 ) = P(E1 )+P(E2 )+P(E3 ) P(E1 \E2 ) P(E2 \E3 ) P(E1 \E3 )+P(E1 \E2 \E3 ).

10/118
Set Theory
1 Probability

Theorem
Suppose E1 , E2 , . . . , En are any n events in S, then
n
X n
X n
X
P([Ni=1 Ei ) = P(Ei ) + ( 1)1 P(Ei \ Ej ) + ( 1)2 P(Ei \ Ej \ Ek )
i=1 j=2, i<j k=3, i<j<k

+ · · · + ( 1)n 1
P(\ni=1 Ei ).

Definition
The events E1 , E2 , . . . , En in S are mutually exclusive (or mutually disjoint) if and only if
n
X
P([ni=1 Ei ) = P(Ei ).
i=1

11/118
Axiomatic Definition of Probability
1 Probability

Definition
The axiomatic definition of probability includes the following four axioms:
• For any event E in a sample space S, 0  P(E)  1.
• P(S) = 1.
• For any finite number n of mutually exclusive events E1 , . . . , En in S,
n
X
P([ni=1 Ei ) = P(Ei ).
i=1

• For any denumerable sequence of mutually exclusive events E1 , E2 , . . . , En , . . . in S,


1
X
P([1
i=1 Ei ) = P(Ei ).
i=1
12/118
Counting
1 Probability

Multiplication Rule
Suppose we consider a compound Experiment E consisting of two experiments E1 and E2 .
Let #S1 = n1 and #S2 = n2 are the sample sizes (number of outcomes) of E1 and E2 ,
respectively, then there are #S = n1 · n2 outcomes to E.

Example
Consider tossing a true coin two times.

E1 : First Toss E2 : Second Toss


H
H
T
H
T
T
#S = #S1 · #S2 = 2 · 2 = 4.
13/118
Counting
1 Probability

Generalized Multiplication Rule


Suppose we consider a compound Experiment E consisting of k experiments E1 , . . . , Ek . If
sample spaces of S1 , . . . , Sk contain n1 , . . . , nk number of outcomes to experiments
E1 , . . . , Ek , respectively, then there are n1 · n2 · . . . · nk outcomes to E.

Example
A true coin is tossed three times. Using a tree diagram, find all possible outcomes.
Ans=8.

Example
Suppose we toss a true coin and cast a true die. Using a tree diagram, find all possible
outcomes.
Ans=12.

14/118
Permutation and Combination
1 Probability

Choose k objects from n object:

Order Matters Order Doesn’t Matter

n+k 1
With Replacements nk k

nP n
Without Replacements k k

15/118
Conditional Probability and Product Rule
1 Probability

Definition (Conditional Probability)


The conditional probability of event E2 given that the event E1 has already occurred is,
denoted by P(E2 |E1 ), and defined as

P(E2 \ E1 )
P(E2 |E1 ) = ,
P(E1 )

provided that P(E1 ) > 0.

Product Rule
For any two events E1 and E2 ,

P(E2 \ E1 ) = P(E2 |E1 ) · P(E1 ),

provided P(E1 ) > 0.


16/118
Independent Events
1 Probability

Definition
Two events E1 and E2 are said to be independent if

P(E1 |E2 ) = P(E1 ) and P(E2 |E1 ) = P(E2 ),

provided P(E1 ) 6= 0 and P(E2 ) 6= 0.

Theorem
If events E1 and E2 are independent, then

P(E1 \ E2 ) = P(E1 ) · P(E2 ),

provided P(E1 ) > 0.

17/118
Independent Events
1 Probability

Theorem
If E1 and E2 are independent events, then the following holds:
1. E1 and E2c are independent events.
2. E1c and E2c are independent events.
3. E1c and E2 are independent events.

Definition
The n events E1 , E2 , . . . , En are mutually independent if and only if the probability of the
intersection of any k (k = 2, 3, . . . , n) of these events is the product of their respective
probability, that is, for k = 2, 3, . . . , n

P(Ei1 \ Ei2 \ . . . \ Eik ) = P(Ei1 ) · P(Ei2 ) · . . . · P(Eik ).

18/118
Partition of the Sample Space
1 Probability

Definition
If E1 , E2 , . . . , En are mutually exclusive events in the sample space S, then E1 , E2 , . . . , En
form a partition of S if
E1 [ E2 [ . . . [ En = S.

19/118
Rule of Elimination or Total Probability, and Bayes’
Theorem
1 Probability

Theorem
If events E1 , E2 , . . . , En represent a partition of the sample space S, then the total
probability of an arbitrary event E in S is
n
X
P(E) = P(Ei ) · P(E|Ei )
i=1

Theorem (Bayes’ Theorem)


If events E1 , E2 , . . . , En represent a partition of the sample space S, and E is an arbitrary
event in S, then

P(Ek ) · P(E|Ek )
P(Ek |E) = Pn , for k = 1, 2, . . . , n.
i=1 P(Ei ) · P(E|Ei )
20/118
Topic
2 Random Variable
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
21/118
Topic
2 Random Variable
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

22/118
Examples
2 Random Variable

Example
Experiment: Consider tossing two fair coins and observe the up face.
• It is a random experiment with 4 outcomes.
— So the sample space is: S = {HH, HT, TH, TT}.
• Let X(e), for outcome e 2 S, represent the number of heads obtained in a toss of the
two coins, then X(e) = 2 or 1 or 0.

X RX
S

HH 2
HT
1
TH
TT 0
X : Random Variable.
23/118 RX : Range of X.
Definition
2 Random Variable
What is a Random Variable?
Definition
Let S be the sample space for an experiment and X : S ! R be a real-valued function that
assigns a real number X(e) for every outcome e 2 S, then X is called as random variable.

OR

A random variable is a real-valued function whose domain is the sample space of a


random experiment.

• Random variables are denoted by capital letters X, Y, and so on to distinguish them


from their possible values given in lowercase x, y.
• A random variable which takes only two values 0 and 1 is called a Bernoulli random
variable.

24/118
Examples of Random Variables
2 Random Variable

Example
• Experiment: Roll a pair of dice and observe the up faces.
— The sample space:
S = {(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6),
(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6),
(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6),
(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6),
(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6),
(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)}.
— Attribute we are observing is the sum of the up faces are 7.
— Then the random variable
( X : S ! R is given by
1, if e 2 {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)},
X(e) =
0, otherwise.

25/118
Examples of Random Variables
2 Random Variable

Example
• Experiment: Suppose that an individual purchases two electronic components each
of which may be either defective (d) or acceptable (a).
— The sample space: S = {(d, d), (d, a), (a, d), (a, a)}.
— Attribute we are observing is the acceptable components.
— Then the random variable X : S !8R is given by
<2, if e 2 {(a, a)},
>
X(e) = 1, if e 2 {(d, a), (a, d)},
>
:
0, otherwise.

26/118
Examples of Random Variables
2 Random Variable

Example
• Experiment: A cathode ray tube is manufactured, put on life test, and aged to failure.
The elapsed time (in hours) at failure is recorded.
— The sample space: S = {t : t 0} = [0, 1).
— Attribute we are observing is the time to failure.
— Then the random variable X : S ! R is given by X(t) = t.

Example
• Experiment: In a particular chemical plant the volume produced per day for a
particular product ranges between a minimum value a and a maximum value b,
which corresponds to the capacity. A day is randomly selected and the amount
produced is observed.
— The sample space: S = {x : a  x  b} = [a, b].
— Attribute we are observing is the amount produced.
— Then the random variable X : S ! R is given by X(t) = t.
27/118
Types of Random Variables
2 Random Variable

1. Discrete Random Variable


A discrete random variable is a random variable that takes on a countable number of
possible values.
— The values that a discrete random variable can take on are typically integers or a finite
set of values.
— For Example:
— Consider tossing two coins.
Here S = {HH, HT, TH, TT}.
Define X : S ! R by
X(A) = Number of heads in A 1.
X takes 3 values 1, 0 and 1. So it is a discrete random variable.
— The number of defective items in a production run.
OR
The number of heads in a series of coin flipping.

28/118
Types of Random Variables
2 Random Variable

1. Continuous Random Variable


A continuous random variable is a random variable whose range consists of an
interval of values.
— The probability of the random variable to take a single particular value is 0.
— For Example:
— Consider two bulbs A and B for switched on/off time duration.
Here S = [0, 1).
Define X : S ! R by
X(t) = time taken for the bulb A to go off.
X has infinitely many possible values, continuous.
X may take any value in [0, 1).
— The height or weight of a person.
OR
The amount of rainfall in a given day.

29/118
Important Points and Notations
2 Random Variable

Let S be a sample space of an experiment and X a random variable with range space RX .
Definition
If A is an event in S (that is, A ⇢ S) and B is an event in Rx (that is, B ⇢ RX ), then A and B
are equivalent events if
A = {e 2 S : X(e) 2 B}.
More simply, if event A in S consists of all outcomes e in S for which X(e) 2 B, then A and
B are equivalent. Whenever A occurs, B occurs, and vice-versa.

30/118
Important Points and Notations
2 Random Variable

Definition
If A is an event in S and B is an event in Rx , then we define the probability of B as

PX (B) = P(A), where A = {e 2 S : X(e) 2 B}.

• In the above definition, A = {e 2 S : X(e) 2 B} = X 1 (B).

• The parenthesis (X = x), or brace {X = x}, or capital bracket [X = x] are used to


denote an event in RX , where x 2 RX .
• Thus (X = x), (X < x), (X  x), (x < X), (x  X) and (x  X  y) are all event in
RX , where x, y 2 RX .
• Here, the event (X = x) in RX is equivalent to the event {e 2 S : X(e) = x} in S.
• Thus, PX (X = x) = P(A), where A = {e 2 S : X(e) = x}.

31/118
Important Points and Notations
2 Random Variable

Example
Experiment: Toss two coins and observe the up face.
• It is a random experiment with 4 outcomes.
— The sample space is: S = {HH, HT, TH, TT}.
• The random variable X represent the number of heads in a toss of two coins.
• RX = {0, 1, 2}.
Events in RX Events in S PX (X = xi )
X=2 {HH} 1
4
X=1 {HT, TH} 2
4
X=0 {TT} 1
4
#RX
X
PX (X = xi ) = 1
i=1
32/118
Important Points
2 Random Variable

Example
• Experiment: Roll a pair of fair dice and observe the up faces.
— The random variable X represent the sum of up faces in a roll of two fair dices.
— RX = {2, 3, . . . , 12}.
Events (X = xi ) in RX Events in S PX (X = xi )
X=2 {(1, 1)} 1
36
X=3 {(1, 2), (2, 1)} 2
36
X=4 {(1, 3), (2, 2), (3, 1)} 3
36
X=5 {(1, 4), (2, 3), (3, 2), (4, 1)} 4
36
X=6 {(1, 5)(2, 4), (3, 3), (4, 2), (5, 1)} 5
36
X=7 {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)} 6
36
X=8 {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} 5
36
X=9 {(3, 6), (4, 5), (5, 4), (6, 3)} 4
36
X = 10 {(4, 6), (5, 5), (6, 4)} 3
36
X = 11 {(5, 6), (6, 5)} 2
36
X = 12 {(6, 6)} 1
36
#RX
X
PX (X = xi ) = 1
33/118 i=1
Important Points
2 Random Variable
• Let S be a sample space for an experiment and X be a discrete random variable on S
with range space RX = {x1 , x2 , . . .}, then
#R
! #R
[X XX
1 = P(S) = PX (X = xi ) = PX (X = xi )
i=1 i=1

• If X and Y are random variables, then c1 X + c2 Y, c1 , c2 2 R, is also a random


variable.
• If X is a random variable on a sample space S, then
1
1. is also a random variable, where X1 (e) := 1 if X(e) = 0, e 2 S.
X
2. X+ (e) := max{0, X(e)}, for e 2 S. Then X+ is a random variable.
3. X (e) := min{0, X(e)}, for e 2 S. Then X is a random variable.
4. |X| is a random variable.
• If X and Y are random variables, then max{X, Y} and min{X, Y} is also a random
variable.
34/118
Topic
3 Distribution Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
35/118
Topic
3 Distribution Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

36/118
Definition
3 Distribution Function

Definition
Let X be a random variable, the function F : R ! R defined by

FX (x) := PX (X  x),

is called the distribution function or cumulative function or cumulative distribution


function (CDF) of the random variable X.

37/118
Distribution Function
3 Distribution Function

Example
In the case of the three-times coin-tossing experiment, the range of the random variable X is
RX = {0, 1, 2, 3}, where X represents the number of heads. Find the distribution function for the
following table:

Events X = xi in RX PX (X = xi )
X=0 1/8
X=1 3/8
X=2 3/8
X=3 1/8
8
>
>
> 0, x < 0,
>
>
>
<1/8, 0  x < 1,
Ans. FX (x) = 4/8, 1  x < 2,
>
>
>7/8,
> 2  x < 3,
>
>
:1, x 3
38/118
Topic
4 Probability Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
39/118
Topic
4 Probability Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

40/118
Definition
4 Probability Function

Definition (Probability Function or Probability Mass Function)


If X be a random variable with range space RX = {x1 , x2 , . . . , xn , . . .}. The function pX
which associates
pX (x) = PX (X = x), for x 2 RX ,
and satisfy the following conditions
1. pX (xi ) 0 for all i,
X1
2. pX (xi ) = 1,
i=1
is called the probability function or probability mass function (PMF) or discrete density
function (PMF).

41/118
Probability Distribution
4 Probability Function

Definition (Probability Distribution)


The pair (xi , pX (xi )), i = 1, 2, . . . , n, . . . , is called as probability distribution of discrete random
variable X.
• The probability distribution is represented in tabular or graphical or mathematical form.

Example
In the case of the three-times coin-tossing experiment, the range of the random variable X is
RX = {0, 1, 2, 3}, where X represents the number of heads.

x p(x)
0 1/8
1 3/8
2 3/8
3 1/8

42/118
Table: Probability distribution table.
Topic
5 Distribution Function and Probability Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
43/118
Topic
5 Distribution Function and Probability Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

44/118
Properties
5 Distribution Function and Probability Function

• 0  FX (x)  1, for all x 2 R.


• lim FX (x) = 1 and lim FX (x) = 0.
x!1 x! 1
• The function FX is non-decreasing, that is, if x1  x2 then FX (x1 )  FX (x2 ).
• The function is continuous from the right, that is, for all x and > 0,

lim FX (x + ) = FX (x)
!0

• pX (xi ) = FX (xi ) FX (xi


1 ), i > 1.
X
• FX (xi ) = PX (X  xi ) = pX (x).
xxi

45/118
Examples
5 Distribution Function and Probability Function

Example
A discrete random variable X has the following probability distribution table:
x 0 1 2 3 4 5 6 7
p(x) 0 k 2k 2k 3k k2 2k2 7k2 +k

Then
1. Find k. Ans. k = 1/10.
2. Evaluate P(X < 6), P(X 6) and P(0 < X < 5). Ans. 81/100, 19/100 and 4/5.
3. If P(X  a) > 1/6, find the minimum value of a. Ans. a = 2.
4. Determine the distribution function of X.

46/118
Examples
5 Distribution Function and Probability Function

Example
(x
, x = 1, 2, 3, 4, 5,
If p(x) = 15
0, otherwise.
Then find
1. P(X = 1 or 2). Ans. 1/5.
2. P( 12 < X < 5
2 X > 1). Ans. 1/7.

47/118
Topic
6 Probability Density Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
48/118
Topic
6 Probability Density Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

49/118
Definition
6 Probability Density Function

Definition (Probability Density Function (PDF))


Let X be a continuous random variable. Define fX (x) as follows
P(x < X  x + )
fX (x) := lim , (whenever the limit exist),
!0+
then fX (x) gives us the probability density at point x. If fX (x) exist for all x 2 R, then the
function fX is called the probability density function.

Consider a continuous random variable X with a differentiable CDF FX , then


d
fX (x) = FX (x).
dx
Hence, it follows that Z x
PX (X  x) = FX (x) = f (s)ds.
1

50/118
Properties
6 Probability Density Function

Let fX be the PDF of a continuous random variable X, then


• fX (x) 0, for all x 2 RX , and fX (x) = 0, if x 2
/ RX .
R1
• 1 fX (x)dx = 1.
• fX is piece-wise continuous.
• For > 0, we have for any value x0
PX (X = x0 ) = lim P(x0  X  (x0 + )) = lim [FX (x0 + ) FX (x0 )] = 0.
!0 !0
• P(a  X  b) = P(a < X  b) = P(a  X < b) = P(a < X < b).
Rx
• FX (x) = 1 fX (s)ds.
Rb
• P(a < X  b) = P(X  b) P(X  a) = a fX (x)dx.

51/118
Examples
6 Probability Density Function

Let X be a continuous random variable


( with the following PDF
3 exp( x), x 0,
fX (x) =
0, otherwise,
where is a positive constant.
1. Find the value of .
2. Find the CDF of X, FX .
3. Find PX (1 < X < 3).

52/118
Topic
7 Some Characteristics of Distributions
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
53/118
Topic
7 Some Characteristics of Distributions
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

54/118
Mean
7 Some Characteristics of Distributions

The mean µX of a random variable X.


8P
< xi pX (xi ),
> for discrete X,
i
µX = R1
>
: xfX (x)dx, for continuous X.
1
This measure provides an indication of central tendencies in the random variable X.

Example
In the case of the three-times coin-tossing experiment, the range of the random variable
X is RX = {0, 1, 2, 3}, where X represents the number of heads. Find the mean of X.

x 0 1 2 3
Sol.
PX (X = x) 1/8 3/8 3/8 1/8
µX = 0 ⇥ 1/8 + 1 ⇥ 3/8 + 2 ⇥ 3/8 + 3 ⇥ 1/8 = 3/2.

55/118
Mean
7 Some Characteristics of Distributions

Example
Let X be a continuous random variable(
with PDF
3/x4 , x 1,
fX (x) =
0, otherwise.
Find the mean of X.
R1 R1
Sol. µX = xfX (x)dx = (3/x3 )dx = 3/2.
1 1

56/118
Median
7 Some Characteristics of Distributions

The median mX of a random variable X


In case of continuous X:
The median divides the entireZ distribution into
Z 1two equal parts, that is,
mX
1
fX (x)dx = fX (x)dx =
1 mX 2
Thus it can be calculated using
Z mX the relations: Z 1
1 1
fX (x)dx = and fX (x)dx =
1 2 mX 2
In case of discrete X:
The median mX = x1 +x 2 , where
2

• x1 is the greatest value X can take such that P(X  x1 )  0.5,


• x2 is the smallest value X can take such that P(X  x2 ) 0.5.

57/118
Median
7 Some Characteristics of Distributions

Example

3/x4 , x 1,
Let X be a continuous random variable with PDF fX (x) =
0, otherwise.
Find the median of X.
m
RX R1 1
Sol. Clearly mX > 1, then fX (x)dx = 1
2
= fX (x)dx =) mX = (2) 3 .
1 mX

Example
In the case of the three-times coin-tossing experiment, where X represents the number of heads.
Find the median of X.

x 0 1 2 3
Sol.
PX (X = x) 1/8 3/8 3/8 1/8
P(X  1) = P(X = 0) + P(X = 1) = 1/8 + 3/8 = 1/2  0.5 and
P(X  2) = P(X = 0) + P(X = 1) + P(X = 2) = 1/8 + 3/8 + 3/8 = 7/8 0.5. Thus, mX = 1+2
3
= 1.5.
58/118
Mode
7 Some Characteristics of Distributions

The mode MX of a random variable X


In case of continuous X:
Given a continuous random variable X, its mode MX is the value of X that is most likely to
occur.
Consequently, the mode is equal to the value x 2 RX at which the probability density
function, fX (x), reaches a maximum.
In case of discrete X:
Given a discrete random variable X, its mode MX is the value of X that is most likely to
occur.
Consequently, the mode is equal to the value x 2 RX at which the probability mass
function, pX (x) = P(X = x), reaches a maximum.

59/118
Variance and Standard Deviation
7 Some Characteristics of Distributions

The variance var(X) of a random variable X.


8P
< (xi µX )2 pX (xi ), for discrete X,
var(X) = i
:R 1
1 (x µX )2 fX (x)dx, for continuous X.
This measure describes the spread or dispersion of the probability associated with the
elements in RX .

The standard deviation X of a random variable X.


The standard deviation, X , of a random variable X tells us how far away from the mean
µX we can expect the values of X to be. It is a positive square root of the variance of X,
that is, p
X = var(X).
It is noted that a small value of X indicates little dispersion, whereas a large value of X
60/118
indicates a greater dispersion.
Variance and Standard Deviation
7 Some Characteristics of Distributions

Example
Let X be a continuous random variable( with PDF
3/x4 , x 1,
fX (x) =
0, otherwise.
Find the variance and standard deviation of X.
R1 R1 R1
Sol. var(X) = (x µX )2 fX (x)dx = (x 3 2
2
) (3/x4 )dx =3 1
x2
3
x2
+ 9
4x3
dx = 3/4.
1 1 1
p p
X = var(X) = 3/2.

61/118
Moment about the Origin and Moment about the Mean
7 Some Characteristics of Distributions

The origin moment/kth -origin moment µ0k of a random variable X.


The kth moment about the origin
8P or origin moment is given by
< xik pX (xi ), for discrete X,
0
µk = Ri
: 1 k
1 x fX (x)dx, for continuous X.

The central moment/kth -central moment µk of a random variable X.


The moment about the mean
8P is called the central moment and is given by
< (xi µX )k pX (xi ), for discrete X,
µk = Ri
: 1
1 (x µX )k fX (x)dx, for continuous X.

Note that : µX = µ01 and var(X) = µ2 .


62/118
Topic
8 Functions of a Random Variable
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
63/118
Topic
8 Functions of a Random Variable
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

64/118
Functions of a Discrete Random Variable
8 Functions of a Random Variable

• Let X be a discrete random variable (defined on a sample space S) with range space
RX = {x1 , x2 , . . .} and H be a real-valued function on a domain that contains RX .
— Then, the composition function H X : S ! R given by H X(e) = H(X(e)) is well
defined.
— The function Y = H X = H(X) is also a discrete random variable.
— Let RY = {y1 , y2 , . . .} be the range spaces of Y.
— Suppose ⌦i := H 1 (yi ) = {x 2 RX : H(x) = yi }, then the probability mass function
(PMF) for Y at yi is given by P
pY (yi ) = pY (Y = yi ) = pX (X = x),
x2⌦i
where pX is a PMF of X.

65/118
Functions of a Discrete Random Variable
8 Functions of a Random Variable

Example
In the case of the three-times coin-tossing experiment, the range of the random variable X is
RX = {0, 1, 2, 3}, where X represents the number of heads.

x 0 1 2 3
PX (X = x) 1/8 3/8 3/8 1/8

1. If Y = 2X 1, find the PMF pY of Y.


2. If Y = |X 2|, find the PMF pY of Y.

Sol.
y 1 1 3 5
1. pY (y) = PY (Y = y) = PX (X = (y + 1)/2), then
pY (y) 1/8 3/8 3/8 1/8
y 0 1 2
2. pY (y) = PY (Y = y) = PX ((X = y + 2) or (X = 2 y)), then
pY (y) 3/8 1/2 1/8
66/118
Continuous Functions of a Continuous Random Variable
8 Functions of a Random Variable

• Let X be a continuous random variable with probability density function (PDF) fX and
H be a real-valued function on a domain that contains RX .
— Then, the function Y = H X = H(X) is also a continuous random variable.
— Let fY be PDF of Y. fY may be found by performing the following three steps:
1. Obtain the CDF, FY (y) = PY (Y  y) = PX (B), where B is the event in RX equivalent to
(Y  y) in Y, that is, B := H 1 (y) = {x 2 RX : H(x) = y}.
d
2. Then obtain fY (y) = FY (y).
dy
3. Find the range space of the new random variable Y.

67/118
Continuous Functions of a Continuous Random Variable
8 Functions of a Random Variable

Example
Suppose the random variable X has the(following PDF:
x/8, 0  x  4,
fX (x) =
0, otherwise.
If Y = H(X), find the PDF fY of Y, where H(x) = 2x + 8.

Sol. FY (y) = PY (Y  y) = PX (2X + 8  y) = PX (X  (y 8)/2).


Z (y 8)/2 Z (y 8)/2
Thus, FY (y) = fX (x)dx = fX (x)dx = (y2 16y + 64)/64.
1 0

d
fY (y) = FY (y) = (y 8)/32.
dy
(
(y 8)/32, 8  y  16,
fY (y) =
0, otherwise.
68/118
Topic
9 Expectation
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
69/118
Topic
9 Expectation
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

70/118
Definition
9 Expectation

Definition (Expectation)
Let X be a random variable and Y = H(X) be a function of X, then the expected value of
H(X) is defined as follows: X
8
>
> H(xi ) · pX (xi ), if X is a discrete random variable,
>
>
< i
E[Y] = E[H(X)] := Z1
>
>
>
> H(x) · fX (x), if X is a continuous random variable.
:
1

Note that:
• In the above definition, in the case where X is a continuous random variable, we
restrict H so that Y = H(X) is a continuous random variable.
• We assume that, in the above definition, the right-hand side sum and the integral
71/118
exist (finite).
Some Important Observations
9 Expectation

• If H(X) = X, then E[X] = µX , (mean of X).


• If H(X) = (X µX )2 , then E[(X µ )2 ] =var(X), (variance of X).
⇥ ⇤ X 2
— Thus, var(X) = E (X E[X])⇥
2
= E[X ] (E[X])
⇤ .
2

— In general, var(H(X)) = E (H(X) E[H(X)]) 2

• The kth moment about the origin µ0k = E[X k ], (origin moment of X).
⇥ ⇤
• The kth moment about the mean µk = E (X E[X])k , (central moment of X).
• (E[X])2  (E[|X|])2
 E[X 2 ].
• Let a, b 2 R be constants, then
— E[a] = a,
— E[aX + b] = aE[X] + b,
— var(aX + b)=a2 var(X).
— P(a < X  b) = 1, then a < E[X]  b.

72/118
Topic
10 Moment-Generating Function
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
73/118
Topic
10 Moment-Generating Function
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

74/118
Definition
10 Moment-Generating Function

Definition (Moment-Generating Function (MGF) )


Given a random variable X, the moment-generating function MX (t) of its probability
distribution is the expected value of exp(tX). Expressed mathematically,
8X
>
> exp(txi ) · pX (xi ), if X is a discrete random variable,
>
>
< i
MX (t) := E[exp(tX)] = Z1
>
>
>
> exp(tx) · fX (x), if X is a continuous random variable.
:
1
The domain of MX is the set of all real numbers t such that E[exp(tX)] exists.

Remark
For real constants a and b,
MaX+b (t) = E[exp(t(aX + b))] = exp(bt)E[exp(atX)] = exp(bt) MX (at)

75/118
MGF and Moments
10 Moment-Generating Function

Theorem (Uniqueness Theorem)


The MGF of a random variable X (or a distribution), if it exists, uniquely determines the
distribution.

Theorem
If X has moment-generating function MX (t), with MX (t) < 1 for |t| < a for some a > 0,
then the distribution of X is determined uniquely, and
1 1
X E[X k ] t k X µ0k t k
MX (t) = = ,
k! k!
k=0 k=0
where µ0K is the kth moment about the origin. In this case,
dk
µ0k = k MX (t) = E[X k exp(tX)] .
dt t=0 t=0

76/118
Limitations of MGF
10 Moment-Generating Function

A random variable X may have no moments, but the MGF exists.


E.g., consider a discrete random variable X with PMF:
1
pX (x) = , x = 1, 2, 3, . . . .
x(x + 1)
X has no moments, but its MGF is
MX (t) = 1 + exp( t) 1 log 1 exp(t) , for t < 0.

A random variable X may have MGF and some or all moments, yet the MGF
does not generate the moments.
E.g., consider a discrete random variable X with PMF:
exp( 1)
pX (2x ) = , x = 0, 1, 2, 3, . . . .
x!

77/118
Limitations of MGF
10 Moment-Generating Function

A random variable X may have some or all moments, but the MGF does not exist
except possibly at one point.
E.g., consider a discrete random variable X with PMF:
exp( 1)
pX (±2x ) = , x = 0, 1, 2, 3, . . . .
x! 2

78/118
Topic
11 Some Important Discrete Distributions
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
79/118
Topic
11 Some Important Discrete Distributions
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

80/118
Definition
11 Some Important Discrete Distributions

Definition (Binomial Distribution)


The random variable X that denotes the number of successes in n Bernoulli trials has the
binomial distribution given by
pX (x) = nx (1 q)n x qx ; x = 0, 1, 2, . . . , n,
where the parameter 0  q  1. Sometimes, it is also denoted by b(x; n, q).
For Example: pX (x) = PX (X = ``x success in n = 3 trails").

S (sample space) X RX (Range of X)

FFF 0
FFS
FSF 1
SFF
FSS
SFS 2
SSF
SSS 3
S : Success.
81/118 F : Failure.
Mean and Variance of Binomial Distribution
11 Some Important Discrete Distributions

Theorem
If X follows the binomial distribution with parameter 0 < q < 1, then
E[X] = nq and var(X) = nq(1 q).
Proof.
Mean Variance
n
X
n
X Xn ⇣n⌘ E[X(X 1)] = x(x 1)pX (x)
E[X] = x pX (x) = x (1 q)n x x
q x=0
x=0 x=0
x
n ⇣
X n 2⌘
n
X n! = n(n 1)q2 (1 q)n y x 2
q
= x (1 q) n x x
q x=2
x 2
x=1
x!(n x)!
n
X2 ⇣ n 2⌘
n
X (n 1)! n x x 1
= n(n 1)q2 (1 q)n 2 y y
q
= nq (1 q) q y=0
y
x=1
(x 1)!(n x)!
n
= n(n 1)q2
X1 ⇣ n 1⌘ n 1 y y
= nq (1 q) q = nq 2
var(X) = E[X ] (E[X])2 = E[X(X 1)] + E[X] (E[X])2
y=0
y
= n(n 1)q2 + nq n2 q2 = nq(q 1).
82/118
MGF of Binomial Distribution
11 Some Important Discrete Distributions

Theorem
The moment-generating function for the binomial distribution with parameter 0 < q < 1,
MX (t) = (q exp(t) + 1 q)n for t 2 R.
Proof. n
X
MX (t) = E[exp(tX)] = exp(txi )pX (xi )
i=0
n n
!
X X n
= exp(ti)pX (i) = exp(ti) (1 q)n i qi
i=0 i=0
i
n
!
X n i
= (1 q)n i
exp(t)q
i=0
i
n
= q exp(t) + (1 q) , for all t

83/118
Definition
11 Some Important Discrete Distributions

Definition (Geometric Distribution)


The number X of Bernoulli trials needed to get one success, follows the geometric
distribution with PMF
pX (x) = (1 q)x 1 q; x = 1, 2, . . . ,
where the parameter 0 < q < 1.

Theorem
If X follows the Geometric distribution with parameter q, then
E[X] = 1q and var(X) = 1q2q .
Proof. Exercise.

84/118
Definition
11 Some Important Discrete Distributions

Theorem
The moment-generating function for Geometric distribution with parameter 0 < q < 1,,
q exp(t) 1
MX (t) = , for t < log .
1 (1 q) exp(t) 1 q
1
X
Proof.
MX (t) = E[exp(tX)] = exp(txi )pX (xi )
i=1
X1 1
X
= exp(ti)pX (i) = exp(ti)(1 q)i 1
q
i=1 i=1
1
X i 1
= q exp(t) exp(t)(1 q)
i=1
q exp(t) 1
= , for t < log .
1 (1 q) exp(t) 1 q

85/118
Definition
11 Some Important Discrete Distributions

Definition (Poisson Distribution)


A random variable X, taking the values 0, 1, 2, . . . is said to follow the Poisson distribution
if its probability mass function is given by
x
pX (x) = ex! ; x = 0, 1, 2, . . .
where the parameter > 0. Sometimes, it is also denoted by f (x; ).

Example
Suppose that the average number of accidents occurring weekly on a particular stretch of
a highway equals 3. Calculate the probability that there is at least one accident this week.
Ans.=0.9502

86/118
Poisson Approximation of Binomial Distribution
11 Some Important Discrete Distributions

Derivation of Poisson distribution from the binomial


From the binomial distribution✓ ◆
n
pX (x) = (1 q)n x qx ; x = 0, 1, 2, . . . , n.
x
If we let = nq, that is, q = n and 1q = 1 n , we obtaion

x n x
n(n 1) · · · (n x + 1) n
pX (x) =
x! n n
x
 ✓ ◆✓ ◆ ✓ ◆ ✓ ◆n ✓ ◆ x
1 2 x 1
= (1) 1 1 ··· 1 1 1
x! n n n n n
Now, in the above, letting n ! 1 and q ! 0 such a way that nq = is fixed, we have
xe
pX (x) = ; x = 0, 1, 2, . . . .
x!

87/118
Poisson Approximation of Binomial Distribution
11 Some Important Discrete Distributions

Example
Suppose the probability that an item produced by a certain machine will be defective is
0.1. Find the probability that a sample of 10 items will contain at most one defective item.
Assume that the quality of successive items is independent.
Ans.=2e 1 ⇡ 0.7358

88/118
Mean and Variance of Poisson Distribution
11 Some Important Discrete Distributions

Theorem
If X follows the binomial distribution with parameter > 0, then
E[X] = and var(X) = .
Proof.
Variance
1
X
E[X(X 1)] = x(x 1)pX (x)
Mean x=0
1 1 xe 1
X xe 1
X x 2
X X 2
E[X] = x pX (x) = x = x(x 1) = e
x! x=2
x! x=2
(x 2)!
x=0 x=0
1 xe 1 x 1 1
X y
X X 2 2
= x = e = e =
x! (x 1)! y=0
y!
x=1 x=1
1 y
X var(X) = E[X 2 ] (E[X])2
= e = e e = .
y=0
y! = E[X(X 1)] + E[X] (E[X])2
2 2
= + = .
89/118
MGF of Poisson Distribution
11 Some Important Discrete Distributions

Theorem
The moment-generating function for the Poisson distribution with parameter > 0,
(et 1)
MX (t) = e for all t 2 R.
1
X 1
X 1
X ie
Proof. MX (t) = E[etX ] = etxi pX (xi ) = eti pX (i) = eti
i!
i=0 i=0 i=0
1
X t )i
(e et (et 1)
=e =e e =e , for all t 2 R
i!
i=0

90/118
Topic
12 Some Important Continuous Distributions
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
91/118
Topic
12 Some Important Continuous Distributions
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

92/118
Definition
12 Some Important Continuous Distributions

Definition (Uniform Distribution)


The continuous random variable X is said to be uniformly distributed over the interval
[↵, ] if its probability density function(is given by
↵, ↵  x 
1
fX (x) =
0, otherwise.
The distribution fX is called uniform distribution.

Example
Buses arrive at a specified stop at 15-minute intervals starting at 7 AM, that is, they arrive at 7, 7 : 15, 7 : 30,
7 : 45, and so on. If a passenger arrives at the stop at a time that is uniformly distributed between 7 and
7 : 30, find the probability that he waits
1. less than 5 minutes for a bus;
2. at least 12 minutes for a bus.
Ans.= 1. 1/3, 2. 1/5.
93/118
Mean and Variance of Uniform Distribution
12 Some Important Continuous Distributions

Theorem
If X follows the uniform distribution over the interval [↵, ], then
↵+ 2
E[X] = and var(X) = (↵ 12 ) .
2
Proof.
Z
VarianceZ
1 1
2
E[X ] = x2 fX (x)dx = x2 dx
1 ↵ ↵
Mean =
1 3 ↵3
=
↵2 + 2 +↵
.
Z 1 Z ↵ 3 3
1
E[X] = x fX (x)dx = x dx
1 ↵ ↵ var(X) = E[X 2 ] (E[X])2
2 ↵2
1 ↵+ ↵2 + 2 +↵ (↵ + )2
= = . =
↵ 2 2 3 4
(↵ )2
= .
12

94/118
MGF of Uniform Distribution
12 Some Important Continuous Distributions

Theorem
The moment-generating function for the uniform distribution X over the interval [↵, ]
8 t
<e e↵t
, for t 2 R \ {0}
MX (t) = ( ↵)t .
:
1, if t = 0.
Z 1
tX
Proof. MX (t) = E[e ] = etx fX (x)dx
1
Z
1
= etx dx
↵ ↵
8 t
<e e↵t
, for all t 2 R \ {0}
= ( ↵)t
:
1, for t = 0.
95/118
Definition
12 Some Important Continuous Distributions

Definition (Gamma Distribution)


The continuous random variable X is said to be a Gamma distribution with parameters
> 0 and r > 0, if its probability(density function is given by
r 1e x, x > 0
(r) ( x)
fX (x) =
0, otherwise,
where R1
(r) = 0 t r 1 e t dt
is the Gamma function.We know that
(r) = (r 1) (r 1)
and when r = n is a positive integer, then
(n) = (n 1)! with (1) = 1.

96/118
Mean and Variance of Gamma Distribution
12 Some Important Continuous Distributions

Theorem
If X follows the Gamma distribution with parameters > 0 and r > 0, then
r
E[X] = and var(X) = r2 .
Proof.
Z 1
Variance
Z 1
1
Mean E[X 2 ] = x2 fX (x)dx = ( x)r+1 e x
dx
Z 1 1 0 (r)
Z
E[X] = x fX (x)dx 1 1
1 = 2
t r+1 e t
dt, taking x = t,
Z 1
(r) 0
1
= ( x)r e x dx (r + 2) (r + 1)r (r) (r + 1)r
0 (r) = 2 (r)
= 2 (r)
= 2
.
Z 1
1
= t r e t dt, taking x = t, var(X) = E[X 2 ] (E[X])2
(r) 0
(r + 1) r (r) r (r + 1)r r2
= = = . = 2 2
(r) (r)
r
= 2
.
97/118
MGF of Gamma Distribution
12 Some Important Continuous Distributions

Theorem
The moment-generating function for the Gamma distribution X with parameters > 0
and r > 0,
⇣ t⌘ r
MX (t) = 1 , for t < .
Z 1
Proof. MX (t) = E[etX ] = etx ( x)r 1 e x dx
0 (r)
Z 1
= ( x)r 1 e ( t)x dx
(r) 0
Z
1 ⇣ t⌘ r 1 r 1 s
= 1 s e ds, taking ( t)x = s,
(r) 0
1 ⇣ t⌘ r ⇣ t⌘ r
= 1 (r) = 1
(r)
98/118
Definition
12 Some Important Continuous Distributions

Definition (Exponential Distribution)


The continuous random variable X is said to be a exponential distribution with parameter
> 0, if its probability density function
( is given by
e x, x > 0
fX (x) =
0, otherwise.
This is a special case of Gamma distribution with r = 1.

Example
An electronic component is known to have a useful life represented by an exponential
density with a failure rate of 10 5 failure per hour (i.e., = 10 5 ). Determine the
fraction of such components that would fail before the mean or expected life.
Ans.: 63.212% would fail before the mean life.

99/118
Mean, Variance and MGF of Exponential Distribution
12 Some Important Continuous Distributions

Theorem (Mean and Variance of Exponential Distribution)


If X follows the exponential distribution with parameter > 0, then
1 1
E[X] = and var(X) = 2 .

Theorem
The moment-generating function for the exponential distribution X with parameter > 0,
⇣ t⌘ 1
MX (t) = 1 , for t < .

100/118
Definition
12 Some Important Continuous Distributions

Definition (Normal Distribution)


The continuous random variable X is said to have a normal distribution with parameters
µ ( 1 < µ < 1) and > 0, if its probability density function is given by
1 2 2
fX (x) = p e (x µ) /2 , 1 < x < 1.
2⇡

fX
Normal Distribution
p1
2⇡

x
µ

101/118
Properties of Normal Distribution
12 Some Important Continuous Distributions
Z 1
1. fX (x) 0 for all x, and fX (x)dx = 1.
1
2. lim fX (x) = 0 and lim fX (x) = 0.
x! 1 x!1
3. fX (x µ) = fX (x + µ), that is, fX is symmetric about x = µ.
4. The maximum value of fX occurs at x = µ, that is,
1 0.39894
max fX (x) = fX (µ) = p ⇡ .
1<x<1 2⇡
5. The points of inflection of fX are at x = µ ±

Note
The shorthand notation X ⇠ N (µ, 2) is often employed to indicate that the random
variable X is normally distributed.

102/118
Mean and Variance of Normal Distribution
12 Some Important Continuous Distributions

Theorem
If X follows the normal distribution with parameters µ ( 1 < µ < 1) and > 0, then
E[X] = µ and var(X) = 2 .
Proof.

Mean Variance
Z 1
E[X µ] var(X) = E[(X 2
µ) ] = (x µ)2 fX (x)dx
Z 1 1
Z
= (x µ) fX (x)dx 1 1 (x µ)2 /2 2
1 = (x µ)2 p e dx
Z 1 1 2⇡
1 2
(x µ) /2 2
Z
= (x µ) p e dx 2 1 (x µ)
t 2 /2
1 2⇡ = p t (te )dt, taking = t,
Z 1 2⇡ 1
2
t /2 (x µ) Z
= p te dt, taking = t, 2 1
2⇡ t 2 /2
1 = p e dt, applying int. by parts,
= 0. 2⇡ 1
2 p 2
= p 2⇡ = .
=) E[X] = µ. 2⇡
103/118
MGF of Normal Distribution
12 Some Important Continuous Distributions

Theorem
The moment-generating function for the normal distribution X with parameters
µ ( 1 < µ < 1) and > 0,
1
MX (t) = e 2 ( t) +µt for all
2
1 < t < 1.
Z 1 Z 1
Proof. 1
MX (t) = E[etX ] = etx fX (x)dx = etx p e (x µ) /2 dx
2 2

1 1 2⇡
Z 1 2 +2 2 t
1 (x µ)
= p e 2 2 dt
2⇡ 1
Z 1
1 1 (x r)2
= e 2 ( t) +µt p
2
e 2 2 dr, where r = µ + 2 t,
2⇡ 1
1
t)2 +µt
= e2( .

104/118
The Normal Approximation to the Binomial Distribution
12 Some Important Continuous Distributions

Let the random variable X denote the number of successes in n Bernoulli trials. We recall the
binomial distribution as ✓ ◆
n x
pX (x) = q (1 q)n x ; x = 1, 2, . . . , n,
x
with mean µX = nq and variance X2 = nq(1 q).

Stirling’s approximation to n! is p n n+0.5


n! ⇡ 2⇡ e n
where the error p
n! 2⇡ e n nn+0.5
!0 as n ! 1.
n!

Using the Stirling approximation to n! in the binomial distribution, we have


1 1 ⇣ nq ⌘x+0.5 ✓ n(1 q) ◆n x+0.5
pX (x) = p p .
2⇡ nq(1 q) x n x
⇣ ⌘x+0.5 ⇣ ⌘n x+0.5
x n x
Let = nq n(1 q) , then

105/118
The Normal Approximation to the Binomial Distribution
12 Some Important Continuous Distributions

⇣ ⌘ ⇣ ⌘
x n x
log = (x + 0.5) log nq + (n x + 0.5) log n(1 q) .
p
Using the change of variable x =
q nq + nq(1 q) z, we have
q
x 1 q n x q
nq =1+ nq z and n(1 q) =1 n(1 q) z.
and p ⇣ q ⌘ p
log = (nq + nq(1 q) z + 0.5) log 1 + 1nqq z + (n(1 q) nq(1 q) z +
⇣ q ⌘
q
0.5) log 1 n(1 q) z ⇡ 0.5z + O(n
2 0.5 ).

Thus, for large n


e (x nq) /(2nq(1 q)) .
2
pX (x) = p1 p 1
2⇡ nq(1 q)
1 (x µX )2 /(2 2
=) pX (x) = p e X) .
X 2⇡

106/118
Theorem
If X ⇠ N (µ, then for any constants a and b, b 6= 0, the random variable Y = a + bX
2 ),

is also a normal random variable, that is, Y ⇠ N (a + bµ, b2 2 ).


Proof. Clearly, mean of Y = µY = E[Y] = E[a + bX] = a + bE[X] = a + bµ
and var(Y) = E[(Y µY )2 ] = E[b2 (X µ)2 ] = ( b2 E[(X µ)2 ] = b2 var(X) = b2 2
.
PX X  y b a , b > 0,
FY (y) = PY (Y  y) = PY (a + bX  y) =
1 PX X  y b a , b < 0,
(
FX y b a , b > 0,
=
1 FX y b a , b < 0,
(
1 dFX y a
d , b > 0,
fY (y) = FY (y) = PY (a + bX  y) = b 1dxdFX by a
dy b dx b
, b < 0,
(
y a
1
fX , b > 0,
= b 1 by a
b
fX b
, b < 0.
Thus Y ⇠ N (a + bµ, b2 2 ).
107/118
Normal Cumulative Distribution Function
12 Some Important Continuous Distributions
The cumulative distribution function FX for the normally distributed random variable X is
given by
Z x Z x ⇣
1 t µ
⌘2
1
FX (x) = PX (X  x) = fX (t)dt = p e 2
dt
1 2⇡ 1
Z x µ
1 s2 (t µ)
=p e 2 dt, taking =s
2⇡ 1
✓ ◆
x µ
= ,

where Z z
1
e s /2 ds
2
(z) := p
2⇡ 1
is called as Standard Normal Cumulative Distribution Function.
108/118
Definition
12 Some Important Continuous Distributions

Definition (Standard Normal Distribution)


The continuous random variable Z is said to have a standard normal distribution, if its
probability density function is given by
1
'Z (z) = p e z /2 ,
2
1 < x < 1.
2⇡
The mean and variance of the standard normal distribution are 0 and 1, respectively.

'Z Standard Normal Cumulative


PZ (Z < z) = PZ (Z > z)
Distribution Function
Z (z) = PZ (Z  z)
Z z
PZ (Z < z) PZ (Z > z)
= 'Z (s)ds
1
Z z
z 1
e s /2 ds
2
-z 0 z =p
109/118 2⇡ 1
Standard Normal Cumulative Distribution Function Table
12 Some Important Continuous Distributions
Rz s2 /2
Table: Standard Normal Cumulative Distribution Function (z) = p1
2⇡ 1
e ds.
z 0 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09
0 0.5 0.50399 0.50798 0.51197 0.51595 0.51994 0.52392 0.5279 0.53188 0.53586
0.1 0.53983 0.5438 0.54776 0.55172 0.55567 0.55962 0.56356 0.56749 0.57142 0.57535
0.2 0.57926 0.58317 0.58706 0.59095 0.59483 0.59871 0.60257 0.60642 0.61026 0.61409
0.3 0.61791 0.62172 0.62552 0.6293 0.63307 0.63683 0.64058 0.64431 0.64803 0.65173
0.4 0.65542 0.6591 0.66276 0.6664 0.67003 0.67364 0.67724 0.68082 0.68439 0.68793
0.5 0.69146 0.69497 0.69847 0.70194 0.7054 0.70884 0.71226 0.71566 0.71904 0.7224
0.6 0.72575 0.72907 0.73237 0.73565 0.73891 0.74215 0.74537 0.74857 0.75175 0.7549
0.7 0.75804 0.76115 0.76424 0.7673 0.77035 0.77337 0.77637 0.77935 0.7823 0.78524
0.8 0.78814 0.79103 0.79389 0.79673 0.79955 0.80234 0.80511 0.80785 0.81057 0.81327
0.9 0.81594 0.81859 0.82121 0.82381 0.82639 0.82894 0.83147 0.83398 0.83646 0.83891
1 0.84134 0.84375 0.84614 0.84849 0.85083 0.85314 0.85543 0.85769 0.85993 0.86214
1.1 0.86433 0.8665 0.86864 0.87076 0.87286 0.87493 0.87698 0.879 0.881 0.88298
1.2 0.88493 0.88686 0.88877 0.89065 0.89251 0.89435 0.89617 0.89796 0.89973 0.90147
1.3 0.9032 0.9049 0.90658 0.90824 0.90988 0.91149 0.91309 0.91466 0.91621 0.91774
1.4 0.91924 0.92073 0.9222 0.92364 0.92507 0.92647 0.92785 0.92922 0.93056 0.93189
1.5 0.93319 0.93448 0.93574 0.93699 0.93822 0.93943 0.94062 0.94179 0.94295 0.94408
1.6 0.9452 0.9463 0.94738 0.94845 0.9495 0.95053 0.95154 0.95254 0.95352 0.95449
1.7 0.95543 0.95637 0.95728 0.95818 0.95907 0.95994 0.9608 0.96164 0.96246 0.96327
1.8 0.96407 0.96485 0.96562 0.96638 0.96712 0.96784 0.96856 0.96926 0.96995 0.97062
1.9 0.97128 0.97193 0.97257 0.9732 0.97381 0.97441 0.975 0.97558 0.97615 0.9767
2 0.97725 0.97778 0.97831 0.97882 0.97932 0.97982 0.9803 0.98077 0.98124 0.98169

110/118
Standard Normal Cumulative Distribution Function Table
12 Some Important Continuous Distributions

2.1 0.98214 0.98257 0.983 0.98341 0.98382 0.98422 0.98461 0.985 0.98537 0.98574
2.2 0.9861 0.98645 0.98679 0.98713 0.98745 0.98778 0.98809 0.9884 0.9887 0.98899
2.3 0.98928 0.98956 0.98983 0.9901 0.99036 0.99061 0.99086 0.99111 0.99134 0.99158
2.4 0.9918 0.99202 0.99224 0.99245 0.99266 0.99286 0.99305 0.99324 0.99343 0.99361
2.5 0.99379 0.99396 0.99413 0.9943 0.99446 0.99461 0.99477 0.99492 0.99506 0.9952
2.6 0.99534 0.99547 0.9956 0.99573 0.99585 0.99598 0.99609 0.99621 0.99632 0.99643
2.7 0.99653 0.99664 0.99674 0.99683 0.99693 0.99702 0.99711 0.9972 0.99728 0.99736
2.8 0.99744 0.99752 0.9976 0.99767 0.99774 0.99781 0.99788 0.99795 0.99801 0.99807
2.9 0.99813 0.99819 0.99825 0.99831 0.99836 0.99841 0.99846 0.99851 0.99856 0.99861
3 0.99865 0.99869 0.99874 0.99878 0.99882 0.99886 0.99889 0.99893 0.99896 0.999
3.1 0.99903 0.99906 0.9991 0.99913 0.99916 0.99918 0.99921 0.99924 0.99926 0.99929
3.2 0.99931 0.99934 0.99936 0.99938 0.9994 0.99942 0.99944 0.99946 0.99948 0.9995
3.3 0.99952 0.99953 0.99955 0.99957 0.99958 0.9996 0.99961 0.99962 0.99964 0.99965
3.4 0.99966 0.99968 0.99969 0.9997 0.99971 0.99972 0.99973 0.99974 0.99975 0.99976
3.5 0.99977 0.99978 0.99978 0.99979 0.9998 0.99981 0.99981 0.99982 0.99983 0.99983
3.6 0.99984 0.99985 0.99985 0.99986 0.99986 0.99987 0.99987 0.99988 0.99988 0.99989
3.7 0.99989 0.9999 0.9999 0.9999 0.99991 0.99991 0.99992 0.99992 0.99992 0.99992
3.8 0.99993 0.99993 0.99993 0.99994 0.99994 0.99994 0.99994 0.99995 0.99995 0.99995
3.9 0.99995 0.99995 0.99996 0.99996 0.99996 0.99996 0.99996 0.99996 0.99997 0.99997
4 0.99997 0.99997 0.99997 0.99997 0.99997 0.99997 0.99998 0.99998 0.99998 0.99998

111/118
Standard Normal Cumulative Distribution Function Table
12 Some Important Continuous Distributions

Example
If X is a normal random variable with mean µ = 3 and variance 2 = 16, find
1. P(X < 11),
2. P(X > 1),
3. P(2 < X < 7).
Ans.: 1. (2), 2. (1/2), 3. (1) ( 1/4).

https://eee.poriyaan.in/topic/normal-distributions--solved-example-problems-11271/

112/118
Topic
13 Markov’s and Chebyshev’s Inequalities
I Probability
What is Probability?
Elements of Probability
Naive Definition
Axiomatic Definition
I Random Variable
Examples
Definition
Types of Random Variables
I Distribution Function
I Probability Function
I Distribution Function and Probability Function
Examples
I Probability Density Function
I Some Characteristics of Distributions
113/118
Topic
13 Markov’s and Chebyshev’s Inequalities
I Functions of a Random Variable
I Expectation
I Moment-Generating Function
I Some Important Discrete Distributions
Binomial Distribution
Geometric Distribution
Poisson Distribution

I Some Important Continuous Distributions


Uniform Distribution
Gamma Distribution
Exponential Distribution
Normal Distribution

I Markov’s and Chebyshev’s Inequalities

114/118
Markov’s Inequality
13 Markov’s and Chebyshev’s Inequalities

Theorem
If X is a random variable that takes only non-negative values, then for any value a > 0
E[X]
P(X a) 
a
proof. We have two Cases:
When X is discrete with mass pX When XZis continuous with density fX .
X X 1
E[X] = xpX (x)dx = xpX (x)dx E[X] = xfX (x)dx
x x 0 Z0 a Z
X X 1
= xpX (x)dx + xpX (x)dx = xfX (x)dx + xfX (x)dx
0xa x a Z0 1 a
Z
X X 1
xpX (x)dx apX (x)dx xfX (x)dx afX (x)dx
x X
a x a aZ a
1
=a pX (x)dx = aP(X a). =a fX (x)dx = aP(X a).
a
x a

Thus P(X a)  E[X]/a Thus P(X a)  E[X]/a


115/118
Chebyshev’s Inequality
13 Markov’s and Chebyshev’s Inequalities

Theorem
If X is a random variable with mean µX and variance var(X) = 2
X, then for any a > 0
2
X
P(|X µX | a) 
a2
proof. Since (X µX )2 is a non-negative random variable, by applying Markov’s inequality we have
E[(x µX )2 ]
P (X µX )2 a2 
a2
But since (X µX )2 a2 if and only if |X µX | a, the above equation is equivalent to
E[(x µX )2 ] 2
X
P |X µX | a  =
a2 a2

Corollary
If X is a random variable with mean µX and variance var(X) = 2
X, then for any k > 0
1
P(|X µX | k X )  2
k
116/118
Examples
13 Markov’s and Chebyshev’s Inequalities

Example
Suppose that it is known that the number of items produced in a factory during a week is
a random variable with the mean 50.
1. What can be said about the probability that this week’s production will exceed 75?
2. If the variance of a week’s production is known to equal 25, then what can be said
about the probability that this week’s production will be between 40 and 60?
Ans.: 1. 2/3, 2. 3/4

117/118
Thank you for listening!
Any questions?

118/118

You might also like