Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
116 views8 pages

Chapter 3-2857

This document provides an overview of discrete distributions and related concepts in statistics. It defines discrete random variables as those that can take on countable values. Examples include Bernoulli random variables that can be 0 or 1, and the number of heads when flipping a coin multiple times. The probability mass function specifies the probability of each possible value. Expected value, variance, and other moments are defined for discrete random variables. The document also introduces moment generating functions and how they relate to expected values and variances. It concludes with discussions of binomial experiments and sampling without replacement.

Uploaded by

Kevin Fontyn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
116 views8 pages

Chapter 3-2857

This document provides an overview of discrete distributions and related concepts in statistics. It defines discrete random variables as those that can take on countable values. Examples include Bernoulli random variables that can be 0 or 1, and the number of heads when flipping a coin multiple times. The probability mass function specifies the probability of each possible value. Expected value, variance, and other moments are defined for discrete random variables. The document also introduces moment generating functions and how they relate to expected values and variances. It concludes with discussions of binomial experiments and sampling without replacement.

Uploaded by

Kevin Fontyn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Stats 2857

Chapter 3: DISCREET DISTRIBUTIONS


- Random Variables
o For a given sample space, a random variable is any rule associating a number
with each outcome in the sample space
o A random variable is a function whose domain is the sample space, range is the
set of real numbers
o I.e. Flipping a coin three times:
{ HHH , HHT , HTH , THH , HTT , THT ,TTH , TTT } ; random variable
X =number of heads={0,1,2,3 }
- Discrete Random variable
o Is a random variable whose possible values constitute a finite set, or infinite
sequence
 Its set of values consists of either all numbers in a single interval, or all
numbers in a disjoint union [ 0,10 ] ∪[20,30]
 No possible value of the variable has positive probability
- Bernouilli Random Variable
o Any random variable whose only possible values are 0 and 1
o y= { 0if something
1 if something else
 i.e. success or failure
- Probability Distribution
o The probability distribution of X says how the total probability of 1 is distributed
among the various possible X values
o Also called the probability mass function
o Is defined for every number x by p ( x ) =P( X =x)
 The pmf specifies the probability of observing that value when the
experiment is performed
 NOTE: pmf must satisfy: p ( x ) ≥ 0 ; ∑ p ( x )=1
- Parameters of a distribution
o Suppose p( x ) depends on a quantity that can be assigned any one of a number
of possible values, with each different value determining a different probability
distribution
o Such a quantity is called a parameter
o The collection of all probability distributions for different values of the
parameter is called a family of probability distributions
- Culminative distribution function
o The culminative distribution function F ( x) of a discrete random variable X with
pmf p(x ) is defined for every number by:
F ( x )=P ( X ≤ x ) = ∑ p( y)
y :y ≤ x
o For any number x F (x) is the probability that the observed value will be at most
x
- Expected Value of X
o Let X be a discrete random variable with set of possible values D and pmf p ( x ) .
The expected value=mean value, of X, E( X) or μx is :
E ( X ) = ∑ x ∙ p( x)
x ∈D
o Provided that the series is less than or equal to infinity
- Expected value of a Function
o If the random variable X has a set of possible values D and pmf p(x), the
expected value of the function h(x), E [h ( x ) ], is computed by:

E [ h ( X ) ]=∑ h(x )∙ p (x)


D
- Expected value of a Linear Function
o The h( x ) function is frequently linear. In this case E [h ( X )] can be computed
from E( X)
E [ h ( X ) ]=a ∙ E ( X )+ b
o This is only possible when h(x) is LINEAR, otherwise calculate the expected value
by hand
- Variance of X
o Let X have a pmf p(X) and expected value μ. The variance of X, denoted by V(X)
or σ 2 x , or just σ 2 is:
V ( X )=∑ ( x−μ )2 ∙ p ( x ) =E[ ( X−μ )2]
D
o Standard deviation is √ V (X )

[ ]
o SHORTCUT: V ( X )=σ = ∑ x ∙ p ( x ) −μ=E ( X )−[ E ( X ) ]
2 2 2 2

- Rules of Variance
o The variance of h( x ) is the expected value of the squared difference between
h(x) and it’s expected value
2 2
V ( aX +b )=σ aX+ b=a ∙ σ X
σ aX +b=¿ a∨∙ σ 2X

Example: A chemical supply company currently has in stock 100 lb of a chemical, which it sells
to customers in 5-lb containers. Let X = the number of containers ordered by a randomly
chosen customer, and suppose that X has pmf:
x 1 2 3 4
P(x) 0.2 0.4 0.3 0.1

E ( x )=∑ x ∙ p ( x )
E ( x )=1 ( 0.2 )+ 2 ( 0.4 ) +3 ( 0.3 ) +4 ( 0.1 )=0.2+0.8+ 0.9+0.4
E ( x )=2.3
E ( x )= ∑ x p ( x )
2 2

¿ 12 ( 0.2 )+22 ( 0.4 )+3 2 ( 0.3 )+ 42 ( 0.1 )


¿ 0.2+1.6+2.7+ 1.6
¿ 6.1

V ( x )=E ( x )−E ( x )
2 2

2
V ( x )=6.1−2.3
V ( x )=0.81

y=¿ of pounds ¿ 100−5 x


y=100−5 x

E ( y )=E ( 100−5 x ) =100−5 ( E ( x ) )


¿ 100−5 ( 2.3 )
¿ 88.5 pounds

2
V ( y )=V ( 100−5 x )=(−5 ) V ( x )
¿ 25 ( 0.81 )
¿ 20.25
- Moments
o Sometimes expected values of integer powers of X and X −μ are called moments
o Expected value of powers of X are called moments about 0
o Expected values of powers of X −μ are called moments about the mean
- Measure of Lack of Symmetry using Moments:
o Measure of departure from symmetry
E¿¿
- Moment Generating Functions:
o The mgf of a discreet random variable is defined to be:
M x ( t ) =E ( e tx )=∑ e tx × p(x)
Example: For a new car the number of defects X has the distribution given by the accompanying
table. Find MX (t).

M x ( t ) =e 0 (0.04)+e t (0.2)+e 2 t (0.34)+ e3 t (0.2)+e 4 t (0.15)+e 5t (0.05)+e 6 t (0.03)


t 2t 3t 4t 5t 6t
¿ 0.04+ 0.2 e + 0.34 e +0.2 e + 0.15 e +e ( 0.05)+e
r (r )
- If the mgf exists E( X )=M X (0)
o It is equal to the rth derivative of the moment generating function at t=0
Example: For a new car the number of defects X has the distribution given by the accompanying
table. Find MX (t).

M x ( t ) =0.04+0.2 e t +0.34 e2 t + 0.2 e3 t + 0.15 e4 t +e 5 t (0.05)+e 6 t


(1 ) t 2t 3t 4t 5t 6t
M x ( t )=0.2 e +0.68 e +0.6 e +0.6 e +0.2 e + 0.18 e
(1 )
M x ( 0 )=2.46

THEN

2 (2)
E( X )=M x (0)=7.84

THEN
V ( X)=E( X 2 )−¿

t 3t
EXAMPLE : Given MX (t )=0.2+0.3 e +0.5 e , find p (x), E (X ),V ( X )

Since MX(t) is the sum of all x(p(x)); p(x)=0.2


E(X) is the first derivative of MX(t) at t=0

(1) (0) 3
E( X)=M X (0)=0.3 e +1.5 e ( 0)=0.3+1.5=1.8

V(X) is the second derivative of MX(T) evaluated at t=0 minus the first
E( X 2)=M (2) 0 3
X (0)=0.3 e + 4.5 e (0)
(2)
M X (0)=4.8

2 2
V ( x )=E( X )−[ E(X )]
V ( X)=4.8−3.24
V ( X)=1.56

- MGF of a linear function:


o Let X have the mgf M x ( t ) and let Y=aX+b; then M Y ( t )=e bt × M x ( at )
- Binomial Experiments:
o MUST satisfy:
 The experiment consists of n smaller experiments called trials, where n is
fixed in advance of the experiment
 Each trial has two outcomes: success/failure
 Trials are independent: Outcome of one trial doesn’t influence the
outcome of another
 The probability of success is not constant from trial to trial (called p)
- Sampling without replacement:
o Sampling from a population of size N, that yields two outcomes. If the sample
size n is at most 5% of the population size, the experiment can be analyzed as
binomial
- Binomial Random Variable
o Given a binomial experiment of n trials, the binomial random variable X
associated to this experiment is defined as:
 X=the number of Sucesses among the n trials
o The pmf of a binomial random variable X depends on parameters n and p; the
pmf is denoted by b(x; n,p)
- Binomial Probability Mass function

{( )
n p x ( 1− p )n− x ;if x=0,1,2 , … n
o b ( x ; n , p) = x
0; otherwise

Example:
1. B(3;8,0.6)

(38) 0.6 ( 1−0.6 )


b ( 3 ; 8,0.6 ) =
3 8−3
=56 ( 0.216 ) ( 0.01024 )=0.1239

2. b (5; 8,0.6)

(58) 0.6 ( 1−0.6 )


b ( 5 ; 8,0.6 ) =
5 8−5
=56 ( 0.07776 ) ( 0.064 )=0.2787

3. P(3 ≤ X ≤5) when n=6 and p=0.6

()
Step 1 :b ( 4 ; 8,0.6 )= 8 0.6 4 (1−0.6 )8− 4=0.2322
4
step2 : P ( 3≤ X ≤ 5 )= p ( 3 ) + p ( 4 )+ p ( 5 ) =0.1239+0.2787+ 0.2322=0.6348
4. P(1≤X) when n=12, and p=0.1

This example spans every element except 0!

P ( 1≤ X ) =1− p ( 0 )

( )
SO :b ( 0 ;12,0.1 )= 12 0.10 ( 1−0.1 )12−0=0.2824
0

P ( 1≤ X ) =1− p ( 0 )=1−0.2824=0.7176

Example 2: The College Board reports that 2% of the two million high school students who
take the SAT each year receive special accommodations because of documented disabilities.
Consider a random sample of 25 students who have recently taken the test.

NOTE: a sample of 25 students is less than 5% of the population, and has two outcomes:
accommodation and no accommodation
n=25, p=0.02, X is the number of successes=the number who received accommodation

a. What is the probability that exactly 1 received a special accommodation?

b ( 1 ;25,0.02 )= ( 251) 0.02 (1−0.02)


1 25−1
=0.3079

b. What is the probability that at least 1 received a special accommodation?

We need P(x≤1)=1-p(0)

b ( 0 ; 25 , 0.02 )= (250)0.02 ( 1−0.02)


0 25−0
=1 ( 1 ) ( 0.603464729 )=0.6035

P ( X ≤1 ) =1−0.6035=0.3965

c. What is the probability that at least 2 received a special accommodation?

P ( X ≤2 ) =1− p ( 0 )− p ( 1 )=1−0.3079−0.6035=0.0886

- Using Binomial Tables:


o Notation: for all X Bin ( n , p ) , the cdf will be denoted by :
x
 P ( X ≤ x )=B ( x ; n , p )= ∑ b( y ;n , p)
y=0
o These are given and PRE CALCULATED
o If given a binomial table question one will be provided
- Mean and variance of X
o If X Bin ( n , p ) , then E ( X ) =np , V ( X )=np ( 1−p )
o σ x =√ np ( 1− p )
o
What is the probability that the number among the 25 who received a special accommodation
is within 2 standard deviations of the number you would expect to be accommodated?

E ( X ) =np=25 ( 0.02 )=0.5

σ x =√ np ( 1− p )=√ 25 ( 0.02 )( 0.98 ) 0.07

So two standard deviations below the mean is 0.5-2(0.07)=-0.9; two standard deviations above
is 1.9

P (−0.9 ≤ X ≤ 1.9 )=P ( 0 ≤ X ≤1 ) =P ( X ≤ 1 ) ; since X can only take on 0 , 1 ,2 , 3 …


¿ the table : P ( μ−2 σ < X < σ +2 σ )=0.642

- Characteristics of Hypergeometric Distributions


o Assumptions that lead into a hypergeometric distribution
 The population or set to be sample consists of N individuals, objects or
elements (a finite population)
 Each individual can be characterized as a success or failure, and there are
M successes in the population
 A sample of n individuals is selected without replacement in such a way
that each subset of size n is equally likely to be choses
 The random variable of interest X is the number of successes in the
population
- PMF function of hypergeometric distributions
o If X is the number of successes in a completely random sample of size n drawn
from a population consisting of M successes, and N−M failures, the probability
distribution of X is given by:

( )( )
P ( X=x )=h ( x ; n , M , N )=
M N −M
x n−x

( ) N
n
for x , satisfying max ( 0 , n−N + M ) ≤ x ≤ min ⁡(n , M )
- Mean and the Variance
M
E ( X ) =n ∙
N

(
V ( X )=
N−n
N−1 ) ∙n∙
M
N( 1−
M
N )
Example 80: A bookstore has 15 copies of a particular textbook, of which 6 are first printings
and the other 9 are second printings (later printings provide an opportunity for authors to
correct mistakes). Suppose that 5 of these copies are randomly selected, and let X be the
number of first printings among the selected copies.

n=5; N=15; M=6→ it is hypergeometric

FIND : P ( X =2 ) , P ( X ≤2 ) ,∧P ( X ≥ 2 ) ; E ( X )∧standard deviation

P ( X=2 )=h ( 2; 5 , 6 , 15 )=
( 2 )( 5−2 ) 15( 84)
6 15−6
= =0.4196
(5)
15 3003
THEN : P ( X=1 ) =h ( 1; 5 , 6 , 15 )=
( 1 )( 5−1 ) 6 ( 126 )
6 15−6
= =0.2517
(5)
15 3003

THEN : P ( X=0 )=h ( 0 ;5 , 6 , 15 )=


( 0)( 5−0 ) 1 ( 126 )
6 15−6
= =0.0420
(5)
15 3003

P ( X ≤2 ) =0.4196+0.2517+ 0.0420=0.7133

P ( X ≥2 ) =1−P ( X ≤1 ) =1−( P ( 0 ) + P ( 1 ) )=1−0.2937=0.7063

THEN : E ( X ) =5 ( 156 )=2


THEN V ( X )= ( 15−5
15−1 ) ( 5 ) ( )( 1− )=0.8571
6
15
6
15
Therefore σ =√ 0.8571=0.9258

You might also like