Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
52 views28 pages

Arnabc74 Github Io Prob1 2025 Basic HTML

The document introduces basic concepts of probability, including random experiments, sample spaces, and events. It outlines the probability axioms, which are essential conditions that any probability function must satisfy, and discusses the continuity of probability functions and the Inclusion-Exclusion formula. Additionally, it provides examples and exercises to illustrate these concepts and their applications.

Uploaded by

titeer123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views28 pages

Arnabc74 Github Io Prob1 2025 Basic HTML

The document introduces basic concepts of probability, including random experiments, sample spaces, and events. It outlines the probability axioms, which are essential conditions that any probability function must satisfy, and discusses the continuity of probability functions and the Inclusion-Exclusion formula. Additionally, it provides examples and exercises to illustrate these concepts and their applications.

Uploaded by

titeer123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

[Home]

Basic concepts

Basic concepts
Probability axioms
Continuity of probability function
Inclusion-Exclusion formula
Countable sample spaces
Equally likely cases
Simulations
Problems for practice
Sets and Venn diagrams
Axioms
Equally likely
Harder

Basic concepts
A random experiment is an activity whose outcome we cannot predict.
The set of all outcomes is called its sample space. Each element of this
set is called a sample point. By the term event we understand a subset of
the sample space.

EXAMPLE 1: A coin toss is a random experiment. Its sample space is


{head, tail}. There are four different events possible here:
ϕ, {head}, {tail}, {head, tail}. ■

EXAMPLE 2: Rolling a die is another random experiment. The sample


space here
Processing is {1,
math: 97% 2, 3, 4, 5, 6}. One possible event is the set of all even
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
numbers, {2, 4, 6}. ■

I give you a coin to toss. Before tossing, you carefully inspect it. You find
no difference at all between the two sides (except for the pictures on
them). So you infer that both head and tail are equally likely (i.e., you
are equally ignorant about both sides). It is common to express this
situation as "50-50" chance, or 50% chance of a head (or a tail), or
1
probability of a head (or a tail) is 2 .

The main idea is that the chance of a head equals the chance of a tail.
We like to express this by first imagining a totality and then halving it.
This totality may be taken as 100 or 100% or 1 or any other positive
number.

In probability theory we take the total as 1. This choice is justified by


statistical regularity, as the following example shows.

EXAMPLE 3: Consider rolling a fair die. Let A be the event that we get
a prime number, i.e., A = {2, 3, 5}. Intuitively, the probability of A
1
should be 2 . We shall use R to perform 5000 trials of this random
experiment and check the running proportion of cases that the event A
happens.

x = sample(6,5000,rep=T)
A = x %in% c(2,3,5)
plot(cumsum(A)/(1:5000), ty='l')
1
Run in cloud Notice that the proportions indeed tend towards 2 .

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Similarly, we can get an idea of the probability of B = {1, 3, 4, 5} using
the following R code.

x = sample(6,5000,rep=T)
B = x %in% c(1,3,4,5)
mean(B)

Run in cloud ■

With each event we assign a probability which is a number from [0, 1].
In practice it is difficult (impossible?) to get a biased coin (i.e., a coin
which is more likely to show one side than the other). It is very easy to
simulate such a coin though:

x = sample(c('h', 't'), 1000, prob=c(0.7,0.3)


sum(x=='h')
sum(x=='t')

Run in cloud The prob=c(0.7,0.3) specifies the probabilities.


If the sample
space is
Let F be the set of all events in a sample space. For countable
example, if Ω = {head, tail} then F is (finite/infinite),
{ϕ, {head}, {tail}, {head, tail} }. then generally
F is just the

Then a probability is a function P : F → [0, 1]. power set of


Ω. If Ω is
Of course, not all such functions can be a probability. A uncountable,
function needs to satisfy certain common sense then there may
conditions to be called a probability function. be some "bad"
subsets for
EXAMPLE 4: A report says that the health condition in which
a country is so bad, that the chance of a newborn baby probability
surviving for at97%
Processing math: least 1 year is only 50%. However, the cannot be
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
chance that he survives for at least 5 years is 90%. Does defined! These
it sound odd? are not called
events, and so
SOLUTION: Yes. Any baby surviving for at least 5 are not
years has of course survived the first year as well. So considered as
how can the latter chance be larger? ■ members of F.
In this case F is
This example gives one common sense condition that a a strict subset
probability function must satisfy: if A ⊆ B then we of the power
should have P(A) ≤ P(B). set of Ω.
Fortunately, we
Of course, there are many many such common sense shall not come
conditions and it is difficult to come up with a complete across such
list. A Russian mathematician named Kolmogorov "bad" subsets
reduced this list to only 3 conditions, called the any time soon.
probability axioms.

Probability axioms
Probability axioms A collection
Let Ω denote the sample space. Let F denote the set of sets is
of all events. Then called disjoint
if the
intersection
1. For any event A ∈ F we have P(A) ≥ 0
of any two
2. P(Ω) = 1 sets from the
3. If A 1, A 2, . . . ∈ F are countably many collection is
(finite/infinite) disjoint events then empty. They
are also
P( ∪ A n) = ∑ P(A n). called
mutually
exclusive
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Notice the last axiom. Here the sum may involve You'll often see
infinitely many terms. Such a sum is called an infinite this sentence:
series. You'll learn about them in details in your
analysis course. But for now you may quickly read this "
crash course on infinite series. (Ω, F, P)
is a
It is a remarkable fact that whatever other common probability
sense condition one has been able to think of so far space".
actually follows as a consequence of these! Also, you
cannot drop any of these requirements, in the sense This is a
that no two of these imply the other. Can you show shorthand for:
this?
Ω is a
It is an interesting exercise to derive various common nonempty set
sense conditions from these axioms. Here is one. (sample
space),
EXAMPLE 5: If A ⊆ B then show that P(A) ≤ P(B). F is the

collection of
c
SOLUTION: Split B as B = A ∪ (A ∩ B). all events,
P: F → R
is a
probability
(i.e., a
function
satisfying the
three
probability
axioms).

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
The two events in RHS are disjoint. So by Axiom 3 we have

P(B) = P(A) + P(A c ∩ B).

The second probability in the RHS is ≥ 0 (by Axiom 1). So done. ■

Try your hand at these:

EXERCISE 1: Show P(A c) = 1 − P(A).

EXERCISE 2: Show P(ϕ) = 0.

EXERCISE 3: If A 1 ⊆ A2 ⊆ A 3, then show that


P(A 1 ∪ A 1 ∪ A 3) = P(A 1) + P(A 2 ∩ A c1) + P(A 3 ∩ A c2).

EXERCISE 4: Show P(A ∪ B) = P(A) + P(B) − P(A ∩ B).

Next we shall prove some common sense properties that will require
more effort.

Continuity of probability function

We know that if f(x) is a continuous function and x n → a then


f(x n) → f(a). Any probability function has a similar property.

Definition: Increasing limit


Let A 1, A 2, . . . be a sequence of events. Let A be any event. We
say that A n's increase to A (and write A n ↗ A) if

A1 ⊆ A2 ⊆ A3 ⊆ ⋯
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
and

A= ∪
nA n.

Definition: Decreasing limit


Let A 1, A 2, . . . be a sequence of events. Let A be any event. We
say that A n's decrease to A (and write A n ↘ A) if

A1 ⊇ A2 ⊇ A3 ⊇ ⋯

and

A = ∩ nA n.

Theorem
If A n ↗ A or A n ↘ A then P(A n) → P(A).

Proof: Let's do the A n ↗ A case first.

Define B 1 = A 1 and for n ≥ 2 let B n = A n ∖ A n − 1.

Then B 1, B 2, B 3, . . . are all disjoint. (Why? Don't just say "obvious!".


Write a one line proof.) Also, for n ∈ N, we have
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
A n = B 1 ∪ ⋯ ∪ B n.

So A = B 1 ∪ B 2 ∪ ⋯. (Why?)

Hence, by the third axiom,


P(A) = ∑ ∞
1
P(B i ) = lim n
n 1P(B i) = lim nP(A n), as required.

The A n ↘ A case follows on taking complements.

[QED]

EXERCISE 5: (A puzzle) You are approached by a gambler at a casino.


"Hmm, youngster", he remarks as he looks you up and down, "you seem
to be new here. Let me offer you some money." He comes closer, sits
beside you, and continues in a friendly voice, "Here I have a die, a fair
one. You shall roll it again and again. After each roll, we shall do a little
transaction like this: if the die shows six, you pay me some positive
amount, say t. But if it shows any other number I shall pay you ten times
that amount. Does that sound like a good game to you?"

Being freshly admitted to ISI, you are of course proud of your probability
skills, and reply "Yes".

"That's very good for you, very good indeed", exclaims the man in glee,
"but it is not good for me, you see. I just made the offer because I took a
liking to you. I hope that you would keep two requests in return."

You get cautious, but agree to hear them anyway.

"The first request is that I shall dictate the value of t before each roll".
Noticing a cloud of worry upon your face, he adds, "Don't worry, t will
always be positive, and I shall fix the amount before the roll."
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
You see no harm in that, and ask him to proceed.

"The second favour that I ask for is to call it quits whenever I like. That
means I shall decide when the game will stop. It is only the barest
protection for me, you see. I shall soon become bankrupt, and then I at
least need to have my right to go back home! Surely you would not deny
me that !"

You find the entire offer reasonable enough, and so accept it.

Have you done a wise thing? [Hint: the die is indeed fair, and there is no
word play. It is a pure mathematical puzzle.]

Inclusion-Exclusion formula

Just now we have mentioned the result


P(A ∪ B) = P(A) + P(B) − P(A ∩ B). We can think of this like

P(A) + P(B) overestimates P(A ∪ B) because the P(A ∩ B) part


is included twice. So we need to exclude it once.

This idea of inclusion and exclusion works for any finite number of
events.

Inclusion-Exclusion formula
Let A 1, . . . , A n be any n events. Let T denote the set of all subsets
of {1, . . . , n}. Then

[ ]
n

P(A 1 ∪ ⋯ ∪ A n) = ∑ ( − 1) k + 1 ∑ P(A α) ,
k=1 α∈T, |α| =k
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
where for any α ⊆ T we define

A α = ∩ i ∈ αA i.

Proof: The notation is a bit complicated. Let's understand it first with the
n = 3 case. Here the first term of the outer sum consists of the sum of
all A α where α ∈ T and | α | = 1 (i.e., all singleton subsets of {1, 2, 3}
). This sum is simply

P(A 1) + P(A 2) + P(A 3).

Similarly, the next term consists of all A α, where α is a doubleton subset


of {1, 2, 3}. Remember that A { 1 , 2 } = A 1 ∩ A 2 and so on. So the
second term (for k = 2) becomes

− [P(A 1 ∩ A 2) + P(A 2 ∩ A 3) + P(A 1 ∩ A 3)].

The third term (for k = 3) similarly is P(A 1 ∩ A 2 ∩ A 3). So the entire


sum looks like

P(A 1 ∪ A 2 ∪ A 3) = [P(A 1) + P(A 2) + P(A 3)][P(A 1 ∩ A 2) + P(A 2 ∩

The Venn diagram shows why this formula is correct. But a Venn
diagram cannot be considered as a proof, as it shows only one possible
case. However, a Venn diagram does indicate how to construct a general
proof. Note that A 1 ∪ A 2 ∪ A 3 is made of certain disjoint events:

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Cells
We have coloured these using blue, green and red. Blue cells consist of
points belonging to exactly one A i. For example, B 1 is the set of points
that belong only to A 1. The green cells consist of points belonging to
exactly two A i's, and so on. So

A 1 = B 1 ∪ B 12 ∪ B 13 ∪ B 123,

and similarly for A 2, and A 3. Note the pattern: all the B ′ s with 1
somewhere in the subscript has occurred in the RHS. Since the events in
the RHS are disjoint, so we have

P(A 1) = P(B 1) + P(B 12) + P(B 13) + P(B 123).

Now, the first stage (inclusion) is

P(A 1) + P(A 2) + P(A 3) = [P(B 1) + P(B 12) + P(B 13) + P(B 123)]
+ [P(B 2) + P(B 12) + P(B 23) + P(B 123)]
+ [P(B 3) + P(B 13) + P(B 23) + P(B 123)]

Note that here each B with a single subscript occurs once, each B with
two subscripts occur twice, and so on. This is of course natural, since for
example, B 12 occurs once as part of A 1 and then again as part of A 2.

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Next, we have

A 12 = B 12 ∪ B 123.

Here all the B's with 12 in the subscript occur in the RHS. Again, since
the B's are all disjoint,

P(A 12) = P(B 12) + P(B 123).

Similarly for A 23 and A 13. Using these, the second stage (exclusion) is

P(A 12) + P(A 23) + P(A 13) = [P(B 12) + P(B 123)]
+ [P(B 23) + P(B 123)]
+ [P(B 13) + P(B 123)]

Note that no B with a single subscript occurs at all. The B's with two
subscripts occur once each, while the B with three subscripts occur
thrice. Do you see the pattern? Each B is like B β, where β is a
nonempty subset of {1, 2, 3}. Similarly, each A is like A α, where α is
also a nonempty subset of {1, 2, 3}. Now B β occurs as a part of A α if
and only if α ⊆ β. So the number of times we see B 123 in the RHS is
3
same as the number of subsets of size 2 of {1, 2, 3}, which is ( 2 ) = 3.
The same technique will also explain why B_{23}, for example, occurs
only once: the number of subsets of size 2 of \{2,3\} is \binom{2}{2}=1.
In fact, the same approach also explains the absence of B's with single
indices. Each single index B occurs \binom{1}{2}=0 times!

The third stage (inclusion) is similar, though simpler. Here A_{123} =


B_{123}, and so
Processing math: 97%P(A_{123}) = P(B_{123}). Our basic pattern holds here
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
also: The 3-index B occurs \binom33=1 time. The 2-index B's occur
\binom23=0 time, and the 1-index B's occur \binom13=0 time.

The following table gives a summary:


Considered how many times in Total
No. of indices of B
Stage 1 (incl) Stage 2 (excl) Stage 3 (incl)
1 \binom11 \binom12 \binom13 1-0+0=1
2 \binom21 \binom22 \binom23 2-1+0=1
3 \binom31 \binom32 \binom33 3-3+1=1
Now we head for the general case for any given n.

For any \alpha\in T define

B_\alpha= the set of all those points that belong to A_i iff i\in
\alpha.

Clearly, by definition, B_\alpha's are all disjoint and A_1\cup\cdots\cup


A_n = \cup_{\alpha\in T} B_\alpha. Now observe that for any \alpha
\in T A_\alpha = \cup_{\beta\supseteq\alpha} B_\beta. So
P(A_\alpha) = \sum_{\beta\supseteq\alpha} P(B_\beta). So the
number of times a k-index B is considered in the r-th stage is \binom{k}
{r}. (Click here for more explanation.) More precisely,
\sum_{|\alpha|=r} P(A_\alpha) = \sum_{k=1}^r \binom{k}{r}
\sum_{|\beta|=k} P(B_\beta).

Hence total number of inclusion of a k-index B is \binom{k}{1}-


\binom{k}{2}+-\cdots+(-1)^{n+1} \binom{k}{n} = 1-(1-1)^k=1,
using binomial theorem.

Hence every B is included exactly once in the RHS. Thus,


P(\cup_{\alpha\in T}B_\alpha) = RHS, as required. [QED]

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Notice that the proof has used only the third axiom of probability. So if
we have any function P(\cdot) that satisfies the third axiom, the theorem
is valid for that function as well. Examples of such functions include
length, area, volume, mass, number of elements (for finite sets). In
short, it is true for any measure of size.

Indeed, all functions that satisfy axiom three are called (signed)
measures, and measure theory is the branch of mathematics that deals
with them.

Countable sample spaces


If the sample space S is countable (finite/infinite), say S = \
{x_1,x_2,...\}, then take any sequence p_1,p_2,... of nonnegative
numbers adding up to 1. Defining P(\{x_i\})=p_i completely specifies a
probability. Conversely, any probability can be constructed like this.

Equally likely cases

The simplest special case is when the sample space \Omega is finite (say
|\Omega|=n) and we take p_1=\cdots=p_n=\frac 1n.

In this case, for any A\subseteq \Omega we have P(A) =|A|/|\Omega|.

Many interesting problems fall in this category. They are basically


problems of combinatorics.

One type of problem is occupancy problems, where we have some


boxes and some balls are distributed over them following various
conditions.

EXAMPLE 6: There are three distinct boxes and 10 distinct balls. The
balls are dropped
Processing math: 97% randomly among the boxes so that all possible
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
configurations are equally likely. (No ball is outside a box, and each box
can hold all the balls.) What is the probability that the first box is empty?

SOLUTION: Each of the 10 balls has 3 possible destinations, irrespective


of the other balls. So the total number of configurations is 3^{10}. So
|\Omega|=3^{10}.

Let A be the event that the first ball remains empty. Then A occurs if and
only if all the balls land in the other 2 boxes. So |A|=2^{10}.

Since all outcomes are equally likely hence P(A)= \frac{|A|}{|\Omega|}


= \left(\frac 23\right)^{10}. ■

EXAMPLE 7: Same problem as above, except that the balls are now
identical. The boxes are still distinct. What is the answer now?

SOLUTION: By the bar-star argument |\Omega| = \binom{12}{2}=66.

Similarly, |A| = \binom{11}{1} = 11.

So the answer is \frac 16. ■

Certain real-life scenarios may be modelled like this. Here are a few
examples from physics (no need to cram these terms for the exams!).

EXAMPLE 8: There are r (identical/distinct) particles. Each particle


may be in one of n distinct states. We can think of the particles as balls
and the states as boxes. For example, if the states are UP and DOWN,
and there are r=12 identical particles, among which 5 are in UP state and
7 in DOWN, we can visualise this as:

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
If there are r=3 distinct particles and the same two states, then the
picture could be like:

Physicists assume various types of probabilities on these.

Maxwell-Boltzmann distribution: The balls are distinct. Then there


are n^r many possible configurations (each of the r balls has n
possible destinations). All these n^r configurations are assumed
equally likely. No real life particle shows this behaviour.

Bose-Einstein distribution: The balls are identical. So, by the bar-star


argument, we know that there are {n+r-1\choose n-1} configurations
possible. These are assumed equally likely.

Fermi-Dirac distribution: Here n\geq r and no box can hold more


than one ball. Balls are identical. So {n\choose r} distinct
configurations are possible (since a configuration is determined
completely by which of the boxes have one ball in it). All these
configurations are assumed equally likely.

Next we discuss a type of problems that are used in statistical quality


control. In this connection, you should know about a few terms:
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Simple random sample (SRS): Suppose that you have a finite set
(called a population), and you draw one element from it at random
(giving equality probability to each). This is called simple random
sampling.
Simple random sample without replacement(SRSWOR): Again you
start with a finite population (of size n), and draw a bunch of r
elements from it at random (giving all the \binom nr samples equal
chance). This is called an SRSWOR.
Simple random sample with replacement(SRSWR): Again you start
with a finite population (of size n), and draw a single elements from it
using simple random sampling. You make a note of the element, and
return it to the opulation. You repeat this r times. This is called an
SRSWOR.

EXAMPLE 9: An electronic component is packaged 100 per box. Each


component may be either good or defective. We want to accept a box if
and only if it has no defective component in it. Testing all the 100
components one by one is too time consuming (and also useless if the
test is destructive). So instead we draw a simple random sample without
replacement (SRSWOR) of size 10 and reject the box if any of these 10
turns out to be defective. Find the probability that a box containing
exactly 5 defectives will be rejected.

SOLUTION: Let \Omega = all samples of size 10. What is its size?
Instead of writing {100\choose 10}, we shall follow the steps that a
typical quality control officer would take: pick one, then pick the next,
then the next and so on. This stepwise approach is generally better
(Why?) than jumping into a ^nC_r or ^nP_r formula.

Here the first step may be done in 100 ways, the next in 99 ways, etc all
the way upto 100-10+1=91 ways.

So |\Omega| = 100\times 99\times\cdots\times 91.


Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
By the given condition, all the outcomes are equally likely.

Let A = the event that the box is rejected.

Thus A is the event that the box contains 1 or more defectives.

There are 5 possible cases here: exactly i defectives, where i=1,...,5.


These are disjoint cases. So we may try to find the probabilities of each
and add up. But we can save the work by working with A^c instead.

Here A^c= the event that the box contains no defective. Again we shall
find |A^c| stepwise.

The first component may be selected in 95 ways (avoiding the 5


defectives). The next in 94 ways, etc up to the 10-th component, which
may be selected in 95-10+1 = 86 ways.

Hence |A^c| = 95\times94\times\cdots\times86.

Thus, P(A^c) = \frac{|A^c|}{|\Omega|} =


\frac{95\times94\times\cdots\times86}
{100\times99\times\cdots\times91}. ■

Here is one example where we shall need the inclusion-exclusion


formula.

EXAMPLE 10: 10 distinct balls are dropped randomly over 3 (distinct)


boxes. Each box can hold any number of balls. What is the probability
that at least one box will remain empty?

SOLUTION: Such "at least" problems usually mean that our event is the
union of some simpler events. Since the simpler events may not be
disjoint, we need the inclusion-exclusion formula.
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Let \Omega= the set of all possible ways of dropping the balls. Then
|\Omega| = 3^{10}.

We assume that all the outcomes are equally likely.

Let A_i=the event that the i-th box remains empty, where i=1,2,3.

Then we are looking for P(A_1 \cup A_2\cup A_3).

By the inclusion-exclusion formula, this is the same as


P(A_1)+P(A_2)+P(A_3)-\big[P(A_1\cap A_2)+P(A_2\cap
A_3)+P(A_3\cap A_1) \big] + P(A_1\cap A_2\cup A_3). Since the
labelling of the boxes are arbitrary, we have P(A_1) = P(A_2)=P(A_3).

Similarly, the three probabilities inside the exclusion term are also equal
to each other. Thus we are left with 3P(A_1)-3P(A_1\cap A_2)+
P(A_1\cap A_2\cup A_3). Now |A_1| = 2^{10} and |A_1\cap A_2| =
1^{10} (Why?) .

Also |A_1\cap A_2\cap A_3| = 0 (Why?) .

So P(A_1) = \frac{|A_1|}{|\Omega|} = \left(\frac 23\right)^{10} and


P(A_1\cap A_2) = \frac{|A_1\cap A_2|}{|\Omega|} = \left(\frac
13\right)^{10}.

Hence the required answer is 3\times \left[ \left(\frac 23\right)^{10}-


\left(\frac 13\right)^{10} \right] = \frac{2^{10}-1}{3^9}.

Simulations

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Often we come across events that easily described in words, but whose
probabilities are rather hard to compute. Computer simulation comes
handy in such cases. Computer simulations help in detecting theoretical
mistakes too.

EXAMPLE 11: A deck of 10 cards labelled 1,...,10 is shuffled


thoroughly. We shall say that the i-th card is at home, if it is in the i-th
position after the shuffle. Write an R code to estimate the probability
that exactly 3 cards are at home.

SOLUTION:

event = numeric(5000)
for(k in 1:5000) {
x = sample(10,10)
at.home = sum(x==(1:10))
event[k] = (at.home==3)
}
mean(event)

Run in cloud ■

Problems for practice


Sets and Venn diagrams
EXERCISE 6: Let E,F,G be any three events. Find expressions for the
event that of E,F,G

1. only F occurs
2. both E and F but not G occur
3. at least one event occurs
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
4. at least two events occur
5. all three events occur
6. none occurs
7. at most one occurs
8. at most two occur
9. exactly two occur
10. exactly one occurs.

[Hint]

EXERCISE 7: Show that E\cap (F\cup G) = (E\cap F)\cup (E\cap G).

[Hint]

EXERCISE 8: Show that (A\cup B)^c = A^c \cap B^c.

[Hint]

EXERCISE 9: State (with proof/counterexample) which of the following


statements is correct/incorrect:

1. (A\cup B)\setminus C = A\cup (B\setminus C).


2. A\cap B\cap C = A\cap (C\cup B).
3. A\cup B\cup C = A\cup (B\setminus (A\cap B))\cup (C\setminus
(A\cap C)).
4. (A\cap B)\cup (B\cap C) \cup (C\cap A)\subseteq A\cup B\cup C.

[Hint]

Axioms

EXERCISE 10: Take \Omega = \{0,1,2\}. Obtain functions


P_i:\Omega\rightarrow{\mathbb R} for i=1,2,3 such that P_i violates
axiom i, but
Processing satisfies
math: 97% the other two.
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
[Hint]

EXERCISE 11: If P(A)=0.9 and P(B)=0.8, show that P(A\cap B)\geq


0.7. In general, show that P(A\cap B)\geq P(A)+P(B)-1. This is known
as Bonferroni's inequality.

[Hint]

EXERCISE 12: Show that P\left( \cup_1^n A_i\right) \leq \sum_1^n


P(A_i).

[Hint]

Equally likely

EXERCISE 13: An SRSWOR of size 2 is drawn from \{1,2,3,4,5\}. What


is the probability that (a) the first selected digit is odd? (b) the second
selected digit is odd? (c) both are odd? (d) at least one is odd?

[Hint]

EXERCISE 14: A fair coin is tossed 6 times. What is the probability that
the first head occurs (a) at the third toss? (b) not before the third toss?

[Hint]

EXERCISE 15: 10 distinct balls are dropped randomly in 3 distinct


boxes. What is the probability that none of the boxes remain empty?

[Hint]

EXERCISE 16: Two fair dice are tossed, what is the probability that the
sum is i for i=2,3,...,12?
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
[Hint]

EXERCISE 17: Two cards are randomly selected from a deck of 52


playing cards. What is the probability that they are of the same
denomination?

[Hint]

EXERCISE 18: 10 light bulbs are shining in a row. If a lightning strikes,


then some (or all or none) of the light bulbs may go out (all possibilities
being equally likely). What is the chance that after a lightning at least
two consecutive light bulbs are still shining.

[Hint]

EXERCISE 19: We have 4 letters and their respective addressed


envelopes. If the letters are placed randomly in the envelopes, then find
the probability that exactly k letters are in their correct envelopes for
k=1,2,3,4.

[Hint]

EXERCISE 20: The numbers 1,2,...,n are arranged in random order.


Find the probability that the digits 1,2,3 appear as neighbours in this
order.

[Hint]

EXERCISE 21: A throws six dice and wins if he scores at least one ace. B
throws twelve dice and wins if he scores at least two aces. Who has the
greater probability to win?

[Hint]
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
EXERCISE 22: Find the probability that among three random digits there
appear exactly 1,2 or 3 different digits. Also do the same for four
random digits.

[Hint]

EXERCISE 23: Find, for r=1,2,3,..., the probability p_r that in a sample
of r random digits no two are equal.

[Hint]

EXERCISE 24: If n balls are placed at random among n cells, find the
probability that exactly one cell remains empty.

[Hint]

EXERCISE 25: A man is given n keys of which only one fits his door. He
tries them successively using SRSWOR until he finds the right key. Show
that the probability that he will try k keys is \frac 1n for k=1,...,n.

[Hint]

EXERCISE 26: Suppose that each of n sticks is broken into one long and
one short part. The resulting 2n pieces are combined pairwise in a
random fashion. What is the probability that the original pairings are
restored? What is the probability that each long piece gets a short
partner?

[Hint]

EXERCISE 27: A box contains 90 good and 10 defective screws. If 10


screws are selected at random (SRSWOR), then find the probability that
none of these are defective.
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
[Hint]

EXERCISE 28: From the set \{a,b,c,d,e\} we draw an SRSWR of size


25. What is the probability that the sample will have 5 occurrences of
each of the letters?

[Hint]

EXERCISE 29: If n men (including A and B) stand in a row in random


order, what is the probability that there will be exactly r men between A
and B?

[Hint]

EXERCISE 30: What is the probability that two throws with three dice
each will show the same configuration if (a) the dice are distinct (e.g.,
(2,3,6) is not the same as (3,2,6)). (b) the dice are identical (e.g., \
{2,3,6\} is the same as \{3,2,6\})?

[Hint]

EXERCISE 31: There are n persons in a room. Assuming that each


person is equally likely to be born in any day of the year (1 year=365
days), find the probability that at least two persons share the same
birthday.

[Hint]

EXERCISE 32: A cereal company is giving a free ball with every box.
The ball is either red or green or blue or yellow, each being equally
likely. A buyer has bought 10 boxes. What is the probability that (s)he
has a at least one ball of each colour?

[Hint]
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
Harder

EXERCISE 33: A town has n+1 people: p_1,...,p_{n+1}. A news is


spreading as rumour in this town as follows. Initially, only p_1 knows the
news. He communicates the news to one of the remaining n people
randomly. This person again communicates the news to one of the other
n persons (may be p_1 again) randomly, and so on. Find the probability
that the rumour spreads r times without returning to p_1. Also, find the
probability that the rumour spreads r times without being repeated to
any person.

[Hint]

EXERCISE 34:

[Hint]

EXERCISE 35:

[Hint]

EXERCISE 36:

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
[Hint]

EXERCISE 37:

[Hint]

EXERCISE 38:

[Hint]

EXERCISE 39:

[Hint]
Processing math: 97%
Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF
EXERCISE 40:

[Hint]

EXERCISE 41: Let A_1,A_2,A_3 be three events. Let p_1 = \sum


P(A_i) and p_2 = \sum_{i< j} P(A_i\cap A_j) and p_3 = P(A_1\cap
A_2\cap A_3). Find (in terms of p_i's) the probability that exactly one of
the events A_1,A_2,A_3 has occurred. Generalise to n events. Also find
(and prove) a formula (in terms of the p_i's) for the probability that
exactly r of the n events has occurred.

[Hint]

Processing math: 97%


Explore our developer-friendly HTML to PDF API Printed using PDFCrowd HTML to PDF

You might also like