SANAKA EDUCATIONAL TRUST’S GROUP OF
INSTITUTIONS
(A UNIT OF SANAKA EDUCATIONAL TRUST)
VILL+P.O- MALANDIGHI, P.S- KANKSA, DURGAPUR-713212
APPROVED BY AICTE, AFFILATED TO MAKAUT, WEST BENGAL
NAME : AYAN SINGHA ROY
UNIVERSITY ROLL NO. : 27830824028
REGISTRATION NO. : 242780110213(2024-
25)
STREAM : B.TECH IN CSE(AI&ML)
SEMESTER : 2ND
PAPER CODE : BS-M201
PAPER NAME : MATHEMATICS-IIA
TOPIC NAME : BASIC PROBABILITY
1. Classical definition & axiomatic definition of
probability
CLASSICAL DEFINITION OF PROBABILITY:
The Classical definition of probability applies to situation
where all possible outcomes are equally likely. It’s based
on the idea that if you have a set of equally possible
outcomes, the probability of a specific event occurring is
the ratio of favorable outcomes of the total number of
possible outcomes.
FORMULA-
P(A)= n(A)/n(S)
where:
P(A) is the probability of event A occurring
n(A) is the number of favorable outcomes
N(S) is the total number of possible outcomes in the
simple space
AXIOMATIC DEFINITION OF PROBABILITY:
The axiomatic definition of probability was introduced by
Andrey Kolmogorov in 1933 and is based on a set of axioms
that define probability in a rigorous mathematical way. These
axioms provide a solid foundation for probability theory.
AXIOMS:
Let S be the sample space and P(A) be the probability of an
event A. The probability function P must satisfy:
1. Non-Negativity: P(A) ≥ 0 for all events A
2. Normalization: P(S) = 1
3. Additivity: If A and B are disjoint, then P(A U B)= P(A)
+P(B)
For countable disjoint events A1,A2,……
P(A1 U A2 U ………..) = P(A1)+P(A2)+ ……………
2. Conditional probability & Bayes’ theorem.
CONDITIONAL PROBABILITY:
Let E be a random experiment and A & B be two events of E, then the
conditional probability of the event A on the hypothesis that the
event B has already occurred is denoted by P(A/B) is defined by
P(A/B)= P(AB)/P(B) where P(B)≠0
Similarly, P(B/A)= P(AB)/P(A) where P(A)≠0
BAYES’ THEOREM:
Let E be a random experiment and a1,a2,……an be n set of mutually
exclusive and exhaustive set of events corresponding to E, now if B
be an arbitrary event then
Where a = 1,2,3,
……n
& j = 1,2,3,……n
3. Moments of binomial variate.
The moments of a binomial variate help describe its
distribution. The first moment, or mean, is given by np,
representing the expected number of successes. The second
moment helps derive the variance, which is np(1 - p),
indicating how much the values deviate from the mean. The
third moment, related to skewness, is np(1 - p)(1 - 2p) and
shows whether the distribution leans more towards higher or
lower values. The fourth moment, associated with kurtosis,
gives insights into the shape and peak of the distribution.
These moments are essential in understanding the spread,
symmetry, and overall behavior of a binomially distributed
variable.
4. Variance and standard deviation of
random variable.
The variance of a random variable measures how
much its values deviate from the mean. If X is a
random variable with expected value E(X)=µ , the
variance is given by:
It represents the average squared deviation from the
mean. A higher variance indicates greater spread in the
data. The standard deviation is the square root of
variance:
It measures the dispersion in the same units as the
original variable, making it easier to interpret.
Standard deviation helps in understanding the
consistency of data and is widely used in statistics and
probability.
5. Probability mass function and probability
density function.
• We know that spectrum of a random variable is either discrete or
continuous and consequently the corresponding distribution function is
known as discrete probability distribution or continuous probability
distribution