____________________________________________________________________________________________________
Subject ECONOMICS
Paper No and Title 2: Quantitative Methods-II (Statistical Methods)
Module No and Title 16: Methods of Point Estimation
Module Tag ECO_P2_M16
ECONOMICS Paper 2: Quantitative Methods-II (Statistical Methods)
Module 16: Methods of Point Estimation
____________________________________________________________________________________________________
TABLE OF CONTENTS
1. Learning Outcomes
2. Introduction
3. Method of Moments
4. Method of maximum likelihood
5. Summary
ECONOMICS Paper 2: Quantitative Methods-II (Statistical Methods)
Module 16: Methods of Point Estimation
____________________________________________________________________________________________________
1. Learning Outcomes
After studying this module, you shall be able to
Know about methods of moments
Know about the maximum likelihood method
2. Introduction
There can be many different ways of estimating a parameter of a population. Also
estimators have various desirable properties to varying degrees. Therefore, it is desirable
to have some general methods that provide estimators with reasonable desirable
properties. Here we discuss two such methods, the method of moments, which is one of
the oldest methods and the method of maximum likelihood. Maximum likelihood
estimators are generally preferable to moment estimators since they yield estimators with
certain efficiency properties. However, they generally require significantly large
computation than those required by the method of moments.
3. Methods of Moments
Let a random sample X1 , X2 ,--- Xn be taken from a probability mass function or
probability distribution function. Let E(Xk ) represent the kth population moment of the
distribution f(x). We represent the kth sample moment by m’k and m’k =
Thus, computing by the above method, we get the first population moment as E(X)= µ,
and the first sample moment is . Similarly, the second population and sample
moments are E(X2) and respectively.
The method of moments involves equating the population moments to their
corresponding sample moments. Thus, we get as many equations as are required from
which we solve for the unknown parameters of the population. The method of moments,
thus consists of solving the following system of equations
m′k =μ k=1,2-----p
for the p parameters of the population.
The following example makes the method of moments more clear.
ECONOMICS Paper 2: Quantitative Methods-II (Statistical Methods)
Module 16: Methods of Point Estimation
____________________________________________________________________________________________________
Example 1: If we want to estimate the parameter p of the binomial distribution when n is
known, then the system of equations we have to solve is m1 ′ =μ1 ′
Since μ 1 ’= np so m1 ′ =np
Hence = m1 ′/n
If both n and p are unknown, then the system of equations we shall have to solve is
m1 ′ =μ1 ′ and m2 ’=μ2 ’
Since μ 1 ′ = np and μ 2 ′ = npq+( μ 1 ’)2
we get
m1 ′ =np and m2 ′ =npq +(np)2
and solving these two equations for n and p, we get the following estimates of the two
parameters of the binomial distribution:
m2 ′ =m1 ′q+(m1 ′)2
⟹ p =1-
m1 ′ =np
⟹ = .
Example 2: Given a random sample of size 2 from a uniform population with β=1, use
the method of moments to obtain a formula for estimating the parameter α.
Solution: We solve the following equation: m1 ’=µ1 ’, where m1 ’= and µ1 ’=(α+β)/2=
(α+1)/2. Thus, = (α+1)/2 and we write the estimate of α as =2 -1.
4. Method of Maximum Likelihood
The method of maximum likelihood involves estimating those values of the unknown
population parameter for which the probability of obtaining the observed random sample
is maximum.
In general, let the observed sample values be x1 , x2 , …. xn , and P(X1 =x1 , X2 =x2 ,….Xn =
xn )= f(x1 , x2 ,….xn ; θ) represent the value of the joint probability distribution of the
random variables X1 , X2 , … Xn at the sample point (x1 , x2 ,…,...... xn ). Since the sample
values have been observed and are therefore fixed numbers, we regard f(x1 , x2 ,…xn ; θ) to
be a function of the parameter θ. This function of θ is referred to as the likelihood
function and is represented by L(θ). Similarly, when the random sample comes from a
continuous population, f(x1 , x2 ,…xn ; θ) is the value of the joint probability density at the
sample points (x1 , x2 ,…xn ). The method of maximum likelihood consists of maximizing
ECONOMICS Paper 2: Quantitative Methods-II (Statistical Methods)
Module 16: Methods of Point Estimation
____________________________________________________________________________________________________
the likelihood function with respect to θ, and we refer to the value of θ which maximizes
the likelihood function as the maximum likelihood estimate of θ. To maximize L(θ)=
f(x1 , x2 ,….xn ; θ)we take derivative of L(θ) with respect to θ and set it equal to zero.
The method is capable of generalization. In case there are several parameters, we take the
partial derivatives with respect to each parameter, set them equal to zero, and solve the
resulting equations simultaneously. Moreover a large sample drawn from a well specified
population distribution will generate maximum likelihood estimate of the parameter θ
having the property of MVUE i.e. it will be approximately unbiased and approximately
have least variance.
Example 3: Given 𝑥"successes" in n trials, find the maximum likelihood estimator of the
parameter 𝜃 of the binomial distribution.
Solution: To find the value of 𝜃 which maximizes
L(𝜃)=b(𝑥;n, 𝜃)=(𝑛𝑥)𝜃𝑥(1−𝜃)𝑛−𝑥, it will be convenient to make use of the fact that the
value of 𝜃 which maximizes L(𝜃) will also maximize
ln L(𝜃)=ln(𝑛𝑥)+ 𝑥∙ ln 𝜃+(n- 𝑥) ∙ ln(1- 𝜃)
Thus we get 𝑑 lnL(𝜃) 𝑑𝜃 = 𝑥𝜃 - 𝑛−𝑥1−𝜃
and, equating this derivative to 0 and solving for 𝜃, we find that the likelihood function
has a maximum at 𝜃=𝑥𝑛. Hence the maximum likelihood estimator of the parameter 𝜃 of
the binomial distribution is 𝜃 =𝑋𝑛.
Example 4: Suppose that n observations, X1 , 𝑋2 ,...... 𝑋𝑛 are made from a normally
distributed population. Find
(a) the maximum likelihood estimate of the mean if variance is known but mean is
unknown
(b) the maximum likelihood estimate of the variance if mean is known but variance is
unknown.
Solution:
(a) Since f(𝑥𝑘,𝜇) = 1/ √2𝜋𝜎2 𝑒−(𝑥𝑘−𝜇)2 /2𝜎2
we have
(1) L = f(𝑥1,𝜇)........ f(𝑥𝑛,𝜇) = (2𝜋𝜎2)-n/2 𝑒−∑ (𝑥𝑘−𝜇)2 /2𝜎2
Therefore,
(2) ln L = -𝑛/2 ln(2𝜋𝜎2 ) – 1/2𝜎2 ∑(𝑥𝑘−𝜇)2
(3) Taking the partial derivative with respect to 𝜇 yields
ECONOMICS Paper 2: Quantitative Methods-II (Statistical Methods)
Module 16: Methods of Point Estimation
____________________________________________________________________________________________________
1/𝐿 𝜕𝐿/𝜕𝜇 = 1/𝜎2 ∑ (𝑥 k −𝜇)
(4) Setting 𝜕𝐿𝜕𝜇 = 0 gives
(𝑥𝑘−𝜇)= 0 i.e. 𝑥𝑘−n𝜇 = 0
(5) Or, 𝜇 = 𝑥𝑘
Therefore the maximum likelihood estimate is the sample mean.
(b) Since f(𝑥𝑘,𝜎2) = 1 /√2𝜋𝜎2 𝑒−(𝑥𝑘−𝜇)2 /2𝜎2
(1) we have, L = f(𝑥1,𝜎2)........ f(𝑥𝑛,𝜎2) = (2𝜋𝜎2 )−𝑛/2 𝑒−∑ (𝑥𝑘−𝜇)2 /2𝜎2
(2) Therefore, ln L = -𝑛/2 ln(2𝜋𝜎2 ) – 1/2𝜎2 (𝑥 𝑘−𝜇)2
(3) Taking the partial derivative with respect to 𝜎2 yields
1/𝐿 𝜕𝐿/𝜕𝜎2 = -𝑛/2𝜎2 +1/2(𝜎2 )2 ∑(𝑥 𝑘−𝜇)2
Setting 𝜕𝐿𝜕𝜎2 = 0 gives
𝜎2 = ∑(𝑥 𝑘−𝜇)2 /𝑛
Example 5: Prove that the maximum likelihood estimate of the parameter 𝛼 of a
population having density function: 2 𝛼2 ( −𝑥),0<𝑥< 𝛼, for a sample of unit size is 2𝑥, 𝑥
being the sample value. Show also that the estimate is biased.
Solution: Sample of unit size ( ) = 1
likelihood function L(𝛼) = 2 /𝛼2 ( 𝛼−𝑥) = f(𝑥,𝛼)
logL(𝛼) = log2 - log 𝛼2 + log( 𝛼−𝑥)
= log2 - 2log 𝛼 + log( 𝛼−𝑥)
Differentiating w.r.t. 𝛼 we get
𝑑/𝑑 𝛼 [(𝛼)] = - 2/𝛼 + 1/( 𝛼−𝑥)
𝑑2 /𝑑 𝛼2 [ 𝐿(𝛼)] = 2/ 𝛼2 – 1/( 𝛼−𝑥)2
For maxima or minima 𝑑/𝑑 [ (𝛼)] = 0
∴ - 2/𝛼 + 1/( 𝛼−𝑥) = 0 ⟹2/𝛼 = 1/( 𝛼−𝑥)⟹2 𝛼−2𝑥 = 𝛼⟹ 𝛼 = 2 𝑥
When 𝛼 = 2𝑥,
𝑑2 /𝑑 𝛼2 [(𝛼)] = 2/4𝑥 2 – 1/(𝑥)2 = −1/2𝑥 2 < 0 𝑖.𝑒.𝑑2 /𝑑 𝛼2 [𝐿(𝛼)] 𝑖𝑠 –𝑣𝑒
∴Maximum likelihood estimator of 𝛼 is given by = 2𝑥
E(𝛼 ) = E(2𝑋) = 2 =4 /𝛼2 [𝛼𝑥 2 /2−𝑥 3 /3]= 2/3𝛼
Since E( ) ≠𝛼, = 2𝑥 is not an unbiased estimate of 𝛼.
ECONOMICS Paper 2: Quantitative Methods-II (Statistical Methods)
Module 16: Methods of Point Estimation
____________________________________________________________________________________________________
5. Summary
Point estimation refers to the process of estimating a parameter from a probability
distribution, based on observed data from the distribution.
We have discussed two general methods to derive a point estimate for an
unknown parameter first is the method of moments and the second is method of
maximum likelihood.
The method of moments involves equating the population moments to their
corresponding sample moments.
The method of maximum likelihood involves estimating those values of the
unknown population parameter for which the probability of obtaining the
observed random sample is maximum.
ECONOMICS Paper 2: Quantitative Methods-II (Statistical Methods)
Module 16: Methods of Point Estimation