Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
38 views11 pages

Tuto1 Merged

This document contains 6 tutorials on probability and statistics concepts. It covers topics like probability density functions, distributions, kernels, conjugacy, sufficiency, and Bayesian inference using conjugate priors, Jeffreys priors, and improper priors. Various examples and exercises are provided to illustrate the application of these concepts.

Uploaded by

22052592
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views11 pages

Tuto1 Merged

This document contains 6 tutorials on probability and statistics concepts. It covers topics like probability density functions, distributions, kernels, conjugacy, sufficiency, and Bayesian inference using conjugate priors, Jeffreys priors, and improper priors. Various examples and exercises are provided to illustrate the application of these concepts.

Uploaded by

22052592
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

INSTITUT SAINS MATEMATIK

UNIVERSITI MALAYA
SQB7009

Tutorial 1

1. Suppose a probability density function for is given by , for some


functions and , where does not depend on . Then is said to be the
kernel of the distribution, and .
For each of the following distributions, write down the probability density function and find
the kernel:

(a)
(b)
(c)
(d)

2. Let be a random variable with pdf given by . In each of the following, what is the
distribution of ?
(a)
(b)
(c)
(d)
(e)
(f)

3. Find the following sums and integrals by identifying the kernel and using properties of pdfs.
(a)

(b)

(c)
(d) , where is an integer
(e)
INSTITUT SAINS MATEMATIK
UNIVERSITI MALAYA
SQB7009

Tutorial 2

1. Suppose we have the example from lectures (section II.2) of patients who survive to six
months after being treated with a new drug. We have . Various doctors are
consulted, and each gives different prior distribution for their prior beliefs about the new
drug. The trial goes ahead, and we observe 6 out of 15 patients who survive.
For each of the priors below, find the posterior distribution and calculate the posterior mean.

(a)
(b)
(c)
(d)
(e) .

Hint: When finding the posterior mean in part (e), you may find it helpful to know that:

 If then
 If then

2. Suppose that are exchangeable Normal random variables, with mean and known
variance . We are interested in inference about the mean . Consider the prior distribution

for known constants .

(a) Show that the posterior distribution is

where

(b) Consider a 95% posterior credible interval for , given by . Show that this is
also a 95% Highest Density Region.
(c) Suppose we wish to make predictions about a future observation . Show that the
predictive distribution is

3. Suppose there is a Beta(4,4) prior distribution on the probability that a coin will yield a
‘head’ when spun in a specified manner. The coin is spun ten times; you are not told how
many heads were seen, only that the number is less than 3.
(a) Find the posterior distribution up to proportionality, and show that the normalizing
constant is given by

Hint: consider carefully the distribution of the data; if is the number of heads
obtained, you need where is the event that .

(b) Show that the posterior mean is 0.305.


(c) Given that , and assuming

find an approximate 95% credible interval for .

Recall that if is an integer.


INSTITUT SAINS MATEMATIK
UNIVERSITI MALAYA
SQB7009

Tutorial 3

1. Let , be exchangeable Poisson random variables, with mean . Consider a


prior distribution .
(a) Find the posterior distribution .
(b) Show that the posterior mean can be written as a weighted average of the prior mean,
, and the maximum likelihood estimator , where is the
sum.
(c) Let be a future (unobserved) observation. Find the mean and variance of the predictive
distribution .
(d) The data in the table below are the number of fatal accidents on scheduled airline flights
between 1976 and 1985.

1976 1977 1978 1979 1980 1981 1982 1983 1984 1985
24 25 31 31 22 21 26 20 16 22

Suppose our prior beliefs about airline accidents can be expressed as


. Let be the number of fatal accidents in 1986.

Assuming

find an approximate 95% predictive interval for the number of accidents in 1986.

2. Suppose are exchangeable. Using the version of de Finetti’s Representation


Theorem given in lectures, show that

where

Let represent data from a first study, from which we have a posterior distribution
. We now conduct a second study, and collect some new data .
Explain how the expression above justifies the approach of using the posterior from the first
analysis as a prior distribution in the second analysis.
3. Suppose that , where is known, and the aim is to estimate
(a) Show that is a sufficient statistic for .
(b) Consider an inverse-gamma prior distribution for :

Show that this corresponds to a gamma distribution for , where , the precision
of .
(c) Find the posterior distribution of .
(d) Show that . Hence find the , given that
and with a prior distribution - .
Hint: Recall that is the same as a distribution.
INSTITUT SAINS MATEMATIK
UNIVERSITI MALAYA
SQB7009

Tutorial 4

1. Let be a sample of data. For each of the following distributions for find the
conjugate prior distribution and the corresponding posterior distribution.

(a) .
(b) .
(c) , with known.
(d) , with known.
(e) are from the Maxwell distribution, :

which has mean and variance .

2. For each of the distributions in Question 1 find Jeffreys’ prior distribution and the
corresponding posterior distribution.
INSTITUT SAINS MATEMATIK
UNIVERSITI MALAYA
SQB7009

Tutorial 5

1. Suppose are exponentially distributed: .


(a) Find the conjugate prior for , and the corresponding posterior distribution. Show that
the posterior mean for the failure rate can be written as a weighted average of the prior
mean and the maximum likelihood estimator, .
(b) Find Jeffreys’ prior, and express this as a member of the conjugate family. Hence
deduce the posterior distribution.
(c) Consider the transformation . Show that Jeffreys’ prior for is equivalent to
a uniform prior for . Find the Jeffreys’ prior for and explain how this demonstrates
the property of invariance to reparameterisation.

A water company is interested in the failure rate of water pipes. They ask two groups of
engineers about their prior beliefs about the failure rate. The first group believe the mean
failure rate is around with coefficient of variation 0.3, while the second group believe
the mean is with coefficient of variation 0.5. (The coefficient of variation is given by
the standard deviation divided by the mean.)
A sample of pipes is taken, with the following times to failure:

Let be the time until a water pipe fails, and assume the model
.

(d) Approximate the two engineer beliefs by appropriate members of the conjugate family.
Find the posterior mean and variance for each, and for the reference prior from part (b).
Approximating each posterior by , estimate also the probability that
the failure rate is less than 0.1.
(e) How do you expect the differences to be reconciled as more data become available?

2. Suppose that , where is an unknown parameter to be estimated.


(a) Show that is a sufficient statistic for .
(b) Show that the conjugate prior distribution is a Pareto distribution, :
(c) Consider the transformation . Assuming a constant prior for on
find the equivalent prior for , and the corresponding posterior distribution.

3. Suppose and , where is known, and and are


conditionally independent.
(a) Find the joint distribution of and .
(b) Consider the improper noninformative joint prior distribution:

Find the joint posterior distribution. Are and independent?


(c) Find the marginal posterior distribution .
(d) Suppose a future observation is given by . Find the predictive
distribution .
INSTITUT SAINS MATEMATIK
UNIVERSITI MALAYA
SQB7009

Tutorial 6

1. Suppose 𝑋|𝜇 ~ 𝑁(𝜇, 𝜙) and 𝑌|𝜇, 𝛿 ~ 𝑁(𝜇 + 𝛿, 𝜙) from Question 3 on Tutorial 5, where 𝜙 is
known, and consider the improper noninformative joint prior distribution, 𝑝(𝜇, 𝛿) ∝ 1.
(a) Describe how the Gibbs sampler may be used to sample from the posterior distribution,
deriving all required conditional distributions.
(b) Suppose we have samples from the Gibbs sampler {𝜇 (𝑡) , 𝛿 (𝑡) }, 𝑡 = 0, … , 𝑁, where 𝑁 is
large. Explain how these samples may be used to estimate the marginal mean, 𝐸[𝛿|𝑥, 𝑦].

2. The independence sampler is the Metropolis-Hastings algorithm with proposal distribution

𝑞(𝜃 ∗ |𝜃𝑡−1) = 𝑞(𝜃 ∗ )

(a) Describe the Metropolis-Hastings algorithm for this transition probability.


(b) Show that if 𝑞(𝜃) is proportional to the required posterior distribution 𝑝(𝜃|𝐲) then the
Metropolis algorithm reduces to simple Monte Carlo sampling.
(c) Suppose we take 𝑞(𝜃) = 𝑝(𝜃), the prior distribution. Show that the acceptance
probability depends only on the ratio of likelihoods. Under what circumstances will using
the prior distribution as the proposal distribution be a good choice?

3. Suppose 𝑌1 , … , 𝑌𝑛 are normally distributed with 𝑌𝑖 |𝜇, 𝜆 ~ 𝑁(𝜇, 1/𝜆), −∞ < 𝜇 < ∞, 𝜆 > 0 ,
where both 𝜇 and 𝜆 are unknown. Suppose we assume 𝜇 and 𝜆 are independent with priors
𝜇~ 𝑁(𝜇0 , 1/𝜏) and 𝜆 ~ Gamma(𝛼, 𝛽) where 𝜇0 , 𝜏, 𝛼, 𝛽 are known.

(a) Show that the joint posterior density of 𝜇 and 𝜆 can be expressed as
𝑛
𝑛 𝜆 𝜏
𝑝(𝜇, 𝜆|𝐲) ∝  2 exp {− ∑(𝑦𝑖
𝛼+ −1
− 𝜇)2 − 𝜇 2 + 𝜏𝜇0 𝜇 − 𝛽𝜆} .
2 2
𝑖=1

(b) Describe how the Gibbs sampler may be used to sample from the joint posterior
distribution, by deriving all required conditional distributions. Give a sensible estimate
of 𝑉𝑎𝑟(𝜆|𝐲).

4. Show that the Gibbs sampler for sampling from a distribution 𝜋(𝜃) where 𝜃 = (𝜃1 , … , 𝜃𝑑 )
can be viewed as a special case of the Metropolis-Hastings algorithm where each iteration 𝑡
consists of 𝑑 Metropolis-Hastings steps each with an acceptance probability of 1.
INSTITUT SAINS MATEMATIK
UNIVERSITI MALAYA
SQB7009

Tutorial 7

1. Consider the following loss function:

(a) Find an expression for the Bayes estimator of .


(b) What is the Bayes estimator of if ?
(c) What is the Bayes estimator of if , where ?

2. Consider the following loss function:

(a) Find a general expression for the Bayes estimator of .


(b) Suppose , with prior distribution . Find the
Bayes estimator for in this case.

3. Consider the loss function

(a) Show that a Bayes estimator of under this loss is given by

(b) Suppose are conditionally independent , with improper prior


. Find the Bayes estimator of .

4. A telesales company wants to increase the number of days on which target sales are
achieved; currently targets are met or exceeded on of days. All employees are given
a day-long motivational course, and their sales monitored for the next five days; targets are
met or exceed on three of these days.
Let be the number of days on which targets are met and assume , with
prior for given by .
Consider the hypotheses

where .

(a) Find the prior odds and the posterior odds, and calculate the Bayes factor.
(b) Suppose we have the following loss function for this hypothesis test:

truth
new is same/worse new is better
use old 0 1
use new k 0

Find a decision rule, in terms of the Bayes factor, to reject the null hypothesis when
.
(c) Informally, we might say that there is substantial evidence against if the Bayes factor
is less than . Find a loss function which corresponds to this informal rule.

You might also like