MATH70078 Fundamentals of Statistical Inference, 2024-25
Coursework.
Deadline for submission 12 noon (midday) Friday 8 November, via Blackboard.
(1) Let Y1, . . . , Yn be independent, identically distributed, with common density f(y; θ) = θe−θy, y > 0.
Suppose the (non-informative, improper) prior for θ of the form. π(θ) ∝ 1/θ, θ > 0.
Find the form. of the Bayes estimator of µ = E(Yi) ≡ 1/θ with respect to this prior and the loss function
L(µ, a) = µ2/(µ − a)
2. [4]
What is the maximum likelihood estimator µb of µ? [1]
Calculate the risk of the Bayes estimator and compare it with that of µb. Is µb admissible? [5]
[A random variable X with the inverse Gamma distribution has density of the form. f(x; α, β) = Γ(α)/βα(x/1)
α+1e
−β/x, x > 0.]
(2) Let Y1, . . . , Yp (p > 2) be independent random variables such that Yi ∼ N(θi
, 1). Write Y = (Y1, . . . , Yp)
T and θ = (θ1, . . . , θp)
T
. Let θ ≡ θ(Y) = (θ1(Y), . . . , θp(Y))T be an estimator of θ, and let g(Y) ≡ (g1(Y), . . . , gp(Y))T = θ − Y .
Let Di(Y) = ∂gi(Y)/∂Yi
. Show that
is an unbiased estimator of the risk of θ, under squared error loss [4]
Suppose the estimator θ is of the form. θ = bY , for b ∈ R. Find the value b
∗ of b that minimises the unbiased risk estimator Rb(Y). Compare the estimator b
∗Y with the James-Stein estimator. [3]
The soft threshold estimator is defined by
where λ > 0 is a constant, to be specified. Show that for this estimator
[2]
[Here, I(A) = 1 if A holds, = 0 otherwise.]
It has been suggested that a suitable choice of the value of λ is that which minimises R(Y). Explain why determining this value only requires examining the value of R(Y) at a finite number of values of λ. [3]
Find the form. of the estimator θP
that minimises the penalized sum of squares
with J(θ) = |{θi
: θi ≠ 0}|, the number of non-zero elements of θ.
Calculate the risk of θP
for the case p = 1. When is this estimator preferable to the unbiased estimator θ ≡ Y ? Quantify your answer for λ = 1, 2, 5. [8]
[Total 30]