553.420/620 Probability
Assignment #06
1. Let X ∼ Gamma(α, β). In class we showed that E(X) = αβ. By using the same approach derive E(X2
), and then use it to compute V ar(X).
2. Use the normalization trick to compute these integrals:
(a) ∫0
∞ x
−1/2
e
−x/2 dx
(b) ∫0
∞ x
n
e
−nx dx, where n > 0 is an integer.
Challenge: Label the integral in part (b) by K(n), does the limit lim n→∞ √
nenK(n) exist? If so, find it.
3. The following is a PDF:
A continuous random variable X having this PDF is said to have the Beta distribution with parameters α > 0 and β > 0, and we write X ∼ Beta(α, β). Use the fact that the above is a PDF to compute each of the following:
(a) ∫10 x
−1/2
(1 − x)
−1/2 dx
(b) E(X)
(c) V ar(X).
Hint for part (a) is to use the normalization trick. This will also work for parts (b) and (c) after you write the integrals for E(X) and E(X2
).
4. A continuous random variable X has the PDF f(x) = x/8 for 0 < x < 4. Compute P( √3/1 ≤ √ X < √ 3).
5. The (random) rate R a particle moves (in meters per second) has PDF f(r) = 2re−r
2
for r > 0.
(a) Compute the probability the particle is slower than 1/2 meter per second.
(b) Compute the probability the particle travels a distance greater than 4 meters in 2 seconds. Hint: distance = rate × time.
(c) Compute E(R) the expected rate. Hint: E(R) = ∫0
∞ r · 2re−r
2
dr, try using integration by parts.
Remark. The PDF in this problem is a special case of the Rayleigh distribution.
6. When it exists, the MGF of a random variable X is MX(θ) = E(e
θX) for θ in an open interval that contains θ = 0. Now consider the transformed random variable Y = aX +b. Carefully show that MY (θ) = e
bθMX(aθ).
7. Suppose X is a random variable with MGF MX(θ) and let µ = E(X) denote its mean and σ =
√
V ar(X) denotes its standard deviation. One consequence of problem 6 is that the MGF of the z-score, namely, the random variable Z := σ/X−µ is e
− µθσ MX(
σ
θ
).
(a) If X ∼ Poisson(λ), then write down the MGF of its z-score Z =
X
√−
λ
λ
. You may need to recall the MGF you computed on the previous homework; otherwise, feel free to use the distribution sheet to recall it.
(b) From your answer to part (a) what happens to this MGF as λ tends to ∞? Answer this question by formally computing the limit. It may help to recognize that in the exponent of e there should appear an e√ λ/θ − 1 and replacing this with the MacLaurin expansion should simplify the problem considerably. Remark. The resulting MGF is not an accident – it happens to be the MGF of a normal distribution with mean 0 and variance (and standard deviation) 1 – this is one manifestation of the central limit theorem.
8. Let X ∼ exp(λ) for some rate λ > 0. Recall this means the PDF of X is f(x) = λe−λx for x > 0. Show that the exponential distribution has this interesting property: for any s, t > 0, P(X > s + t|X > s) = P(X > t). Interpreting X as the lifetime of a component, then, in words, this property says given that a component has survived to time s, the (conditional) probability it survives another t is just the same as a component surviving to t; the component forgets its age! This is called the memoryless property.
9. (The Tail probability formula to compute expected values of nonnegative random variables)
Show that if X is a nonnegative continuous rv with PDF f(x), then E(X) = ∫0
∞ P(X > u) du.
Hint: Since P(X > u) = ∫u
∞ f(x) dx, it would follow ∫0
∞ ∫u
∞ f(x) dxdu now switch the order of integration and pay attention to the region you are integrating over.
Remark. There is a discrete random variable version of this problem when X is a nonnegative but it requires that the random variable to also be integer-valued. In this case
Here’s a proof:
Using this tail probability formula here’s another way to compute the mean of a geometric(p):
Since the tail probabilities are P(X ≥ k) = ∑∞x=kp(1 − p)
x−1 = (1 − p)
k−1
, we’d have E(X) = ∑∞k=1 P(X ≥ k) = ∑ ∞ k=1 (1 − p)
k−1 = p
1
.
10. Suppose X ∼ Normal(µ, σ2 ). In class we showed using the CDF method that Y = aX +b ∼ Normal(aµ+ b, a2σ
2
) when a > 0. Your job: Show this is also true if a < 0. Hint: Following the CDF method as we did, the support of Y is all reals, so, for real y, FY (y) = P(Y ≤ y) = P(aX + b ≤ y) = P(ax ≤ y − b) = P(X ≥ a/y −b) = 1 − ∫ y−b a−∞ e − 1 2 (x − µ ) 2 σ2 √2 πσ 2 dx. Now take derivatives to recover the pdf fY (y) of Y .