Math 180B, Winter 2015 Homework 1 Solutions 1. A fair coin is tossed 20 times. Let X be the number of heads thrown in the first 10 tosses, and let Y be the number of heads tossed in the last 10 tosses. Find the conditional probability that X = 5, given that X + Y = 12. Solution. Evidently, both X and Y have the binomial distribution with parameters n = 10 and p = 1/2, and they are independent. Consequently, X +Y has the binomial distribution with parameters n = 20 and p = 1/2. Therefore P[X = 5|X + Y = 12] = = = = =
P[X = 5, X + Y = 12] P[X + Y = 12] P[X = 5, Y = 7] P[X + Y = 12] P[X = 5] · P[Y = 7] P[X + Y = 12] 10 10 10 · 10 5 (1/2) 7 (1/2) 20 20 12 (1/2) 10 10 5 · 7 , 20 12
which you may recognize as a hypergeometric distribution. 2. Let X1 and X2 be independent Poisson random variables with parameters λ1 and λ2 . Show that for every n ≥ 1, the conditional distribution of X1 , given X1 + X2 = n, is binomial, and find the parameters of this binomial distribution. Solution. Just compute, using the fact that X1 + X2 also has the Poisson distribution, with parameter λ1 + λ2 : For k = 0, 1, 2, . . . , n, P[X1 = k|X1 + X2 = n] = =
P[X1 = k, X2 = n − k] P[X1 + X2 = n] e−λ1 λk1 /k! · e−λ2 λn−k /(n − k)! 2 −λ −λ e 1 2 (λ1 + λ2 )n /n!
n! λk1 λ2n−k · k!(n − k)! (λ1 + λ2 )n k n−k n λ1 λ2 = . k λ1 + λ2 λ1 + λ2 =
1
This shows that the conditional distribution of X1 , given that X1 + X2 = n, is binomial with parameters n and p = λ1 /(λ1 + λ2 ). 3. An item is selected randomly from a collection labeled 1, 2, . . . , n. Denote its label by X. Now select an integer Y uniformly at random from {1, . . . , X}. Find: (a) E(Y ); (b) E(Y 2 ); (c) Var(Y ); (d) P(X + Y = 2). Solution. (a) Observe that for x = 1, 2, . . . , n, E[Y |X = x] =
x X
y·
y=1
1 1 + 2 + ··· + x x(x + 1)/2 x+1 = = = , x x x 2
and likewise E[X] = (n + 1)/2. Therefore, by the Law of the Forgetful Statistician, E[Y ] = E[E[Y |X]] = E[(X + 1)/2] =
1 (n + 1)/2 + 1 n+3 (E[X] + 1) = = . 2 2 4
(b) In similar fashion, 2
E[Y |X = x] =
x X y=1
y2 ·
1 x(x + 1)(2x + 1) 1 2x2 + 3x + 1 = · = , x 6 x 6
and likewise E[X 2 ] = (2n2 + 3n + 1)/6. Therefore 1 E[Y 2 ] = E[(2X 2 + 3X + 1)/6] = 2E[X 2 ] + 3E[X] + 1 6 1 2n2 + 3n + 1 n+1 = +3 +1 6 3 2 4n2 + 15n + 17 = . 36 (c) Using (a) and (b), 4n2 + 15n + 17 n2 + 6n + 9 − 36 16 16n2 + 60n + 68 − 9n2 − 54n − 81 7n2 + 6n − 13 = = . 144 144 2
Var(Y ) = E[Y 2 ] − (E[Y ]) =
(d) Because X + Y = 2 if and only if both X = 1 and Y = 1, we have P[X + Y = 2] = P[X = 1, Y = 1] = P[Y = 1|X = 1] · P[X = 1] = 1 · 2
1 1 = . n n
4. Let X, Y , and Z be independent random variables, each with the standard normal distribution. Compute the following: (a) P[X + Y > Z + 2]; (b) Var[3X + 4Y ]; (c) P[3X + 4Y < 5]. Solution. (a) Because the random variable Z − X − Y has the normal distribution with √ mean 0 and variance 3, the r.v. W = (Z −X −Y )/ 3 has the standard normal distribution. Therefore, using the normal table for the final equality,
√ P[X + Y > Z + 2] = P[Z − X − Y < −2] = P[W < −2/ 3] = P[W < −1.15) = Φ(−1.15) = 0.1251.
(b) Var[3X + 4Y ] = Var[3X] + Var[4Y ] = 9Var[X] + 16Var[Y ] = 9 · 1 + 16 · 1 = 25. (c) In view of part (b), the r.v. U = (3X +4Y )/5 has the standard normal distribution. Therefore P[3X + 4Y < 5] = P[U < 1] = Φ(1) = .8413. Problem 2.1.2. The joint mass function of X and Y is given by 1 1 P[X = x, Y = y] = P[Y = y|X = x] · P[X = x] = · , y ∈ {1, . . . , x}, x ∈ {1, 2, . . . , N }. x N Consequently, the marginal of Y satisfies H(N ) − H(y − 1) P[Y = y] = , y = 1, 2, . . . , N, N where H(y) denotes the harmonic number 1+ 12 + 13 +· · ·+ y1 . Then conditional distribution of X is now seen to be P[X = x|Y = y] = Problem 2.1.5.
1 , x(H(N ) − H(y − 1))
x = y, y + 1, . . . , N ; y = 1, 2, . . . , N.
Let N be the number of times the nickel comes up heads, so that N
has the binomial distribution with parameters 20 and
1 2.
The conditional distribution of
X, given that N = n, is binomial with parameters n and 12 . We now compute: P[X = 0] =
20 X
P[X = 0|N = n] · P[N = n]
n=0 20 X n
20 = (1/2) (1/2) (1/2)n (1/2)20−n 0 n n=0 20 20 X X 20 20 20+n 20 = (1/2) = (1/2) (1/2)n n n n=0 n=0 0
n
= (1/2)20 (1/2 + 1)20 = (3/4)20 = .00317. 3
Exercise 2.3.1. The r.v. N is uniformly distributed on {1, 2, . . . , 6}, so ν = E[N ] =
1+2+3+4+5+6 21 7 = = = 3.5, 6 6 2
and E[N 2 ] =
1 + 4 + 9 + 16 + 25 + 36 = 91/6, 6
and so
91 49 35 − = = 2.916. 6 4 12 Meanwhile, if ξk is the indicator of the event that the k th toss of the coin results in a head, τ 2 = Var[N ] =
then ξ1 , ξ2 , . . . are iid Bernoulli rand variables with mean µ = 1/2 and variance σ 2 = 1/4. Of course, the total number of heads can be expressed as Z = ξ1 + ξ2 + · · · + ξN . Feeding this information into the general rule for the mean and variance of a random sum (formula (2.30) on page 59 of the text), we obtain E[Z] = µν = and
7 1 7 · = = 1.75, 2 2 4
77 7 1 1 35 · + · = = 1.60416. 2 4 4 12 48 On the other hand, we can (with a great deal more effort) compute the mass function Var[Z] = νσ 2 + µ2 τ 2 =
of Z and from that the mean and variance of Z. By the Law of Total Probability and the given information, the marginal of Z is given by the formula P[Z = k] =
6 X n=max {k,1}
n (1/2)n (1/6), k
k = 0, 1, 2, . . . , 6.
With a little patience these seven sums can be evaluated; the result is P[Z = 0] = P[Z = 1] = P[Z = 2] = P[Z = 3] = P[Z = 4] = P[Z = 5] = P[Z = 6] = 4
63 384 120 384 99 384 64 384 29 384 8 384 1 384
which numbers (thankfully) add up to 1. Using these we have E[Z] =
672 0 · 63 + 1 · 120 + 2 · 99 + 3 · 64 + 4 · 29 + 5 · 8 + 6 · 1 = = 1.75. 384 384
and E[Z 2 ] =
0 · 63 + 1 · 120 + 4 · 99 + 9 · 64 + 16 · 29 + 25 · 8 + 36 · 1 1792 14 = = , 384 384 3
and finally 14 49 77 − = , 3 16 48 as before. If nothing else, this latter calculation of the mean and variance of Z demonstrates Var[Z] =
the utility of the Law of the Forgetful Statistician! Exercise 2.3.5.
This is another exercise in the use of formula (2.30) on page 59. The
Poisson random variable N has mean and variance both equal to 2; that is, ν = 2 = τ 2 . Meanwhile, µ = 3 and σ 2 = 4, so E[X] = 6 and Var[X] = 26. Problem 2.3.2. In this problem, the r.v. N is binomial with parameters M and q while the conditional distribution of Z, given that N = n, is binomial with parameters n and p. (In this problem, q and p are unrelated numbers between 0 and 1.) Thus we can think of Z as
N X
ξk ,
k=1
where ξ1 , ξ2 , . . . are iid Bernoulli random variables with success probability p, and the ξk s are independent of N . By the Law of Total Probability, the marginal of Z is P[Z = k] =
M X M n=k
n
n
M −n
q (1 − q)
n k p (1 − p)n−k , k
k = 0, 1, 2, . . . , M.
Make the change of variable j = n−k in this sum; the new range of summation is therefore 0 ≤ j ≤ M − k. There results P[Z = k] =
M −k X M M −k k (qp) [q(1 − p)]j (1 − q)M −k−j , k j j=0
which we can sum using the Binomial Theorem, leading to M M −k P[Z = k] = (qp)k [q(1 − p) + (1 − q)] , k M M −k = (qp)k [1 − qp] , k 5
and the conclusion that Z has the binomial distribution with parameters M and pq. A more conceptual approach to the Problem is the following. Think of N as the number of Heads when a q-dime is tossed M times. Each time the dime comes up Heads, a p-quarter is then tossed; if the dime comes up Tails the quarter is not tossed. Let Z count the number of times both dime and quarter come up Heads. Clearly Z has the binomial distribution with parameters M and pq, and the conditional distribution of Z, given that N = n, is binomial with parameters n and p. Problem 2.3.4(a). Because N has the Poisson distribution, E[N ] = Var[N ] = λ. Using formula (2.30) from page 59 one last time, Var[SN ] = λσ 2 + λµ2 .
E[SN ] = λµ,
Exercise 2.4.3. The joint density of T and U is f (t, u) = fU |T (u|t) · fT (t) =
1 , t
if 0hu ≤ t ≤ 1,
and is 0 elsewhere. Now integrate to obtain the required probability: Z
1
Z
t
P[U > 1/2] = 1/2
1/2
1 du dt = t
Z
1
(1 − (2t)−1 ) dt =
1/2
6
1 (1 − log 2) = 0.1534... 2