EE 302 Division 1. Homework 9 Solutions. Problem 1. You roll a fair 6-sided die, and then you flip a fair coin the number of times shown by the die. Find the mean and the variance of the number of Heads. Solution. Suppose H is the number of heads and F is the number of coin flips. We have the following formula for the expectation and variance of H: E[H] = E[E[H|F ]], var(H) = E[var(H|F )] + var(E[H|F ]).

(1) (2)

Notice that both var(H|F ) and E[H|F ] are random variables depending on F . When F takes value f , we have var(H|F ) = var(H|F = f ) and E[H|F ] = E[H|F = f ] which are the conditional variance and expectation of H conditioned on F = f . It seems obvious but gives us a way to determine var(H|F ) and E[H|F ]: supposing F = f , the conditional distribution of H is binomial with parameters n = f and p = 1/2. Therefore the conditional expectation and variance are equal to E[H|F = f ] = np = f /2 and var(H|F = f ) = np(1 − p) = f /4, respectively. We just replace f with F to get: var(H|F ) = F/4, E[H|F ] = F/2. Combining the above results with Eqs. (1,2) and applying the knowledge that F is a discrete uniformly distributed random variable on {1,2,3,4,5,6}, we get: E[H] = E[F ]/2 = 7/4, var(H) = E[F ]/4 + (1/2)2 var(F ) = 7/8 + 35/48 = 77/48. Problem 2. Sam and Pat are dating. All of their dates are scheduled to start at 9pm. Sam always arrives promptly at 9pm. Pat is highly unorganized, and arrives at a time that is uniformly distributed between 8pm and 10pm. Sam gets irritated when Pat is late. Let X be the time in hours between 8pm and the time when Pat arrives. When Pat arrives after 9pm, their date will last for a time that is uniformly distributed between 0 and (3 − X) hours. When Pat arrives before 9pm, their date will last exactly 3 hours. The date starts at the time that they meet. Sam will end the relationship after the second date on which Pat is late by more than 45 minutes. All dates are independent of any other dates. (a) What is the expected number of hours Pat waits for Sam to arrive? (Remember that waiting time must be non-negative.) Solution. Suppose W is the waiting time of Sam. If we are given the value of X, W can be determined by the following formula: ½ 1 − X if 0 ≤ X < 1, W (X) = 0 if 1 ≤ X ≤ 2.

1

Therefore we have:

Z 2 E[W ] = E[W (X)] = W (x)fX (x)dx 0 Z 1 1 1 = (1 − x) dx = . 2 4 0

(b) What is the expected duration of any particular date? Solution. Suppose random variable D is the duration of a date. Given X = x in that date, the conditional expectation of D can be determined as follows: ½ 3 0≤x≤1 because D is a constant 3 in this case, E[D|X = x] = 1 (3 − x) 1 ≤ x ≤ 2 because D is uniformly distributed on[0, 3 − x]. 2 Therefore we get: Z

2

E[D] = E[E[D|X]] = E[D|X = x]fX (x)dx 0 Z 1 Z 2 1 1 1 = 3 dx + (3 − x) dx 2 2 0 1 2 15 3 3 + = . = 2 8 8 (c) What is the expected number of dates they will have? Solution. Suppose B is the total number of dates, A1 is the number of dates until Pat is late by more than 45 minutes the first time and A2 is the number of dates after that. So we have the relationship: B = A1 + A2 . After A1 dates, the dating process starts over again until Pat is late by more than 45 minutes again. So A1 and A2 must have the same distribution and: E[B] = E[A1 ] + E[A2 ] = 2E[A1 ]. Note that A1 is a geometric random variable with parameter p = P (X > 7/4) = 1/8 and so E[A1 ] = 1/p = 8. We therefore have: E[B] = 2

1 = 16. 1/8

Problem 3. The wombat club has N members, where N is a random variable with PMF pN (n) = pn−1 (1 − p) for n = 1, 2, 3, . . . . On the second Tuesday night of every month, the club holds a meeting. Each wombat member attends the meeting with probability q, independently of all other members. If a wombat attends the meeting, then it brings an amount of money, M , which is a continuous random variable with PDF fM (m) = λe−λm for m ≥ 0. N , M , and whether each wombat attends are all independent. Determine: 2

(a) The expectation and variance of the number of wombats showing up to the meeting. Solution. Suppose the number of wombats who attend the meeting is X. Given the number of members N = n, the conditional distribution of X is binomial with parameters n and q. So the conditional expectation and variance are E[X|N = n] = nq, var(X|N = n) = nq(1 − q) respectively. By replacing n with N we get: E[X|N ] = N q;

var(X|N ) = N q(1 − q),

and therefore we have: E[X] = E[E[X|N ]] = qE[N ],

(3) 2

var(X) = E[var(X|N )] + var(E[X|N ]) = q(1 − q)E[N ] + q var(N ).

(4)

It is given that the PMF of N is geometric with parameter 1 − p. Therefore we have E[N ] = 1/(1 − p) and var(N ) = p/(1 − p)2 . Plugging this into Eqs. (3,4), we get: E[X] = var(X) =

q , 1−p q(1 − q) q2p . + 1−q (1 − p)2

(b) The transform of the PDF for the total amount of money brought to the meeting. Solution. Suppose Mi is the amount of money brought by the i-th wombat who attended the meeting, and T is the total amount. We have: T =

X X

Mi ,

i=1

and Mi are i.i.d. continuous random variables. By properties of sums of a random number of independent random variables, the moment generating function of T can be found by replacing each occurrence of es in MX (s) with MM (s). Since Mi ’s are exponential random variables with parameter λ, we have MM (s) = λ/(λ − s). The problem left is to find out MX (s). If we define Ki to be a series of independent Bernoulli random variables with parameter q, i.e.:  if k = 1,  q 1 − q if k = 0, pK (k) =  0 otherwise, then X can be represented as follows: X=

N X

Ki ,

i=1

which is again a sum of random number of i.i.d. random variables. Therefore MX (s) can be obtained by replacing each es in MN (s) by MK (s). Let us recall that: MK (s) = 1 − q + qes ;

MN (s) = 3

(1 − p)es . 1 − pes

Then, we have: MX (s) =

(1 − p)(1 − q + qes ) . 1 − p(1 − q + qes )

Furthermore, if we replace each es in above expression with MM (s), we get: MT (s) = =

= =

(1 − p)(1 − q + qMM (s)) 1 − p(1 − q + qMM (s)) ³ ´ λ (1 − p) 1 − q + q λ−s ´ ³ λ 1 − p 1 − q + q λ−s (1 − p) [(1 − q)(λ − s) + qλ] λ − s − p [(1 − q)(λ − s) + qλ] (1 − p) (λ − s + qs) . λ − s − p (λ − s + qs)

Problem 4. Random variables X and Y have the following joint PDF:   0.1, if both − 1 ≤ x ≤ 1 and − 2 ≤ y ≤ 2; 0.1, if both 1 ≤ x ≤ 2 and − 1 ≤ y ≤ 1; fX,Y (x, y) =  0, otherwise. (a) Prepare neat, fully labeled sketches of fX|Y (x|y). Solution. The support of the joint PDF is shown in Figure 1(a). The conditional PDF fX|Y (x|y) can be obtained by renormalizing the slice of fX,Y (x, y) as a function of x. It easy to see that these conditional PDF’s are uniform because the uniformity of fX,Y (x, y). But the range depends on the value of y and the height can be obtained by normalization. The results are depicted in figure 1(b). (b) Find E[X|Y = y] and var(X|Y = y). Solution. From the obtained conditional PDF fX|Y (x|y) and the properties of uniform distribution, we can get: ½ 0 −2 ≤ y < −1 or 1 < y ≤ 2 E[X|Y = y] = 0.5 −1 ≤ y ≤ 1 ½ 2 2 /12 = 1/3 −2 ≤ y < −1 or 1 < y ≤ 2 var(X|Y = y) = 32 /12 = 3/4 −1 ≤ y ≤ 1 (c) Find E[X]. Solution. By the total expectation theorem, E[X] = E[X| − 1 ≤ Y ≤ 1]P(−1 ≤ Y ≤ 1) + E[X|Y > 1 or Y < −1]P(Y > 1 or Y < −1) = 0.5 · 0.6 + 0 · P(Y > 1 or Y < −1) = 0.3. 4

y

f X |Y ( x | y )

− 2 ≤ y < −1

2

or

1/ 2

1< y ≤ 2

1 −1

1

-1

2

1

x

x

f X |Y ( x | y )

-1

−1 ≤ y ≤ 1

1/ 3 −1

-2

2

(a) Support of fX,Y (x, y).

x

(b) fX|Y (x|y).

Figure 1: Illustrations to Problem 4. (d) Find var(X) using the law of conditional variances. Solution. By the law of conditional variance, we have: var(X) = E[var(X|Y )] + var(E[X|Y ]). The first term in the right hand side can be obtained by evaluating: Z ∞ E[var(X|Y )] = var(X|Y = y)fY (y)dy −∞ −1

Z = =

Z 2 Z 1 1 1 3 · 0.2dy + · 0.2dy + · 0.3dy −2 3 1 3 −1 4 1 9 7 1 + + = . 15 15 20 12

The second term in the right hand side of Eq.(5) can be obtained by evaluating Z ∞ E[(E[X|Y ])2 ] − (E[E[X|Y ]])2 = E(X|Y = y)2 fY (y)dy − (E[X])2 −∞ 1

Z =

−1

0.25 · 0.3dy − 0.32

= 0.15 − 0.09 = 0.06. Plug these results into Eq.(5) to get: var(X) =

7 + 0.06 ≈ 0.64333. 12 5

(5)

y

g( x) = E[Y | x]

2

2

1.5 E[Y | x2 ] = x2 − 0.5

1

1 E[Y | x1 ] = 0.5

0

0.5

x1

x2

1

2

x

0

(a) Support of fX,Y (x, y).

1

2

x

(b) g(x).

Figure 2: Illustrations to Problem 5. Problem 5. Random variables X and Y   C, C, fX,Y (x, y) =  0,

have the following joint PDF: if both 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1; if both 1 ≤ x ≤ 2 and x − 1 ≤ y ≤ x; otherwise.

The experimental value of X will be revealed to us; we have to design an estimator g(X) of Y that minimizes the conditional expectation E[(Y − g(X))2 |X = x], for all x, over all possible estimators. Provide a plot of the optimal estimator as a function of its argument. Solution. The support of the joint PDF is depicted in figure 2(a). We need to find: g(X) = E[Y |X]. Because of the uniformity of the joint PDF, the conditional PDF fY |X (y|x) is uniform for each possible x and therefore the conditional expectation E[Y |X = x] is just the center of the conditional PDF as shown in figure 2(a). The final answer is: ½ 0.5, if 0 ≤ x < 1, g(x) = E[Y |X = x] = x − 0.5, if 1 ≤ x ≤ 2, as depicted in figure 2(b). Problem 6. The value of a random variable X is transmitted, but what we receive is the value y of Y = X +W . Suppose that X and W are i.i.d. (i.e. independent, identically distributed). Find the least ˆ of X based on Y = y, i.e. the estimate of X which minimizes E[(X ˆ − X)2 |Y = y]. squares estimate X

6

Solution. Because of the symmetry of the situation, the least-squares estimates of X and of W based on Y = y must be the same. They should sum up to y, and therefore the answer is: ˆ = y/2. X Here is a rigorous justification for this argument. We are looking for: Z ∞ ˆ xfX|Y (x|y)dx. X = E[X|Y = y] = −∞

The conditional PDF can found as follows: fX|Y (x|y) =

fY |X (y|x)fX (x) fX,Y (x, y) = . fY (y) fY (y)

(6)

When X = x is given, Y = W + x and fY |X (y|x) is just a shifted version of fW (x). So we have: fY |X (y|x) = fW (y − x). Because W and X have the same distribution, we can call both PDF’s f : fX = fW = f . Substitute these results into Eq. (6) to get: fX|Y (x|y) =

f (y − x)f (x) . fY (y)

Notice that the above function of x is symmetric about x = y/2. This is because: fX|Y (y/2 + a|y) =

f (y/2 − a)f (y/2 + a) = fX|Y (y/2 − a|y) fY (y)

∀a ∈ R.

Therefore the conditional expectation is just the center of the conditional PDF and: ˆ = E[X|Y = y] = y/2. X

7