Lecture 21

Mon Mar 1 Bernoulli Random Variable P (X = 0) = 1 − p E(X) = p

and

P (X = 1) = p.

Var(X) = p(1 − p).

and

You roll a die ten times. Let X be the number of sixes that you roll? For every k between 0 and 10 what is the probability that X = k? If k = 0 then all ten times you didn’t roll a six. For each roll this has probability 5/6. Since the rolls are independent we have that P (X = 0) = (5/6)10 . For a general k we first have to choose the k rolls where we got a six. There are nk choices of the k rolls. For each such choice the probability that we get sixes on those k rolls and not on the other n−k is (1/6)k (5/6)10−k . Thus n P (X = k) = (1/6)k (5/6)10−k . k Binomial Random Variable We say that a random variable X has distribution Bin(n, p) if n P (X = k) = (1 − n)n−k pk . k Thus in the previous example X is a binomial (10, 1/6) random variable. The binomial random variable is the distribution for the number of successes in n trials, each of them independently has a probability p of success. In the example above X was the sum of 10 independent Bernoulli random variables each one that had probability of success (rolling a six) equal to 1/6. In general if X is a Bernoulli random variable with parameters n and p then X has the same distribution as X1 + X2 + · · · + Xn where the Xi are independent Bernoulli random variables which are 1 with probability p and 0 with probability 1 − p. Blue eyes is a recessive trait. When two people who have brown eyes but have a parent with blue eyes then their kids are blue eyed with probability 1

1/4 and brown eyes with probability 3/4. Alice and Bob are both have brown eyes but one of their parents had blue eyes. If Bob and Alice have four kids then what is the probability that 1. three of them have blue eyes? 2. three of them have brown eyes? 3. If five such families each have four kids what is the probability that two of them have exactly three blue eyed kids? Let Y be the number of kids with blue eyes. Then Y is a binomial(4,1/4) random variable. If three of the kids have blue eyes then Y = 3. This has probability 4 3 P (Y = 3) = (1/4)3 (3/4)1 = . 3 64 If three of the kids have brown eyes then Y = 1. This has probability 4 P (Y = 1) = (1/4)1 (3/4)3 = 2764. 1 Let X be the number of the five families with (exactly) three blue eyed children. Then X is a binomial (5, 3/64) random variable. If (exactly) two families have (exactly) three blue eye kids then X = 2 and 5 P (X = 2) = (3/64)2 (61/64)3 . 2 Mean and Variance of Binomial Random variables Let X be a Binomial(n, p) random variable. Then E(X) = np and Var(X) = np(1 − p). The first one is by linearity. Each trial has a chance of success of p and the indicator function has an expected value of p. So the expected value of n such random variables is np. Each of these Bernoulli random variables has variance p(1 − p). Since the random variables are independent the variance of the sum is the sum of the variances. Thus Var(X) = np(1 − p). Let Y be a Bin(20, .25) random variable. What are E(Y ), Var(Y ) and E(Y 2 )? 2

2

Lecture 22

Wed Mar 3 Review of Binomial random variables. Let X be a Binomial(n, p) random variable. Then n k P (X = k) = p (1 − p)n−k . k E(X) = np and Var(X) = np(1 − p). The new lottery game in Washington is Powerball. The lottery chooses 5 of 56 balls and one of 39 “powerballs.” In order to win you must pick all five balls and the powerball correctly. The chance of winning the jackpot is about one in 195 million. If 130 million people play powerball one week what is the probability that nobody wins the jackpot? that at least two people win (have to share) the jackpot? Let Y be the number of people who choose all six numbers correctly. Then Y is a binomial (130,000,000,1/195,000,000) random variable so P (Y = 0) = 1 −

1 195, 000, 000

130,000,000

and P (Y = 1) = (130, 000, 000) 1 −

1 195, 000, 000

129,999,999

1 195, 000, 000

.

Thus P (Y ≥ 2) = 1 − P (Y = 0) − P (Y = 1) 130,000,000 129,999,999 1 1 1 − 1− . = 1− 1− 195, 000, 000 195, 000, 000 195, 000, 000 This gives us an exact answer but it is not very illuminating. To get a better sense for what this value is we use the following approximation. When n is large 1 (1 − )n ≈ e−1 . n

3

So for any λ and large n 1 (1 − )λn = n

1 n λ ≈ (e−1 )λ = e−λ . (1 − ) n

So P (Y = 0) ≈ e−2/3 , P (Y = 1) ≈ e−2/3 (2/3) and P (Y ≥ 2) ≈ 1 − e−2/3 (1 + 2/3) ≈ .1443. The approximation depended only on np = 2/3 and did not depend on n. Poisson Random Variable Based on this we make the following definition. We say that a random variable X has distribution Poisson(λ) if P (X = k) = e−λ

λk k!

for all integers k > 0. Based on what we showed before if Xn is a Bin(n, pn ) random variable, Y is a Poisson(λ) random variable and npn → λ then for all integers k > 0 P (Xn = k) → P (Y = k). Mean and Variance If Y is a Poisson(λ) random variable then E(Y ) = Var(Y ) = λ. To see this we calculate that E(Xn ) = npn → λ and pn → 0 and 1 − pn → 1 Var(Xn ) = npn (1 − pn ) → λ. The book derives the mean and variance of the Poisson random variable from the probability mass function. This is more rigorous but not too enlightening.

4

3

Lecture 23

Fri Mar 5 Review of Poisson Random Variables If Y is a Poisson(λ) random variable then P (Y = k) = e−λ

λk k!

for all k and E(Y ) = Var(Y ) = λ. Example of Data with a Poisson Distribution In the late 19th century von Bortkiewicz studied 20 units of the Prussian army over the course of 10 years and how many deaths there were that were due to kicking by horse. There were a total of 122 deaths or an average of .61 per unit per year. Because of this we compare the data with what we would expect if we sampled 200 independent Poisson(.61) random variables. k number of units 200P(Poisson(.61)=k) 0 109 108.2 1 65 66.3 2 22 20.2 3 3 4.1 4 1 .6 5 0 .1 Poisson paradigm If X is the sum of lots of lots of nearly independent random variables and E(X) = λ then the distribution of X should be close to that of a Poisson(λ) random variable. In particular if λ = np and n is large then the distribution of a Bin(n, p) is close to that of a Poisson(λ) random variable. The following random variables are likely to have a Poisson distribution. 1. the number of winners in a lottery 2. the number of overtime games in a week in the NFL 3. the number of people entering a store between 9:00 and 10:00 on weekdays 4. the number of seismic events in WA each day 5. the number of typos on a page in a book 6. the number of wrong numbers dialed each day in Seattle. 5

Change of Scale Let Y be the number of people entering a store in an hour. A random variable like this can often have a Poisson distribution because there are lots of people independently making the decision to enter the store or not. If there are on average λ people entering the store every hour the the distribution of Y is Poisson(λ). If people are equally likely to enter the store at any time during the hour then there are on average λ/2 people entering the store in the first half hour and λ/2 people entering the store in the second half hour. Let Y1 be the number of people entering the store in the first half hour and Y2 be the number of people entering the store in the second half hour. In situations like this it is often reasonable to assume Y1 and Y2 are independent Poisson(λ/2) random variables. About 1 in 200,000 people is diagnosed with ALS each year. There are approximately 600,000 people in Seattle. 1. What is the probability that there are two diagnoses of ALS in Seattle in 2011? 2. What is the probability that there is no diagnosis of ALS in April? 3. In a period of d day there is a 50% chance of there being someone in Seattle being diagnosed with ALS. What is d? 4. What is the probability that there two months in 2011 with exactly one diagnosis of ALS? A book has an average of 3 typos on each page. 1. What is the probability that there are no errors on half a page? 2. What is the probability that there are 7 errors on two pages?

6