TABLE OF CONTENTS 1.

Probability A. B. C. D. E. F.

2.

Discrete Probabilities Density Histograms and Sample Percentiles Moments of Discrete Variables Skewness Moment Generating Functions Bernoulli Trials and the Binomial Distribution Poisson Distribution Negative Binomial Distribution

47 51 57 63 67 71 85 109

Continuous Distributions A. B. C. D. E. F. G. H. I. J. K.

4.

1 7 15 21 25 29

Discrete Distributions A. B. C. D. E. F. G. H.

3.

Properties of Probability Methods of Enumeration: Selection Without Replacement Methods of Enumeration: Selection with Replacement Conditional Probability Independent Events Bayes’ Theorem

Probability Density and Distribution Functions of Continuous Variables Modes of Continuous Distributions Percentiles and Medians Moments and Expectations of Continuous Distributions Distributions of Functions of a Random Variable Uniform Distribution Exponential and Gamma Distributions Normal Distribution Normal Approximations for Discrete Distributions Lognormal Distribution Beta, Pareto, and Weibull Distributions

115 123 127 131 139 145 153 169 175 181 185

Multivariate Distributions A. B. C. D. E. F. G. H. I.

Joint Probability Distributions of Two Random Variables Marginal Density Functions Independent Random Variables Conditional Distributions Conditional Moments: Discrete Cases Conditional Moments: Continuous Cases Transformations of Random Variables Order Statistics Multinomial Distribution

189 201 207 213 223 239 247 251 255

5.

Other Topics A. B. C. D. E. F.

6.

Covariance Correlation Coefficient Bivariate Normal Distribution Sums of Independent Random Variables Chi-Square Distribution Inequalities and the Central Limit Theorem

257 267 271 273 283 289

Anderson A. B.

Deductibles and Limits Inflation and Coinsurance

291 303

NOTES Questions and parts of some solutions have been taken from material copyrighted by the Casualty Actuarial Society and the Society of Actuaries. They are reproduced in this study manual with the permission of the CAS and SoA solely to aid students studying for the actuarial exams. Some editing of questions has been done. Students may also request past exams directly from both societies. I am very grateful to these organizations for their cooperation and permission to use this material. They are, of course, in no way responsible for the structure or accuracy of the manual. Exam questions are identified by numbers in parentheses at the end of each question. CAS questions have four numbers separated by hyphens: the year of the exam, the number of the exam, the number of the question, and the points assigned. SoA or joint exam questions usually lack the number for points assigned. W indicates a written answer question; for questions of this type, the number of points assigned are also given. A indicates a question from the afternoon part of an exam. MC indicates that a multiple choice question has been converted into a true/false question. Page references refer to Michael A. Bean, Probability: The Science of Uncertainty with Applications to Investments, Insurance, and Engineering (2005); Saeed Ghahramani, Fundamentals of Probability with Stochastic Processes, (2005); Matthew J. Hassett and Donald G. Stewart, Probability for Risk Management, (1999); Robert V. Hogg and Elliot A. Tanis, Probability and Statistical Inference (2001); Irwin Miller and Marylees Miller, John E. Freund's Mathematical Statistics with Applications (2004); Sheldon Ross, A First Course in Probability (2002); and Dennis D. Wackerly, William Mendenhall III, and Richard Scheaffer, Mathematical Statistics with Applications (2002). Although I have made a conscientious effort to eliminate mistakes and incorrect answers, I am certain some remain. I am very grateful to students who discovered errors in the past and encourage those of you who find others to bring them to my attention. Please check our web site for corrections subsequent to publication. I would also like to thank Jenny Carlson for doing initial solutions for most of the problems in the manual.

Hanover, NH 6/30/11 PJM

NOTES Questions and parts of some solutions have been taken from material copyrighted by the Casualty Actuarial Society and the Society of Actuaries. They are reproduced in this study manual with the permission of the CAS and SoA solely to aid students studying for the actuarial exams. Some editing of questions has been done. Students may also request past exams directly from both societies. I am very grateful to these organizations for their cooperation and permission to use this material. They are, of course, in no way responsible for the structure or accuracy of the manual. Exam questions are identified by numbers in parentheses at the end of each question. CAS questions have four numbers separated by hyphens: the year of the exam, the number of the exam, the number of the question, and the points assigned. SoA or joint exam questions usually lack the number for points assigned. W indicates a written answer question; for questions of this type, the number of points assigned are also given. A indicates a question from the afternoon part of an exam. MC indicates that a multiple choice question has been converted into a true/false question. Page references refer to Michael A. Bean, Probability: The Science of Uncertainty with Applications to Investments, Insurance, and Engineering (2005); Saeed Ghahramani, Fundamentals of Probability with Stochastic Processes, (2005); Matthew J. Hassett and Donald G. Stewart, Probability for Risk Management, (1999); Robert V. Hogg and Elliot A. Tanis, Probability and Statistical Inference (2001); Irwin Miller and Marylees Miller, John E. Freund's Mathematical Statistics with Applications (2004); Sheldon Ross, A First Course in Probability (2002); and Dennis D. Wackerly, William Mendenhall III, and Richard Scheaffer, Mathematical Statistics with Applications (2002). Although I have made a conscientious effort to eliminate mistakes and incorrect answers, I am certain some remain. I am very grateful to students who discovered errors in the past and encourage those of you who find others to bring them to my attention. Please check our web site for corrections subsequent to publication. I would also like to thank Jenny Carlson for doing initial solutions for most of the problems in the manual.

Hanover, NH 11/1/10 PJM

Multivariate Distributions  189

PAST CAS AND SoA EXAMINATION QUESTIONS ON MULTIVARIATE DISTRIBUTIONS A.

Joint Probability Distributions of Two Random Variables

A1.

Let the joint density function of X and Y be given by the following:  x + y for 0 < x < 1 and 0 < y < 1 f(x, y) =  0 otherwise 

What is P(X < 2Y)? A. 7/32 A2.

B. 1/4

D. 19/24

E. 7/8

(792–210)

Suppose that the joint density function of X and Y is uniform over the region R = {(x, y) | x + y < 2, x > 0, y > 0}. What is the probability that exactly one of the two events A = {X < 1} and B = {Y > 1} occurs? A. 1/16

A3.

C. 3/4

B. 1/4

C. 1/2

D. 5/8

E. 3/4

(792–214)

Suppose X and Y have the following joint density function:  x + (10/3)y2 for 0 ≤ x ≤ y ≤ 1 f(x, y) =  0 otherwise 

What is E[XY]? A. 19/60 A4.

B. 1/3

C. 11/30

D. 2/5

E. 31/90

(81F–2–38)

Let the joint density function for X and Y be given by the following:  kxy2 for 0 < x < y < 1 f(x, y) =  0 otherwise 

What is the value of k? A. 1 A5.

B. 2

C. 5

D. 6

E. 10

(83S–218)

Let X and Y be continuous random variables with the following joint density function:  kx-3e-y/3 for 1 < x < ∞ and 1 < y < ∞ f(x, y) =  0 otherwise 

Then k = A. (1/3)e1/3

B. e-1/3

© 2010 ACTEX Publications, Inc.

C. (2/3)e1/3

D. (3/2)e-1/3

E. e1/3

(88S–110–50) SOA Exam P and CAS Exam 1 – Peter J. Murdza

190  Multivariate Distributions Solutions are based on Bean, pp. 29–35, 104–19, 325–40; Ghahramani, pp. 311–25, 369–75, 378–82; Hassett, pp. 269–70, 274–78, 293–304, 320–29; Hogg, pp. 222–33; Miller, pp. 92–100; Ross, pp. 239–47; and Wackerly pp. 210–22, 241–49, 255–61. 1 1

A1.

1

|

1

1

2 2 P(X < 2Y) =    x + y dy dx =   (xy + y /2) x/2 dx =   x + 1/2  5x /8 dx 0 x/2

0

P(X < 2Y) = [x2/2 + x/2 

|

1 (5/24)x3] 0

0

= 19/24

Answer: D A2.

The region described by the inequalities is a triangle bounded by the lines x + y = 2, x = 0, and y = 0. This can be broken up into two triangles and a square:

The upper triangle has the additional constraints X < 1 and Y > 1. The lower triangle has the additional constraints X > 1 and Y < 1. The square has the additional constraints of X < 1 and Y < 1. Since in the upper triangle both events occur and in the lower triangle neither events occur, neither of these areas are included in the desired probability. In the square, which comprises 1/2 of the area, only one of the events occurs (X < 1). Answer: C A3.

1 y

1

0 0

0

|

1

y

2 3 2 3 E[XY] =   y4/3 + (5/3)y5 dy   xy[x + (10/3)y ] dx dy =   [x y/3 + (5/3)x y ] 0 dy = 

|

1

E[XY] = [y5/15 + 5y6/18] 0

0

= 1/15 + 5/18 = 31/90

Answer: E A4.

Since probability must equal one, we get: 1 y

1

|

y

2 2 2 1 =    kxy dx dy =   kx y /2 dy 0 0 0

1

|

1

4 5 =   ky /2 dy = ky /10 0

0

= k/10

k = 10

0

Answer: E A5.

Since the probabilities must integrate to unity, we get: ∞ ∞



1 1

1

|



|



1 -3 -y/3 -1/3 -3 -1/3 -2 1 1 =    3e-y/3kx-3 ∞ dx =   kx e dy dx =   3e kx dx = (3/2)e kx

1 = (3/2)e-1/3k

1

k = (2/3)e1/3

Answer: C © 2010 ACTEX Publications, Inc.

SOA Exam P and CAS Exam 1 – Peter J. Murdza

Multivariate Distributions  191 A6.

Let X and Y be continuous random variables with the following joint density function:  .25 for 0 ≤ x ≤ 2 and x  2 ≤ y ≤ x f(x, y) =  0 otherwise 

What is E[X3Y]? A. 6/5 A7.

B. 4/3

C. 2

D. 4

E. 24/5

(90S–110–2)

Let X and Y be discrete random variables with the following joint probability function: 1/6 for x = 1, 2, 3; y = 2, 3 0 otherwise

 

f(x, y) = 

If U = X + Y. What is the probability function of U? A. B. C. D. E. A8.

g(u) = 1/4 for u = 3, 4, 5, 6; 0 otherwise g(u) = 2/5 for u = 3; 1/5 for u = 4, 5, 6; 0 otherwise g(u) = 1/6 for u = 3, 6; 1/3 for u = 4, 5; 0 otherwise g(u) = 1/5 for u = 2, 3, 4, 5, 6; 0 otherwise g(u) = 1/4 for u = 2, 3, 4, 5; 0 otherwise. (90S–110–9)

Let X and Y be continuous random variables with the following joint density function:  2(x + y) for 0 < x < y < 1 f(x, y) =  0 otherwise 

Then E[Y] = A. 5/12 A9.

B. 1/2

C. 3/4

D. 1

E. 7/6

(90S–110–25) (Sample–1–37)

Let X and Y be continuous random variables with joint density function f(x, y) and marginal density functions fX and fY, respectively, that are nonzero only on the interval (0, 1). Which one of the following statements is always true? A.

E[X2Y3]

(

1

1

 x2

= 

(

)(

dx

0 1

y3

1

)

dy

B.

E[X2]

 x2 f(x, y) dx =

0

0 1

)( y

C. E[X2Y3] =   x2 f(x, y) dx 0

3

)

f(x, y) dy

1

D. E[X2] =   x2 f(x) dx

0

0

1

 y3 f(x) dx E. E[Y3] = 

(90S–110–38)

0

A10.

Let X and Y be continuous random variables with the following joint density function:  xy for 0 ≤ x ≤ 2 and 0 ≤ y ≤ 1 f(x, y) =  0 otherwise 

What is P(X/2 ≤ Y ≤ X)? A. 3/32

B. 1/8

C. 1/4

© 2010 ACTEX Publications, Inc.

D. 3/8

E. 3/4

(90S–110–49) SOA Exam P and CAS Exam 1 – Peter J. Murdza

192  Multivariate Distributions 2

A6.

E[X3Y]

x

=    

2

.25x3y dy dx

|

0 x2

0

2

E[X3Y]

=  

x

3 2 =   (1/8) x y dx x-2 dx

2

1/8[x3(x2)



x3(x



2)2]

dx =   x4/2  x3/2 dx

0

0

|

2

E[X3Y] = [x5/10  x4/8] 0

= (2)4(1/5  1/8) = 6/5

Answer: A A7.

Construct a joint probability table and sum the probabilities for different values of U: (X,Y) U P(X, Y)

(1, 2) 3 1/6

g(3) = g(6) = 1/6

(2, 2) 4 1/6

(2, 3) 5 1/6

(3, 1) 4 1/6

(3, 2) 5 1/6

(3, 3) 6 1/6

g(4) = g(5) = 1/3

Answer: C 1

A8.

y

1

E[Y] =     2y(x + y) dx dy =   0

0

x2y +

0

|

y 2xy2 0

1

|

1

3 4 dy =   3y dy = 3y /4 0 0

= 3/4

Answer: C 1

1

A9.

E[u(x)] =   u(x) f(x) dx 0

E[X2]

 x2 f(x) dx = 0

Answer: D 1 x

A10.

2 1

1

P(X/2 ≤ Y ≤ X) =    xy dy dx +    xy dy dx =   0 x/2 1

1 x/2

|

x xy2/2 x/2

0

2

|

1

1

|

1

dx +   xy2/2 x/2 dx 1

3 3 3 4 P(X/2 ≤ Y ≤ X) =   x /2  x /8 dx +   x/2  x /8 dx = 3x /32 0 0

2

|

2

+ [x2/4  x4/32] 1

P(X/2 ≤ Y ≤ X) = 3/32  0 + (1  1/2)  (1/4 1/32) = 3/8 Answer: D

© 2010 ACTEX Publications, Inc.

SOA Exam P and CAS Exam 1 – Peter J. Murdza

Multivariate Distributions  193 A11.

Let X and Y be discrete random variables with joint probability function  y/(24x) for x = 1, 2, 4; y = 2, 4, 8; x ≤ y f(x, y) =  0 otherwise 

What is P(X + Y/2 ≤ 5)? A. 1/8 A12.

B. 7/24

C. 3/8

D. 5/8

E. 17/24

(92S–110–6)

Let X1 and X2 be random variables with joint moment generation function M(t1, t2) = .3 + (.1)exp(t1) + (.2)exp(t2) + (.4)exp(t1 + t2). What is E[2X1  X2]? A. .1 B. .4 (92S–110–16)

A13.

E. .3 + (.1)exp(2t1) + (.2)exp(t2) + (.4)exp(2t1  t2)

D. .2e + .4e2

C. .8

Let X and Y be discrete random variables with joint probability function given by the following table: (x, y) p(x, y)

(0, 0) 0

(0, 1) 1/5

(1, 0) 2/5

(1, 1) 1/5

(2, 0) 1/5

(2, 1) 0

What is the variance of Y  X? A. 4/25 A14.

B. 16/25

C. 26/25

D. 5/4

E. 7/5

(92S–110–40)

Let X and Y be discrete random variables with the following joint probability function:  2(x+1-y)/9 for x = 1, 2, and y = 1, 2 p(x, y) =  0 otherwise 

Calculate E[X/Y]. A. 8/9 A15.

B. 5/4

C. 4/3

D. 25/18

E. 5/3

(96W–11029)

Let X and Y be random losses with the following joint density function:  e-(x+y) for x > 0 and y > 0 f(x, y) =  0 otherwise 

An insurance policy is written to reimburse (X + Y). Calculate the probability that the reimbursement is less than 1. A. e-2

B. e-1

C. 1  e-1

© 2010 ACTEX Publications, Inc.

D. 1  2e-1

E. 1  2e-2

(Sample–1–4)

SOA Exam P and CAS Exam 1 – Peter J. Murdza

194  Multivariate Distributions A11.

The specified outcome consists of the following five pairs: (1, 2), (1, 4), (1, 8), (2, 2), (2, 4). Thus we get for a sum of their respective probabilities: P(Outcome) = 2/24 + 4/24 + 8/24 + 2/48 + 4/48 = 17/24 Answer: E

A12.

E[X1] = MX1'(0) = (.1)exp(0) + (.4)exp(0) = .5 E[X2] = MX2'(0) = (.2)exp(0) + (.4)exp(0) = .6 E[2X1  X2] = 2E[X1]  E[X2] = (2)(.5)  (.6) = .4 Answer: B

A13.

E[Y  X] = (0)(0) + (1)(1/5) + (1)(2/5) + (0)(1/5) + (2)(1/5) + (1)(0) = 3/5 E[(Y  X)2] = (0)2(0) + (1)2(1/5) + (1)2(2/5) + (0)2(1/5) + (2)2(1/5) + (1)2(0) = 7/5 Var(Y  X) = E[(Y  X)2]  (E[Y  X])2 = 7/5  (3/5)2 = 26/25 Answer: C

A14.

E[X/Y] = (1/1)p(1, 1) + (1/2)p(1, 2) + (2/2)p(2, 2) + (2/1)p(2, 1) E[X/Y] = (1)(2)1+1-1/9 + (1/2)(2)1+1-2/9 + (1)(2)2+1-2/9 + (2)(2)2+1-1/9 = 25/18 Answer: D 1 1x

A15.

P(X + Y < 1) = P(Y < 1  X) =     0

|

1

P(X + Y < 1) = [e-x  xe-1] 0

0

1

e-(x+y)

dy dx =   0

|

1-x e-(x+y) 0

1 -x -1 dx =   e  e dx 0

= (e-1  e-1)  (1) = 1  2e-1

Answer: D

© 2010 ACTEX Publications, Inc.

SOA Exam P and CAS Exam 1 – Peter J. Murdza

Multivariate Distributions  195 A16.

Given that future lifetimes (in months) of two components of a machine have the following joint density function, what is the probability that both components are still functioning twenty months from now?  (6/125,000)(50  x  y) for 0 < x < 50 – y < 50 f(x, y) =  0 otherwise  20 20

A. (6/125,000)  (50  x  y) dy dx   0 0 30 50-x-y

30 50-x

B. (6/125,000)   (50  x  y) dy dx   20 20 50 50-x

C. (6/125,000)  

  (50  x  y) dy dx

D. (6/125,000)   (50  x  y) dy dx  

E. (6/125,000)  

  (50  x  y) dy dx ) (00F–1–20) (Sample–P–89)

20 20 50 50-x-y 20

20

A17.

20 20

An insurance company insures a large number of drivers. Let X be the random variable representing the company’s losses under collision insurance and let Y represent the company’s losses under liability insurance. X and Y have the following joint density function: (2x + 2  y)/4 for 0 < x < 1 and 0 < y < 2  0 otherwise 

f(x, y) = 

What is the probability that the total loss is at least 1? A. .33 A18.

B. .38

C. .41

D. .71

E. .75

(00F–1–36) (Sample–P–91)

A device runs until either of two components fails, at which point the device stops running. The joint density function of the lifetimes of the two components, both measured in hours, is: f(x, y) = (x + y)/27

0 < x < 3 and 0 < y < 3

Calculate the probability that the device fails during its first hour of operation. A. .04 A19.

B. .41

C. .44

D. .59

E. .96

(03S–1–16) (Sample–P–78)

Let X be the age of an insured automobile involved in an accident. Let Y be the length of time the owner has insured the automobile at the time of the accident. X and Y have joint probability density function:  (10  xy2)/64 for 2 ≤ x ≤ 10 and 0 ≤ y ≤ 1 f(x, y) =  0 otherwise  Calculate the expected age of an insured automobile involved in an accident. A. 4.9

A20.

B. 5.2

C. 5.8

D. 6.0

E. 6.4

(03S–1–24) (Sample–P–121)

A device runs until either of two components fails, at which point the device stops running. If the joint density function of the lifetimes of the two components, both measured in hours, is the following, calculate the probability that the device fails during its first hour of operation. f(x, y) = (x + y)/8 A. .125

B. .141

© 2010 ACTEX Publications, Inc.

C. .391

0 < x < 3 and 0 < y < 3 D. .625

E. .875

(Sample–P–77)

SOA Exam P and CAS Exam 1 – Peter J. Murdza

196  Multivariate Distributions A16.

P(Y > 20, X > 20) = P(20 < Y < 50  X, 20 < X < 30) 30 50-x

P(Y > 20, X > 20) =     (6/125,000)(50  x  y) dy dx 20 20 30 50-x

 (50  x  y) dy dx P(Y > 20, X > 20) = (6/125,000)    20 20

Answer: B 1 2

A17.

1

|

2

P(X + Y ≥ 1) = P(Y ≥ 1  X) =    (2x + 2  y) /4 dy dx =   [(2x + 2)y  y2/2]/4 1-x dx 0 1-x

0

1 2 P(X + Y ≥ 1) = (1/4)   (4x + 4  2)  [(2x + 2)(1  x)  (1  x) /2] dx 0 1

|

1

 [(5/2)x2 + 3x + 1/2]/4 dx = [(5/6)x3 + (3/2)x2 + x/2]/4 0 P(X + Y ≥ 1) =  0

= .70833

Answer: D 3 1

A18.

3

|

1

2 P(X < 1) = P(Y < 1) =    (x + y) /27 dx dy =   (1/27)(x /2 + xy) dy 0 0 0

0

3

|

3

2 P(X < 1) =   (1/27)(1/2 + y) dy = (1/27)(y/2 + y /2) 0 0 1 1

= (1/27)(3/2 + 9/2) = 2/9

1

|

1

2 P(X < 1) P(Y < 1) =    (x + y) /27 dx dy =   (1/27)(x /2 + xy) dy 0 0 0 1

0

|

1

2 P(X < 1) P(Y < 1) =   (1/27)(1/2 + y) dy = (1/27)(y/2 + y /2) 0 0

= (1/27)(1/2 + 1/2) = 1/27

P([X < 1] and [Y < 1]) = P(X < 1) + P(Y < 1)  P(X < 1) P(Y < 1) P([X < 1] and [Y < 1]) = 2/9 + 2/9  1/27 = 11/27 = .407 Answer: B 10 1

A19.

10

E[X] =     (x/64)(10  2

xy2)

2

|

10

E[X] = (1/64)(5x2  x3/9) 2 Answer: C A20.

dy dx =   (x/64)(10y 

0

|

1 xy3/3) 0

10

dx =   (x/64)(10  x/3) dx 2

= (1/64)(500  1,000/9  20 + 8/9) = 52/9 = 5.78

See A18. 2

|

2

2 P(X < 1) = P(Y < 1) =   (1/8)(1/2 + y) dy = (1/8)(y/2 + y /2) 0 0

P(X < 1) P(Y < 1) = (1/8)(1/2 + 1/2) = 1/8

= (1/8)(2/2 + 4/2) = 3/8

P([X < 1] and [Y < 1]) = 3/8 + 3/8  1/8 = 5/8 = .625

Answer: D © 2010 ACTEX Publications, Inc.

SOA Exam P and CAS Exam 1 – Peter J. Murdza

Multivariate Distributions  197 A21.

A device has two components. The device fails if either component fails. The joint density function of the lifetimes of the components, measured in hours, is f(s, t), where 0 < s < 1 and 0 < t < 1. What is the probability that the device fails during the first half hour of operation? .5 .5

1 .5

A.     f(s, t) ds dt

1

B.     f(s, t) ds dt

0 0 .5 1

0 0 1 .5

.5 .5 .5 1

D.     f(s, t) ds dt +     f(s, t) ds dt 0

0

0

1

C.     f(s, t) ds dt

0

1 .5

E.     f(s, t) ds dt +     f(s, t) ds dt 0 .5

0

0

(Sample–P–79) A22.

Let T1 be the time between a car accident and reporting a claim to the insurance company. The T 2 be the time between the report of the claim and payment of the claim. The joint density function of T 1 and T2, f(t1, t2), is constant over the region 0 < t1 < 6, 0 < t2 < 6, t1 + t2 < 10, and 0 otherwise. Determine E[T1 + T2], the expected time between a car accident and payment of the claim. A. 4.9

A23.

C. 5.7

D. 6.0

E. 6.7

(Sample–P–94)

Let T1 and T2 represent the lifetimes in hours of two linked components in an electronic device. The joint density function for T1 and T2 is uniform over the region defined by 0 ≤ t1 ≤ t2 ≤ L where L is a positive constant. Determine the expected value of the sum of the squares of T 1 and T2. A. L2/3

A24.

B. 5.0

B. L2/2

C. 2L2/3

D. 3L2/4

E. L2

(Sample–P–97)

Let X1, X2, X3 be a random sample from a discrete distribution with probability function p(0) = 1/3

p(1) = 2/3

Determine the moment generating function M(t), of Y = X1X2X3. A. 19/27 + 8et/27 (Sample–P–98)

© 2010 ACTEX Publications, Inc.

B. 1 + 2et

C. (1/3 + 2et/3 )3

D. 1/27 + 8e3t/27

E. 1/3 + 2e3t/3

SOA Exam P and CAS Exam 1 – Peter J. Murdza

198  Multivariate Distributions A21. The probability the device fails in the first half hour is the sum of the probability component S fails in the first half hour and component T fails in the second half hour (case 1), the probability component T fails in the first half hour and component S fails in the second half hour (case 2), and the probability both fail in the first half hour (case 3). A. F – This is the probability both components fail in the first half hour (case 3) but excludes the probabilities of cases 1 and 2. B. F – This is the probability that component S fails in the first half hour (cases 1 and 3). It excludes the probability of case 2. C. F – This is the probability that both components fail in the second half hour. D. F – This is the sum of the probability that component T fails in the first half hour (cases 2 and 3) and the probability that component S fails in the first half hour (cases 1 and 3) and thus double counts the probability that both fail in the first half hour (case 3). E. T – This is the sum of the probability that Component T fails in the first half hour and component S fails in the second half hour (case 2) and the probability that component S fails in the first half hour (cases 1 and 3). Answer: E 4

A22.

6 10-t1

6

1 =   f(t1, t2) =     c dt2 dt1 +     c dt2 dt1 0

0

6

|

4 10-t1

1 = 24c +   ct2 dt1 0 4

0

6

[

4

1 = 24c + c[(10)(6  4)  (36  16)/2] = 34c 4

2

c = 1/34

6 10-t1

6

/ ]|6

=   c(10  t1) dt1 = c 10t1  (t1 2 4 4

|

6

E[t1] =     t1/34 dt2 dt1 +     t1/34 dt2 dt1 =   t1t2/34 dt1 0 0 4

0

4

0

0

6

|

4

E[t1] =   3t1/17 dt1 +   t1(10  t1)/34 dt1 = 3t1 34 0 0

|

4

/

2

6

4

+

10-t1

+   t1t2/34 dt1 0

(15  t1)(t21 )/102|64

E[t1] = 24/17 + 162/51  88/51 = 146/51 E[t1 + t2] = 2[t1] = (2)(146)/51 = 5.72549 Answer: C L t2

A23.

L

2

|

L

1 =   f(t1, t2) =    c dt1 dt2 =   ct2 dt2 = ct2 /2 0 0 0 L t2

   ( 0

[

E

0 2 t2 t1

2/L2

)(

2 t1

+

2 t2

) dt1 dt2

0 L

=  0

2

+ t2

]

(

2/L2

)(

3 t1/3

+

2 t 2t1

)

= cL2/2

|

t2 dt2 0

L

c = 2/L2 3

|

4 L

2 2 =   [8/(3L )]t2dt2 = [2/(3L )]t2 0 0

= 2L2/3

Answer: C A24.

P(Y = 1) = (2/3)3 = 8/27 P(Y = 0) = 1  P(Y = 1) = 1  8/27 = 19/27 t M(t) = P(Y = 0) + P(Y = 1)e = 19/27 + 8et/27 Answer: A

© 2010 ACTEX Publications, Inc.

SOA Exam P and CAS Exam 1 – Peter J. Murdza

Multivariate Distributions  199 A25.

An auto insurance policy will pay for damage to both the policyholder's car and the other driver's car in the event that the policyholder is responsible for an accident. The size of the payment for damage to the policyholder's car (X) has a marginal density function of 1 for 0 < x < 1. Given X = x, the size of the payment for damage to the other driver's car (Y) has conditional density of 1 for x < y < x + 1. If the policyholder is responsible for an accident, what is the probability that the payment for damage to the other driver's car will be greater than .500? A. 3/8

A26.

B. 1/2

C. 3/4

D. 7/8

E. 15/16

(Sample–P–119)

Let X and Y be identically distributed independent random variables such that the moment generating function of (X + Y) is M(t) = .09e-2t + .24e-t + .34 + .24et + .09e2t

∞ < t < ∞.

Calculate P[X ≤ 0]. A. .33 A27.

B. .34

C. .50

D. .67

E. .70

(Sample–P–137)

A machine consists of two components, whose lifetimes have the joint density function  1/50 for x > 0, y > 0, and x + y < 10 f(x, y) =  0 otherwise 

The machine operates until both components fail. Calculate the expected operational time of the machine. A. 1.7 A28.

B. 2.5

C. 3.3

D. 5.0

E. 6.7

(Sample–P–138)

A client spends X minutes in an insurance agent’s waiting room and Y minutes meeting the agent. The joint function of X and Y can be modeled by 1 f(x, y) = 800 e -x/40 e -y/20, for x > 0, y > 0 f(x, y) = 0, otherwise. Which of the following expressions represents the probability that a client spends less than 60 minutes at the agent’s office? 40 20

A. B. C. D. E.

1   e -x/40 e -y/20 dy dx 800  0 0 40 20-x

1 -x/40 e -y/20 dy dx   e 800  0 0 40 40-x

1 -x/40 e -y/20 dy dx   e 800  0 0 60 60

1 -x/40 e -y/20 dy dx  e 800  0 0 60 60-x

1 -x/40 e -y/20 dy dx   e 800  0

0

(Sample–P–144)

© 2010 ACTEX Publications, Inc.

SOA Exam P and CAS Exam 1 – Peter J. Murdza

200  Multivariate Distributions 1 x+1

A25.

.5 x+1

1

P(Y > .5) =   dy dx +     dy dx =   y dx   0 .5 .5 x .5 1

.5

|

.5

x+1 x

|

x+1

+   y dx .5 0

|

.5

2 P(Y > .5) =   dx +   (x + .5) dx = .5 + (x /2 + .5x) 0 0 .5

= .875

Answer: D A26.

Since X and Y are identical, the given m.g.f. is the square of the m.g.f. of one random variable. Thus we get: MX(t) = .3e-t + .4 + .3et This is the moment generating function of a discrete random variable with probability distribution: p(1) = .3

p(0) = .4

p(1) = .3

P[X ≤ 0] = .3 + .4 = .7

Answer: E A27. (0, 10)

(5, 5)

(0, 0)

(0, 10)

This diagram describes the area of f(x, y). The lower two triangles contain points where x > y and the upper two triangles describe points where y > x. Since their distributions are uniform, and the area of each set of two triangles equals 25, f(x) = f(y) = 1/25. The expected operational time when x > y is calculated as follows: 5 x

10 10-x

0 0

5

5

10

2 2 E(X | X > Y) =   x/25 dy dx +     x/25 dy dx =   x /25 dx +   (10x – x )/25 dx 

|

5

0

0

|

5

10

E(X | X > Y) = x3/75 0 + (5x2 – x3/3)/25 5 E(X | X > Y) = {1/25}{125/3 + [(5)(100) – (1,000)/3] – [(5)(25) – 125/3]} = (1/25)(125) = 5 An identical results occurs in the case where y > x. Answer: D A28.

The upper limit for the second integral is the greatest amount of time that can be spent in the meeting. This equals 60 minutes less the amount of time spent in the waiting room, i.e., 60 – x. Answer: E

© 2010 ACTEX Publications, Inc.

SOA Exam P and CAS Exam 1 – Peter J. Murdza