Chapter 4 Continuous Random Variables

Chapter 4 Continuous Random Variables 曾志成 國立宜蘭大學 電機工程學系 [email protected] 1 Figure 4.1 X X Y=n Y=1 Y=2 Y=3 The random pointer on disk of circum...
Author: Arabella Ramsey
5 downloads 1 Views 991KB Size
Chapter 4 Continuous Random Variables 曾志成 國立宜蘭大學 電機工程學系 [email protected]

1

Figure 4.1 X

X Y=n Y=1

Y=2 Y=3

The random pointer on disk of circumference 1.

2

Example 4.1 Problem Suppose we have a wheel of circumference one meter and we mark a point on the perimeter at the top of the wheel. In the center of the wheel is a radial pointer that we spin. After spinning the pointer, we measure the distance, X meters, around the circumference of the wheel going clockwise from the marked point to the pointer position as shown in Figure 4.1. Clearly, 0 ≤ X < 1. Also, it is reasonable to believe that if the spin is hard enough, the pointer is just as likely to arrive at any part of the circle as at any other. For a given x, what is the probability P[X = x]?

3

Example 4.1 Solution This problem is surprisingly difficult. However, given that we have developed methods for discrete random variables in Chapter 3, a reasonable approach is to find a discrete approximation to X. As shown on the right side of Figure 4.1, we can mark the perimeter with n equal-length arcs numbered 1 to n and let Y denote the number of the arc in which the pointer stops. Y is a discrete random variable with range SY = {1, 2, . . . , n}. Since all parts of the wheel are equally likely, all arcs have the same probability. Thus the PMF of Y is  1/n y = 1, 2, . . . , n, PY (y) = (4.1) 0 otherwise. From the wheel on the right side of Figure 4.1, we can deduce that if X = x, then Y = dnxe, where the notation dae is defined as the smallest integer greater than or equal to a. Note that the event {X = x} ⊂ {Y = dnxe}, which implies that P [X = x] ≤ P [Y = dnxe] =

1 . n

(4.2) [Continued]

4

Example 4.1 Solution

(Continued 2)

We observe this is true no matter how finely we divide up the wheel. To find P[X = x], we consider larger and larger values of n. As n increases, the arcs on the circle decrease in size, approaching a single point. The probability of the pointer arriving in any particular arc decreases until we have in the limit, 1 = 0. (4.3) n→∞ n→∞ n This demonstrates that P[X = x] ≤ 0. The first axiom of probability states that P[X = x] ≥ 0. Therefore, P[X = x] = 0. This is true regardless of the outcome, x. It follows that every outcome has probability zero. P [X = x] ≤ lim P [Y = dnxe] = lim

5

Section 4.2

The Cumulative Distribution Function

6

Cumulative Distribution Definition 4.1 Function (CDF) The cumulative distribution function (CDF) of random variable X is FX (x) = P [X ≤ x] .

7

Theorem 4.1 For any random variable X, (a) FX(−∞) = 0 (b) FX(∞) = 1

(c) P[x1 < X ≤ x2] FX(x1)

=

FX(x2) −

8

Definition 4.2 Continuous Random Variable X is a continuous random variable if the CDF FX(x) is a continuous function.

9

Example 4.2 Problem In the wheel-spinning experiment of Example 4.1, find the CDF of X.

10

Example 4.2 Solution We begin by observing that any outcome x ∈ SX = [0, 1). This implies that FX(x) = 0 for x < 0, and FX(x) = 1 for x ≥ 1. To find the CDF for x between 0 and 1 we consider the event {X ≤ x}, with x growing from 0 to 1. Each event corresponds to an arc on the circle in Figure 4.1. The arc is small when x ≈ 0 and it includes nearly the whole circle when x ≈ 1. FX(x) = P[X ≤ x] is the probability that the pointer stops somewhere in the arc. This probability grows from 0 to 1 as the arc increases to include the whole circle. Given our assumption that the pointer has no preferred stopping places, it is reasonable to expect the probability to grow in proportion to the fraction of the circle occupied by the arc {X ≤ x}. This fraction is simply x. To be more formal, we can refer to Figure 4.1 and note that with the circle divided into n arcs, {Y ≤ dnxe − 1} ⊂ {X ≤ x} ⊂ {Y ≤ dnxe} .

(4.4)

Therefore, the probabilities of the three events are related by FY (dnxe − 1) ≤ FX (x) ≤ FY (dnxe) .

(4.5) [Continued]

11

Example 4.2 Solution Note that Y is a discrete random   0 FY (y) = k/n  1

(Continued 2)

variable with CDF y < 0, (k − 1)/n < y ≤ k/n, k = 1, 2, . . . , n, y > 1.

(4.6)

Thus for x ∈ [0, 1) and for all n, we have dnxe dnxe − 1 ≤ FX (x) ≤ . (4.7) n n In Problem 4.2.3, we ask the reader to verify that limn→∞ dnxe/n = x. This implies that as n → ∞, both fractions approach x. The CDF of X is

1   0 FX (x) = x  1

FX(x)

0.5

x < 0, 0 ≤ x < 1, x ≥ 1.

(4.8)

0 0

0.5

1

x

12

Quiz 4.2 The cumulative distribution function of the random variable Y is FY (y ) =

   0   

y/4 1

y < 0, 0 ≤ y ≤ 4, y > 4.

(4.9)

Sketch the CDF of Y and calculate the following probabilities:

(a) P[Y ≤ −1]

(b) P[Y ≤ 1]

(c) P[2 < Y ≤ 3]

(d) P[Y > 1.5] 13

Quiz 4.2 Solution The CDF of Y is FY(y)

1 0.5

F Y (y ) =

0 0

2 y

4

   0   

y/4 1

y < 0, 0 ≤ y ≤ 4, y > 4.

(1)

From the CDF FY (y), we can calculate the probabilities: (a) P[Y ≤ −1] = FY (−1) = 0 (b) P[Y ≤ 1] = FY (1) = 1/4 (c) P [2 < Y ≤ 3] = FY (3) − FY (2) = 3/4 − 2/4 = 1/4. (d) P [Y > 1.5] = 1 − P [Y ≤ 1.5] = 1 − FY (1.5) = 1 − (1.5)/4 = 5/8.

14

Section 4.3

Probability Density Function

15

Figure 4.2 FX (x) p2 p1

∆ x1

∆ x2

x

The graph of an arbitrary CDF FX(x).

16

Probability Density Function Definition 4.3 (PDF) The probability density function (PDF) of a continuous random variable X is fX (x) =

dFX (x) . dx

17

Example 4.3 Problem Figure 4.3 depicts the PDF of a random variable X that describes the voltage at the receiver in a modem. What are probable values of X?

18

Example 4.3 Solution Note that there are two places where the PDF has high values and that it is low elsewhere. The PDF indicates that the random variable is likely to be near −5 V (corresponding to the symbol 0 transmitted) and near +5 V (corresponding to a 1 transmitted). Values far from ±5 V (due to strong distortion) are possible but much less likely.

19

Figure 4.3

fX (x)

x −5

5

The PDF of the modem receiver voltage X.

20

Theorem 4.2 For a continuous random variable X with PDF fX(x),

(a) fX(x) ≥ 0 for all x,

(b) FX(x) =

(c)

Z ∞ −∞

Z x −∞

fX(u) du,

fX(x) dx = 1.

21

Proof: Theorem 4.2 The first statement is true because FX(x) is a nondecreasing function of x and therefore its derivative, fX(x), is nonnegative. The second fact follows directly from the definition of fX(x) and the fact that FX(−∞) = 0. The third statement follows from the second one and Theorem 4.1(b).

22

Theorem 4.3 P [x1 < X ≤ x2] =

Z x 2 x1

fX (x) dx.

23

Proof: Theorem 4.3 From Theorem 4.1(c) and Theorem 4.2(b), P [x1 < X ≤ x2] = FX (x2) − FX (x1) Z Z =

x2

−∞

fX (x) dx −

x1

−∞

fX (x) dx =

Z x 2 x1

fX (x) dx. (4.13)

24

Figure 4.4 fX (x) FX (x2 ) − FX (x1 )

x1

x2

x

The PDF and CDF of X.

25

Example 4.4 Problem For the experiment in Examples 4.1 and 4.2, find the PDF of X and the probability of the event {1/4 < X ≤ 3/4}.

26

Example 4.4 Solution Taking the derivative of the CDF in Equation (4.8), fX(x) = 0 when x < 0 or x ≥ 1. For x between 0 and 1 we have fX(x) = dFX(x)/dx = 1. Thus the PDF of X is

fX(x)

1 0.5

 fX (x) =

1 0

0 ≤ x < 1, otherwise.

(4.15)

0 0

0.5 1 x The fact that the PDF is constant over the range of possible values of X reflects the fact that the pointer has no favorite stopping places on the circumference of the circle. To find the probability that X is between 1/4 and 3/4, we can use either Theorem 4.1 or Theorem 4.3. Thus P [1/4 < X ≤ 3/4] = FX (3/4) − FX (1/4) = 1/2,

(4.16)

and equivalently,

Z

3/4

P [1/4 < X ≤ 3/4] =

Z

3/4

fX (x) dx = 1/4

dx = 1/2.

(4.17)

1/4

27

Example 4.5 Problem Consider an experiment that consists of spinning the pointer in Example 4.1 three times and observing Y meters, the maximum value of X in the three spins. In Example 8.3, we show that the CDF of Y is

1 FY (y)

0.5 0

   0 F Y (y ) = y 3   

1

y < 0, 0 ≤ y ≤ 1, y > 1.

(4.18)

0 0.5 1 y Find the PDF of Y and the probability that Y is between 1/4 and 3/4.

28

Example 4.5 Solution We apply Definition 4.3 to the CDF FY (y). When FY (y) is piecewise differentiable, we take the derivative of each piece:

3 fY (y)

2 dFY (y) = fY (y) = dy

1



3y 2 0

0 < y ≤ 1, otherwise.

(4.19)

0 0

0.5

1

y Note that the PDF has values between 0 and 3. Its integral between any pair of numbers is less than or equal to 1. The graph of fY (y) shows that there is a higher probability of finding Y at the right side of the range of possible values than at the left side. This reflects the fact that the maximum of three spins produces higher numbers than individual spins. Either Theorem 4.1 or Theorem 4.3 can be used to calculate the probability of observing Y between 1/4 and 3/4: P [1/4 < Y ≤ 3/4] = FY (3/4) − FY (1/4) = (3/4)3 − (1/4)3 = 13/32,

(4.20)

and equivalently,

Z

3/4

P [1/4 < Y ≤ 3/4] =

Z

3/4

fY (y) dy = 1/4

3y 2 dy = 13/32.

(4.21)

1/4

Note that this probability is less than 1/2, which is the probability of 1/4 < X ≤ 3/4 29 calculated in Example 4.4 for one spin of the pointer.

Quiz 4.3 Random variable X has probability density function fX (x) =

 cxe−x/2 0

x ≥ 0, otherwise.

(4.23)

Sketch the PDF and find the following: (a) the constant c (b) the CDF FX(x) (c) P[0 ≤ X ≤ 4] (d) P[−2 ≤ X ≤ 2]

30

Quiz 4.3 Solution (a) First we will find the constant c and then we will sketch the PDF. To find c, we use the fact that Z ∞ Z ∞ 1= fX (x) dx = cxe−x/2 dx. (1) −∞

0

We evaluate this integral using integration by parts: ∞ Z ∞ 1 = −2cxe−x/2 + 2ce−x/2 dx 0 0 {z } | =0 ∞ −x/2 = −4ce = 4c.

(2)

0

Thus c = 1/4 and X has the Erlang (n = 2, λ = 1/2) PDF

fX(x)

0.2 0.1

 (x/4)e−x/2 fX (x) = 0

x ≥ 0, otherwise.

0 0

5

10

15

x (b) To find the CDF FX(x), we first note X is a nonnegative random variable so that FX(x) = 0 for all x < 0. For x ≥ 0, [Continued] 31

Quiz 4.3 Solution Z

(Continued 2)

x

Z

x

y −y/2 e dy 4 0 0 Z x 1 −y/2 y −y/2 x e dy =− e + 2 2 0 0 x −x/2 =1− e − e−x/2 . 2 The complete expression for the CDF is FX (x) =

fX (y) dy =

(3)

FX(x)

1 0.5

 1− FX (x) = 0

x 2

+ 1 e−x/2



x ≥ 0, ow.

0 0

5

10

15

x (c) From the CDF FX(x), P [0 ≤ X ≤ 4] = FX (4) − FX (0) = 1 − 3e−2 .

(4)

(d) Similarly, P [−2 ≤ X ≤ 2] = FX (2) − FX (−2) = 1 − 3e−1 .

32

(5)

Section 4.4

Expected Values

33

Definition 4.4 Expected Value The expected value of a continuous random variable X is E [X ] =

Z ∞ −∞

xfX (x) dx.

34

Example 4.6 Problem In Example 4.4, we found that the stopping point X of the spinning wheel experiment was a uniform random variable with PDF fX (x) =

 1 0

0 ≤ x < 1, otherwise.

(4.28)

Find the expected stopping point E[X] of the pointer.

35

Example 4.6 Solution

E [X ] =

Z ∞ −∞

xfX (x) dx =

Z 1 0

x dx = 1/2 meter.

(4.29)

With no preferred stopping points on the circle, the average stopping point of the pointer is exactly halfway around the circle.

36

Example 4.7 Let X be a uniform random variable with PDF fX (x) =

 1 0

0 ≤ x < 1, otherwise.

(4.30)

Let W = g(X) = 0 if X ≤ 1/2, and W = g(X) = 1 if X > 1/2. W is a discrete random variable with range SW = {0, 1}.

37

Theorem 4.4 The expected value of a function, g(X), of random variable X is E [g(X)] =

Z ∞ −∞

g(x)fX (x) dx.

38

Theorem 4.5 For any random variable X,

(a) E[X − µX ] = 0,

(b) E[aX + b] = a E[X] + b,

(c) Var[X] = E[X 2] − µ2 X, (d) Var[aX + b] = a2 Var[X].

39

Example 4.8 Problem Find the variance and standard deviation of the pointer position in Example 4.1.

40

Example 4.8 Solution To compute Var[X], we use Theorem 4.5(c): Var[X] = E[X 2] − µ2 X . We calculate E[X 2] directly from Theorem 4.4 with g(X) = X 2: h

i

E X2 =

Z ∞ −∞

x2fX (x) dx =

Z 1 0

x2 dx = 1/3 m2.

(4.32)

In Example 4.6, we have E[X] = 1/2. Thus − (1/2)2 = q Var[X] = 1/3 √ 1/12, and the standard deviation is σX = Var[X] = 1/ 12 = 0.289 meters.

41

Quiz 4.4 The probability density function of the random variable Y is f Y (y ) =

 3y 2/2 0

−1 ≤ y ≤ 1, otherwise.

(4.33)

Sketch the PDF and find the following:

(a) the expected value E[Y ]

(b) the second moment E[Y 2]

(c) the variance Var[Y ]

(d) the standard deviation σY 42

Quiz 4.4 Solution

fY(y)

The PDF of Y is 3

2  2 3y /2 fY (y) = 0

1 0 −2

0 y

−1 ≤ y ≤ 1, otherwise.

(1)

2

(a) The expected value of Y is Z ∞ Z E [Y ] = yfY (y) dy = −∞

1

−1

3

(3/2)y dy =

1 (3/8)y 4 −1



= 0.

(2)

Note that the above calculation wasn’t really necessary because E[Y ] = 0 whenever the PDF fY (y) is an even function, i.e., fY (y) = fY (−y). (b) The second moment of Y is Z ∞ Z 1  2 2 4 5 1 E Y = y fY (y) dy = (3/2)y dy = (3/10)y −1 = 3/5. (3) −∞

−1

(c) The variance of Y is

  Var[Y ] = E Y 2 − (E [Y ])2 = 3/5. p p (d) The standard deviation of Y is σY = Var[Y ] = 3/5.

(4)

43

Section 4.5

Families of Continuous Random Variables

44

Definition 4.5 Uniform Random Variable X is a uniform (a, b) random variable if the PDF of X is fX (x) =

 1/(b − a) 0

a ≤ x < b, otherwise,

where the two parameters are b > a.

45

Theorem 4.6 If X is a uniform (a, b) random variable,

• The CDF of X is

FX (x) =

   0   

• The expected value of X is • The variance of X is

(x − a)/(b − a) 1

x ≤ a, a < x ≤ b, x > b.

E [X ] = (b + a)/2. Var [X ] = (b − a)2/12.

46

Example 4.9 Problem The phase angle, Θ, of the signal at the input to a modem is uniformly distributed between 0 and 2π radians. What are the PDF, CDF, expected value, and variance of Θ?

47

Example 4.9 Solution From the problem statement, we identify the parameters of the uniform (a, b) random variable as a = 0 and b = 2π. Therefore the PDF and CDF of Θ are f Θ (θ ) =

 1/(2π) 0

   0

0 ≤ θ < 2π, FΘ (θ) = θ/(2π)  otherwise,   1

θ ≤ 0, 0 < x ≤ 2π, (4.34) x > 2π.

The expected value is E[Θ] = b/2 = π radians, and the variance is Var[Θ] = (2π)2/12 = π 2/3 rad2.

48

Theorem 4.7 Let X be a uniform (a, b) random variable, where a and b are both integers. Let K = dXe. Then K is a discrete uniform (a + 1, b) random variable.

49

Proof: Theorem 4.7 Recall that for any x, dxe is the smallest integer greater than or equal to x. It follows that the event {K = k} = {k − 1 < x ≤ k}. Therefore, P [K = k] = PK (k) = =

Z k

PX (x) dx

k−1 1/(b − a) 0

k = a + 1, a + 2, . . . , b, otherwise.

(4.35)

This expression for PK(k) conforms to Definition 3.8 of a discrete uniform (a + 1, b) PMF.

50

Definition 4.6 Exponential Random Variable X is an exponential (λ) random variable if the PDF of X is fX (x) =

 λe−λx 0

x ≥ 0, otherwise,

where the parameter λ > 0.

51

Example 4.10 Problem The duration T of a telephone call is often modeled as an exponential (λ) random variable.. If λ = 1/3, what is E[T ], the expected duration of a telephone call? What are the variance and standard deviation of T ? What is the probability that a call duration is within ±1 standard deviation of the expected call duration?

52

Example 4.10 Solution From Definition 4.6, T has PDF fT (t) =

 (1/3)e−t/3 0

t ≥ 0, otherwise.

(4.36)

With fT(t), we use Definition 4.4 to calculate the expected duration of a call: Z ∞ Z ∞ 1 E [T ] = (4.37) tfT (t) dt = t e−t/3 dt. 3 −∞ 0 Integration by parts (Appendix B, Math Fact B.10) yields Z ∞ ∞ E [T ] = −te−t/3 + e−t/3 dt = 3 minutes. 0 0

(4.38)

To calculate the variance, we begin with the second moment of T : Z ∞ Z ∞ i 1 −t/3 2 2 2 E T = t fT (t) dt = t e dt. 3 −∞ 0 h

(4.39) [Continued] 53

Example 4.10 Solution

(Continued 2)

Again integrating by parts, we have Z ∞ Z ∞ ∞ E T 2 = −t2e−t/3 + (2t)e−t/3 dt = 2 te−t/3 dt. (4.40) 0 0 0 R ∞ −t/3 With the knowledge that E[T ] = 3, we observe that 0 te dt = 3 E[T ] = 9. Thus E[T 2] = 6 E[T ] = 18 and h

i

h

i 2 Var [T ] = E T − (E [T ])2 = 18 − 32 = 9 minutes2. q

(4.41)

P [0 ≤ T ≤ 6] = FT (6) − FT (0) = 1 − e−2 = 0.865

(4.42)

The standard deviation is σT = Var[T ] = 3 minutes. The probability that the call duration is within 1 standard deviation of the expected value is

54

Theorem 4.8 If X is an exponential (λ) random variable, • The CDF of X is • The expected value of X is • The variance of X is

FX (x) =

 1 − e−λx 0

x ≥ 0, otherwise.

E [X ] = 1/λ. Var [X ] = 1/λ2.

55

Theorem 4.9 If X is an exponential (λ) random variable, then K = dXe is a geometric (p) random variable with p = 1 − e−λ.

56

Proof: Theorem 4.9 As in the Theorem 4.7 proof, the definition of K implies PK(k) = P[k − 1 < X ≤ k]. Referring to the CDF of X in Theorem 4.8, we observe PK (k) = Fx (k) − Fx (k − 1)  e−λ(k−1) − e−λk = 0

 (e−λ)k−1(1 − e−λ)

k = 1, 2, . . . = 0 otherwise,

k = 1, 2, . . . otherwise. (4.43)

If we let p = 1 − e−λ, we have P K (k ) =

 p(1 − p)k−1 0

k = 1, 2, . . . otherwise,

(4.44)

which conforms to Definition 3.5 of a geometric (p) random variable with p = 1 − e−λ. 57

Example 4.11 Problem Phone company A charges $0.15 per minute for telephone calls. For any fraction of a minute at the end of a call, they charge for a full minute. Phone Company B also charges $0.15 per minute. However, Phone Company B calculates its charge based on the exact duration of a call. If T , the duration of a call in minutes, is an exponential (λ = 1/3) random variable, what are the expected revenues per call E[RA] and E[RB ] for companies A and B?

58

Example 4.11 Solution Because T is an exponential random variable, we have in Theorem 4.8 (and in Example 4.10) E[T ] = 1/λ = 3 minutes per call. Therefore, for phone company B, which charges for the exact duration of a call, E [RB ] = 0.15 E [T ] = $0.45 per call.

(4.45)

Company A, by contrast, collects $0.15dT e for a call of duration T minutes. Theorem 4.9 states that K = dT e is a geometric random variable with parameter p = 1 − e−1/3. Therefore, the expected revenue for Company A is E [RA] = 0.15 E [K ] = 0.15/p = (0.15)(3.53) = $0.529 per call.

(4.46)

59

Definition 4.7 Erlang Random Variable X is an Erlang (n, λ) random variable if the PDF of X is

fX (x) =

 n n−1 e−λx   λ x

(n − 1)!   0

x ≥ 0, otherwise,

where the parameter λ > 0, and the parameter n ≥ 1 is an integer.

60

Theorem 4.10 If X is an Erlang (n, λ) random variable, then n n (a) E[X] = , (b) Var[X] = 2 . λ λ

61

Theorem 4.11 Let Kα denote a Poisson (α) random variable. For any x > 0, the CDF of an Erlang (n, λ) random variable X satisfies FX (x) = 1 − FKλx (n − 1) =

 

Pn−1 (λx)k e−λx 1 − k=0 . k!

0

x ≥ 0, otherwise.

62

Quiz 4.5 Continuous random variable X has E[X] = 3 and Var[X] = 9. Find the PDF, fX(x), if

(a) X is an exponential random variable,

(b) X is a continuous uniform random variable.

(c) X is an Erlang random variable.

63

Quiz 4.5 Solution (a) When X is an exponential (λ) random variable, E[X] = 1/λ and Var[X] = 1/λ2 . Since E[X] = 3 and Var[X] = 9, we must have λ = 1/3. The PDF of X is  (1/3)e−x/3 x ≥ 0, fX (x) = (1) 0 otherwise. (b) We know X is a uniform (a, b) random variable. To find a and b, we apply Theorem 4.6 to write a+b =3 2 (b − a)2 Var[X] = = 9. 12 E [X] =

(2) (3)

This implies a + b = 6, The only valid solution with a < b is √ a = 3 − 3 3,



b − a = ±6 3.

(4)



b = 3 + 3 3.

(5) [Continued]

64

Quiz 4.5 Solution

(Continued 2)

The complete expression for the PDF of X is √ √ √  1/(6 3) 3 − 3 3 < x < 3 + 3 3, fX (x) = 0 otherwise.

(6)

(c) We know that the Erlang (n, λ) random variable has PDF ( n n−1 −λx λ x e x ≥ 0, (n−1)! fX (x) = 0 otherwise.

(7)

The expected value and variance are E[X] = n/λ and Var[X] = n/λ2 . This implies n = 3, λ

n = 9. λ2

(8)

It follows that n = 3λ = 9λ2 .

(9)

Thus λ = 1/3 and n = 1. As a result, the Erlang (n, λ) random variable must be the exponential (λ = 1/3) random variable with PDF  (1/3)e−x/3 x ≥ 0, fX (x) = (10) 0 otherwise.

65

Section 4.6

Gaussian Random Variables

66

Definition 4.8 Gaussian Random Variable X is a Gaussian (µ, σ) random variable if the PDF of X is 1

2 −(x−µ) /2σ 2 , e

fX (x) = √ 2πσ 2 where the parameter µ can be any real number and the parameter σ > 0.

67

Theorem 4.12 If X is a Gaussian (µ, σ) random variable, E [X ] = µ

Var [X ] = σ 2.

68

Figure 4.5 0.8

0.8

0.6

0.6

fX(x)

fX(x) 0.4

0.4

0.2

0.2

0

0 −2

0

2

4

x (a) µ = 2, σ = 1/2

6

−2

0

2

4

6

x (b) µ = 2, σ = 2

Two examples of a Gaussian random variable X with expected value µ and standard deviation σ.

69

Theorem 4.13 If X is Gaussian (µ, σ), Y = aX + b is Gaussian (aµ + b, aσ).

70

Standard Normal Random Definition 4.9 Variable The standard normal random variable Z is the Gaussian (0, 1) random variable.

71

Definition 4.10 Standard Normal CDF The CDF of the standard normal random variable Z is Z z 2 1 Φ(z) = √ e−u /2 du. 2π −∞

72

Theorem 4.14 If X is a Gaussian (µ, σ) random variable, the CDF of X is x−µ FX (x) = Φ . σ The probability that X is in the interval (a, b] is 



b−µ a−µ P [a < X ≤ b] = Φ −Φ . σ σ 







73

Example 4.12 Problem Suppose your score on a test is x = 46, a sample value of the Gaussian (61, 10) random variable. Express your test score as a sample value of the standard normal random variable, Z.

74

Example 4.12 Solution Equation (4.47) indicates that z = (46 − 61)/10 = −1.5. Therefore your score is 1.5 standard deviations less than the expected value.

75

0.5

0.5

0.4

0.4

0.3

0.3

fX(x)

fX(x)

Figure 4.6

←Φ(z)

0.2 0.1 0 −4

0.2

← 1−Φ(z)

Φ(−z) →

0.1 −2

0 x

z

2

4

0 −4

−2

−z

0 x

z

2

4

(a) (b) Symmetry properties of the Gaussian (0, 1) PDF.

76

Theorem 4.15 Φ(−z) = 1 − Φ(z).

77

Example 4.13 Problem If X is a Gaussian (µ = 61, σ = 10) random variable, what is P[51 < X ≤ 71]?

78

Example 4.13 Solution Applying Equation (4.47), Z = (X − 61)/10 and X − 61 {51 < X ≤ 71} = −1 ≤ ≤ 1 = {−1 < Z ≤ 1} . 10 The probability of this event is 



(4.48)

P [−1 < Z ≤ 1] = Φ(1) − Φ(−1) = Φ(1) − [1 − Φ(1)] = 2Φ(1) − 1 = 0.683.

(4.49)

79

Standard Normal Definition 4.11 Complementary CDF The standard normal complementary CDF is ∞ 2 /2 1 −u e Q(z) = P [Z > z ] = √ du = 1 − Φ(z). 2π z

Z

80

Example 4.14 Problem In q an optical fiber transmission system, the probability of a bit error is Q( γ/2), where γ is the signal-to-noise ratio. What is the minimum value of γ that produces a bit error rate not exceeding 10−6?

81

Example 4.14 Solution −6 when z ≥ 4.75. ThereReferring q to Table 4.1, we find that Q(z) < 10 fore, if γ/2 ≥ 4.75, or γ ≥ 45, the probability of error is less than 10−6.

Although 10( − 6) seems a very small number, most practical optical fiber transmission systems have considerably lower binary error rates.

82

Quiz 4.6 X is the Gaussian (0, 1) random variable and Y is the Gaussian (0, 2) random variable. Sketch the PDFs fX(x) and fY (y) on the same axes and find:

(a) P[−1 < X ≤ 1],

(b) P[−1 < Y ≤ 1],

(c) P[X > 3.5],

(d) P[Y > 3.5].

83

Quiz 4.6 Solution The PDFs of X and Y are: fY(y)

0.4

← fX(x)

0.2 fX(x)

← fY(y)

0 −5

0 x

5 y

The fact that Y has twice the standard deviation of X is reflected in the greater spread of fY (y). However, it is important to remember that as the standard deviation increases, the peak value of the Gaussian PDF goes down. Each of the requested probabilities can be calculated using Φ(z) function and Table 4.1 or Q(z) and Table 4.2. [Continued]

84

Quiz 4.6 Solution

(Continued 2)

(a) Since X is Gaussian (0, 1), P [−1 < X ≤ 1] = FX (1) − FX (−1) = Φ(1) − Φ(−1) = 2Φ(1) − 1 = 0.6826.

(1)

(b) Since Y is Gaussian (0, 2), P [−1 < Y ≤ 1] = FY (1) − FY (−1) ! ! −1 1 −Φ =Φ σ σY Y  1 = 2Φ − 1 = 0.383. 2

(2)

(c) Again, since X is Gaussian (0, 1), P[X > 3.5] = Q(3.5) = 2.33 × 10−4. (d) Since Y is Gaussian (0, 2), 3.5 P [Y > 3.5] = Q = 1 − Φ(1.75) = 0.04. 2 



(3)

85

Section 4.7

Delta Functions, Mixed Random Variables

86

Definition 4.12 Unit Impulse (Delta) Function Let d(x) =

 1/ 0

−/2 ≤ x ≤ /2, otherwise.

The unit impulse function is δ(x) = lim d(x). →0

87

Figure 4.7 δ (x) =

1 16

=

1 8

=

1 4

=

−1/2

1 2

=1

x

1/2

As  → 0, d(x) approaches the delta function δ(x). For each , the area under the curve of d(x) equals 1.

88

Theorem 4.16 For any continuous function g(x), Z ∞ −∞

g(x)δ(x − x0) dx = g(x0).

89

Definition 4.13 Unit Step Function The unit step function is u(x) =

 0 1

x < 0, x ≥ 0.

90

Theorem 4.17 Z x −∞

δ(v) dv = u(x).

91

CDF of a Discrete Random 4.7 Comment: Variable Consider the CDF of a discrete random variable, X. Recall that it is constant everywhere except at points xi ∈ SX , where it has jumps of height PX(xi). Using the definition of the unit step function, we can write the CDF of X as FX (x) =

X

PX (xi) u(x − xi).

(4.55)

xi ∈SX

From Definition 4.3, we take the derivative of FX(x) to find the PDF fX(x). Referring to Equation (4.54), the PDF of the discrete random variable X is fX (x) =

X

PX (xi) δ(x − xi).

(4.56)

xi ∈SX

92

Example 4.15 Suppose Y takes on the values 1, 2, 3 with equal probability. sponding CDF of Y are   0     1/3 y = 1, 2, 3, 1/3 PY (y) = FY (y) =  0 otherwise, 2/3   1

The PMF and the correy < 1, 1 ≤ y < 2, 2 ≤ y < 3, y ≥ 3.

(4.59)

Using the unit step function u(y), we can write FY (y) more compactly as 1 1 1 u(y − 1) + u(y − 2) + u(y − 3). 3 3 3

(4.60)

dFY (y) 1 1 1 = δ(y − 1) + δ(y − 2) + δ(y − 3). dy 3 3 3

(4.61)

FY (y) = The PDF of Y is fY (y) =

[Continued]

93

Example 4.15

(Continued 2)

We see that the discrete random variable Y can be represented graphically either by a PMF PY (y) with bars at y = 1, 2, 3, by a CDF with jumps at y = 1, 2, 3, or by a PDF fY (y) with impulses at y = 1, 2, 3. These three representations are shown in Figure 4.8. The expected value of Y can be calculated either by summing over the PMF PY (y) or integrating over the PDF fY (y). Using the PDF, we have E [Y ] = =

Z ∞

yfY (y ) dy

Z−∞ ∞ y

−∞ 3

δ(y − 1) dy +

= 1/3 + 2/3 + 1 = 2.

Z ∞ y −∞ 3

δ(y − 2) dy +

Z ∞ y −∞ 3

δ(y − 3) dy (4.62)

94

Figure 4.8

0

1/3 1/3 1/3 fY(y)

FY(y)

1/3

Y

P (y)

1 2/3 1/3 0 0

2 y

4

0

2 y

4

0

2 y

4

The PMF, CDF, and PDF of the discrete random variable Y .

95

Example 4.16 For the random variable Y of Example 4.15, FY



2−



= 1/3,

FY



2+



= 2/3.

(4.64)

96

Theorem 4.18 For a random variable X, we have the following equivalent statements: (a) P[X = x0] = q (b) PX(x0) = q

− (c) FX(x+ ) − F (x X 0 0) = q (d) fX(x0) = qδ(0)

97

Definition 4.14 Mixed Random Variable X is a mixed random variable if and only if fX(x) contains both impulses and nonzero, finite values.

98

Example 4.17 Problem Observe someone dialing a telephone and record the duration of the call. In a simple model of the experiment, 1/3 of the calls never begin either because no one answers or the line is busy. The duration of these calls is 0 minutes. Otherwise, with probability 2/3, a call duration is uniformly distributed between 0 and 3 minutes. Let Y denote the call duration. Find the CDF FY (y), the PDF fY (y), and the expected value E[Y ].

99

Example 4.17 Solution Let A denote the event that the phone was answered. P[A] = 2/3 and P[Ac ] = 1/3. Since Y ≥ 0, we know that for y < 0, FY (y) = 0. Similarly, we know that for y > 3, FY (y) = 1. For 0 ≤ y ≤ 3, we apply the law of total probability to write FY (y) = P [Y ≤ y] = P [Y ≤ y|Ac ] P [Ac ] + P [Y ≤ y|A] P [A] .

(4.65)

When Ac occurs, Y = 0, so that for 0 ≤ y ≤ 3, P[Y ≤ y|Ac ] = 1. When A occurs, the call duration is uniformly distributed over [0, 3], so that for 0 ≤ y ≤ 3, P[Y ≤ y|A] = y/3. So, for 0 ≤ y ≤ 3, FY (y) = (1/3)(1) + (2/3)(y/3) = 1/3 + 2y/9.

(4.66)

The complete CDF of Y is

1   0 FY (y) = 1/3 + 2y/9  1

FY (y)

1/3

y < 0, 0 ≤ y < 3, y ≥ 3.

0 0

1

2

3

y [Continued]

100

Example 4.17 Solution

(Continued 2)

Consequently, the corresponding PDF fY (y) is

1/3 fY (y)



2/9

fY (y) =

δ(y)/3 + 2/9 0

0 ≤ y ≤ 3, otherwise.

0 0

1

2

3

y For the mixed random variable Y , it is easiest to calculate E[Y ] using the PDF: 3 Z ∞ Z 3 1 2 2 y 2 E [Y ] = y δ(y) dy + y dy = 0 + (4.67) = 1 minute. 3 9 9 2 −∞ 0 0

101

Properties of Random 4.7 Comment: Variables For any random variable X,

• X always has a CDF FX(x) = P[X ≤ x]. • If FX(x) is piecewise flat with discontinuous jumps, then X is discrete. • If FX(x) is a continuous function, then X is continuous. • If FX(x) is a piecewise continuous function with discontinuities, then X is mixed.

• When X is discrete or mixed, the PDF fX(x) contains one or more 102 delta functions.

Quiz 4.7 The cumulative distribution function of random   0 FX (x) = (x + 1)/4  1

variable X is x < −1, −1 ≤ x < 1, x ≥ 1.

(4.68)

Sketch the CDF and find the following: (a) P[X ≤ 1] (b) P[X < 1] (c) P[X = 1] (d) the PDF fX(x)

103

Quiz 4.7 Solution The CDF of X is

X

F (x)

1

  0 FX (x) = (x + 1)/4  1

0.5 0 −2

0 x

2

x < −1, −1 ≤ x < 1, x ≥ 1.

(1)

The following probabilities can be read directly from the CDF: (a) P[X ≤ 1] = FX(1) = 1. (b) P[X < 1] = FX(1− ) = 1/2. (c) P[X = 1] = FX(1+ ) − FX(1− ) = 1/2. (d) We find the PDF fY (y) by taking the derivative of FY (y). The resulting PDF is

0.5

fX(x)

0.5

fX (x) = 0 −2

0 x

2

 1  4

δ(x−1)  2

0

−1 ≤ x < 1, x = 1, otherwise.

(2)

104