Probability Concepts

Probability Concepts Everyday Probability 1 in 4 chance „ 30% survival rate „ 1 in 7.1 million „ ½ the time „ 88% on-time „ Statistical Probabilit...
Author: Erick Warren
4 downloads 0 Views 1MB Size
Probability Concepts

Everyday Probability 1 in 4 chance „ 30% survival rate „ 1 in 7.1 million „ ½ the time „ 88% on-time „

Statistical Probability Expressed in-terms of 0 – 1 „ Used to 0 – 100%, this comes from multiplying the probability by 100% „ The closer to 0, the less likely „ The closer to 1, the more likely „

0

0.5 0

1

Examples Rolling a “1” on fair die is 1 / 6 „ Choosing a heart from a full deck is 13 / 52 „ Tossing “heads” with a fair coin is 0.5 „ Drawing a “t” from alphabet pool is 1 / 26 „ Choosing correct number between 1 and 10 is 0.10 „

Types of Probability There are two types of probability: 1) Objective probability 2) Subjective Probability (see Daniel)

Objective Probability Objective probability is measuring the likelihood of events based on objective processes. Objective probability can be classified as either: a) classical, or a priori, probability b) relative frequency, or a posteriori, probability

History of classical probability “Since the beginning of recorded history, gambling – the very essence of risktaking – has been a popular pastime and often an addiction. It was a game of chance that inspired Pascal and Fermat’s revolutionary breakthrough into the laws of probability, not some profound question about the nature of capitalism or visions of the future.”

Games and a priori It is not necessary to actually roll a die or deal a card to determine the probability of a certain number being rolled or card being drawn. „ The probabilities are based on reasoning, not the act. „ Previous acts have no influence on the next act. „

The Law of Averages “Losing streaks and winning streaks occur frequently in games of chance, as they do in real life. Gamblers respond to these events in asymmetric fashion: they appeal to the law of averages to bring losing streaks to a speedy end. And they appeal to the same law of averages to suspend itself so that winning streaks will go on and on. The law of averages hears neither appeal. The last sequence of throws of the dice conveys absolutely no information about what the next throw will bring. Cards, coins, dice and roulette wheels have no memory.”

Events, E An event is the occurrence of any defined circumstance: „ a “2” on a roll „ drawing a “king of diamonds” „ “success” or “failure” „ “death”

Mutually Exclusive Events If any two events cannot occur simultaneously, the events are said to be mutually exclusive: „ a 2 and a 4 can not be observed on the same roll of a die „ when choosing a single card, the selection can not be both a 2 of hearts and a 4 of diamonds

The probability of E, P(E) Let N be the total number of mutually exclusive and equally likely events „ Let m be the number of each possible event „ Let P(E) be “the probability of E”, then: „

m P (E ) = N

Relative Frequency Probability Often, the true probability of an event is not known and can only be estimated Relative frequency probability is counting the number of repetitions of a process & the number of times each event occurs in order to estimate the likelihood of an event occurring

The probability of E, P(E) Let m be the number of times the event was observed „ Let n be the number of times the process was repeated: „

m P (E ) = n

Known versus Unknown Unlike games, the true probability of events in science is often unknown and can only be estimated after observing the process. Most of what we do is a posteriori or relative frequency probability. When n gets large enough, the probability is more precisely estimated and is assumed to be “known”

Properties of Probability 1) P ( E ) ≥ 0 2) P ( E1 ) + P ( E2 ) + 3)

P ( E1 or E2 or

+ P ( En ) = 1 →

n

∑ P ( Ei ) = 1

i =1

Ek ) = P ( E1 ) + P ( E2 ) +

+ P ( Ek )

These properties only hold if the events are mutually exclusive.

Calculating Probability We can extend the theory to practical applications for estimating probabilities. Automotive Survey - Subjects were asked: „ What car they purchased „ Why they purchased the car „ Country that produced the car „ Gender, age, & martial status

Buyer Information Gender

Marital Status

Female Married 138 196 Male 165

Single 107

Vehicle Information

Age

Country of Origin

Vehicle Size

Vehicle Purpose

range 18-60

America 115

large 42

family 155

mean 30.7

Europe 40

medium 124

sport 100

sd 5.98

Japan 148

small 37

work 48

Gender Vehicle Purpose Family Sporty Work Total

Female 76 41 21 138

Male 79 59 27 165

Total 155 100 48 303

Q: If we select a person at random from this sample, what is the probability that the person is female?

A:

Assume Gender is mutually exclusive. There are 303 total subjects and 138 females. Therefore:

P ( F ) = number of females = 138 303 = 0.4554

total number of subjects

Q: Suppose we select a subject and the subject is male (M), what is the probability that this subject purchased a car for work (w)?

A: Given that the subject is male, the denominator of interest is 165, of these, 27 purchased a car for work purposes. Therefore: number of work vehicles P (w | M ) = total number of males = 27 165 = 0.1636

Q: For females? number of work vehicles P (w | F ) = total number of females = 21 138 = 0.1521

Joint Probability Q: What is the probability that a subject picked at random is female and purchased the car for work purposes? P (F ∩ w ) =

number of work vehicles purchased by women total number of subjects

= 21 303 = 0.0693

The relationship between conditional and joint probabilities can be expressed as:

P (A | B) =

P ( A ∩ B) P (B )

,

P (B ) ≠ 0

P (w | F ) =

P (w ∩ F ) P (F )

= 0.0693 = 0.1521

0.4554

What about “or” The probability of A or B happening? If they are mutually exclusive, just add P(A) + P(B): P(F U M) = P(F) + P(M) = 0.46 + 0.54 = 1

If A and B are not mutually exclusive: „ „

P(A U B) = P(A) + P(B) - P(A ∩ B) This is because the # in P(A ∩ B) is also in P(A) and P(B), so need to subtract out once for overlap:

P ( F ∪ w) = P ( F ) + P ( w) − P ( F ∩ w) = 0.4554 + 0.0891 − 0.0693 = 0.4752

If you know the joint probabilities of A over all levels of B, then you can find the probability of A:

(

P ( Ai ) = ∑ P Ai ∩ B j j

)

P ( F ∩ f ) = 76 P ( F ∩ s ) = 41 P ( F ∩ w ) = 21

303 303 303

= 0.2508 = 0.1353 = 0.0693

P ( F ) = P ( F ∩ f ) + P ( F ∩ s ) + P ( F ∩ w) = 0.2508 + 0.1353 + 0.0693 = 0.4554

Bayes Theorm If you know P(A1)….P(Ak), and they are mutually exclusive. Suppose there is event B, and you know all P(B|Ai). Then: P(B) = P(B|A1)P(A1) + P(B|A2)P(A2) + … + P(B|Ak)P(Ak) „