Transcript Slide 1

7
Probability
Copyright © Cengage Learning. All rights reserved.
7.3
Probability and Probability Models
Copyright © Cengage Learning. All rights reserved.
Probability and Probability Models
Mathematicians tend to avoid the whole debate, and talk
instead about abstract probability, or probability
distributions, based purely on the properties of relative
frequency.
Specific probability distributions can then be used as
models in real-life situations, such as flipping a coin or
tossing a die, to predict (or model) relative frequency.
3
Probability and Probability Models
Probability Distribution; Probability
A (finite) probability distribution is an assignment of a
number P(si), the probability of si, to each outcome of a
finite sample space S = {s1, s2, . . . , sn}.
The probabilities must satisfy
1. 0 ≤ P(si) ≤ 1
and
2. P(s1) + P(s2) + . . . + P(sn) = 1.
4
Probability and Probability Models
We find the probability of an event E, written P(E), by
adding up the probabilities of the outcomes in E.
If P(E) = 0, we call E an impossible event. The empty
event ∅ is always impossible, since something must
happen.
5
Probability and Probability Models
Quick Examples
1. Let us take S = {H, T} and make the assignments
P(H) = .5, P(T) = .5.
Because these numbers are between 0 and 1 and add
to 1, they specify a probability distribution.
6
Probability and Probability Models
It follows that
P({1, 6}) = .3 + .1 = .4
P({2, 3}) = .3 + 0 = .3
P(3) = 0.
{3} is an impossible event.
7
Probability and Probability Models
Probability Models
A probability model for a particular experiment is a
probability distribution that predicts the relative frequency
of each outcome if the experiment is performed a large
number of times.
Just as we think of relative frequency as estimated
probability, we can think of modeled probability as
theoretical probability.
8
Probability and Probability Models
Quick Examples
1. Fair Coin Model: Flip a fair coin and observe the side
that faces up.
Because we expect that heads is as likely to come up as
tails, we model this experiment with the probability
distribution specified by S = {H, T}, P(H) = .5, P(T) = .5.
9
Probability and Probability Models
Figure 4 suggests that the relative frequency of heads
approaches .5 as the number of coin tosses gets large,
so the fair coin model predicts the relative frequency for
a large number of coin tosses quite well.
Figure 4
10
Probability and Probability Models
2. Unfair Coin Model: Take S = {H, T} and P(H) = .2,
P(T) = .8.
We can think of this distribution as a model for the
experiment of flipping an unfair coin that is four times as
likely to land with tails uppermost than heads.
11
Probability and Probability Models
3. Fair Die Model: Roll a fair die and observe the
uppermost number.
Because we expect to roll each specific number one
sixth of the time, we model the experiment with the
probability distribution specified by
S = {1, 2, 3, 4, 5, 6}, P(1) = 1/6, P(2) = 1/6, . . . , P(6) = 1/6.
This model predicts for example, that the relative
frequency of throwing a 5 approaches 1/6 as the number
of times you roll the die gets large.
12
Probability and Probability Models
4. Roll a pair of fair dice (recall that there are a total of 36
outcomes if the dice are distinguishable). Then an
appropriate model of the experiment has
with each outcome being assigned a probability of 1/36.
13
Probability and Probability Models
5. In the experiment in Quick Example 4, take E to be the
event that the sum of the numbers that face up is 5, so
E = {(1, 4), (2, 3), (3, 2), (4, 1)}.
By the properties of probability distributions,
14
Probability and Probability Models
Notice that, in all of the Quick Examples above except for
the unfair coin, all the outcomes are equally likely, and
each outcome s has a probability of
More generally, in the last Quick Example we saw that
adding the probabilities of the individual outcomes in an
event E amounted to computing the ratio (Number of
favorable outcomes)/(Total number of outcomes):
15
Probability and Probability Models
Probability Model for Equally Likely Outcomes
In an experiment in which all outcomes are equally likely,
we model the experiment by taking the probability of an
event E to be
Visualizing Probability for Equally Likely Outcomes
16
Probability and Probability Models
Quick Example
1. Toss a fair coin three times, so S = {HHH, HHT, HTH,
HTT, THH, THT, TTH, TTT}.
The probability that we throw exactly two heads is
There are eight equally likely
outcomes and E = {HHT, HTH, THH}.
17
Example 1 – Sales of Hybrid Vehicles
A total of 1.9 million hybrid vehicles had been sold in the
United States through October of 2011. Of these, 955,000
were Toyota Prii, 205,000 were Honda Civics, 170,000
were Toyota Camrys, 105,000 were Ford Escapes, and the
rest were other makes.
a. What is the probability that a randomly selected hybrid
vehicle sold in the United States was either a Toyota
Prius or a Honda Civic?
b. What is the probability that a randomly selected hybrid
vehicle sold in the United States was not a Toyota
Camry?
18
Example 1(a) – Solution
The experiment suggested by the question consists of
randomly choosing a hybrid vehicle sold in the United
States and determining its make.
We are interested in the event E that the hybrid vehicle was
either a Toyota Prius or a Honda Civic.
So,
S = the set of hybrid vehicles sold;
n(S) = 1,900,000
19
Example 1(a) – Solution
cont’d
E = the set of Toyota Prii and Honda Civics sold;
n(E) = 955,000 + 205,000 = 1,160,000.
Are the outcomes equally likely in this experiment?
Yes, because we are as likely to choose one vehicle as
another. Thus,
20
Example 1(b) – Solution
cont’d
Let the event F consist of those hybrid vehicles sold that
were not Toyota Camrys.
n(F) = 1,900,000 – 170,000 = 1,730,000
Hence,
21
Probability of Unions, Intersections,
and Complements
22
Probability of Unions, Intersections, and Complements
So far, all we know about computing the probability of an
event E is that P(E) is the sum of the probabilities of the
individual outcomes in E.
Suppose, though, that we do not know the probabilities of
the individual outcomes in E but we do know that
E = A  B, where we happen to know P(A) and P(B).
23
Probability of Unions, Intersections, and Complements
How do we compute the probability of A  B?
We might be tempted to say that P(A  B) is P(A) + P(B),
but let us look at an example using the probability
distribution in Quick Example 5:
For A let us take the event {2, 4, 5}, and for B let us take
{2, 4, 6}. A  B is then the event {2, 4, 5, 6}.
24
Probability of Unions, Intersections, and Complements
We know that we can find the probabilities P(A), P(B), and
P(A  B) by adding the probabilities of all the outcomes in
these events, so
P(A) = P({2, 4, 5}) = .3 + .1 + .2 = .6
P(B) = P({2, 4, 6}) = .3 + .1 + .1 = .5, and
P(A  B) = P({2, 4, 5, 6}) = .3 + .1 + .2 + .1 = .7.
Our first guess was wrong: P(A  B)  P(A) + P(B).
25
Probability of Unions, Intersections, and Complements
Notice, however, that the outcomes in A  B are counted
twice in computing P(A) + P(B), but only once in computing
P(A  B):
P(A) + P(B) = P({2, 4, 5}) + P({2, 4, 6})
= (.3 + .1 + .2) + (.3 + .1 + .1)
= 1.1
A  B = {2, 4}
P(A  B) counted twice
Whereas
P(A  B) = P({2, 4, 5, 6})
= .3 + .1 + .2 + .1
= .7.
P(A  B) counted once
26
Probability of Unions, Intersections, and Complements
Thus, if we take P(A) + P(B) and then subtract the surplus
P(A  B), we get P(A  B).
In symbols,
P(A  B) = P(A) + P(B) – P(A  B)
.7 = .6 + .5 – .4.
(see Figure 6).
Figure 6
27
Probability of Unions, Intersections, and Complements
We call this formula the addition principle.
One more thing: Notice that our original guess
P(A  B) = P(A) + P(B) would have worked if we had
chosen A and B with no outcomes in common; that is, if
A  B = ∅.
When A  B = ∅, recall that we say that A and B are
mutually exclusive.
28
Probability of Unions, Intersections, and Complements
Addition Principle
If A and B are any two events, then
P(A  B) = P(A) + P(B) – P(A  B).
Visualizing the Addition Principle
In the figure, the area of the union
is obtained by adding the areas of
A and B and then subtracting the
overlap (because it is counted
twice when we add the areas).
29
Probability of Unions, Intersections, and Complements
Addition Principle for Mutually Exclusive Events
If A  B = ∅, we say that A and B are mutually exclusive,
and we have
P(A  B) = P(A) + P(B).
Because P(A  B) = 0
Visualizing the Addition Principle for Mutually
Exclusive Events
If A and B do not overlap, then
the area of the union is obtained
by adding the areas of A and B.
30
Probability of Unions, Intersections, and Complements
This holds true also for more than two events:
If A1, A2, . . . , An are mutually exclusive events (that is, the
intersection of every pair of them is empty), then
P(A1  A2  . . .  An)
= P(A1) + P(A2) + . . . + P(An).
Addition principle for
many mutually exclusive
events
31
Probability of Unions, Intersections, and Complements
Quick Example
There is a 10% chance of rain (R) tomorrow, a 20%
chance of high winds (W), and a 5% chance of both. The
probability of either rain or high winds (or both) is
P(R  W) = P(R) + P(W) – P(R  W)
= .10 + .20 – .05
= .25.
32
Example 4 – School and Work
A survey conducted by the Bureau of Labor Statistics found
that 68% of the high school graduating class of 2010 went
on to college the following year, while 42% of the class was
working. Furthermore, 92% were either in college or
working, or both.
a. What percentage went on to college and work at the
same time?
b. What percentage went on to college but not work?
33
Example 4 – Solution
We can think of the experiment of choosing a member of
the high school graduating class of 2010 at random. The
sample space is the set of all these graduates.
a. We are given information about two events:
A: A graduate went on to college; P(A) = .68.
B: A graduate went on to work; P(B) = .42.
We are also told that P(A  B) = .92. We are asked for
the probability that a graduate went on to both college
and work, P(A  B).
34
Example 4 – Solution
cont’d
To find P(A  B), we take advantage of the fact that the
formula
P(A  B) = P(A) + P(B) – P(A  B)
can be used to calculate any one of the four quantities
that appear in it as long as we know the other three.
Substituting the quantities we know, we get
.92 = .68 + .42 – P(A  B)
35
Example 4 – Solution
cont’d
so
P(A  B) = .68 + .42 – .92 = .18.
Thus, 18% of the graduates went on to college and work
at the same time.
36
Example 4 – Solution
cont’d
b. We are asked for the probability of a new event:
C: A graduate went on to college but not work.
C is the part of A outside of A  B, so C  (A  B) = A,
and C and A  B are mutually exclusive. (See Figure 7.)
Figure 7
37
Example 4 – Solution
cont’d
Thus, applying the addition principle, we have
P(C) + P(A  B) = P(A).
From part (a), we know that P(A  B) = .18, so
P(C) + .18 = .68
giving
P(C) = .50.
In other words, 50% of the graduates went on to college
but not work.
38
Probability of Unions, Intersections, and Complements
More Principles of Probability Distributions
The following rules hold for any sample space S and any
event A:
P(S) = 1
The probability of something happening
is 1.
P(∅) = 0
The probability of nothing happening
is 0.
P(A) = 1 – P(A).
The probability of A not happening is
1 minus the probability of A.
39
Probability of Unions, Intersections, and Complements
Note
We can also write the third equation as
P(A) = 1 – P(A)
or
P(A) + P(A) = 1.
40
Probability of Unions, Intersections, and Complements
Visualizing the Rule for Complements
Think of A as the portion of S outside of A. Adding the two
areas gives the area of all of S, equal to 1.
Sample Space S
41
Probability of Unions, Intersections, and Complements
Quick Example
There is a 10% chance of rain (R) tomorrow. Therefore,
the probability that it will not rain is
P(R) = 1 – P(R)
= 1 – .10
= .90.
42