Transcript Chapter 2:
Chapter 2:
Probability
1
Section 2.1: Basic Ideas
Definition: An experiment is a process that results in
an outcome that cannot be predicted in advance with
certainty.
Examples:
rolling a die
tossing a coin
weighing the contents of a box of cereal.
2
Sample Space
Definition: The set of all possible outcomes of an
experiment is called the sample space for the
experiment.
Examples:
• For rolling a fair die, the sample space is {1, 2, 3, 4, 5, 6}.
• For a coin toss, the sample space is {head, tail}.
• For weighing a cereal box, the sample space is (0, ∞), a more
reasonable sample space is (12, 20) for a 16 oz. box.
3
More Terminology
Definition: A subset of a sample space is called an
event.
• A given event is said to have occurred if the outcome
of the experiment is one of the outcomes in the event.
For example, if a die comes up 2, the events {2, 4, 6}
and {1, 2, 3} have both occurred, along with every
other event that contains the outcome “2”.
4
Example 1
An electrical engineer has on hand two boxes of
resistors, with four resistors in each box. The
resistors in the first box are labeled 10 ohms, but in
fact their resistances are 9, 10, 11, and 12 ohms. The
resistors in the second box are labeled 20 ohms, but
in fact their resistances are 18, 19, 20, and 21 ohms.
The engineer chooses one resistor from each box and
determines the resistance of each.
5
Example 1 cont.
Let A be the event that the first resistor has a resistance
greater than 10, let B be the event that the second
resistor has resistance less than 19, and let C be the
event that the sum of the resistances is equal to 28.
1. Find the sample space for this experiment.
2. Specify the subsets corresponding to the events
A, B, and C.
6
Combining Events
The union of two events A and B, denoted
A B, is the set of outcomes that belong either
to A, to B, or to both.
In words, A B means “A or B.” So the event
“A or B” occurs whenever either A or B (or both)
occurs.
7
Example 2
Let A = {1, 2, 3} and B = {2, 3, 4}.
What is A B?
8
Intersections
The intersection of two events A and B, denoted
by A B, is the set of outcomes that belong to A
and to B. In words, A B means “A and B.”
Thus the event “A and B” occurs whenever both
A and B occur.
9
Example 3
Let A = {1, 2, 3} and B = {2, 3, 4}.
What is A B?
10
Complements
The complement of an event A, denoted Ac, is
the set of outcomes that do not belong to A. In
words, Ac means “not A.” Thus the event “not
A” occurs whenever A does not occur.
11
Example 4
Consider rolling a fair sided die. Let A be the
event: “rolling a six” = {6}.
What is Ac = “not rolling a six”?
12
Mutually Exclusive Events
Definition: The events A and B are said to be mutually
exclusive if they have no outcomes in common.
More generally, a collection of events A1 , A2 ,..., An
is said to be mutually exclusive if no two of them have
any outcomes in common.
Sometimes mutually exclusive events are referred to as disjoint
events.
13
Venn Diagrams
Events can be graphically illustrated with Venn
Diagrams:
14
Back to Example 1
• If the experiment with the resistors is
performed
–
–
–
–
Is it possible for events A and B both to occur?
How about B and C?
A and C?
Which pair of events is mutually exclusive?
15
Probabilities
Definition: Each event in the sample space has a
probability of occurring. Intuitively, the probability
is a quantitative measure of how likely the event is to
occur.
Given any experiment and any event A:
The expression P(A) denotes the probability that the
event A occurs.
P(A) is the proportion of times that the event A would
occur in the long run, if the experiment were to be
repeated over and over again.
16
Sample Spaces with Equally Likely
Outcomes
If S is sample space containing N equally
likely outcomes, and if A is an event
containing k outcomes, then:
P(A) = k / N
17
Axioms of Probability
1. Let S be a sample space. Then P(S) = 1.
2. For any event A, 0 P( A) 1 .
3. If A and B are mutually exclusive events, then
P( A B) P( A) P( B.) More generally, if
A1 , A2 ,.....are mutually exclusive events, then
P( A1 A2 ....) P( A1 ) P( A2 ) ...
18
A Few Useful Things
• For any event A,
P(AC) = 1 – P(A).
• Let denote the empty set. Then
P( ) = 0.
• If A is an event, and A = {O1, O2, …, On}, then
P(A) = P(O1) + P(O2) +….+ P(On).
• Addition Rule (for when A and B are not mutually
exclusive):
P( A B) P( A) P( B) P( A B)
19
Example 6
In a process that manufactures aluminum cans, the
probability that a can has a flaw on its side is 0.02, the
probability that a can has a flaw on the top is 0.03, and
the probability that a can has a flaw on both the side and
the top is 0.01.
1. What is the probability that a randomly chosen can
has a flaw?
2. What is the probability that it has no flaw?
20
Section 2.3: Conditional Probability
and Independence
Definition: A probability that is based on part of the
sample space is called a conditional probability.
Let A and B be events with P(B) 0. The conditional
probability of A given B is
P( A B)
P( A | B)
P( B) .
21
Back to Example 6
What is the probability that a can will have a
flaw on the side, given that it has a flaw on the
top?
22
Independence
Definition: Two events A and B are independent if the
probability of each event remains the same whether
or not the other occurs.
If P(A) 0 and P(B) 0, then A and B are
independent if P(B|A) = P(B) or, equivalently,
P(A|B) = P(A).
If either P(A) = 0 or P(B) = 0, then A and B are
independent.
23
The Multiplication Rule
• If A and B are two events and P(B) 0, then
P(A B) = P(B)P(A|B).
• If A and B are two events and P(A) 0, then
P(A B) = P(A)P(B|A).
• If P(A) 0, and P(B) 0, then both of the above
hold.
• If A and B are two independent events, then
P(A B) = P(A)P(B).
24
Extended Multiplication Rule
• If A1, A2,…, An are independent results, then for each
collection of Aj1,…, Ajm of events
• In particular,
25
Example 10
Of the microprocessors manufactured by a
certain process, 20% are defective. Five
microprocessors are chosen at random. Assume
they function independently. What is the
probability that they all work?
26
Law of Total Probability
Law of Total Probability:
If A1,…, An are mutually exclusive and
exhaustive events, and B is any event, then
P(B) = P( A1 B) ... P( An B)
(They are exhaustive, which means their union covers the whole sample
space.)
Equivalently, if P(Ai) 0 for each Ai,
P(B) = P(B|A1)P(A1)+…+ P(B|An)P(An).
27
Example 11
Customers who purchase a certain make of car can
order an engine in any of three sizes. Of all cars sold,
45% have the smallest engine, 35% have the mediumsize one, and 20% have the largest. Of cars with the
smallest engine, 10% fail an emissions test within two
years of purchase, while 12% of the those with the
medium size and 15% of those with the largest engine
fail. What is the probability that a randomly chosen car
will fail an emissions test within two years?
28
Solution
Let B denote the event that a car fails an
emissions test within two years. Let A1 denote
the event that a car has a small engine, A2 the
event that a car has a medium size engine, and A3
the event that a car has a large engine. Then
P(A1) = 0.45, P(A2) = 0.35, and P(A3) = 0.20.
Also, P(B|A1) = 0.10, P(B|A2) = 0.12, and
P(B|A3) = 0.15. What is the probability that a car
fails an emissions test with two years?
29
Bayes’ Rule
Bayes’ Rule: Let A1,…, An be mutually exclusive and
exhaustive events, with P(Ai) 0 for each Ai. Let B be
any event with P(B) 0. Then
P( Ak | B)
P( B | Ak ) P( Ak )
.
n
P( B | A ) P( A )
i 1
i
i
30
Example 12
The proportion of people in a given community who
have a certain disease is 0.005. A test is available to
diagnose the disease. If a person has the disease, the
probability that the test will produce a positive signal is
0.99. If a person does not have the disease, the
probability that the test will produce a positive signal is
0.01. If a person tests positive, what is the probability
that the person actually has the disease?
31
Solution
Let D represent the event that a person actually
has the disease, and let + represent the event that
the test gives a positive signal. We wish to find
P(D|+). We know P(D) = 0.005, P(+|D) = 0.99,
and P(+|DC) = 0.01.
Using Bayes’ rule:
P( | D) P( D)
P( D | )
P( | D) P( D) P( | D C ) P( D C )
0.99(0.005)
0.332.
0.99(0.005) 0.01(0.995)
32
Section 2.4: Random Variables
Definition: A random variable assigns a
numerical value to each outcome in a sample
space.
Definition: A random variable is discrete if its
possible values form a discrete set.
33
Example 13
The number of flaws in a 1-inch length of copper wire
manufactured by a certain process varies from wire to wire.
Overall, 48% of the wires produced have no flaws, 39%
have one flaw, 12% have two flaws, and 1% have three
flaws. Let X be the number of flaws in a randomly selected
piece of wire.
Then P(X = 0) = 0.48, P(X = 1) = 0.39, P(X = 2) = 0.12,
and P(X = 3) = 0.01. The list of possible values 0, 1, 2, and
3, along with the probabilities of each, provide a complete
description of the population from which X was drawn.
34
Probability Mass Function
• The description of the possible values of X and
the probabilities of each has a name: the
probability mass function.
Definition: The probability mass function
(pmf) of a discrete random variable X is the
function p(x) = P(X = x). The probability mass
function is sometimes called the probability
distribution.
35
Cumulative Distribution Function
• The probability mass function specifies the
probability that a random variable is equal to a given
value.
• A function called the cumulative distribution
function (cdf) specifies the probability that a random
variable is less than or equal to a given value.
• The cumulative distribution function of the random
variable X is the function F(x) = P(X ≤ x).
36
More on a Discrete Random Variable
Let X be a discrete random variable. Then
The probability mass function of X is the function
p(x) = P(X = x).
The cumulative distribution function of X is the
function F(x) = P(X ≤ x).
F ( x) p(t ) P( X t ) .
tx
tx
p( x) P( X x) 1, where the sum is over all the
x
possiblex values of X.
37
Example 14
Recall the example of the number of flaws in a
randomly chosen piece of wire. The following is the
pmf: P(X = 0) = 0.48, P(X = 1) = 0.39, P(X = 2) = 0.12,
and P(X = 3) = 0.01.
For any value x, we compute F(x) by summing the
probabilities of all the possible values of x that are less
than or equal to x.
F(0) = P(X ≤ 0) = 0.48
F(1) = P(X ≤ 1) = 0.48 + 0.39 = 0.87
F(2) = P(X ≤ 2) = 0.48 + 0.39 + 0.12 = 0.99
F(3) = P(X ≤ 3) = 0.48 + 0.39 + 0.12 + 0.01 = 1
38
Mean and Variance for Discrete Random
Variables
• The mean (or expected value) of X is given by
X xP( X x) ,
x
where the sum is over all possible values of X.
• The variance of X is given by
X2 ( x X ) 2 P( X x)
x
x 2 P ( X x) X2 .
x
• The standard deviation is the square root of the
variance.
39
Example 15
A certain industrial process is brought down for
recalibration whenever the quality of the items produced
falls below specifications. Let X represent the number
of times the process is recalibrated during a week, and
assume that X has the following probability mass
function.
x
p(x)
0
0.35
1
0.25
2
0.20
3
0.15
4
0.05
Find the mean and variance of X.
40
Continuous Random Variables
• A random variable is continuous if its probabilities
are given by areas under a curve.
• The curve is called a probability density function
(pdf) for the random variable. Sometimes the pdf is
called the probability distribution.
• The function f(x) is the probability density function of
X.
• Let X be a continuous random variable with
probability density function f(x). Then
f ( x)dx 1.
41
Computing Probabilities
Let X be a continuous random variable with
probability density function f(x). Let a and b be
any two numbers, with a < b. Then
b
P(a X b) P(a X b) P(a X b) f ( x)dx.
a
In addition,
P( X a ) P( X a )
a
f ( x)dx
P( X a) P( X a) f ( x)dx.
a
42
More on Continuous Random Variables
• Let X be a continuous random variable with
probability density function f(x). The cumulative
distribution function of X is the function
x
F ( x) P( X x) f (t )dt.
• The mean of X is given by
X xf ( x)dx.
• The variance of X is given by
( x X )2 f ( x)dx
2
X
x 2 f ( x)dx X2 .
43
Example 17
A hole is drilled in a sheet-metal component, and then a
shaft is inserted through the hole. The shaft clearance is
equal to the difference between the radius of the hole and
the radius of the shaft. Let the random variable X denote the
clearance, in millimeters. The probability density function
of X is
1.25(1 x 4 ), 0 x 1
f ( x)
0, otherwise
1. Components with clearances larger than 0.8 mm must be
scraped. What proportion of components are scrapped?
2. Find the cumulative distribution function F(x) .
44
Median
Let X be a continuous random variable with probability
mass function f(x) and cumulative distribution function
F(x).
• The median of X is the point xm that solves the
equation
F ( xm ) P( X xm )
xm
f ( x)dx 0.5.
45
Percentiles
• If p is any number between 0 and 100, the pth
percentile is the point xp that solves the equation
F ( x p ) P( X x p )
xp
f ( x)dx p /100.
• The median is the 50th percentile.
46
Section 2.5:
Linear Functions of Random Variables
If X is a random variable, and a and b are
constants, then
aX b a X b,
2
2 2
aX
a
X
b
aX b a X
,
.
47
More Linear Functions
If X and Y are random variables, and a and b are
constants, then
aX bY aX bY a X bY .
More generally, if X1, …, Xn are random
variables and c1, …, cn are constants, then the
mean of the linear combination c1 X1+…+cn Xn is
given by
c X c X
1 1
2
2 ...cn X n
c1X1 c2 X 2 ... cn X n .
48
Two Independent Random Variables
If X and Y are independent random variables,
and S and T are sets of numbers, then
P( X S and Y T ) P( X S ) P(Y T ).
More generally, if X1, …, Xn are independent
random variables, and S1, …, Sn are sets, then
P( X 1 S1 , X 2 S2 ,K , X n Sn )
P( X 1 S1 ) P( X 2 S2 )L P( X n S n ) .
49
Variance Properties
If X1, …, Xn are independent random variables,
then the variance of the sum X1+ …+ Xn is given
2
2
2
2
by
.... .
X1 X 2 ... X n
X1
X2
Xn
If X1, …, Xn are independent random variables
and c1, …, cn are constants, then the variance of
the linear combination c1 X1+ …+ cn Xn is given
by
2
2 2
2 2
2 2
c X c X
1 1
2
2 ...cn X n
c1 X1 c2 X2 .... cn X n .
50
More Variance Properties
If X and Y are independent random variables
with variances X2 and Y2 , then the variance of
the sum X + Y is
2
X Y
.
2
X
2
Y
The variance of the difference X – Y is
X2 Y X2 Y2 .
51
Example 18
A piston is placed inside a cylinder. The clearance is
the distance between the edge of the piston and the wall
of the cylinder and is equal to one-half the difference
between the cylinder diameter and the piston diameter.
Assume the piston diameter has a mean of 80.85 cm and
a standard deviation of 0.02 cm. Assume the cylinder
diameter has a mean of 80.95 cm and a standard
deviation of 0.03 cm. Find the mean clearance.
Assuming that the piston and cylinder are chosen
independently, find the standard deviation of the
clearance.
52
Independence and Simple Random
Samples
Definition: If X1, …, Xn is a simple
random sample, then X1, …, Xn may be
treated as independent random variables,
all from the same population.
53
Properties of X
If X1, …, Xn is a simple random sample from a
population with mean and variance 2, then the
sample mean X is a random variable with
X
X2
2
n
.
The standard deviation of X is
X
n
.
54
Example 19
A process that fills plastic bottles with a
beverage has a mean fill volume of 2.013 L and
a standard deviation of 0.005 L. A case contains
24 bottles. Assuming that the bottles in a case
are a simple random sample of bottles filled by
this method, find the mean and standard
deviation of the average volume per bottle in a
case.
55
Summary
•
•
•
•
•
Probability and rules
Conditional probability
Independence
Random variables: discrete and continuous
Probability mass functions
56
Summary Continued
•
•
•
•
•
Probability density functions
Cumulative distribution functions
Means and variances for random variables
Linear functions of random variables
Mean and variance of a sample mean
57