No Slide Title
Download
Report
Transcript No Slide Title
Chapter 4 Probability and
Probability Distributions
Now that you have learned to describe a data set, how can you
use sample data to draw conclusions about the sampled
populations? The technique involves a statistical tool called
probability. To use this tool correctly, you must first understand
how it works. The first part of this chapter will teach you the new
language of probability, presenting the basic concepts with
simple examples.
©1998 Brooks/Cole Publishing/ITP
The variables that we measured in Chapters 1 and 2 can now
be redefined as random variables, whose values depend on
the chance selection of the elements in the sample. Using
probability as a tool, you can develop probability distributions
that serve as models for discrete random variables, and you
can describe these random variables using a mean and
standard deviation similar to those in Chapter 2.
©1998 Brooks/Cole Publishing/ITP
Specific Topics
1. Experiments and Events
2. Relative frequency definition of probability
3. Counting Rules (Combinations)
4. Intersections, unions, and complements
5. Conditional probability and independence
6. Additive and Multiplicative Rules of Probability
7.
8. Random variables
9. Probability distributions for discrete random variables
10. The mean and standard deviation for a discrete
random variable
©1998 Brooks/Cole Publishing/ITP
4.1 and 4.2 The Role of
Probability in Statistics and
Events and the Sample Space
When a population is known, probability is used to describe the
likelihood of observing a particular sample outcome, e.g., a 50%
or .5 chance of getting a head or a tail in a fair toss of a coin.
When the population is unknown and only a sample from that
population is available, probability is used in making statements
about the makeup of the population, that is, in making statistical
inferences.
We use the term experiment to describe either uncontrolled
events in nature or controlled situations in a laboratory.
©1998 Brooks/Cole Publishing/ITP
Definition: An experiment is the process by which an observation
(or measurement) is obtained.
Definition: An event is an outcome of an experiment.
See Example 4.1 for a listing of some events.
Definition: Two events are mutually exclusive if, when one event
occurs, the other cannot, and vice versa, e.g., a head or a tail in
a toss of a coin.
Definition: An event that cannot be decomposed is called a simple
event, e.g., a head or a tail in the toss of a coin.
Definition: A set of all simple events is called the sample space,
e.g., a head and a tail in the toss of a coin.
Definition: An event is a collection of one or more simple events,
e.g., the toss of two heads in a row.
©1998 Brooks/Cole Publishing/ITP
Example 4.1
Experiment: Toss a die and observe the number that appears
on the upper face.
Solution
Event A : Observe an odd number
Event B : Observe a number less than 4
Event E1: Observe a 1
Event E2: Observe a 2
Event E3: Observe a 3
Event E4: Observe a 4
Event E5: Observe a 5
Event E6: Observe a 6
©1998 Brooks/Cole Publishing/ITP
Figure 4.1 Venn diagram for die tossing
©1998 Brooks/Cole Publishing/ITP
Venn diagram: The outer box represents the sample space,
which contains all of the simple events; the inner circles
represent events and contain simple events.
See Figure 4.1 for a Venn diagram for die tossing.
Also see Examples 4.2 and 4.3 for examples of experiments.
Tree diagram: For displaying the sample space of an
experiment, each successive level of branching in the tree
corresponds to a step required to generate the final outcome.
See Example 4.4 for an example of a tree diagram.
©1998 Brooks/Cole Publishing/ITP
Figure 4.2
Tree diagram
for Example 4.4
©1998 Brooks/Cole Publishing/ITP
4.3 Calculating Probabilities
Using Simple Events
Relative frequency = Frequency / n
Requirements for Simple-Event Probabilities,
- Each probability must lie between 0 and 1.
- The sum of the probabilities for all simple events in S
equals 1.
Definition: The probability of an event A is equal to the sum of the
probabilities of the simple events contained in A.
©1998 Brooks/Cole Publishing/ITP
Example 4.5
Toss two fair coins and record the outcome. Find the probability
of observing exactly one head in the two tosses.
Solution
©1998 Brooks/Cole Publishing/ITP
Calculating the probability of an event:
1. List all simple events in the sample space.
2. Assign an appropriate probability to each simple event.
3. Determine which simple events results in the event of
interest.
4. Sum the probabilities of the simple events that result in the
event of interest/
Be careful to satisfy two conditions in your calculation:
- Include all simple events in the sample space.
- Assign realistic probabilities to the simple events.
©1998 Brooks/Cole Publishing/ITP
Counting Rule for Combinations:
See Examples 4.14 and 4.15 for examples of the counting
rules, including the use of counting rules to solve a probability
problem.
©1998 Brooks/Cole Publishing/ITP
4.5 Event Composition and
Event Relations
Compound events can be formed by unions or intersections of
other events.
Definition: The intersection of events A and B, denoted by A B,
is the event that A or B occur.
Definition: The union of events A and B, denoted by A B, is
the event that A or B or both occur.
See Figures 4.7 and 4.8 for Venn diagrams illustrating union
and intersection.
©1998 Brooks/Cole Publishing/ITP
Figure 4.7
Venn diagram of A B
Figure 4.8
Venn diagram of A B
©1998 Brooks/Cole Publishing/ITP
Example 4.16 illustrates the use of a Venn diagram to
determine probabilities.
Definition: When two events A and B are mutually exclusive,
it means that when A occurs, B cannot, and vice versa.
Mutually exclusive events are also referred to as disjoint
events.
Figure 4.9 Two disjoint events
©1998 Brooks/Cole Publishing/ITP
When events A and B are mutually exclusive:
P(A B) = 0
and
P(A B) = P(A) + P(B)
If P(A) and P(B) are known, we do not need to break (A
down into simple events—we can simply sum them.
See Example 4.17.
B)
Definition: The complement of an event A, denoted AC, consists
of all the simple events in the sample space S that are not in A.
Figure 4.10
The complement of
an event
©1998 Brooks/Cole Publishing/ITP
4.6 Conditional Probability
and Independence
The conditional probability of A, given that B has occurred, is
denoted as P(A | B), where the vertical bar is read “given” and
the events appearing to the right of the bar are those that you
know have occurred.
Definition: The conditional probability of B, given that A has
occurred, is
The conditional probability of A, given that B has occurred, is
©1998 Brooks/Cole Publishing/ITP
Definition: Two events A and B are said to be independent if and
only if either
P(A| B) =P(A) or P(B | A) = P(B)
otherwise, the events are said to be dependent.
Two events are independent if the occurrence or nonoccurrence
of one of the events does not change the probability of the
occurrence of the other event.
©1998 Brooks/Cole Publishing/ITP
Additive Rule of Probability:
Given two events, A and B, the
probability of their union, A B, is equal to
See Figure 4.12 for a representation of the Additive Rule.
Figure 4.12
©1998 Brooks/Cole Publishing/ITP
Multiplicative Rule of Probability:
The probability that both of the two events, A and B, occur is
If A and B are independent,
Similarly, if A, B, and C are mutually independent events,
then the probability that A, B, and C occur is
See Example 4.21 for an example of the Multiplicative Rule of
probability.
©1998 Brooks/Cole Publishing/ITP
Example 4.24
Consider the experiment in which three coins are tossed. Let
A be the event that the toss results in at least one head. Find
P (A ).
Solution
AC is the collection of simple events implying the event
“three tails,” and because AC is the complement of A ,
P (A ) = 1 - P (AC )
Then, applying the Multiplicative Rule, we have
and
©1998 Brooks/Cole Publishing/ITP
4.8 Discrete Random Variables
and Their Probability Distributions
Definition: A variable x is a random variable if the value that it
assumes, corresponding to the outcome of an experiment, is a
chance or random event.
Definition: The probability distribution for a discrete random
variable is a formula, table, or graph that provides p(x), the
probability associated with each of the values of x.
Requirements for a Discrete Probability Distribution:
– 0 p( x ) 1
– p(x ) = 1
See Example 4.25 for an example involving probability
distributions.
©1998 Brooks/Cole Publishing/ITP
Definition: Let x be a discrete random variable with probability
distribution p(x). The mean or expected value of x is given as
m = E(x) = Sxp(x)
where the elements are summed over all values of the random
variable x.
Definition: Let x be a discrete random variable with probability
distribution p(x) and mean m. The variance of x is
s = E [(x - m) 2 ] = S(x - m) 2 p(x)
where the summation is over all values of the random variable x.
Definition: The standard deviation s of a random variable x is
equal to the square root of its variance.
See Examples 4.26, 4.27, and 4.28 for examples of the
calculation of the mean variance and standard deviation.
©1998 Brooks/Cole Publishing/ITP