Chapter 3-5 - Computer Science
Download
Report
Transcript Chapter 3-5 - Computer Science
Sets, Combinatorics, Probability, and Number
Theory
Mathematical Structures
for Computer Science
Chapter 3
Copyright © 2006 W.H. Freeman & Co.
MSCS Slides
Probability
Introduction to Finite Probability
If some action can produce Y different outcomes and
X of those Y outcomes are of special interest, we may
want to know how likely it is that one of the X
outcomes will occur.
For example, what are the chances of:
Getting “heads” when a coin is tossed?
• The probability of getting heads is one out of two, or 1/2.
Getting a 3 with a roll of a die?
• A six-sided die has six possible results. The 3 is exactly one of
these possibilities, so the probability of rolling a 3 is 1/6.
Drawing either the ace of clubs or the queen of
diamonds from a standard deck of cards?
• A standard card deck contains 52 cards, two of which are
successful results, so the probability is 2/52 or 1/26.
Section 3.5
Probability
1
Introduction to Finite Probability
The set of all possible outcomes of an action is called the
sample space S of the action.
Any subset of the sample space is called an event.
If S is a finite set of equally likely outcomes, then the
probability P(E) of event E is defined to be:
P(E)
Section 3.5
E
S
For example, the probability of flipping a coin twice and having
both come up as heads is:
HH
E
1
P(E)
S HH, HT, TH, TT 4
Probability involves finding the size of sets, either of the sample
space or of the event of interest.
We may
need to use the addition or multiplication principles,
the principle of inclusion and exclusion, or the formula for the
number of combinations of r things from n objects.
Probability
2
Introduction to Finite Probability
Section 3.5
Using the definition of P(E) as seen in the previous
slide, we can make some observations for any events
E1 and E2 from a sample space S of equally likely
outcomes:
Probability
3
Probability Distributions
A way to look at problems where not all outcomes
are equally likely is to assign a probability
distribution to the sample space.
Consider each distinct outcome in the original
sample space as an event and assign it a probability.
If there are k different outcomes in the sample space
and each outcome xi is assigned a probability p(xi),
the following rules apply:
1.
0 p(xi) 1
2.
k
Because any probability value must fall within this range.
p(x ) 1
i
i 1
•
The union of all of these k disjoint outcomes is the sample
space S, and the probability of S is 1.
Section 3.5
Probability
4
Example
For a fair die, the probability of rolling a three is 1/6.
Sample space S = {1, 2, 3, 4, 5, 6}, event space E =
{3}
Suppose the die is loaded so that 4 comes up three
times more often than any of the other numbers. Now
the sample space S = {1, 2, 3, 41, 42, 43, 5, 6} so the
probability of rolling a 3 is only 1/8
The probability distribution for the loaded die is
xi
p(xi)
Section 3.5
1
1/8
2
1/8
3
1/8
4
3/8
5
1/8
6
1/8
p(xi) for all xi above = 1, as we know it should be.
Probability
5
Conditional Probability
Section 3.5
Flip a coin twice, sample space = {HH, HT, TH, TT}. The
chance that we get two tails is 1/4.
Let E1 be the event that the first flip is a tail, so now
E1 = {TH, TT} and P(E1) = | E1 | / |S| = 1/2
Let E2 be the event that the second flip is a tail; E2 = {HT, TT}.
The event of interest, tossing two tails in a row, is E1 E2 or
{TT}. So …
P(E1 E2 ) = | E1 E2 | = 1
|S|
4
Now suppose we know that the first toss was a tail; does this
change the total probability? It does, because now S is limited:
S = {TH, TT}.
P(E2| E1) (probability of E2 given E1) = | E2 | / | S | = 1/2
Probability
6
Conditional Probability - Definition
Given events E1 and E2, the conditional probability
of E2 given E1, P(E2E1), is:
P E1 E 2
PE 2 E1
P E1
For example, in a drug study of a group of patients,
17% responded positively to compound A, 34%
responded
positively to compound B, and 8%
responded positively to both.
The probability that a patient responded positively to
compound B given that he or she responded positively
to A is:
P(B|A) = P(A B) / P(A) = 0.08 / 0.17 @ 0.47
Section 3.5
Probability
7
Conditional Probability
If P(E2E1) = P(E2), then E2 is just as likely to
happen whether E1 happens or not. In this case, E1 and
E2 are said to be independent events.
Section 3.5
Then P(E1 E2) = P(E1) * P(E2)
This can be extended to any finite number of
independent events and can also be used to test
whether events are independent.
Probability
8
Expected Value
If the values in the sample space are not numerical, we
may find a function X: S R that associates a
numerical value with each element in the sample
space. Such a function is called a random variable.
Given a sample space S to which a random variable X
and a probability distribution p have been assigned,
the expected value, or weighted average, of the
random variable is:
n
E X X (x i ) p(x i )
i1
Section 3.5
Probability
9
Average Case Analysis of Algorithms
Expected value may help give an average case
analysis of an algorithm, i.e., tell the expected
“average” amount of work performed by an algorithm.
Let the sample space S be the set of all possible inputs
to the algorithm.
We assume that S is finite.
Let the random variable X assign to each member of S
the number of work units required to execute the
algorithm on that input.
And let p be a probability distribution on S,
The expected number of work units is given as:
n
E X X(x i ) p(x i )
i 1
Section 3.5
Probability
10