7 - Rice University

Download Report

Transcript 7 - Rice University

Bayes Theorem and Review of
Concepts Learnt
Krishna.V.Palem
Kenneth and Audrey Kennedy Professor of Computing
Department of Computer Science, Rice University
1
Content
 Bayes’ Theorem: Proof and In-class exercise solution
 Conditional Probability : Statistical Independence Revisited
 General Product Rule
 Review of concepts learnt so far
 Discussion of last year’s class projects
2
What is Bayes Theorem?
 Bayes' theorem relates the conditional and unconditional probabilities of
events A and B, where B has a non-zero probability:
 Each term in Bayes' theorem has a conventional name:
 P(A) is the prior probability or unconditional probability of A.
 It is "prior" in the sense that it does not take into account any information about
B.
 P(A|B) is the conditional probability of A, given B.
 P(B|A) is the conditional probability of B given A.
 P(B) is the prior or marginal probability of B
3
Alternate Form of Bayes Theorem
 Consider that A has two events : A1 and A2
 If we want to compute the probability of A1 given B, then
 But, P(B) can be written as
P(B)  P (B | A1 )P(A1 )  P(B | A2 )P(A2 )
 Hence, we get


P(B | A1 )P(A1 )

P(A1 | B)  
 P(B | A1 )P(A1 )  P(B | A 2 )P(A 2 ) 
 More generally, Bayes theorem can be written as
4
Understanding Bayes Theorem
 Bayes theorem is often used to compute posterior
probabilities given observations.
 For example, a patient may be observed to have certain symptoms.
 Bayes' theorem can be used to compute the probability that a proposed diagnosis
is correct, given that observation.
 Intuitively, Bayes’ theorem in this form describes the way
in which one's beliefs about observing ‘A’ are updated by
having observed ‘B’.
5
In class problem in Bayes’ Theorem
 Suppose you walk home and notice that the grass is wet.
 You want to know if the grass is wet because it rained or because your




6
sprinkler was on
Looking at the news paper, you know the probability of raining today.
Going by your memory, you somehow “estimate” the probability that
the sprinkler was on today
What is given?
 Probability it rained today
 Probability that your sprinkler was on today.
 Probability that the grass is wet given it rained or sprinkler was on
Let us ignore the cases: “Sprinkler was on and it rained” and “Sprinkler
was off and it did not rain”.
In class problem in Bayes Theorem
Probability of sprinkler being on
Probability of raining
Event
Probability
Event
Probability
Sprinkler On
0.2
Raining
0.6
Sprinkler Off
0.8
No Rain
0.4
Probability of grass being wet given different states of sprinkler and whether
it rained
7
Sprinkler
Rain
Grass is wet
Grass is dry
Off
Rain
0.9
0.1
On
No Rain
0.7
0.3
Question: Given that the grass is wet, what is the probability that the
sprinkler was on and it was not raining?
Solution to the problem
Based on the problem statement, we ignore the cases when it rains and sprinkler is
on or it rains and sprinkler is off.
 Let Sprinkler be represented as S, rain by R and grass by G.
 There are 2 cases of grass being wet:
 A1 - Sprinkler =On and Rain=Off
 A2 - Sprinkler = Off and Rain = On
 B – Grass is wet
 We need to find P(A1|B)
 Applying Bayes theorem from Slide 4, we have


P(B | A1 )P(A1 )

P(A1 | B)  
 P(B | A1 )P(A1 )  P(B | A 2 )P(A 2 ) 
8
Solution to the problem
 From the table, we get
 P(A1) = P(Sprinkler =On and Rain=Off)




9
= P(Sprinker = on) * P(Rain = Off) - (as the events are independent)
= 0.2 * 0.4 = 0.08
P(A2) = P(Sprinkler = Off and Rain=On)
- (as the events are independent)
= P(Sprinker = Off) * P(Rain = On)
= 0.8 * 0.6 = 0.48
P(B|A1) = 0.7 (given)
P(B|A2) = 0.9 (given)
Applying Bayes theorem we get,
= (0.08*0.7) / [(0.08*0.7)+(0.48*0.9)]
= 0.1147
In-Class Exercise
 Now, consider the same problem statement and attempt the
following question:
 Question: Given that the grass is wet, what is the
probability that it was raining and the sprinkler was not on?
Note: Use Bayes theorem to solve this problem
 Answer: 0.8853
10
Content
 Bayes’ Theorem – Proof and In-class exercise solution
 General Product Rule
 Review of concepts learnt so far
 Discussion of last year’s class projects
11
General Product Rule
 All along, we have been using product rule as given below
 P(A and B and C and …) = P(A)P(B)P(C)….
 The above formula is a “Special case” of the general Product Rule.
 All the problems we have been dealing with have consisted of
12
“Independent” Events
 Rolling of a pair of dies
 Tossing of coins
 Therefore, P(A and B and C and ….) = P(A)P(B)P(C)…..
 But what if they were not independent? Will the same formula
work?
 NO!!
 So is there a general product rule which can be applied?
 YES!!
Product Rule
 Suppose we are interested in simultaneous occurrence of
event A, B and C.
 Suppose these events are all dependent on each other
 P(A and B and C) = P(A)P(B|A)P(C|A,B)
 In general for n different dependent events A1, A2, A3….An
 Can we derive it?
13
Proof for Product Rule
 Let us consider just 2 “Dependent” events A1 and A2
 Definition of conditional probability is
 P(A2| A1) = P(A1 and A2 )/P(A1)
 So P(A1 and A2 ) = P(A2| A1) P(A1)
 Now let us add a third event A3
 We need to represent P(A1 and A2 and A3 ) in terms of P(A1 and
A2 ).
 How about P(A3| A1 A2)?
 P(A3| A1 and A2) = P(A1 and A2 and A3 )/P(A1 and A2)
Rearranging the terms of the equation, we get
14
 P(A1 and A2 and A3 ) = P(A3| A1 and A2 ) P(A1 and A2)
•
= P(A3| A1 and A2 ) P(A2| A1) P(A1)
General Product Rule
 In general we can extend this to n events
 In general for n different dependent events A1, A2, A3….An
P(A1 and A2 and A3 …. An) =
P(A1)P(A2|A1)P(A3|A1,A2)P(A
% 4|A1,A2,A3)………………
P(An|A1,A2,A3,….,An-1)
15
Content
 Bayes’ Theorem – Proof and In-class exercise solution
 Conditional Probability : Statistical Independence Revisited
 General Product Rule
 Review of concepts learnt so far
 Discussion of last year’s class projects
16
Summary of topics covered
 The notion of ‘chance’ dates back to primitive age
 a variety of animals, from bees to primates, embrace risk for a chance at a
reward
 Numbers have evolved from crude representation to more simple
representation like the Hindu-Arabic numbers
 Need for compactness
 The advent of ‘numbers’ provided people an opportunity to
quantify chance.
 Numbers laid the foundation to analyzing random experiments and
processes.
 This tool to analyze random experiments is probability.
17
Definitions and properties of probability
 Outcome: It’s the result of a single trial of an experiment
 Example: When you roll a die, one of the outcomes is a ‘6’.
 Event: It’s a collection of one or more outcome.
 Example: An event could be rolling 2 ‘6’s consecutively.
 Probability: The likelihood of the event occurring.
 Example: Probability of rolling 2 ‘6’s consecutively is 1/36
 Properties of probability
 0 ≤ P(x) ≤ 1
 P(NULL SET) = 0 & P(SUPER SET OF EVENTS) = 1
 P(All events except EVENT A) = 1 – P(EVENT A)
18
Random Variable
 Random Variable: A function that maps probability to real
numbers
 Example: Event of seeing 2 ‘6’s can be mapped to number 12.
This mapping is done by a random variable
In general, let the variable ‘x’ represent the event
19
x
Event
1
Event 1
2
Event 2
3
Event 3
4
Event 4
Probability (Event i)
p( x  i)  pi
Here ‘x’ is called a random variable.
Where i={1,2,3,4}
Conditional Probability
 Conditional Probability: It’s the probability of an event given
some additional information or given the information that another
event occurred
Experiment
p
p1
p2
p3
p
Event 1
Event 2
Event 3
…
Event Even
Event Odd
Peven
Podd
…
20
Knowledge of the outcome of the roll of a die in terms of whether it is an even number or
an odd number allows us to predict the actual outcome more precisely
So P(Event 1|Event Even) is the conditional probability of Event 1 given Event Even
Conditional Probability
 Conditional probability is defined as follows
 If event A is dependent on another event B, then the probability
of event A given knowledge about event B is
P(Event A | Event B) =
P(Event A and Event B occurring)
P(Event B occurring)
For the die problem
 P(Die rolled a 2 | Die rolled an even number) =
 P(Die rolled 2 and Die rolled even) = 1/6 = 1/3!!
P (Die rolled even)
21
1/2
Definitions and properties of probability
 Mutual Exclusivity: Two events are mutually exclusive if
the events cannot occur at the same time.
 Example: Heads and tails in a single toss are mutually exclusive
events
 P(Heads Or tails) = P(Heads) + P(Tails)
 Sum Rule: If X1, X2, …. Xn are N mutually exclusive
events, then
 P(X1 or X1 or X1 or …. X1) = P(X1) + P(X2) + …. + P(Xn)
 Product Rule: If X1, X2, …. Xn are N independent events,
then
 P(X1 & X1 & X1 & …. X1) = P(X1)P(X2)…. P(Xn)
22
Independence
 Independence: Two events are independent if occurrence
of one event does not affect the occurrence of the other.
 Example: Getting a 6 on a die does not affect getting a 2 on
the other die
 Formally, P(B|A) = P(B)
 Alternately, “ The probability of two independent events occurring
simultaneously is equal to the product of probability of individual
events”
Random variables X and Y are independent if and only if
For every
23
x  X and y  Y , P( x  X , y  Y )  P( x  X )  P( y  Y )
Definitions and properties of probability
 Distribution: A compact representation of all the events
and their outcomes, typically in the form of an equation.
24
X
p(X)
1
1/6
All other
number
5/6
 P(just one throw) = P(getting a 1 in first throw) =
1/6
 P(just two throws) = P(getting number other than
1 in first throw)* P(getting number 1 on second) =
(5/6)*(1/6)
 P(just N throws) = P(getting number other than 1
in first N-1 throws)*P(getting 1 on Nth throw) =
(5/6)N-1(1/6)
 This is a “geometric distribution”
Bayes Theorem
 Bayes Theorem: Shows the relationship between a conditional
probability with its reverse form
P( B | A) P( A)
P( A | B) 
P( B)
 Each term in Bayes' theorem has a conventional name:
 P(A) is the prior probability or unconditional probability of A.
 It is "prior" in the sense that it does not take into account any
information about B.
 P(A|B) is the conditional probability of A, given B.
 P(B|A) is the conditional probability of B given A.
 P(B) is the prior or marginal probability of B
25
Transition Graphs
1/6
1
1/6
1/6
1/6
 Model or Transition
Graphs: A set of events can
be modeled as a transition
graph with the nodes
representing the outcomes
and the edges having the
associated probabilities
2
1/6
1/6
4
1/6
1/6
1/6
1/6
1/6
5
 Eg: Snakes and Ladders
1/6
1/6
1/6
6
26
3
Game
Content
 Bayes’ Theorem – Proof and In-class exercise solution
 Conditional Probability : Statistical Independence Revisited
 General Product Rule
 Review of concepts learnt so far
 Discussion of last year’s class projects
27
Field Goal Percentage vs. Free Throw Percentage
A case study of applied probability on sports
By Yung-Seok Kevin Choi
 It is advantageous to quantify different aspects of a game to strategize
 In basketball two of the primary statistics are
 Field Goal Percentage
ratio of field goals achieved to field goals attempted
 Free Throw Percentage
 ratio of free throws achieved to free throws attempted

Problem Statement: In basketball, is field goal percentage related to free throw percentage and how does
this relationship differ between positions?
 Solution
 Collect the data from a basketball reference website
 MATLAB to analyze and parse data
 The data was organized into different bins
 MATLAB to compute conditional probabilities such as
 Made statements such as
 “given a guard’s free throw percentage in bin 6, we are 85 percent confident that his field
goal percentage falls between bins 6 and 11.”
Impartiality in Mafia
By Jose Luis Garcia

There is an interactive social game called Mafia
 The basic idea of the game is that there are two groups (Townspeople and Mafia) with each having different
advantages who attempt to use these strengths in order to incapacitate the rival group and thus win the
game.
 Is it the case that one of the two groups has more chance of winning than the other
 Specific Analysis
 What is the probability of being assigned to either mafia or townspeople?
 How many turns on average does it take to finish a game?
 Is there a group favored to win and if so which group?
 How is the win/loss probability related to the number of players?
Problem Statement: Analyze the impartiality of an interactive social game called Mafia

Solution
 Mathematical model of the game
 Define the variables (number of players, win probability etc.)
 MATLAB script to run multiple trials of the game
 Based on the ratio of townspeople to the mafia
 Define the probability of winning of the townspeople
 A card is chosen at random from a given stack (MATLAB script)
 Compute winning and losing probabilities based on multiple trials

One such Conclusion
 As the number of players is increased the probability of Mafia winning is also increased.
Probability in Currency Exchange
By Thomas Roinesdal
 The values of international currencies fluctuate every day
 Data about trading between among different currencies is available
 It will be very advantageous if one can predict the future value of the exchange rate of a currency based on
the historical trends.
Problem Statement: Is it possible to predict a currency given the historical values of other currencies?
 Solution
 Model
 Consider pairs of currencies
Example: 1 US dollar = 1.4 Singapore dollar
 Collect historical data of trends in the exchange rate
 Compute the relative frequency of different exchange rates
 Calculate conditional probabilities based on this data by applying Bayes Theorem
 Conclusion
 Based on historical data of many currency exchange rates, there was less than 30% chance of
predicting accurately a future value of a given currency
 But it was shown that some currencies are more correlated than others

Schedule for Help Sessions & MiniProjects
 16 Sept – Help Session 1
 30 Sept – Final Project Proposal
 5 Oct – Help Session 2
 7 Oct – Mini Project 1
31
END
32