Decision Trees Posterior probabilities

Download Report

Transcript Decision Trees Posterior probabilities

Incorporating New Information to
Decision Trees (posterior
probabilities)
MGS3100 - Chapter 6
Part 3
Here, we reproduce the last slide of the Sonorola problem
from lecture slides part 2.
Of the three
expected
values, choose
12.85, the
branch
associated
with the Basic
strategy.
This decision is indicated in the TreePlan by
the number 2 in the decision node.
Sequential Decisions
• Would you hire a market research group
or a consultant (or a psychic) to get more
info about states of nature?
• How would additional info cause you to
revise your probabilities of states of nature
occuring?
• Draw a new tree depicting the complete
problem.
First, find out the reliability of the source of
information (in this case, the marketing
research group).
Find the Conditional Probability based on
the prior track record:
For two events A and B, the conditional probability
[P(A|B)], is the probability of event A given that
event B will occur.
For example, P(E|S) is the conditional probability
that marketing gives an encouraging report given
that the market is in fact going to be strong.
If marketing were perfectly reliable, P(E|S) = 1.
However, marketing has the following “track
record” in predicting the market:
P(E|S) = 0.6
P(D|S) = 1 - P(E|S) = 0.4
P(D|W) = 0.7
P(E|W) = 1 - P(D|W) = 0.3
Here is the same information displayed in tabular form:
Reliabilities
Strong
Encouraging
Discouraging
Weak
0.6
0.4
0.3
0.7
Calculating the Posterior Probabilities:
Suppose that marketing has come back with an
encouraging report.
Knowing this, what is the probability that the market is
in fact strong [P(S|E)]?
Note that probabilities such as P(S) and P(W) are initial
estimates called a prior probabilities.
Conditional probabilities such as P(S|E) are called
posterior probabilities.
The domestic tractor division has already estimated the
prior probabilities as P(S) = 0.45 and P(W) = 0.55.
Now, use Bayes’ Theorem (see appendix for a formal
description) to determine the posterior probabilities.
P(E|S)P(E|W)
P(D|S)
P(D|W)
=B3*B$8
=SUM(B12:B13)
=B12/$D12
=SUM(B12:C12)
Appendix
Bayes Theorem
• Bayes' theorem is a result in probability theory, which
gives the conditional probability distribution of a random
variable A given B in terms of the conditional probability
distribution of variable B given A and the marginal
probability distribution of A alone.
• In the context of Bayesian probability theory and
statistical inference, the marginal probability distribution
of A alone is usually called the prior probability
distribution or simply the prior. The conditional
distribution of A given the "data" B is called the posterior
probability distribution or just the posterior.