P - Computing Science - Thompson Rivers University

Download Report

Transcript P - Computing Science - Thompson Rivers University

Chapter 12.
Probability Reasoning
Fall 2013
Comp3710 Artificial Intelligence
Computing Science
Thompson Rivers University
Course Outline



Part I – Introduction to Artificial Intelligence
Part II – Classical Artificial Intelligence
Part III – Machine Learning





Introduction to Machine Learning
Neural Networks
Probabilistic Reasoning and Bayesian Belief Networks
Artificial Life: Learning through Emergent Behavior
Part IV – Advanced Topics
Bayesian Networks
2
Learning Objectives
Bayesian Networks
3
Unit Outline
1.
2.
3.
Introduction
Bayesian networks
Summary
Bayesian Networks
4
1. Introduction


A systematic way, called Bayesian networks, to represent the world
probabilistically – independence and conditional independence
relationships
Application areas:









Topics
consumer help desks,
nuclear reactor diagnosis,
tissue pathology,
pattern recognition,
credit assessment,
computer network diagnosis,
...
Example:
Representing knowledge in an uncertain domain
Semantics of Bayesian networks
Bayesian Networks
5
2. Bayesian networks
1.
2.
3.
4.
5.
Definitions
Example – Weather, Cavity, Catch, and Toothache
Example – Burglary net
How to answer queries in the Burglary net
How to construct Bayesian networks
Bayesian Networks
6
Definitions
2.

Bayesian network, belief network, probabilistic network, causal
network, or also called knowledge map: A simple, graphical notation
for conditional independence assertions and hence for compact
specification of full joint distributions

Syntax:




a set of nodes, one per random variable
a directed, acyclic graph (link means “a parent node directly influences its
child nodes.”)
a conditional probability distribution for each node given its parents:
P (Xi | Parents (Xi))
In the simplest case, conditional probability distribution represented
as a conditional probability table (CPT) giving the distribution over
Xi for each combination of parent values
Bayesian Networks
7
Example – Weather, Toothache, C, C

Topology of network encodes conditional independence assertions:

Considering Weather, Toothache, Cavity, and Catch,
Weather is independent of the other variables.
[Q] What else does not get any influence from others?
Cavity influences Toothache and Catch.
Toothache and Catch are conditionally independent given Cavity.




Bayesian Networks
2.
8
Example - Burglary

I'm at work, neighbor John calls to say my alarm is ringing, but
neighbor Mary doesn't call. Sometimes it's set off by minor
earthquakes. Is there a burglar?

[Q] Random variables?



Burglary, Earthquake, Alarm, JohnCalls, MaryCalls
Network topology reflects "causal" or “influence” knowledge.
[Q] In what way?





A burglar can set the alarm off.
An earthquake can set the alarm off.
The alarm can cause Mary to call.
The alarm can cause John to call.
[Q] Can you draw the network then?
Bayesian Networks
9
What is P(Alarm=T)?
P(a) = P(abe) + P(abe) +
P(abe) + P(abe)
= P(a|be) P(be) +
P(a|be) P(be) +
P(a|be) P(be) +
P(a|be) P(be)
= ???
P(a | b  e)
John calls to say my alarm is ringing.
What is P(JohnCalls=T)?
P(j) = P(j  a) + ???
= ??? + ???
= ???
Bayesian Networks
[Q] What does it mean?
10
Compactness
2.

A CPT for Boolean Xi with k Boolean parents has 2k rows for the
combinations of parent values.

Each row requires one number p for Xi = true
(the number for Xi = false is just 1-p)

If each variable has no more than k parents, the complete network
requires O(n · 2k) numbers
I.e., grows linearly with n, vs. O(2n) for the full joint distribution
But, for burglary net, 1 + 1 + 4 + 2 + 2 = 10 numbers (vs. 25-1 = 31)


Bayesian Networks
11
How to answer queries in Burglary net
The full joint distribution is defined as the product of the local conditional
distributions:
P(X1, …, Xn) = πi =n 1 P(Xi | Parents(Xi))
[Q] P(j  m  a  b  e) = ???
= P(j | a) P(m | a) P(a | b, e) P(b) P(e)
= .9 * .7 * .001 * .999 * .998
≈ .00063
Bayesian Networks
12
I'm at work, neighbor John calls to say my alarm is ringing, but neighbor Mary
doesn't call. Sometimes it's set off by minor earthquakes. What are the
chances that the alarm was ringing and there is a burglar?
P(j  m  a  b) = ???
[Q] How to deal with Earthquake?
= P(j  m  a  b  e) + P(j  m  a  b  e)
= P(j | a) P(m | a) P(a | (be)) P(b) P(e) + P(j | a) P(m | a) (P(a | (be)) P(b) P(e)
= P(j | a) (1 - P(m | a)) P(a | (be)) P(b) P(e) + P(j | a) P(m | a) (P(a | (be)) P(b) P(e)
= ???
Mary calls to say alarm is ringing. what
are the changes there is an earthquake?
P(X1, …, Xn) =  P(Xi | Parents(Xi))
Bayesian Networks
13
2.
P(a) = ???
= P(a  b  e) + P(a  b  e) + P(a  b  e) + P(a  b  e)
= P(a | (be)) P(b) P(e) + P(a | (be)) P(b) P(e) +
P(a | (be)) P(b) P(e) + P(a | (be)) P(b) P(e))
=…
P(j) = ???
P(X1, …, Xn) =  P(Xi | Parents(Xi))
Bayesian Networks
14
How to construct Bayesian networks


1. Choose an ordering of random variables X1, …, Xn
2. For i = 1 to n


add Xi to the network
select parents from X1, …, Xi-1 such that
P (Xi | Parents(Xi)) = P (Xi | X1, ..., Xi-1)
i.e., Parents(Xi) are conditionally independent of the others.
This choice of parents guarantees:
n
P (X1, …, Xn)
= πi =1 P (Xi | X1, …, Xi-1) (by chain rule)
n
= πi =1 P (Xi | Parents(Xi)) (by construction)
[Q] Which random variables first?
Bayesian Networks
15
How to choose a correct order?

“Root causes” first,

Topics
[Q] What can be a root?

then the variables they directly influence,
and so on,
until the leaves

Can we use causal relations?




2.
For each node, find child nodes (i.e., the child nodes caused by the node);
Example of Weather, Toothache, Catch, and Cavity
Bayesian Networks
23
Summary



Topics
Bayesian networks provide a natural representation for (causally
induced) conditional independence.
Topology + CPTs = compact representation of joint distribution
Generally easy for domain experts to construct
Bayesian Networks
24