Lecture.27 - University of Michigan

Download Report

Transcript Lecture.27 - University of Michigan

Bayesian Networks II:
Dynamic Networks and
Markov Chains
By Peter Woolf ([email protected])
University of Michigan
Michigan Chemical Process
Dynamics and Controls
Open Textbook
version 1.0
Creative commons
Physics, chemistry, and chemical
engineering knowledge & intuition
Existing plant
measurements
Bayesian network models to
establish connections
Patterns of likely
causes & influences
Efficient experimental design to
test combinations of causes
ANOVA & probabilistic models to eliminate
irrelevant or uninteresting relationships
Process optimization (e.g. controllers,
architecture, unit optimization,
sequencing, and utilization)
Dynamical
process modeling
Static Bayesian Network Example 1:
Car failure diagnosis network
From http://www.norsys.com/netlib/car_diagnosis_2.htm
Static Bayesian Network Example 2:
ALARM network: A Logical Alarm Reduction Mechanism
A medical diagnostic system for patient monitoring with 8
diagnoses, 16 findings, and 13 intermediate values
From Beinlich, Ingo, H. J. Suermondt, R. M. Chavez, and G. F. Cooper (1989) "The ALARM monitoring
system: A case study with two probabilistic inference techniques for belief networks" in Proc. of the Second
European Conf. on Artificial Intelligence in Medicine (London, Aug.), 38, 247-256. Also Tech. Report KSL88-84, Knowledge Systems Laboratory, Medical Computer Science, Stanford Univ., CA.
Dynamic Bayesian Networks
Unrolled Network
Yesterday
Today
(ti-1)
(ti)
ALT
ALT
RBC
RBC
procedure
procedure
survival
survival
weight

weight

Collapsed Network
RBC
OR
weight

ALT
procedure
survival
These are both examples of
Dynamic Bayesian Networks
(DBNs)
ALT
150
Predicts
future
responses
Observed
Predicted
100
50
0
0
5
10
Model derived from past data
ti-1
ti-2
ti-3
15
20
time
ti
ti+1
ti+2
ALT
ALT
ALT
ALT
ALT
ALT
RBC
RBC
RBC
RBC
RBC
RBC
procedure
procedure
procedure
procedure
procedure
procedure
pr
survival
weight

survival
weight

survival
weight

survival
weight

survival
weight

survival
weight

s
Dynamic Bayesian Networks:
Predict to explore alternatives
Tomorrow
(ti+1)
ALT
ALT
RBC
RBC
160
Observed
treatment 1
treatment 2
treatment 3
140
120
100
ALT
Today
(ti)
80
60
procedure
procedure
survival
survival
weight

weight

40
20
0
0
5
10
DBNs provide a suitable environment for MPC!
time
15
20
DBN: Thermostat example
Time(t)
ti
Collapsed network
From http://www.norsys.com/networklibrary.html#
ti+1
N: fluctuations
N: fluctuations
H: Heater
H: Heater
T: Temperature
T: Temperature
G: Temp Set Pt.
G: Temp Set Pt.
S: Switch
S: Switch
V: Value/Cost
V: Value/Cost
Unrolled network
A Dynamic Bayesian Network can
be recast as a Markov Network
A Markov network describes
how a system will transition
from system state to state
{000}
{111}
ti
ti+1
N: fluctuations
N: fluctuations
H: Heater
H: Heater
T: Temperature
T: Temperature
Simplified DBN
{110}
{001}
Time(t)
{100}
{011}
{101}
{010}
Assume each variable is
binary (has states 1 or
0), thus any
configuration could be
written as {010} meaning
N=0, H=1, T=0
A Dynamic Bayesian Network can
be recast as a Markov Network
A Markov network describes
how a system will transition
from system state to state
{000}
{110}
{001}
{111}
{011}
to
{000} {001} {010} {100} {101} {110} {011} {111}
{000}0 p1

{001} p3 p4
{010}0 0

{100} 0 0
from
{101} 0 0

{110} 0 0

{011} 0 0
{100}
{111} 
0 0

0 

0 0
0
0
0
0 
0 p5 0
0
0
0 

0 0
0
p6 p7
0 
p8 p9 p10 0
0
0 

0 0
0 p11 0
0

0 0 p12 0
0
p13
0 0
0
0 p14 p15

0
0
0
p2
0
Note: All rows must sum to 1
P1+P2=1
P5=1 etc.
{101}
{010}
Each edge has a probability associated with it.
Case Study: Synthetic Study
Situation: Imagine that we are exploring the effect of a DNA
damaging drug and UV light on the expression of 4 genes.
GFP
Gene A
Gene B
Gene C
Case Study 1: Synthetic Study
GFP
Gene A
Gene B
Gene C
Idealized
Data
Case Study 1: Synthetic Study
GFP
Gene A
Gene B
Gene C
Noisy
data
Case Study 1: Synthetic Study
Noisy data
Idealized Data
Given idealized or noisy data, can we find any relationships
between the drug, UV exposure, GFP, and the gene
expression profiles?
See miniTUBA.demodata.xls
Case Study 1: Synthetic Study
Google “miniTUBA” or go to http://ncibi.minituba.org
Case study 1: synthetic data
• Observations:
– Stronger relationships require fewer
observations to identify
– Noise in measurements are okay
– Moderate binning errors are forgivable
– Uncontrolled experiments can be your
friend in model learning
Take Home Messages
• Noisy, time varying processes can be
modeled as a Dynamic Bayesian
Network (DBN)
• A DBN can be recast as a Markov
model of a stochastic system
• DBNs can be learned directly from data
using tools such as miniTUBA