2011 ACM Turing Award - Lu Jiaheng's homepage

Download Report

Transcript 2011 ACM Turing Award - Lu Jiaheng's homepage

2011 ACM Turing Award -Judea Pearl
Contributions for Winning ACM Turing Award
 innovations that enabled remarkable advances in the partnership between
humans and machines that is the foundation of Artificial Intelligence (AI)
 developments in probabilistic and causal reasoning
 a computational foundation for processing information under uncertainty
 develop graphical methods and symbolic calculus that enable machines to
reason about actions and observations, and to assess cause-effect
relationships from empirical findings
 influence extends beyond artificial intelligence and even computer
science, to human reasoning and the philosophy of science
Biography of Judea Pearl
Judea Pearl(1936- ),
computer scientist and philosopher
Nationality: Israeli American
 B.S. degree in Electrical Engineering from the
Technion, Haifa, Israel, in 1960
 Master degree in Physics from Rutgers University,
New Brunswick, New Jersey, in 1965
 Ph.D. degree in Electrical Engineering from the
Polytechnic Institute of Brooklyn, Brooklyn, NY in 1965
 RCA Research Laboratories, Princeton, New Jersey, on
superconductive parametric and storage devices and
at Electronic Memories, Inc., Hawthorne, California,
on advanced memory systems, 1965-1970
 Work in UCLA since 1970
About Judea Pearl
current interests
•
•
•
•
Artificial intelligence and knowledge representation
Probabilistic and causal reasoning
Nonstandard logics
Learning strategies
Books
•
•
•
•
Heuristics, Addison-Wesley, 1984
Probabilistic Reasoning in Intelligent Systems, Morgan-Kaufmann, 1988
Causality: Models, Reasoning, and Inference. Cambridge University Press, 2000
I Am Jewish: Personal Reflections Inspired by the Last Words of Daniel Pearl, Jewish Lights, 2004.
Awards and other frills
•
•
•
•
•
•
Harvey Prize(For "foundational work that has touched a multitude of spheres of modern life") (2012)
David E.Rumelhart Prize(For Contributions to the Theoretical Foundations of Human Cognition) (2011)
IEEE AI’s Hall-of-Fame(August , 2011) (All recipients) (2011)
2008 Benjamin Franklin Medal in Computers and Cognitive Science, "For creating the first general
algorithms for computing and reasoning with uncertain evidence" (2008)
2003 Allen Newell Award (2004)
…
Brief Introduction to
Bayesian Networks
and
Causal Networks
Bayesian Networks
Describe the probability distribution(joint distribution) on a
group of random variables, e.g. Y1, Y2, …Yn in the joint space
V(Y1) ×V(Y2) ×… ×V(Yn).
Conditional Independence
 if ( xi, yj, zk) P(X= xi |Y= yj , Z= zk ) = P(X= xi |Z= zk), we say
The probability distribution of X given Z is independent from Y.
 More general,
P(X1,…,Xl |Y1,…,Ym, Z1,…,Zn)= P(X1,…,Xl | Z1,…,Zn)
Graphical Representation of Bayesian Networks
Joint Distribution
Storm
BusTour
Group
P(y1,y2,…,yn)=∏P(yi|Parents(Yi))
Lightning
Camfire
Thunder
ForestFire
To represent joint
distribution, use:
a group of conditional
independence
hypothesis(DAG) and local
conditional probability
Graphical Representation of Bayesian Networks
Storm
• P(S,B,L,C,T,F)=P(S)*P(B)*P(L|S)
*P(C|S,B)*P(T|L)*P(F|S,L,C)
BusTour
Group
Lightning
Camfire
Camfire
S,B
S,~B
~S,B
~S,~B
C
0.4
0.1
0.8
0.2
~C
0.6
0.9
0.2
0.8
• P(Camfire=true|Storm=true,
BusTourGroup=true)=0.4
Thunder
ForestFire
Some part of the network is observable, we want to know the other part…
Some Learning Algorithms(gradient ascent, EM)…
From Bayesian Networks to Causal Networks
Each child-parent family in DAG represents a deterministic function:
where are mutually independent, arbitrarily distributed random disturbances.
 P(xi|pai)  deterministic function also yields
 Also permit us to specify how the resulting distribution would change in
response to external interventions
From Bayesian Networks to Causal Networks
An simple example
xi’ is set by some external force Fi.
Transformation of the DAG
Fi={ set(xi’ ),idle}.
pai’= pai ∪ {Fi}.
From Bayesian Networks to Causal Networks
Now, the conditional probability:
and
From Bayesian Networks to Causal Networks
For more precise analysis of conditional independence, we introduce the
concept of d-separate.
From Bayesian Networks to Causal Networks
Example:
{X2} and {X3} are d-separated by {X1}
{X2} and {X3} are not d-separated by {X1,X5}
From Bayesian Networks to Causal Networks
Assume we are given a casual network together with non-experimental data
on a subject Xo of observed variables in the network, and we want to
estimate what effect the intervention set(Xi=xi’) would have on some
response variable Xj.
S is any set of variables.
From Bayesian Networks to Causal Networks
If S satisfies
then
If we find an S  Xo , we think it’s easy to get
From Bayesian Networks to Causal Networks
• How to find such an S?
Any set S meets the following back-door criterion:
1. No node in S is a descendant of Xi.
2. S d-separates Xi from Xj along every path containing an arrow into Xi.
{X3,X4}
{X4,X5}
{X4}?
Thanks!