Presentation (PowerPoint File)
Download
Report
Transcript Presentation (PowerPoint File)
From logicist to probabilist
cognitive science
Nick Chater
Department of Psychology
University College London
[email protected]
Overview
Logic and probability
A competitive perspective
How much deduction is there?
Evidence from human reasoning
A cooperative perspective
Compare and contrast
Logic as a theory of representation
Probability as a theory of uncertain inference
Probability over logically rich representations
The probabilistic mind
Two traditions
George Boole (1815-1864)
“Laws of thought”
Logic (and probability)
Vision of mechanizable
calculus for thought
Boolean algebra developed
and applied to computer
circuits by Shannon
Symbolic AI
Thomas Bayes (1702-1761)
“Bayes theorem” for
calculating inverse
probability
Making probability
applicable to perception and
learning
Machine learning
Neither Boole nor Bayes distinguished
between normative and descriptive
principles: Error or insight?
Logic and the consistency of beliefs
When is a set of
beliefs consistent?
If = {A, B, ¬C} is
inconsistent, then
A, B deductively
imply C
Beliefs consistent if
they have a model
A variety of logics
provide rules for
avoiding inconsistency
Focus on internal
structure of beliefs
Different depths of representation
John must sing
A
O(A)
□A
F(j)
prop calculus
deontic logic
modal
1st order
John_must_sing
Must(John_sings)
Necessarily(John_sings)
Must_sing(John)
Probability and the consistency of
subjective degrees of belief
Probability
When is a set of
subjective degrees of
belief consistent?
Defined over
formulae of prop.
calculus P, P&Q, ¬Q
Pr(P) = .5
Pr(Q|P) = .5
Pr(P&Q) = .3
Inconsistency
To avoid this, must
follow laws of
probability
(including Bayes theorem)
Subjective degrees of belief consistent
if they have a (probabilistic) model
So difference of emphasis
Believe it (or not)
Calculus of certain
inference
Degrees of belief*
Calculus of uncertain
inference
*Tonight’s Evening Session – Chater, Griffiths,
Tenenbaum, Subjective probability and Bayesian
foundations
Overview
Logic and probability
A competitive perspective
How much deduction is there?
Evidence from human reasoning
A cooperative perspective
Compare and contrast
Logic as a theory of representation
Probability as a theory of uncertain inference
Probability over logically rich representations
The probabilistic mind
Uncertainty: a logical invasion?
Philosophy of
science
T implies D
Popper
D is false
T is false
Statistics (!)
Fisher (Sampling
theory)
AI
Non-monotonic
logic
If A then B, and
A, and no
reason to the
contrary,
infer B
Why the invasion won’t work
Methods for certain
reasoning fail,
because they can
only reject
Or remain agnostic,
not favouring one
option or another
No mechanism for
gaining confidence
in a hypothesis
(though Popper’s
corroboration of theories)
A probabilistic counter-attack?
Everyday inference is defeasible
There is no deduction!
So cognitive science should focus on
probability
Conditionals: probability encroachng on logic?
Inference
Additional
premise
Candidate
conclusion
Logical
validity
Probabilistic
comparison
MP: Modus Ponens
P
Q
Y
Pr(Q|P) Pr(Q)
DA: Denial of the
Antecedent
Not-P
Not-Q
N
Pr(not-Q|not-P)
Pr(not-Q)
AC: Affirming the
Consequent
Q
P
N
Pr(P|Q) Pr(P)
MT: Modus Tollens
Not-Q
Not-P
Y
Pr(not-P|not-Q)
Pr(not-P)
• Probabilistic predictions are graded
• Depend on Pr(P) and Pr(Q)
• Fit with data on argument endorsements…
Varying probabilities in conditional inference
(Oaksford, Chater & Grainger, 2000)
Low P(p), Low P(q)
Low P(p), High P(q)
100
Proportion Endorsed (%)
Proportion Endorsed (%)
100
80
60
40
20
Data
Model
0
MP
DA
AC
80
60
40
20
Data
Model
0
MT
MP
Inference
AC
MT
Inference
High P(p), Low P(q)
High P(p), High P(q)
100
Proportion Endorsed (%)
100
Proportion Endorsed (%)
DA
80
60
40
20
Data
Model
0
MP
DA
AC
Inference
80
60
40
20
Data
Model
0
MT
MP
DA
AC
Inference
MT
Negations implicitly vary probabilities
(e.g., if Pr(Q)=.1; Pr(not-Q=.9)
If p then q
If p then not-q
100
Proportion Endorsed (%)
Proportion Endorsed (%)
100
80
60
40
20
Data
Model
0
MP
DA
AC
80
60
40
20
Data
Model
0
MT
MP
Inference
AC
MT
Inference
If not-p then q
If not-p then not-q
100
Proportion Endorsed (%)
100
Proportion Endorsed (%)
DA
80
60
40
20
Data
Model
0
MP
DA
AC
Inference
80
60
40
20
Data
Model
0
MT
MP
DA
AC
Inference
MT
Wason’s Selection task
Each card has P/¬P
on one side, Q/ ¬Q
on the other
Test If P then Q
Which cards to turn?
P
¬P
Q
¬Q
“logical” Popperian
view: aim for
falsification only:
turn P, ¬Q
But people tend to
‘seek confirmation’
choosing P, Q
Bayesian view: assess expected amount of
information from each card (cf Lindley 1956)
And expected amount of information
(Shannon) depends crucially on Pr(P),
Pr(Q)
normally most things don’t happen, i.e.,
assume rarity
P
¬P
Q
¬Q
Fits Observed preferences p > q > ¬q > ¬p
And also if priors Pr(P), Pr(Q) are experimentally are
manipulated…
A fits with science---where we attempt to
confirm hypotheses (and reject them if we fail)
Overview
Logic and probability
A competitive perspective
How much deduction is there?
Evidence from human reasoning
A cooperative perspective
Compare and contrast
Logic as a theory of representation
Probability as a theory of uncertain inference
Probability over logically rich representations
The probabilistic mind
Is logic dispensible?
Just a special case of
probability?
Quantification
(when Probs are 0 and 1)
Not yet! Probability
doesn’t easily handle:
Objects, Predicates,
Relations
though see BLOG,
Russell, Milch. Morning
session Monday 16 July
The bane of confirmation
theory Fa, Fb, Fc
Pr(x.Fx) = ??
Modality
x.Fx
Pr(□Fx) = ??
Why logic is not dispensible: An example
John must sing or dance
? (John must sing) OR (John must dance)
? If ¬(John sings) then (John must dance)
? There is something that John must do
P □P(j) (second order logic, and modals)
From an apparently innocuous sentence to the far
reaches of logical analysis
Reconciliation: Logic as representation;
Probability for belief updating
Logic
What is the meaning
of a representation
Especially, in virtue
of its structure
Probability
How should my
beliefs be updated
Aim: probabilistic
models over complex
structure, including
logical languages
Representation is crucial, not just for natural
language
Diseases cause symptom
in the same person only
People can transmit
diseases (but it’s the
same disease)
Effects cannot precede
causes
Can try to capture by
“brute force” in, e.g., a
Bayesian network
But no
representation of
“person”
Two people having
the same disease
Etc…
Cf. e.g., Tenenbaum; Kemp; Goodman; Russell;
Milch and more at this summer school…
Overview
Logic and probability
A competitive perspective
How much deduction is there?
Evidence from human reasoning
A cooperative perspective
Compare and contrast
Logic as a theory of representation
Probability as a theory of uncertain inference
Probability over logically rich representations
The probabilistic mind
Two Probabilistic Minds
Probability as a theory of internal processes of
neural/cognitive calculation
Probability as a meta-language for description of
behaviour
Probability as description is a push-over
The brain deals effectively with an probabilistic world
Probability theory elucidates the challenges the brain
faces…and hence a lot about how the brain behaves
Cf. Vladimir Kramnik vs. Deep Fritz
But this does not imply probabilistic calculation
Indeed, tractability considerations imply that the
brain must be using some approximations
(e.g., general assumption in this workshop)
But are they so extreme, as not be recognizably
probabilistic at all?
(e.g., Simon; Kahneman & Tversky, Gigerenzer, Judgment and
Decision literature – cf Busemeyer, Wed, 25 July)
The paradox of human probabilistic reasoning
Good
Parsing and classifying
complex real world
objects
Learning the causal
powers of the everyday
world
Commonsense reasoning,
resolving conflicting
constraints, over a vast
knowledge-base
Bad
Binary classification of
simple artificial
categories;
Associative learning
Multiple disease problems
Explicit probabilistic and
‘logical’ reasoning
The puzzle
Where strong, human probabilistic reasoning far
outstrip any Bayesian machine we can build
Spectacular parsing, image interpretation, motor control
Where weak, it is hopelessly feeble
e.g., hundreds of trials for simple discriminations; daft
reasoning fallacies
Resolving the paradox?
Interface solution:
Some problems don’t allow interface with the
brain’s computational powers?
2-factor solution:
Perhaps there are two aspects to probabilistic
reasoning
the brain is good at one;
But as theorists, we only really understand
the other
A speculation
Maybe the key is having
the right
representations
Not just heavy-duty
numerical calculations
Qualitative structure of
probabilistic reasoning
Including predication,
quantification, causality,
modality,…
And note, too, that cognition can learn both
from being told (i.e., logic?); and experience
(probability?)
So perhaps the fusion of logic and
probability may be crucial