Transcript Document

A New Frontier in Computation—
Computation with Information Described in
Natural Language
Lotfi A. Zadeh
Computer Science Division
Department of EECS
UC Berkeley
ISAI
Beijing, China
August 1, 2006
URL: http://www-bisc.cs.berkeley.edu
URL: http://www.cs.berkeley.edu/~zadeh/
Email: [email protected]
PREAMBLE

What is meant by Computation with Information
Described in Natural Language, or NL-Computation,
for short?

Does NL-Computation constitute a new frontier in
computation?

Do existing bivalent-logic-based approaches to
natural language processing provide a basis for NLComputation?

What are the basic concepts and ideas which
underlie NL-Computation?

These are some of the issues which are addressed
in the following.
2 /168
LAZ 7/28/2006
A HISTORICAL NOTE

3 /168
NL-Computation is a culmination of my longstanding interest in exploring what I have
always believed to be a central issue in fuzzy
logic, namely, the relationship between fuzzy
logic and natural languages. My principal
papers on this theme are the following.
LAZ 7/28/2006
TIMELINE










1971
Quantitave Fuzzy Semantics
1973, 1975 Lynguistic Variables and Fuzzy-if-then rules
1978
Theory of Approximate Reasoning
1978
PRUF-a meaning representation language for
natural languages
1979
Fuzzy Sets and Information Granularity
1982
Test-score semantics for natural languages
and meaning representation via PRUF
1986
Generalized Constraint
1996
Fuzzy Logic=Computing with Words
1999
From computing with numbers to computing
with words --from manipulation of
measurements to manipulation of perceptions
2005
Generalized Theory of Uncertainty (GTU)
4 /168
LAZ 7/28/2006
BASIC STRUCTURE OF NL-COMPUTATION
Basically, NL-Computation is a system of computation in
which the objects of computation are words and
propositions drawn from a natural language
COMPUTATION
PRECISIATION
NL
p
q
information
question
Pre1(p)
Pren(p)
Pre1(q)
Pren(q)
bridge from NL to MATH
(generalized-constraint-based)
5 /168
reduction
solution
ans(q/p)
reduction to final solution
a standard
problem
(generalized-constraint-based)
LAZ 7/28/2006
KEY IDEAS IN NL-COMPUTATION
FUNDAMENTAL THESIS
 Information = generalized constraint

proposition is a carrier of information
MEANING POSTULATE
 proposition = generalized constraint

6 /168
In our approach, NL-Computation is reduced
to computation with generalized constraints,
that is, to generalized-constraint-based
computation. NL-Computation is based on
fuzzy logic. NL-Computation is closely
related to Computing with Words (CW)
LAZ 7/28/2006
FUZZY LOGIC—KEY POINTS


“Fuzzy logic” is not fuzzy logic
Fuzzy logic is a precise logic of imprecision
The principal distinguishing features of fuzzy
logic are:
a)
In fuzzy logic everything is, or is allowed to
be graduated, that is, be a matter of degree
or, equivalently fuzzy
b)
In fuzzy logic everything is allowed to be
granulated
7 /168
LAZ 7/28/2006



ANALOGY
In bivalent logic, one writes with a ballpoint
pen
In fuzzy logic, one writes with a spray pen
which has a precisely defined spray pattern
This simple analogy suggests many
mathematical problems
Y
X


8 /168
What is the maximum value of f?
Precisiation/imprecisiation principle
LAZ 7/28/2006
EXAMPLE OF NL-COMPUTATION
Trip planning
I am planning to drive from Berkeley to Santa
Barbara, with stopover for lunch in Monterey.
Usually, it takes about two hours to get to
Monterey. Usually it takes about one hour to
have lunch. It is likely that it will take about
six hours to get from Monterey to Santa
Barbara. At what time should I leave Berkeley
to get to Santa Barbara, with high probability,
before about 6 pm?
9 /168
LAZ 7/28/2006
LOOKAHEAD

in NL-Computation, computations are for the most
part protoformal, that is, the objects of computation
are protoforms (deep structures).
example
C1
C3
C5
C2
C4
C5=C1+C2+C3+C4

what we have is partial (granular) information about
the Ci which is expressed as a generalized constraint
i(Ci)=GC(Ci)
example: usually (C2 is about 2 hours)
10 /168
LAZ 7/28/2006
A BASIC CONCEPT IN NL-COMPUTATION:
PROTOFORM EQUIVALENCE
protoform = abstracted summary
surface
structure
deep
structure
most Swedes are tall
most balls are large
Count (G[A is B]/G[A]) is Q
protoform
11 /168
LAZ 7/28/2006
A BASIC CONCEPT IN NL-COMPUTATION:
INFORMATION GRANULARITY
A
granular value of X
singular value of X
universe of discourse
singular: X is a
singleton
 granular: X isr A
granule
 a granule is defined by a generalized constraint
example:
X: unemployment
a: 7.3%
A: high

12 /168
LAZ 7/28/2006
ATTRIBUTES OF A GRANULE

Probability measure

Possibility measure

Verity measure

Length

Volume

Entropy
13 /168
LAZ 7/28/2006
PRECISIATION 1
X: time of departure
U: travel time from Berkeley to Monterey
V: duration of lunch
W: travel time from Monterey to Santa Barbara
X is a fuzzy variable; U, V, and W are imprecisely
described fuzzy random variables
*a: approximately a
Prob(((?X + U + V + W)  *18)) is high
14 /168
LAZ 7/28/2006
SIMPLIFIED PROBLEM—LOOKAHEAD
Problem
? Z=*a + usually(*b)
X
Y
: granular values
precisiation
granular computing
15 /168
LAZ 7/28/2006
PRECISIATION OF “approximately a,” *a

1
singleton
s-precisiation
0
a

cg-precisiation
x
1
interval
0
p
a
x
probability distribution
g-precisiation
0

a
x
possibility distribution
0
a

x
1
0
16 /168
fuzzy graph
20
25
x
LAZ 7/28/2006
CONTINUED
p
bimodal distribution
g-precisiation
0
17 /168
x
LAZ 7/28/2006
CONTINUED
A
A
*a is A
u
B
B
*b is B
u
usually
usually is C
usually (*b)
 B ( v ) p( v )dv
18 /168
0
p(v)
1
u
is usually
LAZ 7/28/2006
CONTINUED
( p )  usually (  B ( v ) p( v )dv )
p * x ( v )  p( v  u )
p(v)
p*(v)=p(v-u)
v
Z ( p * ( w ))  supp ,v ( p( v ))   A ( u )
subject to:
19 /168
w  v u
LAZ 7/28/2006
EXTENSION PRINCIPLE (Zadeh 1965, 1975)
Y=f(X)
singular values
granulation
Y*=f*(X*)
granular values
example
f(X) is A
g(X) is B
B=supu(A(f(u))
subject to
v=g(u)
20 /168
LAZ 7/28/2006
MAMDANI
Y=f(X)
granular f
f*: if X is Ai then Y is Bi,
f* is Ii AiBi
X is a
Y is iAi(a)Bi
21 /168
i=1, …, n
LAZ 7/28/2006
EXAMPLES OF NL-COMPUTATION
Balls-in-box
A box contains about twenty balls of various sizes.
Most are large. What is the number of small balls?
What is the probability that a ball drawn at random is
neither small nor large?
Temperature
Usually the temperature is not very low and not very
high. What is the average temperature?
Tall Swedes
Most Swedes are tall. How many are short? What is
the average height of Swedes?
Flight delay
Usually most United Airlines flights from San
Francisco leave on time. What is the probability that
my flight will be delayed?
22 /168
LAZ 7/28/2006
CONTINUED
Maximization
f is a function from reals to reals described
as: If X is small then Y is small; if X is
medium then Y is large; if X is large then Y is
small. What is the maximum of f?
Expected value
X is a real valued random variable. Usually, X
is much larger than approximately a, and
much smaller than approximately b, where a
and b are real numbers, with a < b. What is
the expected value of X?
23 /168
LAZ 7/28/2006
BASIC POINTS





24 /168
Much of human knowledge is expressed in natural
language
A natural language is basically a system for
describing perceptions
Perceptions are intrinsically imprecise, reflecting the
bounded ability of sensory organs, and ultimately
the brain, to resolve detail and store information
Imprecision of perceptions is passed on to natural
languages, resulting in semantic imprecision
Semantic imprecision of natural languages stands in
the way of application of machinery of natural
languages processing to computation with
information described in natural language
LAZ 7/28/2006
NL-CAPABILITY

NL-capability = capability to compute
with information described in natural
language

Existing scientific theories do not have
NL-capability

In particular, probability theory does
not have NL-capability
25 /168
LAZ 7/28/2006
26 /168
LAZ 7/28/2006
UNDERSTANDING VS. PRECISIATION

Understanding precedes precisiation

I understand what you said, but can you be more
precise

Beyond reasonable doubt
Use with adequate ventilation
precisiation
Unemployment is high
unemployment is
over 5%


Where do you draw the line? Paraphrase: The US
Constitution is an invitation to argue over where
to draw the line
 Where to draw the line is a key issue in legal
arguments
27 /168
LAZ 7/28/2006

THE CONCEPT OF -PRECISION / PRECISIATION




28 /168
In NL-Computation, precision is a concept
with many facets. This perception of
precisition leads to the concept of precision/precisiation, where  is an
indexical variable whose values are labels of
facets of precision
 = v (value) v-precise, v-precisiation
v-imprecise, v-imprecisiation
 = m (meaning)
m-precise, m-precisiation
m-imprecise, m-imprecisiation
 = mh + mm + s + g + gc +bl + fl + …
LAZ 7/28/2006
WHAT IS PRECISE?
PRECISE
v-precise
precise value
m-precise
precise meaning
• p: X is a Gaussian random variable with mean m and
variance 2. m and 2 are precisely defined real
numbers
• p is v-imprecise and m-precise
• p: X is in the interval [a, b]. a and b are precisely
defined real numbers
• p is v-imprecise and m-precise
m-precise = mathematically well-defined
29 /168
LAZ 7/28/2006
PRECISIATION AND IMPRECISIATION
v-imprecisiation
1
0
a
x
v-precisiation
m-precise
1
0
a
x
m-precise
v-imprecise
v-precise
1
v-imprecisiation
0
30 /168
x
If X is small then Y is small
If X is medium then Y is large
If X is large then Y is small
v-precise
v-imprecise
m-precise
m-imprecise
LAZ 7/28/2006
IMPRECISIATION/ SUMMARIZATION OF
FUNCTIONS
L
v-imprecisiation
M
S
summarization
If X is small then Y is small
If X is medium then Y is large
If X is large then Y is small
0
S
M
L
If X is small then Y is small
mm-precisiation
If X is medium then Y is large
If X is large then Y is small
(X, Y) is small  small +
medium  large +
large  small
fuzzy graph
31 /168
LAZ 7/28/2006
SUMMARIZATION OF T-NORMS
S
M
L

32 /168
S
M
L
S
S
S
S
M
M
S
M
L
To facilitate the chore of an appropriate t-norm, each
t-norm should be associated with a summary
LAZ 7/28/2006
APPROXIMATION VS. SUMMARIZATION

y
0
summarization may be viewed as a form of
imprecisiation
y
approximation
0
x
x
granule
L
y
0
M
summarization
S
x
0
S
33 /168
M
L
LAZ 7/28/2006
V-PRECISIATION
X: variable
g-precisiation
A
X
s-precisiation
s-precisiation
unemployment
7.3%
high
g-precisiation
• s-precisiation is used routinely in scientific theories
and especially in probability theory
• defuzzification may be viewed as an instance of sprecisiation
34 /168
LAZ 7/28/2006
PRECISIATION/IMPRECISIATION PRINCIPLE
(Zadeh 2005)
a*: approximately a
 simple version
f(*a)= *f(a)
Y
Y
X
35 /168
X
LAZ 7/28/2006
PRECISE SOLUTION
level set
undominated
36 /168
LAZ 7/28/2006
THE CONCEPTS OF PRECISIEND AND PRECISIAND
precisiend
precisiand
object of
precisiation
variable
value of X
lexeme
precisiation
result of
precisiation
X
v-precisiation
X
v-precisiand
ℓ
concept
Pre (ℓ)
proposition m-precisiation
question
command
37 /168
m-precisiand
model of meaning
LAZ 7/28/2006
MODALITIES OF m-PRECISIATION
m-precisiation
mh-precisiation
human-oriented
ℓ
38 /168
m-precisiation
mm-precisiation
machine-oriented
precisiand of ℓ (Pre(ℓ))
LAZ 7/28/2006
PRECISIATION AND DISAMBIGUATION
Examples:
• Overeating causes obesity
most of those who
overeat become obese
Count(become.obese/overeat) is most
• Obesity is caused by overeating
are obese were overeating
Count(were.overeating/obese) is most
39 /168
most of those who
LAZ 7/28/2006
PRECISIATION/ DISAMBIGUATION
P: most tall Swedes
P(A) is?
POPULATION (Swedes)
tall Swedes
A
P
mh-precisiation
P1
P2
40 /168
mm-precisiation
mm-precisiation
P1: most of tall Swedes
P2: mostly tall Swedes
Count(A/tall.Swedes) is most
Count(tall.Swedes/A) is most
LAZ 7/28/2006
BASIC STRUCTURE OF DEFINITIONS
definiens
definiendum
(idea/perception)
concept
mh-precisiation
mh-precisiand
mm-precisiation
mm-precisiand
cointension
cointension
cointension= wellness of fit of meaning
mh-precisiation
bear market
41 /168
mm-precisiation
Declining market with expectation
of further decline
We classify a bear market as a 30
percent decline after 50 days, or a
13 percent decline after 145 days.
(Robert Shuster)
LAZ 7/28/2006
EXAMPLES: MOUNTAIN, CLUSTER, STABILITY
mh-precisiation
mountain
mm-precisiation
42 /168
A natural raised part of the earth’s
surface, usually rising more or less
abruptly, and larger than a hill
?
LAZ 7/28/2006
CONTINUED
mh-precisiation
cluster
mm-precisiation
A number of things of the
same sort gathered together
or growing together; bunch
?
• the concepts of mountain and cluster are PFequivalent, that is, have the same deep structure
mh-precisiation
stability
mm-precisiation
mm-precisiation
43 /168
The capacity of an object to
return to equilibrium after
having been displaced
Lyapuonov definition
fuzzy stability definition
LAZ 7/28/2006
RATIONALE FOR IMPRECISIATION
IMPRECISIATION PRINCIPLE
p: X is V
value of X
variable
X: real-valued variable
X: (X1, …, Xn)
X: function
X: relation
…
V is v-precise if V is a singleton (singular)
v-imprecisiation: singular
granular
44 /168
LAZ 7/28/2006
v-IMPRECISIATION
v-imprecisiation
forced
deliberate
forced: V is not known precisely
deliberate: V need not be known precisely
v-imprecisiation principle: Precision carries a cost. If
there is a tolerance for imprecision, exploit it by
employing v-imprecisiation to achieve lower cost,
robustness, tractability, decision-relevance and
higher level of confidence
45 /168
LAZ 7/28/2006
EXAMPLE: V-IMPRECISIATION
v-precise
v-imprecise
1
deliberate
0
x
If X is small then Y is small
If X is medium then Y is large
If X is large then Y is small
perception
forced
46 /168
LAZ 7/28/2006
GRANULATION REVISITED

Granulation is a derivative of v-imprecisiation
principle
continuous
Age
quantized
1, 2, 3, 4, 5, …
granulated
young + middle-aged + old
µ
µ
1
1
0
quantized
Age
0
middle
young -aged
old
Age
granulated
granulation = v-imprecisiation / m-precisiation
47 /168
LAZ 7/28/2006
KEY POINT



•
48 /168
Granulation plays a key role in human
cognition
In human cognition, v-imprecisiation is
followed by mh-precisiation. Granulation is
mh-precisiation-based
In fuzzy logic, v-imprecisiation is followed by
mm-precisiation. Granulation is mmprecisiation-based
mm-precisiation-based granulation is a major
contribution of fuzzy logic. No other logical
system offers this capability
LAZ 7/28/2006
DIGRESSION—EXTENSION VS. INTENSION
extension and intension are concepts drawn from logic
and linguistics
basic idea
name
attribute
attribute
name
value
object
attribute
attribute
name
value

object =(name; (attribute1, value1), …, (attribute n, value n))
more compactly
object = (name, (attribute, value))
n-ary
n-ary
49 /168
LAZ 7/28/2006
OPERATIONS ON OBJECTS
function
name-based extensional definition
object
attribute-based
intensional definition
(algorithmic)
object: (Michael, (gender, male), …, (age, 25))
son (Michael) = Ron
50 /168
LAZ 7/28/2006
PREDICATE (PROPERTY, CONCEPT, SET, MEMBERSHIP
FUNCTION, INPUT-OUTPUT RELATION)

A predicate, P, is a truth-valued function
U: universe of discourse
generic object
X
P





51 /168
D(P)
Denotation of P: D(P)= {X|P(X)}
Extension of P: Ext(P)= D(P) if P(X) is name-based
Intension of P: Int(P)= D(P) if P(X) is attribute-based
P(X): open predicate (X is a free variable)
P(a): closed predicate (X is a bound variable(P is grounded))
LAZ 7/28/2006
EXAMPLE
U: population
X
P: bachelor
D(bachelor)
Ext (bachelor)= {X|bachelor (X)}
gender( X )  man
{
X
|
Int(bachelor)=
marital.status( X )  sin gle }
52 /168
LAZ 7/28/2006
PRINCIPAL MODES OF DEFINITION

Extension: name-based meaning

Intension: attribute-based meaning

Extensional: P={u1, …, un}
e-meaning of P

Ostensive: P={u, uk, ul}
o-meaning of P
exemplars

53 /168
Intensional: P={u|P(u)},
i-meaning of P
LAZ 7/28/2006
PROPOSITION (TENTATIVE)
A proposition, p, is a sentence which may be
expressed as P(object). Equivalently, p, is a closed
predicate. Equivalently, p = object is P
young(Valentina), e-meaning
very simple example:
extensional
p: Valentina is young

young(Age(Valentina), i-meaning
example
intensional
p: most Swedes are tall
P: most
object: Count(tall.Swedes/Swedes)
p
most(Count(tall.Swedes/Swedes))
i-meaning of p is associated with i-meaning of P and imeaning of object
Question: D(most.tall.Swedes)?
54 /168
LAZ 7/28/2006
THE CONCEPT OF COINTENSION






55 /168
p, q are predicates or propositions
CI(p,q): cointension of p and q: degree of
match between the i-meanings of p and q
q is cointensive w/n to p if GI(p, q) is high
A definition is cointensive if CI(definiendum,
definiens) is high
In practice, CI(p,q) is frequently associated
with o-meaning of p and i-meaning of q
The o-meaning of the definiendum is
perception-based
LAZ 7/28/2006
THE CONCEPTS OF COINTENSION AND
RESTRICTIVE COINTENSION
U: universe of discourse
q
D(q)
p
D(p)
restriction
R



56 /168
CI(p,q)=degree of proximity of D(p) and D(q)
Cointension of q relative to p=degree of
subsethood of D(q) in D(p)
Restricted cointension: U is restricted to R
LAZ 7/28/2006
THE CONCEPT OF COINTENSIVE PRECISIATION

Precisiation of a concept or proposition, p, is
cointensive if Pre(p) is cointensive with p.
Example: bear market
We classify a bear market as a 30 percent
decline after 50 days, or a 13 percent decline
after 145 days. (Robert Shuster)
This definition is clearly not cointensive
57 /168
LAZ 7/28/2006
KEY POINTS


Precisiand=model of meaning
In general, p, may be precisiated in many different ways,
resulting in precisiands Pre1(p), …, Pren(p), each of
which is associated with the degree, CIi, of cointension
of Prei(p), i= 1, …, n. In general, CIi is context-dependent.
p



precisiation1
Pre1(p)
precisiation2
Pre2(p)
precisiationn
Pren(p)
: C1
: C2
: Cn
Precisiation is necessary but not sufficient
To serve its pupose, precisiation must be cointensive
Cointensive precisiation is a key to mechanization of
natural language understanding
58 /168
LAZ 7/28/2006
AN IMPORTANT IMPLICATION FOR SCIENCE

59 /168
It is a deep-seated tradition in science to
employ the conceptual structure of bivalent
logic and probability theory as a basis for
formulation of definitions of concepts. What
is widely unrecognized is that, in reality,
most concepts are fuzzy rather than bivalent,
and that, in general, it is not possible to
formulate a cointensive definition of a fuzzy
concept within the conceptual structure of
bivalent logic and probability theory.
LAZ 7/28/2006
EXAMPLES OF FUZZY CONCEPTS WHOSE
STANDARD, BIVALENT-LOGIC-BASED DEFINITIONS
ARE NOT COINTENSIVE






60 /168
stability
causality
relevance
bear market
recession
mountain
• independence
• stationarity
• cluster
• grammar
• risk
• linearity
LAZ 7/28/2006
ANALOGY
S
system
M(S)
modelization
ℓ
lexeme
model
Pre(ℓ)
precisiation
precisiand
• input-output relation
intension (test-score function)
• system analysis
semantical analysis (Frege’s Principle
of Compositionality)
• degree of match between M(S) and S
cointension
• In general, it is not possible to construct a cointensive
model of a nonlinear system from linear components
61 /168
LAZ 7/28/2006
CHOICE OF PRECISIAND

Cointension and tractability are contravariant
cointension
tractability
complexity


62 /168
To be tractable, precisiation should not be
complex
An optimal choice is one which achieves a
compromise between tractability and
cointension
LAZ 7/28/2006
THE KEY IDEA: MEANING POSTULATE

In NL-computation, a proposition, p, is precisiated by
expressing its meaning as a generalized constraint.
In this sense, the concept of a generalized constraint
serves as a bridge from natural languages to
mathematics.
NL
Mathematics
p
precisiation
p* (GC(p))
generalized constraint
• The concept of a generalized constraint is the
centerpiece of NL-computation
63 /168
LAZ 7/28/2006
TEST-SCORE SEMANTICS (ZADEH 1982)
Prinicipal Concepts and Ideas





64 /168
Test-score semantics has the same conceptual
structure as systems analysis
In test-score semantics, a lexeme, p, is viewed as a
composite constraint
Each constraint is associated with a test-score
function which defines the degree to which the
constraint is satisfied given the values of constraint
variables
Semantic analysis involves computation of the testscore function associated with p in terms of the testscore functions associated with f components of p
the operation of composition and the resulting testscore function constitute the meaning of p
LAZ 7/28/2006
CONTINUED




65 /168
Constraints are represented as relations
The system of relations associated with p
constitutes an explanatory database; ED
ED may be viewed as a description of a
possible world
Test-score semantics has a much higher
expressive power than possible-world
semantics
LAZ 7/28/2006
EXAMPLE
p: young men like young women
p* most young men like most young women
POPULATION
men
women
Namei
Namej
ED (explanatory database)
possible world
young
ED=
66 /168
POPULATION [Name; Gender; Age] +
LIKES [Name1; Name2; ] +
YOUNG [Age; ] +
MOST [Proportion; ]
LAZ 7/28/2006
CONTINUED
P: likes mostly young women
women liked by Namei
young women
Namei
P(Namei): Count ((POPULATION [Name;
Gender is F; Age is young])/ LIKES [Name is
Namei; Name2; Gender is F; Age]) is most
ts(p): Count (POPULATION[Name is P]/
POPULATION [Name; Gender is M; Age is
young]) is most
67 /168
LAZ 7/28/2006
68 /168
LAZ 7/28/2006
GENERALIZED CONSTRAINT (Zadeh 1986)
• Bivalent constraint (hard, inelastic, categorical:)
XC
constraining bivalent relation

Generalized constraint:
X isr R
constraining non-bivalent (fuzzy) relation
index of modality (defines semantics)
constrained variable
r:  | = |  |  |  | … | blank | p | v | u | rs | fg | ps |…
bivalent

primary
open and closed constraints
69 /168
LAZ 7/28/2006
•
CONTINUED
constrained variable
• X is an n-ary variable, X= (X1, …, Xn)
• X is a proposition, e.g., Leslie is tall
• X is a function of another variable: X=f(Y)
• X is conditioned on another variable, X/Y
• X has a structure, e.g., X= Location
(Residence(Carol))
• X is a generalized constraint, X: Y isr R
• X is a group variable. In this case, there is
a group, G: (Name1, …, Namen), with each
member of the group, Namei, i =1, …, n,
associated with an attribute-value, hi, of
attribute H. hi may be vector-valued.
Symbolically
70 /168
LAZ 7/28/2006
CONTINUED
G = (Name1, …, Namen)
G[H] = (Name1/h1, …, Namen/hn)
G[H is A] = (µA(hi)/Name1, …, µA(hn)/Namen)
Basically, G[H] is a relation and G[H is A] is a
fuzzy restriction of G[H]
Example:
tall Swedes
71 /168
Swedes[Height is tall]
LAZ 7/28/2006
SIMPLE EXAMPLES

“Check-out time is 1 pm,” is an instance of a
generalized constraint on check-out time

“Speed limit is 100km/h” is an instance of a
generalized constraint on speed

“Vera is a divorcee with two young children,”
is an instance of a generalized constraint on
Vera’s age
72 /168
LAZ 7/28/2006
GENERALIZED CONSTRAINT—MODALITY r
X isr R
r: =
r: ≤
r:
r: blank
equality constraint: X=R is abbreviation of X is=R
inequality constraint: X ≤ R
subsethood constraint: X  R
possibilistic constraint; X is R; R is the possibility
distribution of X
r: v
veristic constraint; X isv R; R is the verity
distribution of X
r: p
probabilistic constraint; X isp R; R is the
probability distribution of X
Standard constraints: bivalent possibilistic, bivalent veristic
and probabilistic
73 /168
LAZ 7/28/2006
CONTINUED
r: bm bimodal constraint; X is a random variable; R is a
bimodal distribution
r: rs
random set constraint; X isrs R; R is the setvalued probability distribution of X
r: fg
fuzzy graph constraint; X isfg R; X is a function
and R is its fuzzy graph
r: u
usuality constraint; X isu R means usually (X is R)
r: g
group constraint; X isg R means that R constrains
the attribute-values of the group
74 /168
LAZ 7/28/2006
PRIMARY GENERALIZED CONSTRAINTS



Possibilistic: X is R
Probabilistic: X isp R
Veristic: X isv R

Primary constraints are formalizations of
three basic perceptions: (a) perception of
possibility; (b) perception of likelihood; and
(c) perception of truth

In this perspective, probability may be
viewed as an attribute of perception of
likelihood
75 /168
LAZ 7/28/2006
EXAMPLES: POSSIBILISTIC

Monika is young
Age (Monika) is young
X
R
Monika is much younger than Maria
(Age (Monika), Age (Maria)) is much younger

X

R
most Swedes are tall
Count (tall.Swedes/Swedes) is most
X
76 /168
R
LAZ 7/28/2006
EXAMPLES: PROBABILISITIC

X is a normally distributed random
variable with mean m and variance 2
X isp N(m, 2)

X is a random variable taking the values
u1, u2, u3 with probabilities p1, p2 and p3,
respectively
X isp (p1\u1+p2\u2+p3\u3)
77 /168
LAZ 7/28/2006
EXAMPLES: VERISTIC
78 /168

Robert is half German, quarter French and
quarter Italian
Ethnicity (Robert) isv (0.5|German +
0.25|French + 0.25|Italian)

Robert resided in London from 1985 to
1990
Reside (Robert, London) isv [1985,
1990]
LAZ 7/28/2006
STANDARD CONSTRAINTS

Bivalent possibilistic: X  C (crisp set)

Bivalent veristic: Ver(p) is true or false

Probabilistic: X isp R

Standard constraints are instances of
generalized constraints which underlie
methods based on bivalent logic and
probability theory
79 /168
LAZ 7/28/2006
GENERALIZED CONSTRAINT—SEMANTICS
A generalized constraint, GC, is associated with a
test-score function, ts(u), which associates with
each object, u, to which the constraint is
applicable, the degree to which u satisfies the
constraint. Usually, ts(u) is a point in the unit
interval. However, if necessary, it may be an
element of a semi-ring, a lattice, or more generally,
a partially ordered set, or a bimodal distribution.
example: possibilistic constraint, X is R
X is R
Poss(X=u) = µR(u)
ts(u) = µR(u)
80 /168
LAZ 7/28/2006
TEST-SCORE FUNCTION





81 /168
GC(X): generalized constraint on X
X takes values in U= {u}
test-score function ts(u): degree to which u satisfies
GC
ts(u) may be defined (a) directly (extensionally) as a
function of u; or indirectly (intensionally) as a
function of attributes of u
intensional definition=attribute-based definition
example (a) Andrea is tall 0.9
(b) Andrea’s height is 175cm; µtall(175)=0.9;
Andrea is 0.9 tall
LAZ 7/28/2006
CONSTRAINT QUALIFICATION

p isr R means r-value of p is R
in particular
p isp R
Prob(p) is R (probability qualification)
p isv R
Tr(p) is R (truth (verity) qualification)
p is R
Poss(p) is R (possibility qualification)
examples
(X is small) isp likely
Prob{X is small} is likely
(X is small) isv very true
Ver{X is small} is very true
(X isu R)
Prob{X is R} is usually
82 /168
LAZ 7/28/2006
GENERALIZED CONSTRAINT LANGUAGE (GCL)




GCL is an abstract language
GCL is generated by combination, qualification,
propagation and counterpropagation of generalized
constraints
examples of elements of GCL
 X/Age(Monika) is R/young (annotated element)
 (X isp R) and (X,Y) is S)
 (X isr R) is unlikely) and (X iss S) is likely
 If X is A then Y is B
the language of fuzzy if-then rules is a sublanguage
of GCL

deduction= generalized constraint propagation and
counterpropagation

the language of fuzzy if-then rules is a sublanguage
of GCL
LAZ 7/28/2006
83 /168
CONSTRAINTS
generalized constraints
primary constraints
standard constraints
generalized: X isr R , r: possibilistic, probabilistic, veristic,
random set, usuality, group, …
primary: possibilistic, probabilistic, veristic
standard: bivalent possibilistic, probabilistic, bivalent veristic
existing scientific theories are based on primary constraints
84 /168
LAZ 7/28/2006
PRECISIATION = TRANSLATION INTO GCL
BASIC STRUCTURE
NL
p
GCL
precisiation
translation
p*
precisiand
of p
GC(p)
generalized constraint
annotation
p
X/A isr R/B
GC-form of p
example
p: Carol lives in a small city near San Francisco
X/Location(Residence(Carol)) is R/NEAR[City]  SMALL[City]
85 /168
LAZ 7/28/2006
v-PRECISIATION
s-precisiation
conventional
(degranulation)
*a
precisiation
g-precisiation
GCL-based
(granulation)
a
approximately a
*a
p
precisiation
proposition
X isr R
GC-form
common practice in probability theory
• cg-precisiation: crisp granular precisiation
86 /168
LAZ 7/28/2006
PRECISIATION OF “approximately a,” *a

1
singleton
s-precisiation
0
a

cg-precisiation
x
1
interval
0
p
a
x
probability distribution
g-precisiation
0

a
x
possibility distribution
0
a

x
1
0
87 /168
fuzzy graph
20
25
x
LAZ 7/28/2006
CONTINUED
p
bimodal distribution
g-precisiation
0
x
GCL-based (maximal generality)
*a
g-precisiation
X isr R
GC-form
88 /168
LAZ 7/28/2006
EXAMPLE
• p: Speed limit is 100 km/h
poss
cg-precisiation
r = blank (possibilistic)
p
100
110
speed
poss
g-precisiation
r = blank (possibilistic)
p
100
110
prob
g-precisiation
r = p (probabilistic)
p
100
89 /168
110
speed
LAZ 7/28/2006
CONTINUED
prob
g-precisiation
r = bm (bimodal)
p
100
110
120
speed
If Speed is less than *110, Prob(Ticket) is low
If Speed is between *110 and *120, Prob(Ticket) is medium
If Speed is greater than *120, Prob(Ticket) is high
90 /168
LAZ 7/28/2006
THE CONCEPT OF GRANULAR VALUE
X is a
singular value
singleton
X is A
granular value
granule
A is defined as a generalized constraint
example
X is small
granular value

fuzzy set
91 /168
LAZ 7/28/2006
GRANULAR COMPUTING (GrC)

The objects of computation in granular
computing are granular values of variables
and parameters

Granular computing has a position of
centrality in fuzzy logic

Granular computing plays a key role in
precisiation and deduction

Informally
granular computing=ballpark computing
92 /168
LAZ 7/28/2006
GRANULAR DEFINITION OF A FUNCTION
Y
f
granule
L
M
S
0
0
Y
S
medium x large
f* (fuzzy graph)
0
93 /168
X
f
M
perception
summary
L
f* :
if X is small then Y is small
if X is medium then Y is large
if X is large then Y is small
LAZ 7/28/2006
PRECISIATION AND DEDUCTION

p: most Swedes are tall
p*: Count(tall.Swedes/Swedes) is
most
further precisiation
X(h): height density function (not known)
X(h)du: fraction of Swedes whose height
is in [h, h+du], a  h  b
94 /168
b
a X ( h )du  1
LAZ 7/28/2006
PRECISIATION AND CALIBRATION
µtall(h): membership function of tall (known)
µmost(u): membership function of most
(known)


height
most
1
1
0
0
1
height
0.5
1
fraction
X(h)
height density function
0
95 /168
a
b
h (height)
LAZ 7/28/2006
CONTINUED

fraction of tall Swedes:  b
a X ( h )tall ( h )dh

constraint on X(h)
b
a X ( h )tall ( h )dh is most
granular value
( X )  most (  b
a X ( h )tall ( h )dh )
96 /168
LAZ 7/28/2006
DEDUCTION
q: How many Swedes are short
q*:
b
a
X ( h ) short ( h )dh is ? Q
deduction:
b
a
X ( h )tall ( h )dh is most
b
a
X ( h ) short ( h )dh is ? Q
given
needed
• Frege principle of compositionality—precisiated version
• precisiation of a proposition requires precisiations
(calibrations) of its constituents
97 /168
LAZ 7/28/2006
EXTENSION PRINCIPLE Zadeh 1965, 1975
f(X) is A
g(X) is B
B ( v )  supu  A ( f ( u ))
subject to
v  g( u )
98 /168
LAZ 7/28/2006
CONTINUED
deduction:
b
a X ( h )tall ( h )dh
b
a X ( h ) short ( h )dh is ? Q
given
needed
solution:
Q ( v )  sup X (  most (  b
a X ( h )tall ( h )dh ))
subject to
v  b
a X ( h ) short ( h )dh
b
a
99 /168
X ( h )dh  1
LAZ 7/28/2006
CONTINUED
q: What is the average height of Swedes?
q*:
b
a
X ( h )hdh is ? Q
deduction:
is most
b
a X ( h )tall ( h )dh
b
a X ( h )hdh is ? Q
100 /168
LAZ 7/28/2006
LOOKAHEAD--PROTOFORMAL DEDUCTION
Example:
most Swedes are tall
1/nCount(G[H is R]) is Q
Height
101 /168
LAZ 7/28/2006
PROTOFORMAL DEDUCTION RULE
1/nCount(G[H is R]) is Q
1/nCount(G[H is S]) is T
i µR(hi) is Q
i µS(hi) is T
µT(v) = suph1, …, hn(µQ(i µR(hi))
subject to
v = i µS(hi)
values of H: h1, …, hn
102 /168
LAZ 7/28/2006
PROTOFORM LANGUAGE
AND
PROTOFORMAL DEDUCTION
103 /168
LAZ 7/28/2006
THE CONCEPT OF A PROTOFORM
PREAMBLE

104 /168
As we move further into the age of machine
intelligence and automated reasoning, a daunting
problem becomes harder and harder to master. How
can we cope with the explosive growth in
knowledge, information and data. How can we
locate—and infer from—decision-relevant
information which is embedded in a large database.
Among the many concepts that relate to this issue
there are four that stand out in importance: search,
precisiation and deduction. In relation to these
concepts, a basic underlying concept is that of a
protoform—a concept which is centered on the
confluence of abstraction and summarization
LAZ 7/28/2006
WHAT IS A PROTOFORM?

protoform = abbreviation of prototypical form

informally, a protoform, A, of an object, B, written as
A=PF(B), is an abstracted summary of B

usually, B is lexical entity such as proposition,
question, command, scenario, decision problem, etc

more generally, B may be a relation, system,
geometrical form or an object of arbitrary complexity

usually, A is a symbolic expression, but, like B, it
may be a complex object

the primary function of PF(B) is to place in evidence
the deep semantic structure of B
105 /168
LAZ 7/28/2006
CONTINUED
object space
object
summarization
protoform space
summary of p
protoform
abstraction
p
S(p)
A(S(p))
PF(p)
PF(p): abstracted summary of p
deep structure of p
• protoform equivalence
• protoform similarity
106 /168
LAZ 7/28/2006
PROTOFORMS
object space
protoform space
PF-equivalence
class

at a given level of abstraction and summarization,
objects p and q are PF-equivalent if PF(p)=PF(q)
example
p: Most Swedes are tall
q: Few professors are rich
107 /168
Count (A/B) is Q
Count (A/B) is Q
LAZ 7/28/2006
EXAMPLES
instantiation

Monika is young
Age(Monika) is young
A(B) is C
abstraction

Monika is much younger than Robert
(Age(Monika), Age(Robert) is much.younger
D(A(B), A(C)) is E

Usually Robert returns from work at about 6:15pm
Prob{Time(Return(Robert)} is 6:15*} is usually
Prob{A(B) is C} is D
usually
6:15*
Return(Robert)
Time
108 /168
LAZ 7/28/2006
CONTINUED
EXTENSION VS INTENSION
Q A’s are B’s
(attribute-free; extension)

most Swedes are tall
1
Count(G[H is A]) is Q
n
(attribute-based; intension)
109 /168
LAZ 7/28/2006
EXAMPLES
Alan has severe back pain. He goes to
see a doctor. The doctor tells him that
there are two options: (1) do nothing;
and (2) do surgery. In the case of
surgery, there are two possibilities: (a)
surgery is successful, in which case
Alan will be pain free; and (b) surgery is
not successful, in which case Alan will
be paralyzed from the neck down.
Question: Should Alan elect surgery?
Y
gain
0
110 /168
2
option 2
option 1
Y
object
0
1
i-protoform
X
0
X
LAZ 7/28/2006
PROTOFORMAL DEDUCTION
NL
GCL
PFL
p
precisiation
p*
summarization
q
precisiation
q*
abstraction
WKM
World
Knowledge
Module
q**
DM
r**
deduction module
111 /168
p**
a
answer
LAZ 7/28/2006
PROTOFORMAL DEDUCTION

Rules of deduction in the Deduction Database (DDB)
are protoformal
examples: (a) compositional rule of inference
X is A
symbolic
B ( v )  sup(  A ( u )  B ( u ,v ))
(X, Y) is B
computational
Y is A°B
(b) Extension Principle
X is A
Y = f(X)
 y ( v )  supu (  A ( u ))
Subject to:
v  f (u )
Y = f(A)
112 /168
symbolic
computational
LAZ 7/28/2006
RULES OF DEDUCTION


Rules of deduction are basically rules governing
generalized constraint propagation
The principal rule of deduction is the extension
principle
X is A
f(X,) is B
symbolic
113 /168
 B ( v )  supu (  A ( u )
Subject to:
v  f (u )
computational
LAZ 7/28/2006
GENERALIZATIONS OF THE EXTENSION
PRINCIPLE
information = constraint on a variable
f(X) is A
given information about X
g(X) is B
inferred information about X
 B ( v )  supu (  A ( f ( u ))
subject to:
114 /168
v  g( u )
LAZ 7/28/2006
CONTINUED
f(X1, …, Xn) is A
 B ( v )  supu (  A ( f ( u ))
g(X1, …, Xn) is B
Subject to:
 B ( v )  supu (  A ( f ( u ))
(X1, …, Xn) is A
gj(X1, …, Xn) is Yj
(Y1, …, Yn) is B
115 /168
v  g( u )
,
j=1, …, n
Subject to:
v  g( u )
j = 1,..., n
LAZ 7/28/2006
EXAMPLE OF DEDUCTION
p: Most Swedes are much taller than most Italians
q: What is the difference in the average height of
Swedes and Italians?
Solution
Step 1. precisiation: translation of p into GCL
S = {S1, …, Sn}: population of Swedes
I = {I1, …, In}:
population of Italians
gi = height of Si
, g = (g1, …, gn)
hj = height of Ij
, h = (h1, …, hn)
µij = µmuch.taller(gi, hj)= degree to which Si is much
taller than Ij
116 /168
LAZ 7/28/2006
CONTINUED
1
ri   j  ij = Relative Count of Italians in relation to whom
n
Si is much taller
ti = µmost (ri) = degree to which Si is much taller than
most Italians
1
v=
t i = Relative Count of Swedes who are
m
much taller than most Italians
ts(g, h) = µmost(v)
p
generalized constraint on S and I
1
1
 i gi   j h j
q: d =
m
n
117 /168
LAZ 7/28/2006
CONTINUED
Step 2. Deduction via Extension Principle
 q ( d )  supg ,h ts( g , h )
subject to
1
1
d   i gi   j h j
m
n
118 /168
LAZ 7/28/2006
DEDUCTION PRINCIPLE
1.
2.
3.
Precisiate query
Precisiate query-relevant information
Employ constraint propagation (Extension
Principle) to deduce the answer to query
example
q: What is the average height of Swedes?
Assume that P is a population of Swedes,
P=(Name1, …, Namen), with
hi=Height(Namei), i=1, …, n.
119 /168
LAZ 7/28/2006
CONTINUED
q
1
(h1+···+hn)
n
(qri) I: Most Swedes are tall
I
most
1
(µtall(h1)+···+µtall(hn) is
n
GC(h): (µmost(
hn)
120 /168
1
n
(iµtall(hi)) , h = (hi, ···,
LAZ 7/28/2006
CONTINUED
constraint propagation
1
(µmost( (iµtall(hi))
n
Extension Principle
1
Ave(h) = ihi
n
1
(µAve(h)(v) = suph(µmost( iµtall(hi)) ,
n
subject to:
121 /168
(h1+···+hn)
1
v = ihi
n
LAZ 7/28/2006
DEDUCTION PRINCIPLE—GENERAL
FORMULATION


Point of departure: question, q
Data: D = (X1/u1, …, Xn/un)
ui is a generic value of Xi


Ans(q): answer to q
If we knew the values of the Xi, u1, …, un, we could express
Ans(q) as a function of the ui
Ans(q)=g(u1, …,un)

122 /168
u=(u1, …, un)
Our information about the ui, I(u1, …, un) is a generalized
constraint on the ui. The constraint is defined by its test-score
function
f(u)=f(u1, …, un)
LAZ 7/28/2006
CONTINUED

Use the extension principle
 Ans( q ) ( v )  supu ( ts( u ))
subject to
v  g( u )
123 /168
LAZ 7/28/2006
MODULAR DEDUCTION DATABASE
POSSIBILITY
MODULE
SEARCH
MODULE
124 /168
PROBABILITY
FUZZY ARITHMETIC
MODULE agent MODULE
FUZZY LOGIC
MODULE
EXTENSION
PRINCIPLE MODULE
LAZ 7/28/2006
125 /168
LAZ 7/28/2006
THE CONCEPT OF BIMODAL DISTRIBUTION
(ZADEH 1979)
X isbm R
bimodal distribution
random variable

A bimodal distribution is a collection of ordered
pairs of the form
R: {(P1, A1), …, (Pn, An)}
or equivalently
i(Pi \Ai)
,
i=1, …, n
where the Pi are fuzzy probabilities and the Ai are
fuzzy sets
126 /168
LAZ 7/28/2006
CONTINUED
Special cases:
1. The Pi are crisp; the Ai are fuzzy
2. The Pi are fuzzy; the Ai are crisp
3. The Pi are crisp; the Ai are crisp

127 /168
The Demspter-Shafer theory of evidence is
basically a theory of crisp bimodal
distributions
LAZ 7/28/2006
EXAMPLE: FORD STOCK

128 /168
I am considering buying Ford stock. I ask my
stockbroker, “What is your perception of the
near-term prospects for Ford stock?” He tells
me, “A moderate decline is very likely; a
steep decline is unlikely; and a moderate
gain is not likely.” My question is: What is
the probability of a large gain?
LAZ 7/28/2006
CONTINUED


129 /168
Information provided by my stockbroker may be
represented as a collection of ordered pairs:
Price: ((unlikely, steep.decline),
(very.likely, moderate.decline),
(not.likely, moderate.gain))
In this collection, the second element of an ordered
pair is a fuzzy event or, equivalently, a possibility
distribution, and the first element is a fuzzy
probability.
The importance of the concept of a bimodal
distribution derives from the fact that in the context
of human-centric systems, most probability
distributions are bimodal
LAZ 7/28/2006
BIMODAL DISTRIBUTIONS

130 /168
Bimodal distributions can assume a variety
of forms. The principal types are Type 1,
Type 2 and Type 3. Type 1, 2 and 3 bimodal
distributions have a common framework but
differ in important detail
LAZ 7/28/2006
BIMODAL DISTRIBUTIONS (Type 1, 2, 3)
U
A1
An

A2
A
Type 1 (default): X is a random variable taking values in U
A1, …, An, A are events (fuzzy sets)
pi = Prob(X is Ai) ,
i = 1, …, n
ipi is unconstrained
pi is Pi (granular probability)
BMD: bimodal distribution: ((P1, A1), …, (Pn, An))
X isbm (P1\A1 + ··· + Pn\An)
Problem: What is the probability, p, of A? In general, this
probability is fuzzy-set-valued, that is, granular
131 /168
LAZ 7/28/2006
CONTINUED

Type 2 (fuzzy random set): X is a fuzzy-set-valued
random variable with
values A1, …, An (fuzzy
sets)
pi = Prob(X = Ai), i = 1, …, n
BMD:
X isrs (p1\A1 + ··· + pn\An)
ipi = 1
Problem: What is the probability, p, of A? p is not
definable. What are definable are (a) the
expected value of the conditional
possibility of A given BMD, and (b) the
expected value of the conditional
necessity of A given BMD
132 /168
LAZ 7/28/2006
CONTINUED

Type 3 (augmented random set; Dempster-Shafer):
X is a set-valued random variable taking the values X1,
…, Xn with respective probabilities p1, …, pn
 Yi is a random variable taking values in Ai, i = 1, …, n
 Probability distribution of Yi in Ai, i = 1, …, n, is not
specified
 X isp (p1\X1+···+pn\Xn)
Problem: What is the probability, p, that Y1 or Y2 … or Yn
is in A? Because probability distributions of the Yi in the
Ai are not specified, p is interval-valued. What is
important to note is that the concepts of upper and
lower probabilities break down when the Ai are fuzzy
sets
133 /168
LAZ 7/28/2006
IS DEMPSTER SHAFER COINTENSIVE?

In applying Dempster Shafer theory, it is important to
check on whether the data fit Type 3 model.
NL
description
of problem

134 /168
precisiation
Bimodal
Type 1
precisiation
Bimodal
Type 2
precisiation
Bimodal
Type 3
DempsterShafer
Caveat: In many cases the cointensive (well-fitting)
precisiand (model) of a problem statement is
bimodal distribution of Type 1 rather than Type 3
(Demspter-Shafer)
LAZ 7/28/2006
BASIC BIMODAL DISTRIBUTION (BMD) (Type 1)
(PERCEPTION-BASED PROBABILITY
DISTRIBUTION)
X is a real-valued random variable
probability
P3
P2
g
P1
X
0
A1
A2
A3
BMD: P(X) = Pi(1)\A1 + Pi(2)\A2 + Pi(3)\A3
Prob {X is Ai } is Pj(i)
P(X)= low\small + high\medium + low\large
135 /168
LAZ 7/28/2006
P
INTERPOLATION OF A BASIC BIMODAL
DISTRIBUTION (TYPE 1)
g(u): probability density of X
p1
p2
p
pn
X
0
A1
A2
A
An
pi is Pi : granular value of pi , i=1, …, n
(Pi , Ai) , i=1, …, n
are given
A is given
(?P, A)
136 /168
LAZ 7/28/2006
INTERPOLATION MODULE AND PROBABILITY
MODULE
Prob {X is Ai} is Pi
, i = 1, …, n
Prob {X is A} is Q
Q ( v )  supg (  P1 (   A1 ( u )g( u )du )     
U
 Pn   Pn (   An ( u )g( u )du ))
U
U
subject to
U    A ( u )g( u )du
U
137 /168
LAZ 7/28/2006
EXAMPLE

Probably it will take about two hours to get from San
Francisco to Monterey, and it will probably take
about five hours to get from Monterey to Los
Angeles. What is the probability of getting to Los
Angeles in less than about seven hours?
BMD: (probably, *2) + (probably, *5)
X
Z = X+Y
w
138 /168
Y
pz ( w )   p x ( u ) pY ( w  u )du
u v
LAZ 7/28/2006
CONTINUED
query:  pZ ( w )  * 7 ( w )dw
is ?A
 pX   probably (  * 2 ( u ) pX ( u )du )
qri:
 pY   probably (  * 5 ( v ) pY ( v )dv )
 A ( t )  suppX , pY (  X  Y )
subject to:
t   p X ( w ) * 7 ( w )dw
139 /168
LAZ 7/28/2006
TEST PROBLEMS (PROBABILITY THEORY)


X is a real-valued random variable. What is known
about X is: a(usually X is much larger than
approximately a; b usually X is much smaller than
approximately b, where a and b are real numbers
with a < b. What is the expected value of X?
X and Y are random variables. (X,Y) takes values in
the unit circle. Prob(1) is approximately 0.1; Prob(2)
is approximately 0.2; Prob(3) is approximately 0.3;
Prob(4) is approximately 0.4. What is the marginal
distribution of X? Y
4
3
140 /168
0
1
2
X
LAZ 7/28/2006
CONTINUED



function: if X is small then Y is large +…
(X is small, Y is large)
probability distribution: low \ small + low \ medium +
high \ large +…
Count \ attribute value distribution: 5* \ small + 8* \
large +…
PRINCIPAL RATIONALES FOR F-GRANULATION
 detail
not known
 detail not needed
 detail not wanted
141 /168
LAZ 7/28/2006
OPERATIONS ON BIMODAL DISTRIBUTIONS
P(X) defines possibility distribution of g
( g )  Pi (  U  Ai ( u )g( u )du )      Pu (  U  An ( u )g( u )du )
problem
a) what is the expected value of X
142 /168
LAZ 7/28/2006
EXPECTED VALUE OF A BIMODAL
DISTRIBUTION
E ( P*)   U ug( u )du  f ( g )
Extension Principle
 E ( P *) ( v )  sup (  p1 (  U  A1 ( u )g( u )du )    
g
  Pn (  U  An ( u )g( u )du ))
subject to:
143 /168
v   U ug( u )du
LAZ 7/28/2006
PERCEPTION-BASED DECISION ANALYSIS
ranking of bimodal probability distributions
PA
0
X
PB
0
maximization of expected utility
144 /168
X
ranking of fuzzy numbers
LAZ 7/28/2006
USUALITY CONSTRAINT PROPAGATION RULE
X: random variable taking values in U
g: probability density of X
X isu A
Prob {X is B} is C
X isu A
Prob {X is A} is usually
( g )  usually(  U  A ( u )g( u )du )
C ( v )  supg ( usually(  U  A ( u )g( u )du ))
subject to:
145 /168
v   U  B ( u )g( u )du
LAZ 7/28/2006
PROBABILITY MODULE (CONTINUED)
X isp P
Y = f(X)
Y isp f(P)
X isp P
(X,Y) is R
Y isrs S
146 /168
Prob {X is A} is P
Prob {f(X) is B} is Q
X isu A
Y = f(X)
Y isu f(A)
LAZ 7/28/2006
PNL-BASED DEFINITION OF STATISTICAL
INDEPENDENCE
Y
contingency table
L
C(M/L)
M
C(S/S)
S
X
0
S
M
 (M/L)=
3
L/S
L/M
L/L
2
M/S
M/M
M/L
1
S/S
S/M
S/L
1
2
3
L
C (M x L)
C (L)
• degree of independence of Y from X=
degree to which columns 1, 2, 3 are identical
147 /168
PNL-based definition
LAZ 7/28/2006
WHAT IS A RANDOM SAMPLE?
In most cases, a sample is drawn from a
population which is a fuzzy set, e.g., middle class,
young women, adults
 In the case of polls, fuzziness of the population
which is polled may reflect the degree
applicability of the question to the person who is
polled
 example (Atlanta Constitution 5-29-95)
Is O.J. Simpson guilty?
Random sample of 1004 adults polled by phone.
61% said “yes.”
Margin of error is 3%
 to what degree is this question applicable to a
person who is n years old?

148 /168
LAZ 7/28/2006
CONJUNCTION
X is A
X is B
X is A B
X isu A
X isu B
X isr A B
•determination of r involves interpolation of a bimodal
distribution
150 /168
LAZ 7/28/2006
USUALITY CONSTRAINT
X isu A
X isu B
X isp P
(A  B) ispv Q
X is A
X is B
X is A  B
g: probability density function of X
(g): possibility distribution function of g
(g )  supg ( usually(  g (u ) A (u )du )   usually(  g (u ) B (u )du ))
U
U
subject to: U g (u )du  1
Q (v )  supg ((g ))
subject to: v  U g (u )( A (u )   B (u ))du
151 /168
LAZ 7/28/2006
USUALITY — QUALIFIED RULES
X isu A
X isun (not A)
X isu A
Y=f(X)
Y isu f(A)
 f ( A ) ( v )  supu|v f ( u ) (  A ( u ))
152 /168
LAZ 7/28/2006
USUALITY — QUALIFIED RULES
X isu A
Y isu B
Z = f(X,Y)
Z isu f(A, B)
 Z ( w )  supu ,v |w f ( u ,v ) (  A ( u )   B ( v )
153 /168
LAZ 7/28/2006
SUMMATION

The concept of GC-computation is the centerpiece of
NL-computation. The point of departure in NLcomputation is the key idea of representing the
meaning of a proposition drawn from a natural
language, p, as a generalized constraint. This mode
of representation may be viewed as precisiation of p,
with the result of precisiation being a precisiand, p*,
of p. Each precisiand is associated with a measure,
termed cointension, of the degree to which the
intension of p* is a good fit to the intension of p. A
principal desideratum of precisiation is that the
resulting precisiand be cointensive. The concept of
cointensive precisiation is a key to mechanization of
natural language understanding.

The concept of NL-computation has wide-ranging
ramifications, especially within human-centric fields
such as economics, law, linguistics and psychology
154 /168
LAZ 7/28/2006
155 /168
LAZ 7/28/2006
DEDUCTION
THE BALLS-IN-BOX PROBLEM
Version 1. Measurement-based



A flat box contains a layer of black and white
balls. You can see the balls and are allowed
as much time as you need to count them
q1: What is the number of white balls?
q2: What is the probability that a ball drawn at
random is white?
q1 and q2 remain the same in the next version
156 /168
LAZ 7/28/2006
DEDUCTION
Version 2. Perception-based
You are allowed n seconds to look at the
box. n seconds is not enough to allow you
to count the balls
You describe your perceptions in a natural
language
p1: there are about 20 balls
p2: most are black
p3: there are several times as many black
balls as white balls
PT’s solution?
157 /168
LAZ 7/28/2006
MEASUREMENT-BASED





158 /168
version 1
a box contains 20 black
and white balls
over seventy percent
are black
there are three times as
many black balls as
white balls
what is the number of
white balls?
what is the probability
that a ball picked at
random is white?
PERCEPTION-BASED
version 2





a box contains about 20
black and white balls
most are black
there are several times
as many black balls as
white balls
what is the number of
white balls
what is the probability
that a ball drawn at
random is white?
LAZ 7/28/2006
COMPUTATION (version 2)

159 /168
measurement-based
X = number of black
balls
Y2 number of white
balls
X  0.7 • 20 = 14
X + Y = 20
X = 3Y
X = 15
; Y=5
p =5/20 = .25

perception-based
X = number of black
balls
Y = number of white
balls
X = most × 20*
X = several *Y
X + Y = 20*
P = Y/N
LAZ 7/28/2006
FUZZY INTEGER PROGRAMMING
Y
X= most × 20*
X+Y= 20*
X= several × y
1
160 /168
x
LAZ 7/28/2006
January 26, 2005
Factual Information About the Impact of Fuzzy
Logic
PATENTS



161 /168
Number of fuzzy-logic-related patents applied for in
Japan: 17,740
Number of fuzzy-logic-related patents issued in
Japan: 4,801
Number of fuzzy-logic-related patents issued in the
US: around 1,700
LAZ 7/28/2006
PUBLICATIONS
Count of papers containing the word “fuzzy” in title, as cited in INSPEC
and MATH.SCI.NET databases.
Compiled by Camille Wanat, Head, Engineering Library, UC Berkeley,
December 22, 2004
Number of papers in INSPEC and MathSciNet which have "fuzzy" in their
titles:
INSPEC - "fuzzy" in the title
1970-1979: 569
1980-1989: 2,404
1990-1999: 23,207
2000-present: 14,172
Total: 40,352
MathSciNet - "fuzzy" in the title
1970-1979: 443
1980-1989: 2,465
1990-1999: 5,483
2000-present: 3,960
Total: 12,351
162 /168
LAZ 7/28/2006
JOURNALS
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
163 /168
(“fuzzy” or “soft computing” in title)
Fuzzy Sets and Systems
IEEE Transactions on Fuzzy Systems
Fuzzy Optimization and Decision Making
Journal of Intelligent & Fuzzy Systems
Fuzzy Economic Review
International Journal of Uncertainty, Fuzziness and
Knowledge-Based Systems
Journal of Japan Society for Fuzzy Theory and Systems
International Journal of Fuzzy Systems
Soft Computing
International Journal of Approximate Reasoning--Soft
Computing in Recognition and Search
Intelligent Automation and Soft Computing
Journal of Multiple-Valued Logic and Soft Computing
Mathware and Soft Computing
Biomedical Soft Computing and Human Sciences
Applied Soft Computing
LAZ 7/28/2006
APPLICATIONS
The range of application-areas of fuzzy logic is too wide for exhaustive
listing. Following is a partial list of existing application-areas in which there
is a record of substantial activity.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
Industrial control
Quality control
Elevator control and scheduling
Train control
Traffic control
Loading crane control
Reactor control
Automobile transmissions
Automobile climate control
Automobile body painting control
Automobile engine control
Paper manufacturing
Steel manufacturing
Power distribution control
Software engineerinf
Expert systems
Operation research
Decision analysis
164 /168
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
Financial engineering
Assessment of credit-worthiness
Fraud detection
Mine detection
Pattern classification
Oil exploration
Geology
Civil Engineering
Chemistry
Mathematics
Medicine
Biomedical instrumentation
Health-care products
Economics
Social Sciences
Internet
Library and Information Science
LAZ 7/28/2006
Product Information Addendum 1
This addendum relates to information about products which employ fuzzy logic singly or
in combination. The information which is presented came from SIEMENS and
OMRON. It is fragmentary and far from complete. Such addenda will be sent to the
Group from time to time.
SIEMENS:
* washing machines, 2 million units sold
* fuzzy guidance for navigation systems (Opel, Porsche)
* OCS: Occupant Classification System (to determine, if a place in a car is
occupied by
a person or something else; to control the airbag as well as the intensity of the
airbag). Here FL is used in the product as well as in the design process
(optimization of parameters).
* fuzzy automobile transmission (Porsche, Peugeot, Hyundai)
OMRON:
* fuzzy logic blood pressure meter, 7.4 million units sold, approximate retail value
$740 million dollars
Note: If you have any information about products and or manufacturing which may be of
relevance please communicate it to Dr. Vesa Niskanen [email protected]
and Masoud Nikravesh [email protected] .
165 /168
LAZ 7/28/2006
Product Information Addendum 2
This addendum relates to information about products which employ fuzzy logic singly or
in combination. The information which is presented came from Professor Hideyuki Takagi,
Kyushu University, Fukuoka, Japan. Professor Takagi is the co-inventor of neurofuzzy
systems. Such addenda will be sent to the Group from time to time.
Facts on FL-based systems in Japan (as of 2/06/2004)
1. Sony's FL camcorders
Total amount of camcorder production of all companies in 1995-1998 times Sony's market
share is the following. Fuzzy logic is used in all Sony's camcorders at least in these four
years, i.e. total production of Sony's FL-based camcorders is 2.4 millions products in
these four years.
1,228K units X 49% in 1995
1,315K units X 52% in 1996
1,381K units X 50% in 1997
1,416K units X 51% in 1998
2. FL control at Idemitsu oil factories
Fuzzy logic control is running at more than 10 places at 4 oil factories of Idemitsu Kosan
Co. Ltd including not only pure FL control but also the combination of FL and conventional
control.
They estimate that the effect of their FL control is more than 200 million YEN per year and
it saves more than 4,000 hours per year.
166 /168
LAZ 7/28/2006
3. Canon
Canon used (uses) FL in their cameras, camcorders, copy machine, and
stepper alignment equipment for semiconductor production. But, they have
a rule not to announce their production and sales data to public.
Canon holds 31 and 31 established FL patents in Japan and US,
respectively.
4. Minolta cameras
Minolta has a rule not to announce their production and sales data to
public, too.
whose name in US market was Maxxum 7xi. It used six FL systems in a
camera and was put on the market in 1991 with 98,000 YEN (body price
without lenses). It was produced 30,000 per month in 1991. Its sister
cameras, alpha-9xi, alpha-5xi, and their successors used FL systems, too.
But, total number of production is confidential.
167 /168
LAZ 7/28/2006
5. FL plant controllers of Yamatake Corporation
Yamatake-Honeywell (Yamatake's former name) put FUZZICS, fuzzy
software package for plant operation, on the market in 1992. It has
been used at the plants of oil, oil chemical, chemical, pulp, and other
industries where it is hard for conventional PID controllers to
describe the plan process for these more than 10 years.
They planed to sell the FUZZICS 20 - 30 per year and total 200 million
YEN.
As this software runs on Yamatake's own control systems, the
software package itself is not expensive comparative to the hardware
control systems.
6. Others
Names of 225 FL systems and products picked up from news articles
in 1987 - 1996 are listed at
http://www.adwin.com/elec/fuzzy/note_10.html in Japanese.)
Note: If you have any information about products and or manufacturing
which may be of relevance please communicate it to Dr. Vesa
Niskanen [email protected] and Masoud Nikravesh
[email protected] , with cc to me.
168 /168
LAZ 7/28/2006