Transcript meaning

Semantics: Representations and
Analyses
CS 4705
What kinds of meaning do we want to
capture?
• Categories/entities
– IBM, Jane, a black cat, Pres. Bush
• Events
– running a mile, AS elected governor of CA
• Time
– Oct 30, next week, in 2 years
• Aspect
– Jack knows how to run. Jack is running. Jack ran the
mile in 5 min.
• Beliefs, Desires and Intentions (BDI)
What Can Serve as a Meaning
Representation?
• Anything that allows us to
– Answer questions (What is the tallest building in the
world?)
– Determine truth (Is the blue block on the red block?)
– Draw inferences (If the blue block is on the red block
and the red block is on the tallest building in the world,
then the blue block is on the tallest building in the
world)
Meaning Representations
• All represent ‘linguistic meaning’ of I have a car
and state of affairs in some world
• All consist of structures, composed of symbols
representing objects and relations among them
– FOPC:
x, y{Having(x)  Haver(S, x)  HadThing( y, x) Car( y)}
– Semantic Net:
having
haver
speaker
had-thing
car
• Conceptual Dependency Diagram:
Car
 Poss-By
Speaker
• Frame
Having
Haver: S
HadThing: Car
A Standard Representation: PredicateArgument Structure
• Represents concepts and relationships among
them
– Nouns as concepts or arguments (red(ball))
– Adjectives, adverbs, verbs as predicates (red(ball))
• Subcategorization (or, argument) frames specify
number, position, and syntactic category of
arguments
– NP likes NP
– NP likes Inf-VP
– NP likes NP Inf-VP
Semantic (Thematic) Roles
• Subcat frames link arguments in surface structure
with their semantic roles
– Agent: George hit Bill. Bill was hit by George.
– Patient: George hit Bill. Bill was hit by George.
• Selectional Restrictions: constraints on the types
of arguments verbs take
George assassinated the senator.
*The spider assassinated the fly.
assassinate: intentional (political?) killing
First Order Predicate Calculus
• Not ideal as a meaning representation and doesn't
do everything we want -- but better than many…
–
–
–
–
Supports the determination of truth
Supports compositionality of meaning
Supports question-answering (via variables)
Supports inference
NL Mapping to FOPC
• Terms: constants, functions, variables
– Constants: objects in the world, e.g. Huey
– Functions: concepts, e.g. sisterof(Huey)
– Variables: x, e.g. sisterof(x)
• Predicates: symbols that refer to relations that hold
among objects in some domain or properties that
hold of some object in a domain
likes(Huey, kibble)
cat(Huey)
• Logical connectives permit compositionality of
meaning
kibble(x)  likes(Huey,x) “Huey likes kibble”
cat(Vera) ^ odd(Vera) “Vera is an odd cat”
sleeping(Huey) v eating(Huey) “Huey either is sleeping
or eating”
• Sentences in FOPC can be assigned truth values
– Atomic formulae are T or F based on their presence or
absence in a DB (Closed World Assumption?)
– Composed meanings are inferred from DB and meaning
of logical connectives
– cat(Huey)
– sibling(Huey,Vera)
– cat(Huey) ^ sibling(Huey,Vera)  cat(Vera)
• Limitations:
– Do ‘and’ and ‘or’ in natural language really mean ‘^’
and ‘v’?
Mary got married and had a baby. And then…
Your money or your life!
– Does ‘’ mean ‘if’?
If you go, I’ll meet you there.
– How do we represent other connectives?
She was happy but ignorant.
• Quantifiers: , 
– Existential quantification: There is a unicorn in my
garden. Some unicorn is in my garden.
– Universal quantification: The unicorn is a mythical
beast. Unicorns are mythical beasts.
– Many? A few? Several? A couple?
Temporal Representations
• How do we represent time and temporal
relationships between events?
Last year Martha Stewart was happy but soon she will be
in prison.
• Where do we get temporal information?
– Verb tense
– Temporal expressions
– Sequence of presentation
• Linear representations: Reichenbach ‘47
– Utterance time (U): when the utterance occurs
– Reference time (R): the temporal point-of-view of the
utterance
– Event time (E): when events described in the utterance
occur
George is eating a sandwich.
-- E,R,U 
George had eaten a sandwich (when he realized…)
E–R–U
George will eat a sandwich.
--U,R – E 
While George was eating a sandwich his mother arrived.
Verbs and Event Types: Aspect
• Statives: states or properties of objects at a particular point
in time
I am hungry.
• Activities: events with no clear endpoint
I am eating.
• Accomplishments: events with durations and endpoints
that result in some change of state
I ate dinner.
• Achievements: events that change state but have no
particular duration – they occur in an instant
I got the bill.
Beliefs, Desires and Intentions
• Very hard to represent internal speaker states like
believing, knowing, wanting, assuming, imagining
– Not well modeled by a simple DB lookup approach so..
– Truth in the world vs. truth in some possible world
George imagined that he could dance.
George believed that he could dance.
• Augment FOPC with special modal operators that
take logical formulae as arguments, e.g. believe,
know
Believes(George, dance(George))
Knows(Bill,Believes(George,dance(George)))
• Mutual belief: I believe you believe I believe….
– Practical importance: modeling belief in dialogue
– Clark’s grounding
Compositional Semantics
• The meaning of the whole is made up of the
meaning of its parts (predicates and arguments)
– George cooks. Dan eats. Dan is sick.
– Cook(George) Eat(Dan) Sick(Dan)
– If George cooks and Dan eats, Dan will get sick.
(Cook(George) ^ eat(Dan))  Sick(Dan)
Sick(Dan)  Cook(George)
• Syntax tells us how to combine the parts
Syntax-Driven Semantics
S
NP VP eat(Dan)
Nom V
N
Dan
eats
• Task: can we link up syntactic structures to a
corresponding semantic representation to produce
the ‘meaning’ structure of a sentence in the course
of parsing it?
Rule-to-Rule Hypothesis
• We don’t want to have to specify for every
possible parse tree what semantic representation it
maps to
• We want to identify general mappings from parse
trees to semantic representations:
• Hypothesis: A mapping exists between rules of
the grammar and rules of semantic representation
Semantic Attachments
• Extend each grammar rule with instructions on
how to map the components of the rule to a
semantic representation
S  NP VP {VP.sem(NP.sem)}
• Each semantic function is defined in terms of the
semantic representation of choice
• Problem: how to define these functions and how to
specify their composition so we always get the
meaning representation we want from our
grammar?
A ‘Simple’ Example
AyCaramba serves meat.
• Associating constants with arguments
– ProperNoun  AyCaramba {AyCaramba}
– MassNoun  meat {Meat}
• Defining functions to produce these from input
– NP  ProperNoun {ProperNoun.sem}
– NP  MassNoun {MassNoun.sem}
– Assumption: meaning reps of children are passed up to
parents for non-branching constuents
• Verbs here are where the action is
– V  serves {E(e,x,y) Isa(e,Serving) ^ Server(e,x) ^
Served(e,y)}
– Will every verb need its own distinct representation?
George served in the army
• How do we combine these pieces?
– VP  V NP
– Goal: E(e,x) Isa(e,Serving) ^ Server(e,x) ^
Served(e,Meat)
– VP semantics must tell us
• Which vars to be replaced by which args?
• How this replacement is done?
Non-Compositional Language
• What do we do with language whose meaning
isn’t derived from the meanings of its parts
– Metaphor:
You’re the cream in my coffee.
She’s the cream in George’s coffee.
The break-in was just the tip of the iceberg.
This was only the tip of Shirley’s iceberg.
– Idioms:
The old man finally kicked the bucket.
The old man finally kicked the proverbial bucket.
Problems with Syntactic-Driven Semantics
• Syntactic structures often don’t fit semantic
structures very well
– Important semantic elements often distributed very
differently in trees for sentences that mean ‘the same’
I like soup. Soup is what I like.
– Parse trees contain many structural elements not clearly
important to making semantic distinctions
– Syntax driven semantic representations are sometimes
pretty bizarre
Sum
• Many hard problems in full semantic
representation:
– Temporal relations: tense, aspect
– BDI
• Current representations impoverished in many
respects
• Next time: Read Ch 16