Transcript Parsing-1

CS60057
Speech &Natural Language
Processing
Autumn 2007
Lecture 12
22 August 2007
Lecture 1, 7/21/2005
Natural Language Processing
1
LING 180 SYMBSYS 138
Intro to Computer Speech and
Language Processing
Lecture 13: Grammar and Parsing (I)
October 24, 2006
Dan Jurafsky
Thanks to Jim Martin for many of these slides!
Lecture 1, 7/21/2005
Natural Language Processing
2
Outline for Grammar/Parsing Week





Context-Free Grammars and Constituency
Some common CFG phenomena for English
 Sentence-level constructions
 NP, PP, VP
 Coordination
 Subcategorization
Top-down and Bottom-up Parsing
Earley Parsing
Quick sketch of probabilistic parsing
Lecture 1, 7/21/2005
Natural Language Processing
3
Review


Parts of Speech
 Basic syntactic/morphological categories that words
belong to
Part of Speech tagging
 Assigning parts of speech to all the words in a
sentence
Lecture 1, 7/21/2005
Natural Language Processing
4
Syntax




Syntax: from Greek syntaxis, “setting out together,
arrangmenet’
Refers to the way words are arranged together, and the
relationship between them.
Distinction:
 Prescriptive grammar: how people ought to talk
 Descriptive grammar: how they do talk
Goal of syntax is to model the knowledge of that people
unconsciously have about the grammar of their native
language
Lecture 1, 7/21/2005
Natural Language Processing
5
Syntax

Why should you care?
 Grammar checkers
 Question answering
 Information extraction
 Machine translation
Lecture 1, 7/21/2005
Natural Language Processing
6
4 key ideas of syntax




Constituency (we’ll spend most of our time on this)
Grammatical relations
Subcategorization
Lexical dependencies
Plus one part we won’t have time for:
 Movement/long-distance dependency
Lecture 1, 7/21/2005
Natural Language Processing
7
Context-Free Grammars

Capture constituency and ordering
 Ordering:


What are the rules that govern the ordering of words and
bigger units in the language?
Constituency:
How words group into units and how the various kinds
of units behave
Lecture 1, 7/21/2005
Natural Language Processing
8
Constituency

Noun phrases (NPs)







Three parties from Brooklyn
A high-class spot such as Mindy’s
The Broadway coppers
They
Harry the Horse
The reason he comes into the Hot Box
How do we know these form a constituent?
 They can all appear before a verb:




Three parties from Brooklyn arrive…
A high-class spot such as Mindy’s attracts…
The Broadway coppers love…
They sit…
Lecture 1, 7/21/2005
Natural Language Processing
9
Constituency (II)

They can all appear before a verb:





But individual words can’t always appear before verbs:





Three parties from Brooklyn arrive…
A high-class spot such as Mindy’s attracts…
The Broadway coppers love…
They sit
*from arrive…
*as attracts…
*the is
*spot is…
Must be able to state generalizations like:

Noun phrases occur before verbs
Lecture 1, 7/21/2005
Natural Language Processing
10
Constituency (III)


Preposing and postposing:
 On September 17th, I’d like to fly from Atlanta to Denver
 I’d like to fly on September 17th from Atlanta to Denver
 I’d like to fly from Atlanta to Denver on September 17th.
But not:
 *On September, I’d like to fly 17th from Atlanta to Denver
 *On I’d like to fly September 17th from Atlanta to Denver
Lecture 1, 7/21/2005
Natural Language Processing
11
CFG Examples







S -> NP VP
NP -> Det NOMINAL
NOMINAL -> Noun
VP -> Verb
Det -> a
Noun -> flight
Verb -> left
Lecture 1, 7/21/2005
Natural Language Processing
12
CFGs

S -> NP VP
 This says that there are units called S, NP, and VP in
this language
 That an S consists of an NP followed immediately by
a VP
 Doesn’t say that that’s the only kind of S
 Nor does it say that this is the only place that NPs and
VPs occur
Lecture 1, 7/21/2005
Natural Language Processing
13
Generativity

As with FSAs and FSTs you can view these rules as
either analysis or synthesis machines
 Generate strings in the language
 Reject strings not in the language
 Impose structures (trees) on strings in the language
Lecture 1, 7/21/2005
Natural Language Processing
14
Derivations

A derivation is a sequence of rules applied to a string
that accounts for that string
 Covers all the elements in the string
 Covers only the elements in the string
Lecture 1, 7/21/2005
Natural Language Processing
15
Derivations as Trees
Lecture 1, 7/21/2005
Natural Language Processing
16
CFGs more formally

A context-free grammar has 4 parameters (“is a 4tuple”)
1) A set of non-terminal symbols (“variables”) N
2) A set of terminal symbols  (disjoint from N)
3) A set of productions P, each of the form


4)
A -> 
Where A is a non-terminal and  is a string of symbols from
the infinite set of strings (  N)*
A designated start symbol S
Lecture 1, 7/21/2005
Natural Language Processing
17
Defining a CF language via derivation


A string A derives a string B if

A can be rewritten as B via some series of rule applications
More formally:

If A -> is a production of P

 and are any strings in the set (  N)*

Then we say that




Derivation is a generalization of direct derivation
Let 1, 2, … m be strings in (  N)*, m>= 1, s.t.



A directly derives 
Or A  
1 2, 2 3… m-1 m
We say that 1derives m or 1* m
We then formally define language LG generated by grammar G


As set of strings composed of terminal symbols derived from S
LG = {w | w is in * and S * w}
Lecture 1, 7/21/2005
Natural Language Processing
18
Parsing

Parsing is the process of taking a string and a grammar
and returning a (many?) parse tree(s) for that string
Lecture 1, 7/21/2005
Natural Language Processing
19
Context?


The notion of context in CFGs has nothing to do with the ordinary
meaning of the word context in language.
All it really means is that the non-terminal on the left-hand side of a
rule is out there all by itself (free of context)
A -> B C
Means that I can rewrite an A as a B followed by a C
regardless of the context in which A is found
Lecture 1, 7/21/2005
Natural Language Processing
20
Key Constituents (English)




Sentences
Noun phrases
Verb phrases
Prepositional phrases
Lecture 1, 7/21/2005
Natural Language Processing
21
Sentence-Types




Declaratives: A plane left
S -> NP VP
Imperatives: Leave!
S -> VP
Yes-No Questions: Did the plane leave?
S -> Aux NP VP
WH Questions: When did the plane leave?
S -> WH Aux NP VP
Lecture 1, 7/21/2005
Natural Language Processing
22
NPs





NP -> Pronoun
 I came, you saw it, they conquered
NP -> Proper-Noun
 Los Angeles is west of Texas
 John Hennesey is the president of Stanford
NP -> Det Noun
 The president
NP -> Nominal
Nominal -> Noun Noun
 A morning flight to Denver
Lecture 1, 7/21/2005
Natural Language Processing
23
PPs

PP -> Preposition NP
 From LA
 To Boston
 On Tuesday
 With lunch
Lecture 1, 7/21/2005
Natural Language Processing
24
Recursion

We’ll have to deal with rules such as the following where
the non-terminal on the left also appears somewhere on
the right (directly).
NP -> NP PP
[[The flight] [to Boston]]
VP -> VP PP
[[departed Miami] [at noon]]
Lecture 1, 7/21/2005
Natural Language Processing
25
Recursion

Of course, this is what makes syntax interesting
flights from Denver
Flights from Denver to Miami
Flights from Denver to Miami in February
Flights from Denver to Miami in February on a Friday
Flights from Denver to Miami in February on a Friday under $300
Flights from Denver to Miami in February on a Friday under $300 with
lunch
Lecture 1, 7/21/2005
Natural Language Processing
26
Recursion

Of course, this is what makes syntax interesting
[[flights] [from Denver]]
[[[Flights] [from Denver]] [to Miami]]
[[[[Flights] [from Denver]] [to Miami]] [in February]]
[[[[[Flights] [from Denver]] [to Miami]] [in February]] [on a
Friday]]
Etc.
Lecture 1, 7/21/2005
Natural Language Processing
27
Implications of recursion and contextfreeness

If you have a rule like
 VP -> V NP

It only cares that the thing after the verb is an NP. It
doesn’t have to know about the internal affairs of that
NP
Lecture 1, 7/21/2005
Natural Language Processing
28
The Point


VP -> V NP
I hate
flights from Denver
Flights from Denver to Miami
Flights from Denver to Miami in February
Flights from Denver to Miami in February on a Friday
Flights from Denver to Miami in February on a Friday under $300
Flights from Denver to Miami in February on a Friday under $300 with
lunch
Lecture 1, 7/21/2005
Natural Language Processing
29
Bracketed Notation
 [S [NP [PRO
I] [VP [V prefer [NP [NP [Det a] [Nom [N morning] [N
flight]]]]
Lecture 1, 7/21/2005
Natural Language Processing
30
Coordination Constructions





S -> S and S
 John went to NY and Mary followed him
NP -> NP and NP
VP -> VP and VP
…
In fact the right rule for English is
X -> X and X
Lecture 1, 7/21/2005
Natural Language Processing
31
Problems



Agreement
Subcategorization
Movement (for want of a better term)
Lecture 1, 7/21/2005
Natural Language Processing
32
Agreement




This dog
Those dogs

This dog eats
Those dogs eat

Lecture 1, 7/21/2005


*This dogs
*Those dog
*This dog eat
*Those dogs eats
Natural Language Processing
33
Possible CFG Solution




S -> NP VP
NP -> Det Nominal
VP -> V NP
…







Lecture 1, 7/21/2005
SgS -> SgNP SgVP
PlS -> PlNp PlVP
SgNP -> SgDet SgNom
PlNP -> PlDet PlNom
PlVP -> PlV NP
SgVP ->SgV Np
…
Natural Language Processing
34
CFG Solution for Agreement



It works and stays within the power of CFGs
But its ugly
And it doesn’t scale all that well
Lecture 1, 7/21/2005
Natural Language Processing
35
Subcategorization







Sneeze: John sneezed
Find: Please find [a flight to NY]NP
Give: Give [me]NP[a cheaper fare]NP
Help: Can you help [me]NP[with a flight]PP
Prefer: I prefer [to leave earlier]TO-VP
Said: You said [United has a flight]S
…
Lecture 1, 7/21/2005
Natural Language Processing
36
Subcategorization




*John sneezed the book
*I prefer United has a flight
*Give with a flight
Subcat expresses the constraints that a predicate (verb
for now) places on the number and syntactic types of
arguments it wants to take (occur with).
Lecture 1, 7/21/2005
Natural Language Processing
37
So?

So the various rules for VPs overgenerate.
 They permit the presence of strings containing verbs
and arguments that don’t go together
 For example
 VP -> V NP
 therefore
Sneezed the book is a VP since “sneeze” is a verb and
“the book” is a valid NP
Lecture 1, 7/21/2005
Natural Language Processing
38
Subcategorization







Sneeze: John sneezed
Find: Please find [a flight to NY]NP
Give: Give [me]NP[a cheaper fare]NP
Help: Can you help [me]NP[with a flight]PP
Prefer: I prefer [to leave earlier]TO-VP
Told: I was told [United has a flight]S
…
Lecture 1, 7/21/2005
Natural Language Processing
39
Forward Pointer

It turns out that verb subcategorization facts will provide
a key element for semantic analysis (determining who
did what to who in an event).
Lecture 1, 7/21/2005
Natural Language Processing
40
Possible CFG Solution




VP -> V
VP -> V NP
VP -> V NP PP
…




Lecture 1, 7/21/2005
VP -> IntransV
VP -> TransV NP
VP -> TransVwPP NP PP
…
Natural Language Processing
41
Movement

Core example
 My travel agent booked the flight
Lecture 1, 7/21/2005
Natural Language Processing
42
Movement

Core example
 [[My travel agent]NP [booked [the flight]NP]VP]S

I.e. “book” is a straightforward transitive verb. It expects a
single NP arg within the VP as an argument, and a single
NP arg as the subject.
Lecture 1, 7/21/2005
Natural Language Processing
43
Movement



What about?
 Which flight do you want me to have the travel agent
book?
The direct object argument to “book” isn’t appearing in
the right place. It is in fact a long way from where its
supposed to appear.
And note that its separated from its verb by 2 other
verbs.
Lecture 1, 7/21/2005
Natural Language Processing
44
CFGs: a summary




CFGs appear to be just about what we need to account for a lot of
basic syntactic structure in English.
But there are problems
 That can be dealt with adequately, although not elegantly, by
staying within the CFG framework.
There are simpler, more elegant, solutions that take us out of the
CFG framework (beyond its formal power)
Syntactic theories: HPSG, LFG, CCG, Minimalism, etc
Lecture 1, 7/21/2005
Natural Language Processing
45
Other Syntactic stuff

Grammatical Relations
 Subject



Object


I booked a flight to New York
The flight was booked by my agent.
I booked a flight to New York
Complement

I said that I wanted to leave
Lecture 1, 7/21/2005
Natural Language Processing
46
Dependency Parsing






Word to word links instead of constituency
Based on the European rather than American traditions
But dates back to the Greeks
The original notions of Subject, Object and the progenitor of
subcategorization (called ‘valence’) came out of Dependency theory.
Dependency parsing is quite popular as a computational model
Since relationships between words are quite useful
Lecture 1, 7/21/2005
Natural Language Processing
47
Parsing



Parsing: assigning correct trees to input strings
Correct tree: a tree that covers all and only the elements
of the input and has an S at the top
For now: enumerate all possible trees
 A further task: disambiguation: means choosing the
correct tree from among all the possible trees.
Lecture 1, 7/21/2005
Natural Language Processing
48
Parsing: examples using Rion
Snow’s visualizer



Stanford parser
 http://ai.stanford.edu/~rion/parsing/stanford_viz.html
Minipar parser
 http://ai.stanford.edu/~rion/parsing/minipar_viz.html
Link grammar parser
 http://ai.stanford.edu/~rion/parsing/linkparser_viz.html
Lecture 1, 7/21/2005
Natural Language Processing
49
Treebanks


Parsed corpora in the form of trees
Examples:
Lecture 1, 7/21/2005
Natural Language Processing
50
Parsed Corpora: Treebanks


The Penn Treebank
 The Brown corpus
 The WSJ corpus
Tgrep
 http://www.ldc.upenn.edu/ldc/online/treebank/
Lecture 1, 7/21/2005
Natural Language Processing
51
Parsing


As with everything of interest, parsing involves a search
which involves the making of choices
We’ll start with some basic (meaning bad) methods
before moving on to the one or two that you need to
know
Lecture 1, 7/21/2005
Natural Language Processing
52
For Now

Assume…
 You have all the words already in some buffer
 The input isn’t pos tagged
 We won’t worry about morphological analysis
 All the words are known
Lecture 1, 7/21/2005
Natural Language Processing
53
Top-Down Parsing


Since we’re trying to find trees rooted with an S
(Sentences) start with the rules that give us an S.
Then work your way down from there to the words.
Lecture 1, 7/21/2005
Natural Language Processing
54
Top Down Space
Lecture 1, 7/21/2005
Natural Language Processing
55
Bottom-Up Parsing


Of course, we also want trees that cover the input words.
So start with trees that link up with the words in the right
way.
Then work your way up from there.
Lecture 1, 7/21/2005
Natural Language Processing
56
Bottom-Up Space
Lecture 1, 7/21/2005
Natural Language Processing
57
Control

Of course, in both cases we left out how to keep track of
the search space and how to make choices
 Which node to try to expand next
 Which grammar rule to use to expand a node
Lecture 1, 7/21/2005
Natural Language Processing
58
Top-Down, Depth-First, Left-to-Right
Search
Lecture 1, 7/21/2005
Natural Language Processing
59
Example
Lecture 1, 7/21/2005
Natural Language Processing
60
Example
Lecture 1, 7/21/2005
Natural Language Processing
61
Example
Lecture 1, 7/21/2005
Natural Language Processing
62
Control

Does this sequence make any sense?
Lecture 1, 7/21/2005
Natural Language Processing
63
Top-Down and Bottom-Up


Top-down
 Only searches for trees that can be answers (i.e. S’s)
 But also suggests trees that are not consistent with
the words
Bottom-up
 Only forms trees consistent with the words
 Suggest trees that make no sense globally
Lecture 1, 7/21/2005
Natural Language Processing
64
So Combine Them


There are a million ways to combine top-down
expectations with bottom-up data to get more efficient
searches
Most use one kind as the control and the other as a filter
 As in top-down parsing with bottom-up filtering
Lecture 1, 7/21/2005
Natural Language Processing
65
Bottom-Up Filtering
Lecture 1, 7/21/2005
Natural Language Processing
66
Top-Down, Depth-First, Left-to-Right
Search
Lecture 1, 7/21/2005
Natural Language Processing
67
TopDownDepthFirstLeftoRight (II)
Lecture 1, 7/21/2005
Natural Language Processing
68
TopDownDepthFirstLeftoRight (III)
flight
Lecture 1, 7/21/2005
Natural Language Processing
flight
69
TopDownDepthFirstLeftoRight (IV)
flight
Lecture 1, 7/21/2005
flight
Natural Language Processing
70
Adding Bottom-Up Filtering
Lecture 1, 7/21/2005
Natural Language Processing
71
3 problems with TDDFLtR Parser



Left-Recursion
Ambiguity
Inefficient reparsing of subtrees
Lecture 1, 7/21/2005
Natural Language Processing
72
Left-Recursion

What happens in the following situation
 S -> NP VP
 S -> Aux NP VP
 NP -> NP PP
 NP -> Det Nominal
 …
 With the sentence starting with

Did the flight…
Lecture 1, 7/21/2005
Natural Language Processing
73
Ambiguity

One morning I shot an elephant in my pyjamas. How he
got into my pajamas I don’t know. (Groucho Marx)
Lecture 1, 7/21/2005
Natural Language Processing
74
Lots of ambiguity




VP -> VP PP
NP -> NP PP
Show me the meal on flight 286 from SF to Denver
14 parses!
Lecture 1, 7/21/2005
Natural Language Processing
75
Lots of ambiguity

Church and Patil (1982)
 Number of parses for such sentences grows at rate of number of
parenthesizations of arithmetic expressions
 Which grow with Catalan numbers
1 2n 
C(n) 
 
n  1  n 
Lecture 1, 7/21/2005
Natural Language Processing
PPs
1
2
3
4
5
6
Parses
2
5
14
132
469
1430
76
Avoiding Repeated Work

Parsing is hard, and slow. It’s wasteful to redo stuff over
and over and over.

Consider an attempt to top-down parse the following as
an NP
A flight from Indi to Houston on TWA
Lecture 1, 7/21/2005
Natural Language Processing
77
flight
Lecture 1, 7/21/2005
Natural Language Processing
78
flight
flight
Lecture 1, 7/21/2005
Natural Language Processing
79
Lecture 1, 7/21/2005
Natural Language Processing
80
Lecture 1, 7/21/2005
Natural Language Processing
81
Dynamic Programming

We need a method that fills a table with partial results
that
 Does not do (avoidable) repeated work
 Does not fall prey to left-recursion
 Can find all the pieces of an exponential number of
trees in polynomial time.
Lecture 1, 7/21/2005
Natural Language Processing
82
Possible improvements

in bigram POS tagging, we condition a tag only on the
preceding tag

why not...
 use more context (ex. use trigram model)

more precise:
 “is clearly marked” --> verb, past participle
 “he clearly marked” --> verb, past tense



combine trigram, bigram, unigram models
condition on words too
but with an n-gram approach, this is too costly (too many
parameters to model)
Lecture 1, 7/21/2005
Natural Language Processing
83