Transcript Syntax
Syntax
Sudeshna Sarkar
25 Aug 2008
1
Sentence-Types
Declaratives: A plane left
S -> NP VP
Imperatives: Leave!
S -> VP
Yes-No Questions: Did the plane leave?
S -> Aux NP VP
WH Questions: When did the plane leave?
S -> WH Aux NP VP
2
Recursion
We’ll have to deal with rules such as the following
where the non-terminal on the left also appears
somewhere on the right (directly).
Nominal -> Nominal PP [[flight] [to Boston]]
VP -> VP PP
[[departed Miami] [at noon]]
3
Recursion
Of course, this is what makes syntax interesting
flights from Denver
Flights from Denver to Miami
Flights from Denver to Miami in February
Flights from Denver to Miami in February on a Friday
Flights from Denver to Miami in February on a Friday under
$300
Flights from Denver to Miami in February on a Friday under
$300 with lunch
4
The Point
If you have a rule like
VP -> V NP
It only cares that the thing after the verb is an NP. It
doesn’t have to know about the internal affairs of
that NP
5
The Point
6
Conjunctive Constructions
S -> S and S
John went to NY and Mary followed him
NP -> NP and NP
VP -> VP and VP
…
In fact the right rule for English is
X -> X and X
7
Problems
Agreement
Subcategorization
Movement (for want of a better term)
8
Agreement
This dog
Those dogs
*This dogs
*Those dog
This dog eats
Those dogs eat
*This dog eat
*Those dogs eats
9
Agreement
In English,
subjects and verbs have to agree in person and
number
Determiners and nouns have to agree in number
Many languages have agreement systems that are far
more complex than this.
10
Subcategorization
Sneeze: John sneezed
Find: Please find [a flight to NY]NP
Give: Give [me]NP[a cheaper fare]NP
Help: Can you help [me]NP[with a flight]PP
Prefer: I prefer [to leave earlier]TO-VP
Told: I was told [United has a flight]S
…
11
Subcategorization
*John sneezed the book
*I prefer United has a flight
*Give with a flight
Subcat expresses the constraints that a predicate
(verb for now) places on the number and syntactic
types of arguments it wants to take (occur with).
12
So?
So the various rules for VPs overgenerate.
They permit the presence of strings containing verbs
and arguments that don’t go together
For example
VP -> V NP therefore
Sneezed the book is a VP since “sneeze” is a verb and
“the book” is a valid NP
13
So What?
Now overgeneration is a problem for a
generative approach.
The grammar is supposed to account for all and
only the strings in a language
From a practical point of view... Not so clear
that there’s a problem
Why?
14
Possible CFG Solution
S -> NP VP
NP -> Det Nominal
VP -> V NP
…
SgS -> SgNP SgVP
PlS -> PlNp PlVP
SgNP -> SgDet SgNom
PlNP -> PlDet PlNom
PlVP -> PlV NP
SgVP ->SgV Np
…
15
CFG Solution for Agreement
It works and stays within the power of CFGs
But its ugly
And it doesn’t scale all that well
16
Forward Pointer
It turns out that verb subcategorization facts will
provide a key element for semantic analysis
(determining who did what to who in an event).
17
Movement
Core (canonical) example
My travel agent booked the flight
18
Movement
Core example
[[My travel agent]NP [booked [the flight]NP]VP]S
I.e. “book” is a straightforward transitive verb. It expects a
single NP arg within the VP as an argument, and a single
NP arg as the subject.
19
Movement
What about?
Which flight do you want me to have the travel agent book?
The direct object argument to “book” isn’t appearing in
the right place. It is in fact a long way from where its
supposed to appear.
And note that its separated from its verb by 2 other
verbs.
20
The Point
CFGs appear to be just about what we need to
account for a lot of basic syntactic structure in
English.
But there are problems
That can be dealt with adequately, although not elegantly, by
staying within the CFG framework.
There are simpler, more elegant, solutions that take
us out of the CFG framework (beyond its formal
power)
21
Grammars
Before you can parse you need a grammar.
So where do grammars come from?
Grammar Engineering
– Lovingly hand-crafted decades-long efforts by humans to write
grammars (typically in some particular grammar formalism of
interest to the linguists developing the grammar).
TreeBanks
– Semi-automatically generated sets of parse trees for the
sentences in some corpus. Typically in a generic lowest
common denominator formalism (of no particular interest to any
modern linguist).
22
TreeBank Grammars
Reading off the grammar…
The grammar is the set of rules (local subtrees) that
occur in the annotated corpus
They tend to avoid recursion (and elegance and
parsimony)
Ie. they tend to the flat and redundant
Penn TreeBank (III) has about 17500 grammar rules
under this definition.
23
TreeBanks
24
TreeBanks
25
Sample Rules
26
Example
27
TreeBanks
TreeBanks provide a grammar (of a sort).
As we’ll see they also provide the training data for
various ML approaches to parsing.
But they can also provide useful data for more purely
linguistic pursuits.
You might have a theory about whether or not something can
happen in particular language.
Or a theory about the contexts in which something can happen.
TreeBanks can give you the means to explore those theories.
If you can formulate the questions in the right way and get the
data you need.
28
Tgrep
You might for example like to grep through a file filled
with trees.
29
TreeBanks
Finally, you should have noted a bit of a circular
argument here.
Treebanks provide a grammar because we can read
the rules of the grammar out of the treebank.
But how did the trees get in there in the first place?
There must have been a grammar theory in there
someplace…
30
TreeBanks
Typically, not all of the sentences are hand-annotated
by humans.
They’re automatically parsed and then handcorrected.
31
Parsing
Parsing with CFGs refers to the task of assigning
correct trees to input strings
Correct here means a tree that covers all and only the
elements of the input and has an S at the top
It doesn’t actually mean that the system can select the
correct tree from among all the possible trees
32
Parsing
As with everything of interest, parsing involves a search
which involves the making of choices
We’ll start with some basic (meaning bad) methods
before moving on to the one or two that you need to
know
33
For Now
Assume…
You have all the words already in some buffer
The input isn’t POS tagged
We won’t worry about morphological analysis
All the words are known
34
Top-Down Parsing
Since we’re trying to find trees rooted with an S
(Sentences) start with the rules that give us an S.
Then work your way down from there to the words.
35
Top Down Space
36
Bottom-Up Parsing
Of course, we also want trees that cover the input
words. So start with trees that link up with the words
in the right way.
Then work your way up from there.
37
Bottom-Up Space
38
Bottom Up Space
39
Control
Of course, in both cases we left out how to keep track
of the search space and how to make choices
Which node to try to expand next
Which grammar rule to use to expand a node
40
Top-Down and Bottom-Up
Top-down
Only searches for trees that can be answers (i.e. S’s)
But also suggests trees that are not consistent with any of the
words
Bottom-up
Only forms trees consistent with the words
But suggest trees that make no sense globally
41
Problems
Even with the best filtering, backtracking methods are
doomed if they don’t address certain problems
Ambiguity
Shared subproblems
42
Ambiguity
43
Shared Sub-Problems
No matter what kind of search (top-down or bottom-up
or mixed) that we choose.
We don’t want to unnecessarily redo work we’ve already
done.
44
Shared Sub-Problems
Consider
A flight from Indianapolis to Houston on TWA
45
Shared Sub-Problems
Assume a top-down parse making bad initial choices
on the Nominal rule.
In particular…
Nominal -> Nominal Noun
Nominal -> Nominal PP
46
Shared Sub-Problems
47
Shared Sub-Problems
48
Shared Sub-Problems
49
Shared Sub-Problems
50
Parsing
CKY
Earley
Both are dynamic programming solutions that
run in O(n**3) time.
CKY is bottom-up
Earley is top-down
51
Sample Grammar
52
Dynamic Programming
DP methods fill tables with partial results and
Do not do too much avoidable repeated work
Solve exponential problems in polynomial time (sort of)
Efficiently store ambiguous structures with shared sub-parts.
53
CKY Parsing
First we’ll limit our grammar to epsilon-free, binary
rules (more later)
Consider the rule A -> BC
If there is an A in the input then there must be a B followed
by a C in the input.
If the A spans from i to j in the input then there must be some
k st. i<k<j
– Ie. The B splits from the C someplace.
54
CKY
So let’s build a table so that an A spanning from i to j
in the input is placed in cell [i,j] in the table.
So a non-terminal spanning an entire string will sit in
cell [0, n]
If we build the table bottom up we’ll know that the
parts of the A must go from i to k and from k to j
55
CKY
Meaning that for a rule like A -> B C we should look
for a B in [i,k] and a C in [k,j].
In other words, if we think there might be an A
spanning i,j in the input… AND
A -> B C is a rule in the grammar THEN
There must be a B in [i,k] and a C in [k,j] for some
i<k<j
56
CKY
So to fill the table loop over the cell[i,j] values in some
systematic way
What constraint should we put on that?
For each cell loop over the appropriate k values to search for
things to add.
57
CKY Table
58
CKY Algorithm
59
CKY Parsing
Is that really a parser?
60
Note
We arranged the loops to fill the table a column at a
time, from left to right, bottom to top.
This assures us that whenever we’re filling a cell, the parts
needed to fill it are already in the table (to the left and below)
61
Example
62
Other Ways to Do It?
Are there any other sensible ways to fill the table that
still guarantee that the cells we need are already
filled?
63
Other Ways to Do It?
64
Sample Grammar
65
Problem
What if your grammar isn’t binary?
As in the case of the TreeBank grammar?
Convert it to binary… any arbitrary CFG can be rewritten into
Chomsky-Normal Form automatically.
What does this mean?
The resulting grammar accepts (and rejects) the same set of strings
as the original grammar.
But the resulting derivations (trees) are different.
66
Problem
More specifically, rules have to be of the form
A -> B C
Or
A -> w
That is rules can expand to either 2 non-terminals or to a single
terminal.
67
Binarization Intuition
Eliminate chains of unit productions.
Introduce new intermediate non-terminals into the grammar that distribute
rules with length > 2 over several rules. So…
S -> A B C
– Turns into
S -> X C
X-AB
Where X is a symbol that doesn’t occur anywhere else in the the grammar.
68
CNF Conversion
69
CKY Algorithm
70
Example
Filling column 5
71
Example
72
Example
73
Example
74
Example
75
END
76