Dependency structure and cognition
Download
Report
Transcript Dependency structure and cognition
Dependency structure and
cognition
Richard Hudson
Depling2013, Prague
1
The question
• What is syntactic structure like?
– Does it include dependencies between words
(dependency structure)?
– Or does it only contain part-whole links
(phrase structure)?
She looked after him
after him
She
looked
after
him
2
Relevant evidence: familiarity
• University courses teach only one approach.
• School grammar sometimes offers one.
– Usually dependency structure
– even in the USA
• Reed-Kellogg sentence-diagramming
– especially in Europe
– and especially in the Czech Republic!
3
What Czech children do at school
blossomed out
by stream
kingcups
yellow
near
Jirka Hana & Barbora Hladká 2012
4
or even …
5
Relevant evidence: convenience
• Dependency structure is popular in
computational linguistics.
• Maybe because of its simplicity:
– few nodes
– little but orthographic words
• Good for lexical cooccurrence relations
6
Relevant evidence: cognition
•
•
•
•
•
Language competence is memory
Language processing is thinking
Memory and thinking are part of cognition
So what do we know about cognition?
A. Very generally, cognition is not simple
– so maybe syntactic structures aren't in fact
simple?
7
B. Knowledge is a network
Gretta
Colin
John
Gaynor
me
Lucy
Peter
8
C. Links are classified relations
relative
person
is-a
woman
man
parent
mother
child
father
9
D. Nodes are richly related
Gretta
John
m
m s
Colin
f
f
s
b
s
s
b
w
me
h
Gaynor
d
gf
Lucy
s
Peter
10
E. Is-a allows default inheritance
• Is-a forms taxonomies.
– e.g. 'linguist is-a person', 'Dick is-a linguist'
• Properties 'inherit' down a taxonomy.
• But only 'by default' – exceptions are ok.
– e.g. birds (normally) fly
– but penguins don't.
11
Penguins
bird
robin
robin*
'flies'
penguin
'flies'
penguin*
'doesn't fly'
'doesn't fly'
12
Cognitivism
• 'Cognitivism'
– 'Language is an example of ordinary cognition'
• So all our general cognitive abilities are
available for language
– and we have no special language abilities.
• Cognitivism matters for linguistic theory.
13
Some consequences of cognitivism
1.
2.
3.
4.
5.
6.
Word-word dependencies are real.
'Deep' and 'surface' properties combine.
Mutual dependency is ok.
Dependents create new word tokens.
Extra word tokens allow raising.
But lowering may be ok too.
14
1. Word-word dependencies are real
• Do word-word dependencies exist (in our
minds)?
– Why not?
– Compare social relations between individuals.
• What about phrases?
– Why not?
– But maybe only their boundaries are relevant?
– They're not classified, so no unary branching.
15
Punctuation marks boundaries
• At the end of the road, turn right.
• Not:
– At the end of the, road turn right.
– At the end, of the road turn right.
– At the end of the road turn right,
• How do we learn to punctuate if we can't
recognise boundaries?
16
No unary branching
• If S
NP + VP, then:
S
But if a verb's subject is
a noun:
NP
VP
N
V
N
V
Cows
moo.
Cows
moo.
17
2. 'Deep' and 'surface' properties
combine.
• Dependencies are relational concepts.
• Concepts record bundles of properties that
tend to coincide
– e.g. 'bird': beak, flying, feathers, two legs, eggs
– 'mother': bearer, carer
• So one dependency has many properties:
– semantic, syntactic, morphosyntactic
– e.g. 'subject' ….
18
'subject'
The typical subject is defined by
• meaning
– typically 'actor' or …
• word order and/or case
– typically before verb and/or nominative
• agreement
– typically the verb agrees with it
• status
– obligatory or optional, according to finiteness
19
So …
• Cognition suggests that 'deep' and 'surface'
properties should be combined
– not separated
• They are in harmony by default
– but exceptionally they may be out of harmony
– this is allowed by default inheritance
20
3. Mutual dependency is ok.
• Mutual dependency is formally impossible
in standard notation
• And is formally impossible in phrase
structure theory
• So if it exists, we need to
– resist PS theory
– change the standard notation
21
Mutual dependency exists
• I wonder who came?
• Who is subject of came,
– so who depends on came.
• But who depends on wonder
• and came can be omitted:
– e.g. Someone came – I wonder who.
• So came depends on who.
22
Standard notation
A
A 'dominates' B
B
so A is above B
so B cannot 'dominate' A
B
A
23
4. Dependents create new word
tokens.
•
General cognition:
–
–
–
•
every exemplar needs a mental node.
no node carries contradictory properties.
so some exemplars need two nodes.
E.g. when we re-classify things.
–
NB we can remember both classifications
24
What kind of bird?
bird
blackbird
B
?
mate
B*
25
And in language …
word
LIKE-verb
like
I
?
NB like* is a
token of a token
subject
like*
26
The effect of a dependent
• When we recognise a dependent for W, we
change W into a new token W*.
• The classification of W* may change.
• W* also has a new meaning
– normally a hyponym of W
– but may be idiomatic
• If we add dependents singly, this gives a
kind of phrase structure!
27
typical French house
HOUSE
meaning
house
house
meaning
house
French
house*
meaning
French house
typical
house**
meaning
typical
28
French house
Notation
house**
house*
typical
French
house
typical
French
house
29
5. Extra word tokens allow raising.
subject
it
it
rains
subject
predicative
subject
it*
raining
keeps
30
Raising in the grammar
A* is-a A, so A*
wins.
higher parent
A*
B
shared
lower parent
A
C
31
6. But lowering may be ok too.
• Raising is helpful for processing
– the higher parent is nearer to the sentence root.
• But sometimes lowering is helpful too
– e.g. if it allows a new meaning-unit.
• Eine Concorde gelandet ist hier nie.
a Concorde landed has here never.
A-Concorde-landing has never happened here.
32
German Partial VP fronting
Eine Concorde
higher parent
Eine Concorde*
gelandet
ist
hier
nie
lower parent
lowered
33
Conclusions
• Language is just part of cognition.
• So syntactic dependencies are:
– psychologically real
– rich (combining 'deep' and 'surface' properties)
– complex (e.g. mutual, multiple).
• And dependency combines with
– default inheritance
– multiple tokens
34