Effects of age group and personality on cognitive and affective
Download
Report
Transcript Effects of age group and personality on cognitive and affective
Effects of age group and
personality on cognitive and
affective processing of music
David Ellison, Elvira Brattico,
Suvi Saarikallio
Cognitive Brain Research Unit, Institute of Behavioral
Sciences, University of Helsinki, Finland
Finnish Center of Excellence in Interdisciplinary Music
Research, University of Jyväskylä, Finland
Background
• Event-related potential (ERP)
•
•
•
•
Observed brain response to an event (e.g., auditory
stimulus onset)
Recorded using electroencephalography (EEG),
which measures electrical potential (voltage) between
points on the scalp and a reference
Averaged over many trials and across participants
Can be negative or positive (in potential energy)
• Behavioural data – frequencies of certain
responses, reaction time (RT)
• Personality test – “Big” Five Factor Model
ERPs to music in adults (early,
automatic)
• N100 or N1
• Fronto-central, 80-120ms
• Onset of unpredictable stimulus
• Followed by P2
• Mismatch negativity (MMN)
• 150-250ms fronto-central
• Deviant auditory stimuli
•
•
Physical feature deviance (e.g., freq., intensity) (phMMN)
Abstract (musical) feature deviance (afMMN)
•
•
Descending vs. ascending intervals (Saarinen et al., 1992)
Major vs. minor chords (Virtala et al., 2011)
• Early right anterior negativity (ERAN)
• Unexpected chords; music expectancy violations
• 150-180ms, anterior and often right-lateralized
ERPs to music in adults (later,
attentive)
• P300
• Peaks parietally 300ms after incongruous
note/chord at end of melody/cadence
• Improbability of task-related stimuli
• P600
• Syntactic language and music ERP
• Re-analysis of musical and linguistic sequences
following incongruity
ERPs to music in adults (affect,
emotion)
• Late positive potential (LPP)
• Indexes emotional arousal of stimuli, usu.
Pictures
• starts ∼300–400 ms after picture onset and
sustains throughout picture presentation
• Brattico et al. (2010) – cadence ending chords
•
•
600ms positivity to negatively-rated cadence resolutions
in both correctness and liking tasks
1000-1800ms slow parietal positivity associated with
’dislike’ ratings
ERPs to music pre-adulthood
• Enhanced MMN in chord discrimination task in
musically-trained 13-year-olds (Virtala et al.,
2012)
• Clear ERAN by age 5, albeit longer latency
than adults (230–240 ms; Koelsch,
Grossmann, et al., 2003; Jentschke et al.,
2008)
•
Is this due to neuro-anatomical development, or
establishment of music-syntactic representations?
• Indistinguishable from adult ERAN by age 11
(Jentschke et al., 2005)
Individual differences in music
processing
•
ERP difference factors
•
•
•
•
•
•
Age
Gender
Musical training (ERAN, MMN)
Infant temperament (P3a)
Emotion regulation deficits (anx., depression) modulate LPP
Personality
•
•
Music genre preferences
Music-listening habits (Chamorro-Premuzic & Furnham, 2007)
•
•
•
Perception of musically expressed emotions (Vuoskoski & Eerola, 2009)
•
•
•
Openness to experience – intellectual music appreciation
Neuroticism, introversion, less conscientiousness – music as mood regulation
Neuroticism - sad & tense ratings
Extraversion - positive ratings
No study of effects on music ERPs
Objectives
How does the brain respond to affective vs.
cognitive processing of music in adults and
adolescents? Do these groups differ?
What is the role of personality? Does it
influence individual affective/cognitive
brain responses to music?
Experiment
• 58 participants
• 34 fourteen-year-olds (10 male)
• 24 adults (9 male)
• Questionnaires: Ten Item Personality
Inventory (“Big” Five Factor Model), Music
Appreciation Scale, Rosenberg Self-Esteem
Scale
• EEG listening experiment with rating task
Stimuli
Task design
• Each trial consisted of 5-chord cadence + one of
two tasks:
• Cognitive – does the ending sound correct or
incorrect?
• Affective – does the ending sound happy or sad?
• Task cue 1st 4 chords final chord maj/min intune/mistuned behavioural response
• ERPs from cadence onset
• Preparation for cognitive or affective task
• Averaged by task type
• ERPs from final chord onset
• Response to target chord
• Averaged by behavioural response type (BR) and by
stimulus category
Behavioural results
• Cognitive task RT < affective task RT
•
F =4.130, p=.047, ηp2=.069
• Adolescent RT < adult RT
•
F=4.284, p=.043, ηp2=.071
• Participant ratings more likely to conform to
researcher expectations than to disagree
•
•
F=176.641, p=.000, ηp2=.759
i.e., major resolutions rated as ‘happy’, chords containing
mistuned notes rated as ‘incorrect’
No effect of group on this
•
• Adults on avg rated 93±3 items as correct, and
61±3 as incorrect
•
F=6.065, p=.017, ηp2=.098
• Correlation between “correct” and “happy” ratings
•
r=.425, p=.001
Task-averaged ERP curves and
scalp maps
•
•
Larger negative
peaks = N1s
during affective
task than
cognitive task
after the 1st and
2nd chords
These were
larger in
adolescents than
adults
BR-averaged ERP curves and
scalp maps
Behavioural response ERP
averaging
•
At around 300ms
•
•
•
Adults: significant MMN/ERAN to negatively-rated stimuli (“sad”
or “incorrect”) for both tasks
Adolescents: MMN/ERAN to cadence resolutions rated “sad”
only from left and midline of scalp; MMN/ERAN to cadence
resolutions rated “incorrect” only from right
At around 500ms
•
•
Larger P300 in adults than adolescents
At around 700ms
•
Adolescents: negative ratings (“sad” or “incorrect”) elicited
larger LPPs than did positive ratings, but for adults, this
difference was not quite significant on the right side of the
scalp.
Stimulus-category-averaged ERP
curves and scalp maps
•
•
•
At 300ms, larger
negativity to
mistuned ending
chords relative to
correctly-tuned
ending chords
(MMN) and also to
minor resolutions
relative to major
resolutions (ERAN)
At 500ms, larger
central P300 for
mistuned than
tuned chords
At 700ms, larger
centro-parietal
LPPs to minor
chords and
mistunings
Stimulus-category-averaged ERP
curves and scalp maps
• Adolescents
elicited larger
ERAN than
adults to major
resolutions but
did not differ
significantly from
adults following
minor
resolutions.
Correlations with individual difference
scales – results and discussion
•
Higher extraversion associated with slower reaction times when rating a stimulus
as sounding sad (p=.005)
•
Extraverts’ bias toward positive emotional ratings of music (Vuoskoski & Eerola, 2009)
and proneness to feel positive emotions
•
Higher music appreciation score associated with higher accuracy on the
affective task (p=.008)
•
Only in adolescents: conscientiousness associated with smaller ERAN
responses to sad-rated stimuli (p=.001)
•
•
Even though conscientiousness has in the past been associated with larger P300 in
auditory mismatch task (Gurrera et al., 2001; 2005)
Only in adults: openness to experience associated with larger late posterior
negativities (LPN) to sad stimuli peaking at around 1250ms (p=.003)
•
•
ERP component related to long-term memory
Perhaps those more open to experience are more likely to commit certain types of chord
progressions to memory.
ERP Discussion
•
Enhanced negative responses to cadence chords during cognitive
task similar to findings of Brattico et al., 2010
•
•
•
•
But also enhanced negative responses in adolescents
Perhaps related to establishing tonal context
•
•
•
•
•
Enhanced negativity at 300ms
Late negative difference (late Nd), working memory (Singhal & Fowler,
2005)
More demanding or important in cognitive task?
Still developing in adolescence?
Early negativity (MMN/ERAN) to mistuned and minor chords, and
to those rated as sad or incorrect
More ERAN-like response in adolescents (w.r.t. adults) to major
chord endings
Do these harmony/affect-related ERANs indicate unexpected
chords?
Future questions
Longitudinal data has been collected from same
adolescent participants at age 16. Will their data
differ? How do they prepare for cog/aff tasks?
Will the correlations between personality and
ERP amplitude hold up with a more robust
measure of personality? What do they mean?
What is early negative component differentiating
major from minor and sad from happy? Is it
really ERAN?
Works cited
Chamorro‐Premuzic, T., & Furnham, A. (2007). Personality and music: Can traits explain how people use music in everyday life?. British
Journal of Psychology, 98(2), 175-185.
Gurrera, R. J., O’Donnell, B. F., Nestor, P. G., Gainski, J., & McCarley, R. W. (2001). The P3 auditory event-related brain potential indexes
major personality traits. Biological Psychiatry, 49(11), 922-929.
Gurrera, R. J., Salisbury, D. F., O'Donnell, B. F., Nestor, P. G., & McCarley, R. W. (2005). Auditory P3 indexes personality traits and cognitive
function in healthy men and women. Psychiatry research, 133(2), 215-228.
Jentschke, S., Koelsch, S., & Friederici, A. D. (2005). Investigating the relationship of music and language in children: Influences of musical
training and language impairment. Annals of the New York Academy of Science, 1060, 231–242.
Jentschke, S., Koelsch, S., Sallat, S., & Friederici, A. D. (2008). Children with specific language impairment also show impairment of musicsyntactic processing. Journal of Cognitive Neuroscience, 20(11), 1940-1951.
Koelsch, S., Grossmann, T., Gunter, T. C., Hahne, A., Schro¨ ger, E., & Friederici, A. D. (2003). Children processing music: Electric brain
responses reveal musical competence and gender differences. Journal of Cognitive Neuroscience, 15, 683–693.
Saarinen, J., Paavilainen, P., Schöger, E., Tervaniemi, M., & Näätänen, R. (1992). Representation of abstract attributes of auditory stimuli in
the human brain. NeuroReport, 3, 1149–1151.
Virtala, P., Berg, V., Kivioja, M., Purhonen, J., Salmenkivi, M., Paavilainen, P., & Tervaniemi, M. (2011). The preattentive processing of major
vs. minor chords in the human brain: An event-related potential study. Neuroscience letters,487(3), 406-410.
Virtala, P., Huotilainen, M., Putkinen, V., Makkonen, T., & Tervaniemi, M. (2012). Musical training facilitates the neural discrimination of major
versus minor chords in 13‐year‐old children. Psychophysiology, 49(8), 1125-1132.
Vuoskoski, J. K., & Eerola, T. (2009). Personality traits moderate the perception of music-mediated emotions. In Frontiers in Human
Neuroscience. Conference Abstract: Tuning the Brain for Music.