Brain Potentials Dissociate Emotional and Conceptual Cross-Modal Priming of Environmental Sounds

Share Embed


Descripción

Cerebral Cortex Advance Access published June 16, 2011 Cerebral Cortex doi:10.1093/cercor/bhr128

Brain Potentials Dissociate Emotional and Conceptual Cross-Modal Priming of Environmental Sounds Yan Jing Wu, Stefanos Athanassiou, Dusana Dorjee, Mark Roberts and Guillaume Thierry School of Psychology, University of Wales, LL57 2AS Bangor, UK Address correspondence to Guillaume Thierry, School of Psychology, University of Wales, Bangor, Bangor, Gwynedd LL57 2AS, UK. Email: [email protected].

Keywords: affective environmental sounds, cross-modal priming, emotional congruence, event-related potentials, N400, semantic priming

Introduction Intuitively, there seems to be a dissociation between subjective emotions and objective knowledge (As French philosopher Blaise Pascal puts it: ‘‘The heart has its reasons, of which reason knows nothing.’’). Indeed, one can consciously analyze a situation or a meaningful event as being rationally acceptable while it is affectively shocking and vice versa. However, the hypothesis of a relative independence of affective vis-a`-vis other conceptual processing lacks scientific evidence and the question of interactions between our emotional state and our knowledge of the world has seldom been addressed. How is the meaning of external sensory events (e.g., environmental sounds) extracted by the brain in relation to their affective value as compared with nonaffective conceptual content? This study focuses on a peak of event-related brain potentials (ERPs), the N400, to examine differences between conceptual and affective processing in a priming paradigm. ERPs are averaged brain signals recorded from the surface of the scalp and time locked to the onset of a stimulus of interest. The N400 is a negative-going waveform that peaks around 400 ms after stimulus onset. It is classically viewed as an index of semantic integration of meaningful stimuli within the context of presentation (Kutas and Hillyard 1980, 1984). For example, N400 amplitude is reduced when the target word is preceded Ó The Author 2011. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: [email protected]

by a semantically related prime word (e.g., doctor—nurse) as compared with an unrelated prime word (e.g., doctor—table). This priming effect has been found across sensory modalities (Kutas and Federmeier 2000) and across verbal/nonverbal coding domains (e.g., Connolly et al. 1995; Grigor et al. 1999; Cummings et al. 2006; Orgs et al. 2006). In order to study affective versus nonaffective conceptual priming in the present experiment, we used environmental sounds as targets in a priming paradigm. N400 modulations by semantic relatedness in word-environmental sound pairs have been reported previously (Van Petten and Rheinfelder 1995; Cummings et al. 2006; Orgs et al. 2008; but see also Chiu and Schacter 1995; Stuart and Jones 1995). Previous studies using emotionally valenced pictures and words have mainly focused on early posterior negativity (EPN; Kissler et al. 2007; Scott et al. 2009) and late emotion-related P3 effects (Fischler and Bradley 2006; Herbert et al. 2008). Only a few studies have examined the impact of affective valence on semantic processing using the N400. Research with emotionally valenced spoken words has indicated greater N400 negativity for emotionally incongruent as compared with congruent stimuli. For instance, Schirmer and Kotz (2003) independently manipulated affective prosody (happy, neutral, angry) and affective valence (positive, neutral, negative) of spoken words and asked participants to rate the stimuli in 1 of the 2 dimensions. In the affective valence rating condition, they found an interference effect for incongruent stimuli, which translated into greater N400 amplitudes. Furthermore, they found this effect only in female listeners. However, Schirmer and Kotz (2003) and Schirmer et al. (2002) did not compare the effects of emotional congruence and semantic relatedness. It is therefore unknown whether the evaluation of emotional prosody and that of factual content show relative independence or interact. Other studies have reported that mismatch between affective valence of sentence prosody and affective valence of target words increases N400 amplitudes (Schirmer et al. 2002, 2005). Furthermore, within the visual modality, affective words embedded in sentences have been found to enhance N400 amplitude (Holt et al. 2008). But, to our knowledge, none of the previous studies attempted to distinguish semantic effects from affective valence effects. In addition, effects of emotionally valenced environmental sounds have seldom been investigated so far. Here, we used a cross-modal priming paradigm to characterize the interplay of conceptual and emotional processing during the comprehension of written words and environmental sounds. Sounds rated on average as unpleasant or as positive in a preliminary evaluation session were presented in a crossmodal priming context. In each trial, a word pair consisting of

Downloaded from http://cercor.oxfordjournals.org/ by guest on January 24, 2016

The attentional effects triggered by emotional stimuli in humans have been substantially investigated, but little is known about the impact of affective valence on the processing of meaning. Here, we used a cross-modal priming paradigm involving visually presented adjective--noun dyads and environmental sounds of controlled affective valence to test the contributions of conceptual relatedness and emotional congruence to priming. Participants undergoing event-related potential recording indicated whether target environmental sounds were related in meaning to adjective-noun dyads presented as primes. We tested spontaneous emotional priming by manipulating the congruence between the affective valence of the adjective in the prime and that of the sound. While the N400 was significantly reduced in amplitude by both conceptual relatedness and emotional congruence, there was no interaction between the 2 factors. The same pattern of results was found when participants judged the emotional congruence between environmental sounds and adjective--noun dyads. These results support the hypothesis that conceptual and emotional processes are functionally independent regardless of the specific cognitive focus of the comprehender.

Materials and Methods Participants Two groups of 14 participants, matched for gender, mean age, and level of education, gave informed consent to take part in the experiment, which was approved by the ethics committee of Bangor University. All participants were right handed based on the Edinburgh Handedness Inventory (Oldfield 1971) and had self-reported normal hearing. Participants were paid with course and print credits as part fulfillment of their degree.

Stimuli Word stimuli consisted of 40 adjectives of positive valence (mean = 7.5 ± 0.6 on a scale of 1--9; 1 being very negative and 9 being very positive; Bradley and Lang 1999), 40 adjectives of negative valence (mean = 2.9 ± 1) and 40 highly imageable concrete neutral nouns (mean concreteness = 568 ± 67 and mean imageability = 594 ± 44; on a scale from 100 to 700, Coltheart 1981; mean valence = 5.2 ± 1.2). Auditory stimuli were 40 digitized sounds (44.1 KHz sampling rate, 16 bit encoding, mono, DC offset corrected, normalized to the same maximal peak amplitude, mean duration = 1115 ± 386 ms) of positive or negative valence based on a previous rating procedure involving 97

Table 1 Experimental design and stimulus examples Emotional congruence

Sound emotionally congruent with adjective (EC) Sound emotionally incongruent with adjective (EI)

Conceptual relatedness between noun and sound Conceptually related (CR)

Conceptually unrelated (CU)

Terrible car—car alarm

Depressing music—car alarm

Adorable car—car alarm

Inspiring music—car alarm

Note: The same adjective--noun pairs and the same environmental sounds were presented in all conditions, albeit paired differently. This prevented spurious ERP differences due to perceptual variability between experimental conditions.

Page 2 of 7 Semantic Relatedness versus Emotional Congruence

d

Wu et al.

participants (scale of 1--5, 1 being unpleasant and 5 being pleasant; Thierry and Roberts 2007). Twenty sounds were positive (mean valence = 3.39 ± 0.32) and 20 were negative (mean valence = 1.97 ± 0.35). Both positive and negative sounds were collected from royaltyfree internet sound libraries or digitally recorded within our laboratory and contained human (e.g., scream, laughter), animal (e.g., meowing, growling), and mechanical (e.g., sleigh bells, machine gun) sounds. All 40 sounds were also rated for recognizability by 33 participants who did not take part in the ERP experiment. They were rated 4.3 ± 0.52 on a scale from 1 to 5 on average, 1 being unrecognizable and 5 being very recognizable. Another group of 13 participants rated the conceptual relatedness 1) between adjective--noun dyads and sounds and 2) between single nouns and sounds on a scale from 1 to 5, 1 being unrelated and 5 being strongly related. The results show that there was a main effect of conceptual relatedness, no effect of emotional congruency, and no interaction when it comes to overt matching judgments. These results were neither different for adjective--noun dyads nor for nouns presented in isolation. Adjective--noun dyads and sounds were paired so as to construct 80 trials in each of the 4 experimental conditions (Table 1, Appendix). Note that it was not possible to control for emotional category (e.g., anger, sadness, or disgust) and generate stimulus pairs in sufficient numbers, therefore this study only aimed at testing affective valence congruency effects. Target sounds were presented twice in each of the experimental conditions, that is, 8 times across the whole experiment, in order to cancel out spurious perceptual differences between conditions. Sounds were either conceptually related or unrelated to the noun and were either congruent or not with the adjective in terms of affective valence. The valence of adjectives and sounds was highly positively correlated for congruent pairings (r = 0.88, P < 0.0001) and highly negatively correlated for incongruent pairings (r = –0.91, P < 0.0001). The high positive correlation between the emotional valence of adjectives and sounds in the congruent condition shows that the dyads and sounds were strongly emotionally related (i.e., both positive or both negative). The negative correlation found for incongruent pairings validated the pairs as genuinely incongruent (i.e., opposite affective polarity).

Design and Procedure Volume levels were adjusted at a comfortable level for each participant individually prior to the electroencephalography (EEG) recording session. Adjective--noun dyads were presented on a single line at the center of a computer monitor for 500 ms, followed by a fixation cross which remained on the screen until the presentation of the next visual prime. Sounds were presented with a variable interstimulus interval of 400, 500 or 600 ms after the words within a window of 3500 ms so as to reduce the impact of the ERP elicited by the preceding prime stimulus. In the first experiment, participants were instructed to press one button when the object described in the dyad was related in meaning to the environmental sound presented as a target (e.g., ANGRY DOG—[dog growling]) and another button when they could not see a relationship (e.g., ANGRY DOG—[kiss]). Note that the adjective never provided a conceptual cue regarding the source of the environmental sound. Participants’ response automatically triggered the next trial after a minimal duration of 400 ms. In case of no response, the next trial was initiated 3500 ms after sound onset. The same procedure was followed in the second experiment, except that participants were instructed to make a congruency judgment regarding the affective valence of dyads and sounds (e.g., congruent: HAPPY CAT—[sleigh bells] and incongruent: HURT CAT—[sleigh bells]). Note that the noun by itself provided no cue regarding the affective valence of the environmental sound. Each of the 4 experimental blocks consisted of 80 trials and contained 2 sound repetitions, which never pertained to the same experimental condition. Conditions were randomized within each block. Block order and response side were fully counterbalanced between participants. Half of the participants pressed a right-hand button for conceptually related/emotionally congruent pairs and left-hand button for conceptually unrelated/emotionally incongruent pairs and the other half received the reverse instruction. This conformed to the structure of a 2 3 2 design with conceptual relatedness and emotional congruence as factors.

Downloaded from http://cercor.oxfordjournals.org/ by guest on January 24, 2016

an adjective (of controlled affective valence) and an emotionally neutral noun was displayed in standard left/right orientation at the center on a screen before an environmental sound was presented through headphones. The sound was either conceptually related or unrelated to the noun and either emotionally congruent or incongruent with the adjective (see Table 1). We elected to use adjective--noun dyads as primes because manipulating conceptual relatedness and emotional congruence using only nouns in a fully counterbalanced design proved impossible. Two groups of participants were asked to judge whether or not prime words and target sounds were related in meaning (Experiment 1) or congruent with regard to affective valence (Experiment 2), respectively. This design enabled us to test the independent effect of conceptual relatedness and emotional congruence on the amplitude of the N400. Critically, no behavioral or ERP differences could be attributed to spurious perceptual differences between conditions because the same sounds were used in each of the 4 experimental conditions. We predicted that conceptual relatedness would significantly reduce N400 mean amplitude and hypothesized the same effect for emotional congruence. Furthermore, based on the classical philosophical viewpoint regarding the relative independence of conceptual and emotional content evaluation, we hypothesized that the 2 factors would not interact at any point in time whether participants were focusing on the conceptual or emotional dimension of the stimuli.

ERP Acquisition and Processing Electrophysiological data were recorded in reference to Cz at a rate of 1 kHz from 64 Ag/AgCl electrodes embedded in an elastic cap (Easycap, Herrsching, Germany) placed according to the 10--20 convention using 64-channel and SynAmp2 amplifiers (Compumedics, Charlotte, NC). Impedances were kept below 7 kX. EEG activity was filtered online between 0.01 and 200 Hz and refiltered off-line using a 40 Hz low-pass zero phase shift digital filter. Eyeblinks were mathematically corrected after modeling (Gratton et al. 1983) and remaining artifacts manually dismissed. Epochs ranged from –100 to 1000 ms after the onset of the sound stimulus. Error trials were dismissed from further analysis. There were more than 30 trials in each of the 4 conditions for each of the participants included in the final analysis (mean number of accepted trials per condition = 56 ± 13). Baseline correction was performed in reference to prestimulus activity, and individual averages were digitally rereferenced to the arithmetic mean of the left and right mastoid channels.

Figure 1. Behavioral results in the conceptual relatedness judgment task (Experiment 1). Mean reaction times (bars, left-hand scale) and error rates (circle, right-hand scale) in the 4 experimental conditions: CR/U conceptually related/ unrelated, EC/I: emotionally congruent/incongruent. Error bars depict standard error of the mean in all cases. Note that the error bars are too small to show in the case of mean error rates in conditions CUEC and CUEI.

Results Behavioral Results In the first experiment, no main effect of conceptual relatedness or emotional congruence was found on reaction times and no interaction was found between the 2 factors. Participants made significantly more errors for conceptually related dyads--sound pairs (mean = 24.8 ± 2.7%) than unrelated pairs (mean = 2.1 ± 0.4%; F1,13 = 74.4, P < 0.0001). There was no main effect of emotional congruence and no interaction between the 2 factors on error rates (Fig. 1). In the second experiment, manipulations of conceptual relatedness and emotional congruency had no main effect and there was no interaction between these 2 factors on reaction time. The taskrelevant factor of emotional congruence increased the mean error rate for congruent pairs (mean = 27.1 ± 4.9%) as compared with incongruent pairs (mean = 5.3 ± 1.9%; F1,13 = 80.6, P < 0.0001). There was no main effect of conceptual relatedness and no interaction between the 2 factors on error rates (Fig. 2). Importantly, the pattern of reaction time did not change over the course of either of the 2 experiments. ERP Results In Experiment 1, ERPs elicited by environmental sounds displayed a sequence of peaks typically associated with the processing of meaningful auditory stimuli: N1, P2a, N2, P2b, and N400. None of the component latencies were affected by experimental conditions. The N1 peaked at 104 ± 8 ms on average and was maximal at Cz. N1 amplitude was not significantly affected by either experimental factor. The P2a peaked at

Figure 2. Behavioral results in the emotional congruence judgment task (Experiment 2). Mean reaction times (bars, left-hand scale) and error rates (circle, right-hand scale) in the 4 experimental conditions: CR/U conceptually related/unrelated, EC/I: emotionally congruent/incongruent. Error bars depict standard error of the mean in all cases.

Cerebral Cortex Page 3 of 7

Downloaded from http://cercor.oxfordjournals.org/ by guest on January 24, 2016

Statistical Analysis ERP components were determined based on the mean global field power measured across the scalp, which summarizes the contribution of all electrodes in the form of a single vector norm (Picton et al. 2000). This allowed automatic peak detection time locked to electrodes of maximum amplitude in the following intervals: 80--120 ms (N1), 150--220 ms (P2a), 260--320 ms (N2), 320--370 ms (P2b), and 370--500 ms (N400). Peak amplitudes and latencies were analyzed for each component using a 2 (conceptually related/unrelated) 3 2 (emotionally congruent/incongruent) 3 62 (levels of electrode) repeated measures analysis of variance (ANOVA). The results of ANOVAs conducted with 62 levels of electrodes were then replicated in a subset of 9 frontocentral electrodes (F3, Fz, F4, FC1, FCz, FC2, C1, Cz, C2). All the main effects and interaction between factors other than electrodes found in the 62-electrode analysis were replicated in the 9-electrode analysis. Lateralization effects were tested with an ANOVA involving hemispheres (2 levels) and all electrodes except midline ones (27 levels).

interaction between these 2 factors was found within this temporal window (F1,13 = 0.77, P > 0.1). Discussion The aim of this study was to determine whether the emotional processing of visually presented words and environmental sounds is independent from the processing of conceptual information as indexed by classical repetition priming effects. This was achieved by observing the priming effect of the 2 factors on the amplitude of the N400. We found a significant modulation of the N400 by conceptual relatedness and emotional congruence, respectively, with no interaction between the 2 factors whether participants focused on the conceptual or the emotional dimension of the stimuli. In both Experiments 1 and 2, we found no main effect of conceptual relatedness or emotional congruency on reaction times. Since the behavioral data was derived from correct trials only, the lack of priming cannot be attributed squarely to a disagreement between participants and experimenters on what does or does not constitute a related pair. Moreover, we found that error rates were relatively high for conceptually related or emotionally congruent pairs and low for unrelated or incongruent pairs. Taking into consideration the absence of a reaction time difference between conditions, this pattern of results suggests that both the conceptual and the emotional tasks require elaborate efforts and that we were successful in directing participants’ attention onto task-relevant properties of the stimuli since participants were biased toward judging the dyads--sound pairs as unrelated or incongruent. These findings do not affect our interpretation of the ERP data because only trials where the participants made a decision in agreement with experimenters regarding relationships between words and sounds were included. However, in both experiments, the

Figure 3. Event-related potentials elicited at 9 electrode sites in Experiment 1. (a) Main effect of conceptual relatedness. (b) Main effect of emotional congruence.

Page 4 of 7 Semantic Relatedness versus Emotional Congruence

d

Wu et al.

Downloaded from http://cercor.oxfordjournals.org/ by guest on January 24, 2016

189 ± 18 ms on average and was maximal at Cz. Unexpectedly, P2a mean amplitude was significantly reduced for conceptually unrelated as compared with related sounds (F1,13 = 8.31, P < 0.05), but there was no main effect of emotional congruence and no interaction. The N2 and P2b peaked on average at 290 ± 16 ms and 347 ± 13 ms, respectively and were maximal at Fz. Their mean amplitudes were significantly affected by conceptual relatedness (N2: F1,13 = 31.7, P < 0.0001; P2b: F1,13 = 25, P < 0.0001) in the same direction as in the P2a range (Fig. 3a), but there was no main effect of emotional congruence (Fig. 3b) and no interaction between the 2 factors. The N400 peaked at 462 ± 49 ms on average and was maximal over frontocentral regions. N400 mean amplitudes were significantly affected by conceptual relatedness (F1,13 = 20.9, P < 0.001, g2 = 0.77), such that unrelated targets elicited greater N400 amplitudes than related targets. There was also a significant electrode main effect (F61,793 = 9.66, P < 0.0001, g2 = 0.08), relating to maximal overall N400 amplitudes over frontocentral electrodes (Fig. 3a). In addition, there was a main effect of emotional congruence (F1,13 = 8.6, P < 0.02, g2 = 0.14): emotionally incongruent sounds elicited greater N400 amplitudes than the same sounds presented as emotionally congruent targets (Fig. 3b). But, emotional congruence failed to interact with the electrode factor (F1,13 = 2.1, P > 0.1). Critically, there was no interaction between conceptual relatedness and emotional congruence in the N400 range (F1,13 = 0.62, P > 0.1). Consistent with the pattern of results in Experiment 1, early ERP components (i.e., N1 and P1) were not affected by either conceptual relatedness or emotional congruency in the second experiment. Significant priming effects were found in the N400 range (i.e., around 200--500 ms on average), where conceptual relatedness (F1,13 = 16.1, P < 0.001, g2 = 0.35) and emotional congruency (F1,13 = 19.6, P < 0.001, g2 = 0.53) reduced mean ERP amplitude, respectively (Fig. 4a,b). Again, both effects showed frontal--central distribution that was maximal at Cz. No

task-irrelevant factor (emotional congruency in Experiment 1 and conceptual relatedness in Experiment 2) failed to affect either reaction times or error rates making it impossible to evaluate the implicit processing of emotional congruence and conceptual relatedness based on behavioral performance in these experiments. In Experiment 1, the ERP modulation elicited by conceptual relatedness appeared earlier than is traditionally reported in priming studies involving spoken words (e.g., Van Petten and Rheinfelder 1995; Radeau et al. 1998; Thierry et al. 1998; Thierry, Cardebat, et al. 2003). However, our result is consistent with previously reported differences in N400 peak latencies elicited by environmental sounds and words (Cummings et al. 2006). It may be due to a greater immediacy of access to meaning for environmental sounds than words and is compatible with recent demonstration of semantic effects before the N400 time window (Hauk et al. 2006). In addition, the topography of the N400 elicited by conceptually unrelated sounds tended to be right lateralized, consistent with previously reported N400 topography elicited by spoken words presented outside a sentence context (Kutas and Iragui 1998; Cummings et al. 2006; but see Van Petten and Rheinfelder 1995; Thierry et al. 1998; Thierry, Cardebat, et al. 2003). This result is reminiscent of classical dichotic listening studies showing a left-ear (right hemisphere) advantage for the interpretation of environmental sounds (Kimura 1967; King and Kimura 1972) and, more recently, neuroimaging data suggesting right-hemispheric involvement in accessing the meaning of environmental sounds/nonverbal information (Humphries et al. 2001; Thierry, Giraud, et al. 2003; Thierry and Price 2006). Even though there is evidence that ‘‘semantic access’’ is lateralized (Thierry and Price 2006), it must be kept in mind that there is a large anatomical overlap in the structures ultimately activated by words and sounds, that is, the ‘‘semantic system’’ (Saygin et al. 2003; Thierry, Giraud, et al. 2003; Cummings et al. 2006).

The main effect of emotional congruence on mean amplitudes in the N400 range is consistent with the hypothesis that emotionally incongruent information increases difficulty of contextual information integration (Bentin et al. 1993; Chwilla et al. 1995). Schirmer et al. (2002) have reported similar priming effects of emotional congruence between speech prosody and emotional valence in word processing. However, the sex effects reported by Schirmer and Kotz (2003) and Schirmer et al. (2002) suggest that our results may only apply to female listeners since 12 of our 14 participants were women (but see Schirmer et al. 2005). Critically, the absence of interaction between conceptual relatedness and emotional congruence supports our hypothesis that the emotional valence congruency effect reported here is independent from the analysis of other conceptual properties. The ERP results of Experiment 2, where participants made emotional congruence instead of conceptual relatedness judgment, have essentially replicated those of Experiment 1: Priming effects were observed for both conceptually relatedness and emotional congruency, and no interaction was found between the 2 factors, indicating that the dissociation between conceptual and emotional processing observed in Experiment 1 is not task specific. However, the priming effects by conceptual and emotional manipulations in Experiment 2 were of similar magnitude and latency, thus contrasting with the asymmetry of the priming effects in Experiment 1. This suggests that the relatively smaller effect of emotional priming in the first experiment is due to its implicit nature rather than the insignificance of the emotional congruence manipulation (Interestingly, we did not observe the reverse pattern of ERP modulations [i.e., larger and earlier effect by emotional priming relative to conceptual priming] in Experiment 2. One explanation is that the cross-modal priming paradigm adopted in the present study was more sensitive to conceptual relatedness than emotional congruence. Another possible interpretation Cerebral Cortex Page 5 of 7

Downloaded from http://cercor.oxfordjournals.org/ by guest on January 24, 2016

Figure 4. Event-related potentials elicited at 9 electrode sites in Experiment 2. (a) Main effect of conceptual relatedness. (b) Main effect of emotional congruence.

of this result is that the N400 itself is less sensitive to emotional priming than conceptual priming. It has been shown elsewhere that the N400 for words and sounds is modulated by automatic processing of conceptual features and also in the absence of behavioral priming effects [e.g., see Heil et al. 2004; Orgs et al. 2007, 2008]). Our results therefore show that N400 modulations by conceptual relatedness and emotional congruence are to some extent functionally independent and given that a number of previous studies have reported atypical N400 topographies in verbal (e.g., Deacon et al. 1995; Rugg et al. 1998; De Diego Balaguer et al. 2005) and nonverbal (e.g., Huddy et al. 2003; Zhou et al. 2004; Kelly et al. 2006) contexts, these 2 processes may involve partially distinct neural substrata. Conclusion

Funding Biotechnology and Biological Sciences Research Council UK (grant # 5/S18007); European Research Council (grant ERC-SG209704 to D.D., Y.J.W., and G.T.). Notes The authors wish to thank Bastien Boutonnet for assistance with data collection and Dr Clara Martin for useful discussions. Conflict of Interest: None declared.

Appendix List of stimuli used in the experiments Adjective--noun pair

Negative sound

Adjective--noun Pair

Positive sound

Adorable Terrible Good Dreadful Joyful Tense Romantic Rough Safe Toxic Beautiful Overwhelming Happy Hurt Lucky Timid Relaxing Fearful Impressive Upset Proud

Car alarm

Inspiring Depressing Terrific Cold Outstanding Hostile Nice Stiff Famous Nervous Friendly Hungry Powerful Delayed Tidy Odd Natural Narcotic Lively Scared Smooth

Drums

Car Car Plane Plane Alert Alert Storm Storm Tree Tree Warmth Warmth Cat Cat Snake Snake Dentist Dentist Pig Pig Shotgun

Airplane attack Alert alarm Wind storm Chainsaw Explosion Cat scream Snake die Dentist drill Pig scream Machine gun

Music Music Lake Lake Trunk Trunk Food Food Horse Horse Cow Cow Engine Engine Locker Locker Milk Milk People People Storm

Page 6 of 7 Semantic Relatedness versus Emotional Congruence

d

Duck quack Elephant Microwave Horse gallop Cow Helicopter Keys Sheep Crowd Thunder storm

Wu et al.

Adjective--noun pair Nasty Thoughtful Noisy Devoted Morbid Luscious Rotten Satisfied Angry Gentle Bloody Masterful Useless Sexy Boring Pleasurable Embarrassing Astonished Obnoxious

Shotgun Door Door Patient Patient Honey Honey Dog Dog Hit Hit Hairdryer Hairdryer Foam Foam Clothing Clothing Rattle Rattle

Negative sound Door slam Ambulance Bees in heat Dog growl Punch series Hair dryer Shave Cloth ripping Snake rattle

Adjective--noun Pair Severe Wise Violent Loved Sick Merry Sad Erotic Regretful Useful Distressed Jolly Rejected Silly Terrified Grateful Messy Untroubled Evil

Storm Owl Owl Kitten Kitten Christmas Christmas Kiss Kiss Insect Insect Heart Heart Bird Bird Puppy Puppy Laughter Laughter

Positive sound Barn owl Kitten Sleigh bells Kiss Cricket Heart beat Bird song Puppy Laugh

References Bentin S, Kutas M, Hillyard SA. 1993. Electrophysiological evidence for task effects on semantic priming in auditory word processing. Psychophysiology. 30:161--169. Bradley MM, Lang PJ. 1999. Affective norms for English words (ANEW). Gainesville (FL): The NIMH Center for the Study of Emotion and Attention, University of Florida. Chiu C-YP, Schacter DL. 1995. Auditory priming for nonverbal information: implicit and explicit memory for environmental sounds. Conscious Cogn. 4:440--458. Chwilla DJ, Brown CM, Hagoort P. 1995. The N400 as a function of the level of processing. Psychophysiology. 32:274--285. Coltheart M. 1981. The MRC psycholinguistic database. Q J Exp Psychol. 33:497--505. Connolly JF, Byrne JM, Dywan CA. 1995. Assessing adult receptive vocabulary with event-related potentials: an investigation of crossmodal and cross-form priming. J Clin Exp Neuropsychol. 17:548--565. Cummings A, Ceponiene R, Koyama A, Saygin AP, Townsend J, Dick F. 2006. Auditory semantic networks for words and natural sounds. Brain Res. 1115:92--107. Deacon D, Mehta A, Tinsley C, Nousak JM. 1995. Variation in the latencies and amplitudes of N400 and NA as a function of semantic priming. Psychophysiology. 32:560--570. De Diego Balaguer R, Sebastian-Galles N, Diaz B, Rodriguez-Fornells A. 2005. Morphological processing in early bilinguals: an ERP study of regular and irregular verb processing. Brain Res Cogn Brain Res. 25:312--327. Fischler I, Bradley M. 2006. Event-related potential studies of language and emotion: words, phrases, and task effects. Prog Brain Res. 156:185--203. Gratton G, Coles MG, Donchin E. 1983. A new method for off-line removal of ocular artifact. Electroencephalogr Clin Neurophysiol. 55:468--484. Grigor J, Van Toller S, Behan J, Richardson A. 1999. The effect of odour priming on long latency visual evoked potentials of matching and mismatching objects. Chem Senses. 24:137--144. Hauk O, Davis MH, Ford M, Pulvermuller F, Marslen-Wilson WD. 2006. The time course of visual word recognition as revealed by linear regression analysis of ERP data. Neuroimage. 30:1383--1400. Heil M, Rolke B, Pecchinenda A. 2004. Automatic semantic activation is no myth: semantic context effects on the N400 in the letter-search task in the absence of response time effects. Psychol Sci. 15:852--857. Herbert C, Junghofer M, Kissler J. 2008. Event related potentials to emotional adjectives during reading. Psychophysiology. 45(3):487--498. Holt DJ, Lynn SK, Kuperberg GR. 2008. Neurophysiological correlates of comprehending emotional meaning in context. J Cogn Neurosci. 21(11):2245--2262.

Downloaded from http://cercor.oxfordjournals.org/ by guest on January 24, 2016

The present study provides scientific evidence addressing a long-debated philosophical question of the relative independence between conceptual and affective evaluation of meaningful information. In a cross-modal verbal--nonverbal context (word--sound priming), independent modulations of ERPs induced by the processing of emotional congruence and conceptual relatedness suggest that the 2 forms of information processing can be functionally dissociated regardless of task demands. Future studies will determine whether such relative independence can be generalized to the processing of other categories of meaningful stimuli.

Appendix Continued

Radeau M, Besson M, Fonteneau E, Castro SL. 1998. Semantic, repetition and rime priming between spoken words: behavioral and electrophysiological evidence. Biol Psychol. 48:183--204. Rugg MD, Mark RE, Walla P, Schloerscheidt AM, Birch CS, Allan K. 1998. Dissociation of the neural correlates of implicit and explicit memory. Nature. 392:595--598. Saygin AP, Dick F, Wilson SM, Dronkers NF, Bates E. 2003. Neural resources for processing language and environmental sounds: evidence from aphasia. Brain. 126:928--945. Schirmer A, Kotz SA. 2003. ERP evidence for a sex-specific Stroop effect in emotional speech. J Cogn Neurosci. 15:1135--1148. Schirmer A, Kotz SA, Friederici AD. 2002. Sex differentiates the role of emotional prosody during word processing. Brain Res Cogn Brain Res. 14:228--233. Schirmer A, Kotz SA, Friederici AD. 2005. On the role of attention for the processing of emotions in speech: sex differences revisited. Brain Res Cogn Brain Res. 24:442--452. Scott GG, O’Donnell PJ, Leuthold H, Sereno SC. 2009. Early emotion word processing: evidence from event-related potentials. Biol Psychol. 80:95--104. Stuart GP, Jones DM. 1995. Priming the identification of environmental sounds. Q J Exp Psychol. 48A:741--761. Thierry G, Cardebat D, Demonet JF. 2003. Electrophysiological comparison of grammatical processing and semantic processing of single spoken nouns. Brain Res Cogn Brain Res. 17:535--547. Thierry G, Doyon B, Demonet JF. 1998. ERP mapping in phonological and lexical semantic monitoring tasks: a study complementing previous PET results. Neuroimage. 8:391--408. Thierry G, Giraud AL, Price C. 2003. Hemispheric dissociation in access to the human semantic system. Neuron. 38:499--506. Thierry G, Price CJ. 2006. Dissociating verbal and nonverbal conceptual processing in the human brain. J Cogn Neurosci. 18:1018--1028. Thierry G, Roberts MV. 2007. Event-related potential study of attention capture by affective sounds. Neuroreport. 18:245--248. Van Petten C, Rheinfelder H. 1995. Conceptual relationships between spoken words and environmental sounds: event-related brain potential measures. Neuropsychologia. 33:485--508. Zhou S, Zhou W, Chen X. 2004. Spatiotemporal analysis of ERP during Chinese idiom comprehension. Brain Topogr. 17:27--37.

Cerebral Cortex Page 7 of 7

Downloaded from http://cercor.oxfordjournals.org/ by guest on January 24, 2016

Huddy V, Schweinberger SR, Jentzsch I, Burton AM. 2003. Matching faces for semantic information and names: an event-related brain potentials study. Brain Res Cogn Brain Res. 17:314--326. Humphries C, Willard K, Buchsbaum B, Hickok G. 2001. Role of anterior temporal cortex in auditory sentence comprehension: an fMRI study. Neuroreport. 12:1749--1752. Kelly SD, Ward S, Creigh P, Bartolotti J. 2006. An intentional stance modulates the integration of gesture and speech during comprehension. Brain Lang. 101(3):222--233. Kimura D. 1967. Functional asymmetry of the brain in dichotic listening. Cortex. 3:163--178. King FL, Kimura D. 1972. Left-ear superiority in dichotic perception of vocal nonverbal sounds. Can J Psychol. 26:111--116. Kissler J, Herbert C, Peyk P, Junghofer M. 2007. Buzzwords: early cortical responses to emotional words during reading. Psychol Sci. 18(6):475--480. Kutas M, Federmeier KD. 2000. Electrophysiology reveals semantic memory use in language comprehension. Trends Cogn Sci. 4:463--470. Kutas M, Hillyard SA. 1980. Reading senseless sentences: brain potentials reflect semantic incongruity. Science. 207:203--205. Kutas M, Hillyard SA. 1984. Brain potentials during reading reflect word expectancy and semantic association. Nature. 307:161--163. Kutas M, Iragui V. 1998. The N400 in a semantic categorization task across six decades. Electroencephalogr Clin Neurophysiol. 108:456--471. Oldfield RC. 1971. The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia. 9:97--113. Orgs G, Lange K, Dombrowski JH, Heil M. 2006. Conceptual priming for environmental sounds and words: an ERP study. Brain Cogn. 62:267--272. Orgs G, Lange K, Dombrowski JH, Heil M. 2007. Is conceptual priming for environmental sounds obligatory? Int J Psychophysiol. 65:162--166. Orgs G, Lange K, Dombrowski JH, Heil M. 2008. N400-effects to taskirrelevant environmental sounds: further evidence for obligatory conceptual processing. Neurosci Lett. 436(2):133--137. Picton TW, Bentin S, Berg P, Donchin E, Hillyard SA, Johnson R, Jr, Miller GA, Ritter W, Ruchkin DS, Rugg MD, et al. 2000. Guidelines for using human event-related potentials to study cognition: recording standards and publication criteria. Psychophysiol. 37(2): 127--152.

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.