Facial expressions and complex IAPS pictures: common and differential networks

Share Embed


Descripción

DTD 5

ARTICLE IN PRESS YNIMG-03679; No. of pages: 14; 4C: 3, 6, 7, 8

www.elsevier.com/locate/ynimg NeuroImage xx (2006) xxx – xxx

Facial expressions and complex IAPS pictures: Common and differential networks Jennifer C. Britton,a,* Stephan F. Taylor,b Keith D. Sudheimer,a and Israel Liberzon b,c a

Department of Neuroscience, University of Michigan, Ann Arbor, MI 48109, USA Department of Psychiatry, University of Michigan, Ann Arbor, MI 48109, USA c Psychiatry Service, Ann Arbor VAMC, Ann Arbor, MI 48105, USA b

Received 20 July 2005; revised 11 December 2005; accepted 16 December 2005

Neuroimaging studies investigating emotion have commonly used two different visual stimulus formats, facial expressions of emotion or emotionally evocative scenes. However, it remains an important unanswered question whether or not these different stimulus formats entail the same processes. Facial expressions of emotion may elicit more emotion recognition/perception, and evocative pictures may elicit more direct experience of emotion. In spite of these differences, common areas of activation have been reported across different studies, but little work has investigated activations in response to the two stimulus formats in the same subjects. In this fMRI study, we compared BOLD activation patterns to facial expression of emotions and to complex emotional pictures from the International Affective Picture System (IAPS) to determine if these stimuli would activate similar or distinct brain regions. Healthy volunteers passively viewed blocks of expressive faces and IAPS pictures balanced for specific emotion (happy, sad, anger, fear, neutral), interleaved with blocks of fixation. Eye movement, reaction times, and off-line subjective ratings including discrete emotion, valence, and arousal were also recorded. Both faces and IAPS pictures activated similar structures, including the amygdala, posterior hippocampus, ventromedial prefrontal cortex, and visual cortex. In addition, expressive faces uniquely activated the superior temporal gyrus, insula, and anterior cingulate more than IAPS pictures, despite the faces being less arousing. For the most part, these regions were activated in response to all specific emotions; however, some regions responded only to a subset. D 2006 Elsevier Inc. All rights reserved.

Introduction Emotion research utilizes different types of stimuli (e.g. expressive faces and complex evocative pictures) to probe affective processing; however, the two lines of investigation have remained * Corresponding author. Massachusetts General Hospital, Psychiatry Department, Building 149 Thirteenth Street, Charlestown, MA 02129, USA. Fax: +1 617 726 4078. E-mail address: [email protected] (J.C. Britton). Available online on ScienceDirect (www.sciencedirect.com). 1053-8119/$ - see front matter D 2006 Elsevier Inc. All rights reserved. doi:10.1016/j.neuroimage.2005.12.050

relatively separate. Facial expressions are often viewed as external signals of experienced emotions that communicate information to the observer (Frank and Stennett, 2001). Facial expressions portraying specific emotions (e.g. happy, sad, anger, fear) are universally recognized (Ekman, 1992, 1994; Izard, 1994) and each expression of discrete emotion has meaning, targeting a specific response (Halberstadt and Niedenthal, 1997). Even though facial expressions are used frequently as probes of emotion recognition, some studies have shown that faces can be inducers of emotion (Hatfield et al., 1992; Wild et al., 2001). Facial expressions have been also shown to evoke physiological changes (Clark et al., 1992; Esteves and Ohman, 1993) and autonomic activity in response to facial expressions has been shown to correlate with neural activation (Williams et al., 2004). Complex pictures from the International Affective Picture System (IAPS), another common emotional probe, depict emotion-laden scenes to induce affective states. The standardized set of IAPS pictures has been rated in terms of their ability to induce valence (unpleasant/pleasant) and arousal (calm/ excited) changes. These measures have also been correlated with viewer’s heart rate and skin conductance changes, respectively, providing physiological validity to subjectively reported emotion induction (Lang et al., 1993). However, little work has been done to identify the discrete emotions elicited by these pictures. Although both emotional faces and IAPS pictures target emotional processing, these two stimuli sets may preferentially engage certain brain structures involved in emotion. In addition, it is not known whether facial expressions and IAPS pictures of specific emotions (happy, sad, anger, and fear) would activate similar or discrete circuits. Studies of expressive faces and IAPS pictures suggest that a similar set of regions is involved in processing both emotional stimulus types. Expressive faces and IAPS pictures activate regions involved in emotion processing, including the amygdala (Breiter et al., 1996; Liberzon et al., 2003; Morris et al., 1996), hippocampus (Gur et al., 2002; Lane et al., 1997c), insula (Phan et al., 2004; Phillips et al., 1997), anterior cingulate (ACC, Killgore and Yurgelun-Todd, 2004; Morris et al., 1998), medial prefrontal cortex (mPFC, Kim et al., 2003; Taylor et al., 2003; Winston et al., 2003), ventromedial prefrontal cortex (vMPFC, Phan et al., 2004)/orbitofrontal cortex (OFC, Blair et al.,

ARTICLE IN PRESS 2

J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

1999), and visual cortex (Liberzon et al., 2003; Morris et al., 1998). Both stimulus types may recruit similar structures due to the underlying emotional processes activated within those regions (e.g. amygdala activation reflecting fear (LeDoux, 2000) or stimulus salience (Liberzon et al., 2003), insula activation reflecting somatic/visceral responses (Damasio, 1999) and disgust perception (Phillips et al., 1997), anterior cingulate activation reflecting attention and self-awareness (Lane et al., 1997a), and medial prefrontal activation reflecting emotion regulation (Davidson et al., 2000)). However, few studies have compared these stimuli directly. In a single study comparing threat-related stimuli, bilateral amygdala activation was found in response to both expressive faces and IAPS pictures (Hariri et al., 2003); however, the low Z-scores and the cognitive matching task in this study prevent any definitive conclusions regarding the common and differential emotional networks activated by these emotional stimuli. Even though expressive faces and complex pictures may activate a similar set of regions, given the role of emotional facial expressions in transacting social behavior, emotional perception of faces is thought to be processed by a distinct circuitry (Calder et al., 2001), including superior temporal gyrus (STG) and amygdala (Adolphs et al., 2002; Winston et al., 2003). Facial expressions of emotion have characteristic profiles (e.g. protruded tongue when disgusted, contracted eyebrows when angry) (Darwin, 1998) and the STG has been shown to respond to variable aspects in facial expressions (Narumoto et al., 2001). In some studies, superior temporal gyrus has also been shown to respond preferentially to faces relative to pictures (Geday et al., 2003). Lesion and neuroimaging studies highlight the robustness of the amygdala response to faces. Amygdala lesions have been shown to impair fear recognition (Yang et al., 2002a). Neuroimaging studies have shown increased amygdala activity when viewing fear (Breiter et al., 1996; Hariri et al., 2003; Morris et al., 1996; Phillips et al., 1997; Whalen et al., 2001), angry (Whalen et al., 2001), sad (Blair et al., 1999), and happy facial expressions (Breiter et al., 1996; Dolan et al., 1996). Even though IAPS pictures also activate these regions, processing emotional information from facial expressions may be processed preferentially by superior temporal gyrus and amygdala. In the current study, we aimed to examine the neural correlates of responses to expressive faces and IAPS pictures. Do these emotional probes elicit similar or distinct activation patterns? In order to effectively compare BOLD responses to expressive faces and IAPS pictures, stimulus properties (e.g. specific emotion, valence and arousal) had to be balanced, but only few studies have examined the emotion induction capability of facial expressions (Wild et al., 2001) or the profiles of specific emotions induced by the IAPS pictures (Davis et al., 1995). Therefore, a behavioral experiment was conducted to match stimuli based on these features. Subsequently, a block design fMRI study was conducted to examine the neural correlates of processing facial expressions and IAPS pictures, balanced on specific emotion. We hypothesized that facial expressions and IAPS pictures would activate a similar emotional network, and that some brain regions (superior temporal gyrus and amygdala) would preferentially respond to facial expressions.

Methods Participants Healthy volunteers were recruited from advertisements placed at local universities. Demographics are outlined in Table 1. All

Table 1 Demographics Behavioral Group 1 Behavioral Group 2 fMRI Participants Gender (males) Age (years) Caucasian African American Asian Indian Hispanic

60 30 21.6 T 3.0 (SD) 43 3 8 4 2

60 30 22.1 T 2.9 44 2 6 4 4

12 6 21.4 T 2.2 9 1 2 0 0

participants were between 18 and 30 years, right-handed, English speaking, and had normal or corrected-to-normal visual acuity. Participants did not have a current or prior history of head injury, learning disability, psychiatric illness, medical illness, or substance abuse/dependence (> 6 months). For the fMRI study, a formal screening assessment (Mini SCID) was used (Sheehan et al., 1998). After explanation of the experimental protocol, all participants gave written informed consent, as approved by the University of Michigan Institutional Review Board. Participants were paid for their participation. Experiment 1: Behavioral study Stimuli The image set included 150 facial expressions of specific emotions posed and evoked by actors balanced for gender and ethnicity (Gur et al., 2002) and 200 IAPS pictures (Lang et al., 1997). These images were selected to target the emotions of happiness (babies, Mickey Mouse, sporting events), sadness (funeral scenes/cemeteries, premature babies, wounded bodies), anger (human violence, guns, KKK images), and fear (snakes, spiders, sharks, medical procedures) in equal quantities. In addition, neutral or nonemotional images (mushrooms, household items) were also selected. All images were converted from color to gray scale/black and white using Photoshop 6.0 (Adobe Systems, San Jose, CA) and matched on luminance. Procedure Volunteers participated in separate rating-task experiments (Group 1: IAPS rating task, Group 2: Face rating task). For both, participants were seated in front of a laptop computer (Dell PC, Inspiron 2650) in a quiet experimental room. After viewing an image for 3 s, participants were prompted to rate each image. IAPS pictures were rated only on (1) predominant emotion and (2) emotion intensity because standardized ratings of valence and arousal for each picture have been published (Lang et al., 1997). Facial expressions were rated on (1) predominant emotion, (2) emotion intensity, and also on (3) valence and (4) arousal. The predominant emotion rating instructions were ‘‘Indicate the predominant emotion that is depicted in the image given the following options: happy, neutral, sad, anger, fear, and disgust’’. The emotion intensity rating instructions were ‘‘Indicate the degree/intensity of the selected emotion (1 = not at all, 2 = mildly, 3 = moderately, 4 = strongly, 5 = extremely)’’. The valence rating instructions were ‘‘Rate how unpleasant or pleasant the image makes you feel using a 1 – 9 scale (1 = very unpleasant, 5 = neutral, 9 = very pleasant)’’. The arousal rating instructions were ‘‘Rate how emotionally intense or

ARTICLE IN PRESS J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

arousing the image makes you feel using a 1 – 9 scale (1 = calm, 5 = somewhat aroused, 9 = excited)’’. Analysis For each image, the frequencies of the two most reported emotions were compared using chi-squared analysis. For images with significant chi-squared values ( P < 0.05), the image was classified according to the predominant emotion. The number of images in each specific emotional category was compiled and the percentage agreement across participants was calculated. The valence and arousal ratings obtained for the face and standardized IAPS picture ratings were compared using t tests. A series of t tests compared valence and arousal ratings within each specific emotion category as well. Results Using the criteria described above (significance on a chi-square test), a proportion of the 150 facial expressions stimuli (82.6%) were classified according to specific emotions (happy: 19.3%, neutral: 18.0%, sad: 17.3%, anger: 12%, fear: 16%). From the set of 200 IAPS pictures, 67.5% were classified according to specific emotions (happy: 19.5%, neutral: 16.0%, sad: 13.5%, anger: 7.5%, fear: 11.0%) (Fig. 1). After assigning the images in a particular specific emotion category, the percent agreement was analyzed (Fig. 1A). In

3

general, more agreement was detected in the emotional faces (83.70%) than IAPS pictures (75.2%). In addition, happy images showed most agreement (>90%). Standardized valence and arousal ratings for the IAPS picture set (Lang et al., 1997) were compared to the ratings of facial expressions obtained from our participants (Figs. 1B – D). The IAPS pictures were rated higher on valence (i.e. more pleasant or more unpleasant) as compared to faces in each specific emotion category (post hoc pairwise t tests: P < 0.001) except anger ( P > 0.241). Happy and neutral IAPS pictures were rated more positively than happy and neutral facial expressions, respectively. Sad and fear IAPS pictures were rated more negatively than sad and fear faces, respectively. The arousal rating for the faces (3.19 T 0.06 (SEM)) was lower than for IAPS pictures (5.07 T 0.09) for all specific emotion categories [t(334.9) = 17.56, P < 0.001; post hoc pairwise t tests: P < 0.001]. Experiment 2: fMRI study Procedure Volunteers were placed comfortably within the scanner. A light restraint was used to limit head movement during acquisition. While lying inside the scanner, stimuli were presented to participants via MRI-compatible display goggles (VisuaStimXGA, Resonance Technology) mounted on the RF head coil and adjusted to ensure an unobstructed field of view. Stimuli were displayed

Fig. 1. Behavioral ratings. Each stimulus set (expressive faces and complex IAPS pictures) was rated on several dimensions. (A) Percentage of participants agreeing with predominant emotion assigned to each image. (B) Valence (1 = very unpleasant, 5 = neutral, 9 = very pleasant) and arousal (1 = calm, 9 = excited) ratings plotted for each image. (C) Mean and standard error of valence ratings for images within each assigned discrete emotion category. (D) Mean and standard error of arousal ratings for images within each assigned discrete emotion category.

ARTICLE IN PRESS 4

J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

using Eprime software (Psychology Software Tools, Inc.; Schneider et al., 2002a,b). In addition, Eprime recorded participants’ subjective responses via right-handed button-glove. Using a block design, expressive faces and IAPS pictures were interleaved with control periods. Images for each specific emotion block (happy, neutral, sad, anger, fear) were identified using the emotional ratings and emotional intensities obtained in the behavioral experiment. Emotion block order was counterbalanced across the entire scanning session. Four emotional stimuli were presented in each face and IAPS picture block. Each image within the block was shown for 4 s with no interstimulus interval. Two gray scale fixation images were presented during control periods. The sequence of face and picture blocks was repeated eight times within each run. Eight runs were acquired. Each stimulus block was repeated in the second half of the experiment; however, while the stimuli within each block were maintained, the block order within each run was counterbalanced. Participants passively viewed each image and responded via button-press using the right index finger to indicate when a new image appeared on the screen. The reaction time of this response was recorded and used to monitor task performance. In addition, eye movements were monitored with infrared camera within the display goggles that sampled pupil location at 30 Hz with an accuracy of 1.0- of visual arc (ViewPoint Eyetracker, Arrington Research). Before scanning, participants were introduced to a brief version of the task, consisting of one block of neutral expressive faces and one block of neutral complex pictures interspersed by fixation. The images displayed in this practice session were not repeated during image acquisition. Immediately following scanning, participants completed a self-paced rating task outside the scanner similar to the procedure of the behavioral experiment. Maintaining image order within each block, participants rated each image on several dimensions: predominant emotion (forced-choice selection between happy, neutral, sad, anger, fear, and disgust), associated emotional intensity (1 = not at all, 5 = extremely), valence (1 = most unpleasant, 5 = neutral, 9 = most pleasant), and arousal (1 = calm, 9 = very excited). The block order was counterbalanced between subjects. fMRI acquisition Scanning was performed on a 3.0 T GE Signa System (Milwaukee, WI) using a standard radio frequency coil. A T1weighted image was acquired for landmark identification to position subsequent scans. After initial acquisition of T1 structural images, functional images were acquired. To minimize susceptibility artifact (Yang et al., 2002b), whole-brain functional scans were acquired using T2*-weighted reverse spiral sequence with BOLD (blood oxygenation level-dependent) contrast (echo time/ TE = 30 ms, repetition time/TR of 2000 ms, frequency of 64 frames, flip angle of 90-, field of view/FOV of 20 cm, 40 contiguous 3 mm oblique axial slices/TR approximately parallel to the AC – PC line). Each run began with 6 Fdummy_ volumes (subsequently discarded) to allow for T1 equilibration effects. After 8 functional runs were collected, a high-resolution T1 scan was also acquired to provide precise anatomical localization (3DSPGR, TR of 27 ms, minimum TE, flip angle of 25-, FOV of 24 cm, slice thickness of 1.0 cm, 60 slices/TR). Co-images were reconstructed off-line using the gridding approach into a 128  128 display matrix with an effective spatial resolution of 3 mm isotropic voxels.

Analysis Participants responded when a new image appeared on the screen in this passive viewing task to monitor on-task performance. To test on-task performance, the number of responses and the reaction time was examined. The number of responses to face, IAPS picture, and fixation images was examined. The reaction times were examined using 2 (image: face, IAPS picture)  5 (emotion: happy, neutral, sad, anger, fear) Repeated Measures ANOVA and post hoc analysis. Separate paired t tests were used to test differences between images (face, IAPS picture, and fixation). In addition, paired t tests examined differences between the reaction times during the first and last part of the experiment for each image type. Preprocessing of eye movement occurred offline, beginning with the identification of eye blinks. Linear interpolation was then performed to correct for missing data points. The standard deviation of the eye position in horizontal and vertical directions was calculated for each stimulus block using MATLAB (Mathworks, Inc., Sherborn, MA). The eye movement data in the horizontal and vertical directions were examined using separate 2 (image: faces, IAPS pictures)  5 (emotion: happy, neutral, sad, anger, fear) Repeated Measures ANOVA. Paired t tests examined the differences between faces, IAPS pictures, and fixation images. The postscan ratings (valence and arousal) were examined using separate 2 (image type: faces, IAPS pictures)  5 (emotion: happy, neutral, sad, anger, fear) Repeated Measures ANOVA. Post hoc analysis determined significant main effects of image type and emotion. Paired t tests examined the differences between faces and IAPS pictures in each discrete emotion category. fMRI analysis Images were slice-time corrected, realigned, co-registered, normalized, and smoothed according to standard methods. Scans were slice-time corrected using sinc interpolation of the eight nearest neighbors in the time series (Oppenheim and Schafer, 1989) and realigned to the first acquired volume using AIR 3.08 routines (Woods et al., 1998). Additional preprocessing and image analysis of the BOLD signal were performed with Statistical Parametric Mapping (SPM99; Wellcome Institute of Cognitive Neurology, London, UK; www.fil.ion.ucl.ac.uk/spm) implemented in MATLAB. Images were co-registered with the high-resolution SPGR T1 image. This high-resolution image was then spatially normalized to the Montreal Neurological Institute (MNI152) template brain and transformation parameters were then applied to the co-registered functional volumes, resliced, and spatially smoothed by an isotropic 6 mm full-width-half-maximum (FWHM) Gaussian kernel to minimize noise and residual differences in gyral anatomy. Each normalized image set was band passfiltered (high pass filter = 100 s) to eliminate low frequency signals (Ashburner et al., 1997; Friston et al., 1995). The data were analyzed using a general linear model with parameters corresponding to each specific emotion (happy, neutral, sad, anger, and fear) and image type (expressive faces, IAPS pictures, and fixation images), modeling each run separately. Each stimulus block was convolved with a canonical hemodynamic response function (HRF). For each participant, parameter estimates of block-related activity were obtained at each voxel. Contrast images were calculated by applying appropriate linear contrasts to the parameter estimates of each block to produce statistical parametric maps

ARTICLE IN PRESS J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

(SPM{t}), which were transformed to a normal distribution (SPM{Z}). Relevant linear contrasts included image type main effects (e.g. Face-Fixation, IAPS-Fixation), specific emotion main effects within each image type (e.g. Happy Face-Fixation, Happy Face – Neutral Face), and emotion  image type interaction effects (e.g. [Emotional Face – Neutral Face] – [Emotional IAPS picture – Emotional IAPS picture], [Happy Face – Neutral Face] – [Happy IAPS picture – Neutral IAPS picture]). To account for interindividual variability, an additional 6-mm smoothing was performed on the contrast images before incorporating the individual contrasts in a random effects analysis. A second-level random effects analysis used one-sample t tests on smoothed contrast images obtained in each subject for each comparison of interest, treating subjects as a random variable (Friston, 1998). This analysis estimates the error variance for each condition of interest across subjects, rather than across scans, and therefore provides a stronger generalization to the population from which data are acquired. In this random effects analysis, resulting SPM maps (df = 11) were examined in a priori regions (amygdala/sublenticular extended amygdala, hippocampus, STG, insula, ACC, mPFC, vMPFC/ OFC). Whole-brain analysis conducts comparisons in a voxelwise manner, increasing the possibility of false positives unless an appropriate correction for multiple comparisons is used. To restrict the number of comparisons, a Small Volume Correction (SVC) was applied for all activations in a priori regions. SVC was implemented in SPM across a two volumes of interest [rectangular box 1: x = 0 T 70 mm, y = 10 T 30 mm, z = 5 T 25 mm; rectangular box 2: x = 0 T 20 mm, y = 35 T 35 mm, z = 15 T 45 mm] defined using the Talaraich atlas to isolate central regions (amygdala/SLEA, hippocampus, STG, insula) and anterior midline regions (mPFC, ACC, vMPFC/OFC). Within each SVC, a false discovery rate [FDR] of 0.005 was used to ensure that on average no more than 0.5% of activated voxels for each contrast are expected to be false positive results (Genovese et al., 2002). In addition, activation foci were required to have a cluster size/extent threshold of greater than 5 contiguous voxels. For activation foci detected between modalities (e.g. Faces > IAPS Pictures and IAPS Pictures > Faces), regions activated within each modality that fell just below the cluster threshold are also denoted in the tables.

Results On-task performance Participants responded via button-press to 98% of images (100% accuracy to faces, 99% accuracy to IAPS pictures, and 96% accuracy to blanks), confirming on-task performance. Reaction times differed depending on modality [ F(1,11) = 17.87, P < 0.001]. Reaction times to faces (638.5 T 91.8 ms (SEM)) were significantly faster than reaction times to IAPS pictures (859.7 T 142.3 ms, t(11) = 4.23, P < 0.001). No main effect of specific emotion was detected ( P > 0.477). The first half of the experiment elicited slower reaction times than the second half for all image types [1st half: 807.4 T 124.4 ms, 2nd half: 686.0 T 114.7 ms, paired t tests t(11) = 4.91, P < 0.001]. Different lateral eye movement patterns were detected for different stimulus types [image effect: F(1,8) = 8.01, P < 0.018]. IAPS pictures (SD: 0.339 T 0.079) elicited more eye movement in the horizontal direction compared to eye move-

5

ments elicited by faces (SD: 0.235 T 0.065, t(8) = 2.83, P < 0.018) and fixation (SD: 0.210 T 0.058, t(8) = 1.80, P < 0.101). No differences between specific emotions were detected ( P > 0.556). No difference in vertical eye movements between different images was detected ( P > 0.319). Postscan subjective ratings The stimulus sets were examined to determine the percentage agreement with the predominant emotion standards determined by the behavioral experiment. In general, more agreement was detected in the emotional faces (83.5%) compared to the IAPS pictures (78.9%). In addition, happy images were more consistently identified by participants than any other emotion (Fig. 2). Similar to Experiment 1, emotional IAPS pictures were rated with higher valence (i.e. more pleasant or more unpleasant) for all specific emotions [image type: F(1,11) = 12.46, P < 0.005, paired t tests: P < 0.005]. Happy and neutral IAPS pictures were rated more positively than happy and neutral faces, respectively. Sad, anger, and fear IAPS pictures were rated more negatively than the sad, anger, and fear faces, respectively [emotion: F(5,55) = 76.73, P < 0.001, emotion  image type interaction: F(5,55) = 48.62, P < 0.001]. Emotional IAPS pictures were more arousing than emotional faces for all specific emotions [image type: F(1,11) = 52.04, P < 0.001; paired t tests: P < 0.001; emotion main effect: F(5,55) = 35.94, P < 0.001, image type  emotion interaction: F(5,55) = 13.88, P < 0.001]. Arousal ratings for neutral IAPS pictures and neutral faces were not significantly different [paired t test: t(11) = 1.74, P < 0.110]. fMRI results Effects of facial expressions and IAPS pictures Facial expressions analyzed together (contrast: all facesfixation) and picture stimuli analyzed together (contrast: all IAPS pictures-fixation) activated a similar network: bilateral amygdala, posterior hippocampus, ventral medial prefrontal cortex, and visual cortex (Table 2, Fig. 3). In addition, dorsomedial prefrontal cortex activated in response to expressive faces [( 3, 57, 33), Z = 3.02, k = 19]. This pattern of activation was consistently present when several different specific emotions (happy, sad, anger, fear, and neutral) were analyzed separately (contrast: specific emotion-fixation, e.g. happy face-fixation). The amygdala was activated in response to all emotional facial expressions and sad and anger IAPS pictures. With the exception of happy facial expressions, the pattern of dorsomedial prefrontal cortex activation was similar to the amygdala. Hippocampus activated in response to all facial expressions (except happy) and also to all IAPS pictures. Ventromedial prefrontal cortex activated in response to all facial expressions (except happy) and all IAPS pictures (except fear). Visual cortex was activated in response to all facial expressions and all IAPS pictures. Effects of emotional faces and emotional IAPS pictures To identify and compare emotionality in these stimulus types, all facial expressions and all IAPS pictures were analyzed relative to neutral (e.g. contrast: emotional faces – neutral faces and [emotional faces – neutral faces] – [emotional IAPS pictures – neutral IAPS

ARTICLE IN PRESS 6

J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

Effects of specific emotional faces and emotional IAPS pictures To identify the effects of each specific emotion, each specific emotion (happy, sad, anger, and fear) was also analyzed separately (e.g. contrast: happy faces – neutral faces). While amygdala, hippocampus, vMPFC, and visual cortex were commonly activated among faces and pictures when all specific emotions were analyzed together, we observed differential activation in these regions in response to specific emotions, suggesting that some emotions contributed more substantially to these overall results. Amygdala activated in response neutral stimuli; however, anger faces showed significantly greater amygdala activity than neutral faces. Similarly, while hippocampus activated in response to neutral stimuli, anger and fear stimuli showed significantly greater hippocampal activity than neutral stimuli. Ventromedial prefrontal cortex activated in response to neutral stimuli, and sad and anger faces and anger IAPS pictures showed greater vMPFC activity than neutral faces and pictures, respectively. Visual cortex activated in response to neutral stimuli, and happy, sad, and fear IAPS pictures showed greater visual cortical activity than neutral pictures. Additionally, fear and sad IAPS pictures showed greater visual activity compared to fear and sad faces. Specific emotions (e.g. contrast: [happy faces – neutral faces] – [happy IAPS pictures – neutral IAPS pictures]) contributed to the overall differences in activation between facial expressions and IAPS pictures in STG, insula, and ACC (Fig. 4). STG was significantly activated in response to all specific emotional faces relative to neutral faces (happy, sad, anger, and fear). All these activations (except sad) were also significantly larger than corresponding activations elicited by specific emotional IAPS relative to neutral IAPS pictures. Similarly, insula was activated in response to all specific emotional faces (happy at a subthreshold level), and all these activations (except anger) were significantly larger than corresponding activations elicited by specific emotional IAPS pictures. Anterior cingulate was significantly activated in response to fear and sad (sad at a subthreshold level) facial expressions, and these activations showed greater anterior cingulate activity compared to corresponding IAPS pictures. Anger and sad faces also elicited greater rostral anterior cingulate activity compared to corresponding IAPS pictures (Table 4).

Discussion

Fig. 2. fMRI postscan ratings. Each stimulus set (expressive faces and complex IAPS pictures) was rated on several dimensions. (A) Percentage of participants agreeing with predominant emotion assigned to each image. (B) Mean and standard error of valence (1 = very unpleasant, 5 = neutral, 9 = very pleasant) ratings for images within each assigned discrete emotion category. (C) Mean and standard error of arousal (1 = calm, 9 = excited) ratings for images within each assigned discrete emotion category.

pictures]). The superior temporal gyrus, insula, and anterior cingulate activated in response to emotional faces, and showed greater activity in these regions compared to emotional IAPS pictures. Visual cortex activated in response to emotional pictures, and showed greater activity compared to facial expressions. Of note, the activations to neutral stimuli did not differ in any region other than the visual cortex [IAPS pictures > faces: ( 9, 93, 3), Z = 5.36, k = 656] (Table 3).

In this study, we examined whether expressive faces and IAPS pictures would activate similar brain regions. Analyzed as set of stimuli, expressive faces and IAPS pictures activated a common pattern of brain regions including the amygdala, posterior hippocampus, ventromedial prefrontal cortex, and visual cortex. These stimuli also activated superior temporal gyrus, insula, and anterior cingulate differentially, e.g. more activation in these regions to expressive faces than to IAPS pictures. For the most part, these regions were activated in response to each specific emotion separately; however, some regions responded only to a subset of specific emotions. Expressive faces and IAPS pictures: common areas of activation The amygdala, posterior hippocampus, ventromedial prefrontal cortex, and visual cortex were activated by both expressive faces and IAPS pictures analyzed as two sets of emotional stimuli,

ARTICLE IN PRESS J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

7

Table 2 Emotional faces and IAPS pictures activate a similar network relative to fixation Region

Faces

IAPS pictures a

L. amygdala R. amygdala Hippocampus Ventromedial prefrontal/orbitofrontal cortex Visual a b c

b

(x, y, z)

Z

( 21, 6, 18) (24, 6, 15) ( 24, 30, 3) (15, 30, 3) (0, 45, 24) (30, 78, 15)

4.18 4.05 3.73 2.90 4.55 5.68

k

c

36 44 30 12 49 2559

(x, y, z)

Z

( 21, 6, 15) (24, 3, 15) (15, 30, 6)

3.81 3.06 4.57

k 12 6 207

(3, 45, 21) ( 33, 60, 15)

3.65 5.79

45 5607

Stereotactic coordinates from MNI atlas, left/right (x), anterior/posterior ( y), and superior/inferior (z), respectively. R = right, L = left. Z score, significant after Small Volume Correction (SVC thresholded using a false discovery rate [FDR] correction for multiple comparisons of 0.005). Spatial extent in cluster size, threshold  6 voxels.

suggesting that these regions are involved in general emotion processing (i.e. not specific to stimulus type or a particular process, recognition vs. induction). Consistent with previous findings, negative emotional faces and IAPS pictures activated the amygdala (Hariri et al., 2000, 2003). In addition, we found amygdala activation to happy emotional faces. Amygdala activation has been reported to positive and negative facial expressions (Breiter et al., 1996; Morris et al., 1996; Somerville et al., 2004) and IAPS pictures (Liberzon et al., 2003); therefore, it is unclear why positive IAPS pictures did not activate the amygdala as well. Emotional faces and IAPS pictures activated the hippocampus, in concert with

previous studies (Fried et al., 1997; Lane et al., 1997c). The hippocampus has been shown to be involved in episodic memory and declarative knowledge (Bechara et al., 1995) and with its extensive connections from extrastriate visual areas including fusiform gyrus, the hippocampal activation may reflect contextual memory and visual processing triggered by our stimuli. Negative facial expressions and negative IAPS pictures, with the exception of fear, activated the ventromedial prefrontal cortex. The medial prefrontal cortex is thought to be involved in emotional selfawareness (Lane et al., 1997b) and reexperiencing the Ffeelings_ of one’s emotional past (Damasio, 1999). In concert, ventromedial

Fig. 3. Common regions of activation. SPM t map showing activated visual cortex (visual), ventromedial prefrontal cortex (vMPFC), and amygdala (Amy) to (A) Expressive Faces and (B) IAPS pictures relative to fixation. Posterior hippocampus was also activated (not shown). Activated voxels are displayed with P < 0.005 uncorrected, [k] > 5 voxels threshold.

ARTICLE IN PRESS 8

J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

Table 3 Emotional faces and IAPS pictures activate a different network relative to neutral Region

Faces

Faces > IAPS pictures a

Superior temporal gyrus Insular cortex Anterior cingulate Visual cortex a b c

b

(x, y, z)

Z

(60, 39, 6) ( 66, 6, 9) ( 39, 18, 9) (0, 30, 30)

4.17 3.50 3.56 4.10

k

c

284 113 69 35

(x, y, z)

a

(69, 21, 15) ( 66, 9, 12) ( 27, 6, 12) ( 3, 30, 30)

Z

b

IAPS pictures k

3.22 3.64 3.14 3.64

c

a

IAPS pictures > Faces b

(x, y, z)

Z

(12, (30,

3.04 3.24

k

c

(x, y, z)a

Zb

kc

(36,

3.81

88

134 91 37 43 75, 12) 93, 15)

9 25

78, 0)

Stereotactic coordinates from MNI atlas, left/right (x), anterior/posterior ( y), and superior/inferior (z), respectively. Z score, significant after Small Volume Correction (SVC thresholded using a false discovery rate [FDR] correction for multiple comparisons of 0.005). Spatial extent in cluster size, threshold  6 voxels.

prefrontal lesions lead to deficits in recognizing emotion from facial expressions (Hornak et al., 1996). In addition, ventromedial prefrontal cortical activation to IAPS was modulated by the extent of self-association (Phan et al., 2004); thus, the ventromedial prefrontal cortex activation may reflect personal association. Both expressive faces and IAPS pictures activated the visual cortex, which is expected given the reports of emotional content modulating visual processing. Two components of emotional processing (e.g. arousal and valence) have been shown to contribute to visual cortex activations (Mourao-Miranda et al., 2003), and increased activation in the visual cortex may also reflect the stimulus’ significance (Anderson and Phelps, 2001; Pessoa et al., 2002) or increased attention (Lane et al., 1999). The dorsomedial prefrontal cortex is thought to be involved in general emotional processing (i.e. emotional appraisal/evaluation and emotion regulation) (Phan et al., 2002). Discrete emotions in both types of stimuli activated dorsomedial prefrontal cortex, but this activation was detected in the main effect of expressive faces but not IAPS pictures. The less consistent dorsomedial prefrontal cortex activation to these emotional stimuli might have been be due to our choice of passive viewing task in this study. Including a cognitive task (e.g. rating) has shown to increase dorsomedial prefrontal cortex activation (Taylor et al., 2003) and while the passive viewing task was chosen as to not bias the participants towards emotion recognition or emotion induction, it is possible that subjects were labeling the emotion displayed on each face. Although the dMPFC and amygdala activation was consistent among all facial expressions, it was observed that negative pictures showed dMPFC activation when amygdala was activated in those conditions as well. Given the anatomical connections, the co-activation of these two structures have been hypothesized to reflect possible influence of cortical inhibitory control (Ongur and Price, 2000). The MPFC has been implicated in emotion regulation (Levesque et al., 2003; Ochsner et al., 2002; Taylor et al., 2003), extinguished fear (Milad and Quirk, 2002; Milad et al., 2004; Quirk et al., 2003), and cognitive-emotion interactions (Liberzon et al., 2000; Simpson et al., 2000; Taylor et al., 2003). In this study, sad and anger IAPS pictures show medial prefrontal cortex and amygdala co-activation, suggesting that dorsomedial prefrontal cortex activation may be playing a role in reappraisal of negative emotion (Beauregard et al., 2001; Ochsner et al., 2002; Phan et al., 2005). Expressive faces and IAPS pictures: differential areas of activation As a group, expressive faces activated superior temporal gyrus, insula, and anterior cingulate more than IAPS pictures, despite the fact that expressive faces overall were subjectively rated to be

lower on valence and arousal. Previous studies suggest that facial expressions can evoke emotion portrayed to the viewer through primitive emotion contagion (Wild et al., 2001). The subjective responses in this study indicate that IAPS pictures are even more potent than facial expressions at inducing changes in the subjective state of emotional valence and arousal. Nevertheless, expressive faces elicited greater activation than IAPS pictures in several regions. Superior temporal gyrus has been shown to be involved in processing variable components of the face such as eye gaze, eye brows, and mouth gape (Haxby et al., 2000); therefore, it is not surprising that expressive faces would activate this region more than IAPS pictures. The insula has been shown to be involved in processing emotional expression in others (Haxby et al., 2002) and insular projections to inferior prefrontal cortex and amygdala may convey motivation and social information from these stimuli (Critchley et al., 2000). Anterior cingulate has been posited to reflect emotional awareness (Lane et al., 1997a) and cognitiveemotion interactions (Bush et al., 2000, 2002). Generally, the processing differences between emotion types detected in these regions may be partially a reflection of the fact that faces and IAPS pictures differ on novelty and complexity (Winston et al., 2003). If novelty and complexity of the stimuli do contribute, faster habituation to novelty effects in faces and slower habituation to novelty in pictures could explain the significant effect in one modality (e.g. expressive faces) and a lack of effect resulting from sustained activation in the other (e.g. IAPS pictures). With respect to novelty, faces can be viewed as a relatively unchanging stimulus having consistent facial features (eyes, nose, mouth), despite feature changes (raised brows, gaping mouth, etc.) that depict particular emotional states; whereas, each IAPS picture with complex contextual scenes is often more unique and novel. Decreased novelty and resulting habituation of responses to neutral facial expressions may lead to detectable activations (Fischer et al., 2003; Wright et al., 2003); whereas, sustained novelty (i.e. similar levels of novelty) between emotional and nonemotional IAPS pictures would lend itself to not detecting activation. Some evidence in the literature supports this idea. Several regions, including superior temporal gyrus, insula, and anterior cingulate, have been shown to respond to novel relative to familiar stimuli (Downar et al., 2002; Tulving et al., 1994). Insula activation to fearful faces was detected during early but not later periods, reflecting initial orienting not sustained processing (Williams et al., 2004). Even though IAPS pictures have been shown to habituate with repeated exposure (Phan et al., 2003), IAPS pictures depict emotion-laden scenes portraying a variety of contexts; thus, novelty in IAPS pictures may show reduced habituation (i.e. sustained activation) effects relative to facial expressions.

ARTICLE IN PRESS J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

9

Fig. 4. Expressive faces activate anterior cingulate, insula, and superior temporal gyrus more than IAPS pictures. SPM t map showing greater BOLD activity to expressive faces than IAPS pictures in anterior cingulate (ACC), insular cortex (Ins), and superior temporal gyrus (STG). (A) Happy relative to neutral. (B) Sad relative to neutral. (C) Fear relative to neutral. Activated voxels are displayed with P < 0.005 uncorrected, [k] > 5 voxels threshold.

With respect to complexity, faces may be processed more automatically; whereas, the complex scenes within IAPS pictures may require additional cognitive processing, leading to sustained activation in all IAPS conditions (including neutral) but not in face conditions. Neuroimaging studies involving masked faces designs elicit emotional networks despite subjective experience (Whalen et al., 1998), pointing to the automatic processing of facial expressions. In addition, emotions in facial expressions are universally recognized (Ekman, 1992, 1994; Izard, 1994). In support, in our study, expressive faces showed more agreement on discrete emotion labels than IAPS pictures. This finding is consistent with studies reporting

high agreement of discrete emotion in facial expressions (Carroll and Russell, 1996; Frank and Stennett, 2001). Significant activation may be more easily detected due to automatic, but relatively transient processing of facial expressions. On the other hand, subjective reports indicate that IAPS pictures have higher valence and arousal and increasing intensity may introduce ambiguity. The IAPS pictures have less percent agreement and increased reaction times than expressive faces, which may reflect increased cognitive demands. Processing the context in relation to past experience and acquired knowledge (memories and associations with emotional stimulus) may require additional cognitive load. Right insula and

ARTICLE IN PRESS 10

J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

Table 4 Activations to specific emotions relative to neutral Region

Faces

Faces > IAPS pictures

(x, y, z) Happy Superior temporal gyrus Insular cortex Visual cortex

Sad Superior temporal gyrus Insular cortex Anterior cingulate Rostral anterior cingulate Ventromedial prefrontal/ orbitofrontal cortex Visual cortex Anger R. amygdala Hippocampus Superior temporal gyrus Insular cortex Rostral anterior cingulate Ventromedial prefrontal/ orbitofrontal cortex Fear L. hippocampus Superior temporal gyrus Insular cortex Anterior cingulate

(63,

a

Z

18, 9)

( 63, 15, 6) ( 30, 6, 9)

(63,

27, 12)

b

3.51 2.89 2.62

k

(x, y, z)

Z

k

58

(57,

3.56

160

2.73 3.50

10 161

8 2d

3.21

27

(51, 12, 3) ( 42, 21, 21) (0, 30, 30)

2.60 3.53 2.52d

3d 102 5d

(9, 51, 9)

3.21

65

(0,

72, 24)

3.71

43

(33, 3, 24) ( 21, 21, 12) ( 63, 6, 12)

3.01 4.14 3.73

8 62 45

(57, 24, 12) ( 39, 18, 9)

3.43 3.86

80 307

12)

2.95

7

( 18, 27, 9) (60, 33, 0)

2.96 3.99

9 113

( 36, 15, 6)

3.61

125

(3, 18, 21) (0, 6, 30)

3.50 3.45

7 42

18, 3)

( 60, 15, 6) ( 39, 3, 12)

(54, 12, 6) ( 39, 24, 0) ( 3, 36, 30) (0, 30, 12)

3.04 2.92 3.21 3.10

(57,

6, 12)

Z

(54, 75, 3) ( 45, 72, 30)

4.33 3.64

78 8

3.32

19

15, 6)

3.36

163

3)

2.96

7

33, 9)

3.01

39

( 45, 9, 0) ( 39, 9, 9) (0, 30, 30)

3.31 3.15 2.86

27 34 8

Visual cortex

k

(x, y, z)

(0,

3.74

211

2.84

10

( 42, 24, 6)

4.11

47

( 12, 69, 6)

3.07

8

3.91

14

3.6

40

Z

k

14 9 24 9

90, 0)

( 18, (66,

IAPS pictures > faces

(x, y, z)

( 21, ( 60,

(3, 27, ( 15, 60,

IAPS pictures

c

(9,

12,

27,

90,

3)

12)

3)

(30,

(9, (0,

90,

93, 72,

15)

3) 15)

3.67

#

3.62 3.16

28 12

#, part of larger cluster. a Stereotactic coordinates from MNI atlas, left/right (x), anterior/posterior ( y), and superior/inferior (z), respectively. R = right, L = left. b Z score, significant after Small Volume Correction (SVC thresholded using a false discovery rate [FDR] correction for multiple comparisons of 0.005). c Spatial extent in cluster size, threshold  6 voxels. d Subthreshold activations.

anterior cingulate were activated to an explicit evaluation task, suggesting that these regions may be required to associate personal reflections and memories in order to make an evaluation of the stimulus due to increased complexity (Cunningham et al., 2004). Like the case of novelty, a significant effect in one modality (e.g. faces) and a lack of effect in another (e.g. IAPS pictures) may be due to differences in complexity (i.e. automatic vs. effortful response or innate vs. learned associations).

Specific emotions Specific emotions influenced the subjective ratings and neuroimaging activation patterns. Subjectively, positive emotions were more easily identified. Few labels for positive emotions exist; whereas, increased variability in labeling negative emotions result from increased choices. This interpretation is consistent with the fact that positive emotions may be more general; whereas, negative

ARTICLE IN PRESS J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

emotions may be more specific (Fredrickson, 2001, 2004). As seen in this study, recognition of anger and fear is often confused for one another. This variability could result from similarly high levels of valence and arousal (Carroll and Russell, 1996; Davis et al., 1995) or the inability to assign agency. Anger and fear may elicit a more intense reaction and increasing emotional intensity may result in a more complex profile, illustrating the increased difficulty in distinguishing emotions. Variability in the reports may also result because a complementary emotion is activated rather than mimicked (i.e. an angry faces makes the observer fearful). Some specific emotions contributed more substantially to the regions activated by both expressive faces and IAPS pictures. Amygdala activity has been most reported in response to fearful stimuli (Breiter et al., 1996; Downar et al., 2001; Hariri et al., 2003; Morris et al., 1996; Phillips et al., 1997; Whalen et al., 2001). In this study, amygdala activity relative to fixation was detected to fearful faces, like all other specific emotional faces. However, relative to neutral amygdala activity was detected in response to anger faces only and not fearful faces. In studies examining anger and fear faces only, amygdala activation was significantly greater for fearful faces compared to anger faces; however, these studies did not incorporate additional specific emotions or additional stimulus types (Whalen et al., 2001). Some studies have suggested that this discrepancy may be explained by the inclusion of other conditions that will influence activation within the amygdala (Somerville et al., 2004). Hippocampus activated in response to both types of anger and fear stimuli more significantly than neutral stimuli. Since this hippocampal activation was present in both faces and pictures, it appears that hippocampus may be more responsive to specific emotion rather than stimulus type. Ventromedial prefrontal cortex activated in response to anger stimuli and sad faces. Since no significant difference between modality was detected in this region for sad stimuli, the ventromedial prefrontal cortex, like the hippocampus, seems to respond to specific emotion. All specific emotions contributed to activations in STG and insula in response to faces and most specific emotions contributed to the differences in activation patterns seen between expressive faces and IAPS pictures. Happy, anger, and fear emotions contributed to differences between expressive faces and IAPS pictures in the superior temporal gyrus. This finding may reflect sensitivity to detect differences between facial expressions when presented in isolation, rather than the variability introduced by expressions within a greater context as presented in IAPS pictures. Insula activation is typically found when recognizing disgust faces (Phillips et al., 1997); however, happy, sad, and fear contributed to the insula differences between modalities in this study. This finding suggests that the insula may play a role in general emotional processing with respect to specific emotion (Phan et al., 2002), but also points to its preference to processing faces. A subset of negative emotions contributed to the anterior cingulate activation differences between modalities. For more dorsal regions of the anterior cingulate, the specific emotions of sad and fear showed preferentially processing to faces, both activating and showing significant difference compared to IAPS pictures. In previous studies, when a rating task is compared to a perceptually matching task, ACC activation was detected (Hariri et al., 2003). Thus, this ACC activation may indicate that the negative emotions in faces are being evaluated to a greater extent compared to IAPS pictures. For more rostral regions, sad and anger faces

11

show greater activity than IAPS pictures. While dorsal anterior cingulate showed activation to faces and neither activation nor deactivation to pictures, the rostral anterior cingulate showed a differential activity pattern. From region of interest (ROI) analysis, we noted that sad and anger faces tend to activate while sad and anger pictures tend to deactivate in the rostral anterior cingulate cortex. Even though the rACC is implicated in self-induced sadness and depression (Mayberg, 1997; Mayberg et al., 1999) and it may not be surprising to find sadness differentially activated by faces, one must take caution interpreting these results given that the findings within each modality are nonsignificant. Overall though, these findings suggest an interaction between specific emotion and emotion type influences activation within anterior cingulate regions. Several limitations should be noted when interpreting the results of these studies. First, expressive faces and IAPS pictures were balanced in terms of the predominant emotion but valence and arousal varied; expressive faces had lower valence and arousal. This result, however, offers an advantage in analyzing differential results because neural activation patterns showed greater activity to expressive faces despite lower arousal ratings. Secondly, forced choice methods to determine specific emotions may reflect response bias and demand characteristics; however, this bias is present in both IAPS and expressive faces. Even though the presence of mixed emotions was minimized, this method may not have completely eliminated this effect. Thirdly, the two stimulus sets may be unbalanced with respect to complexity, intentionality, and sociality. Despite these differences, critical comparisons attempted to ‘‘subtracted out’’ the effects of stimulus type by comparing the emotional stimulus relative to its neutral (e.g. emotional face – neutral face and emotional IAPS – neutral IAPS), isolating the contribution of emotionality above and beyond the processing of the stimuli properties contained within each. Even though we attempted to ‘‘subtract out’’ effects of stimulus type, the expressive face set may be more balanced due to the cohesive properties of faces but more susceptible to habituation effects; whereas, IAPS pictures may have increased variability due to contextual differences but more resistant to habituation effects. Future investigations are needed to tease apart these components. Additionally, the faces, both emotional and neutral, may be more inherently social than the IAPS pictures given their role in social communication. Alternatively, IAPS pictures may be characterized by more variable interpersonal interactions. Future studies need to determine how sociality influences these neural activation patterns. Next, the limited number of TR volumes collected per emotion condition may have yielded low power, resulting in a failure to detect additional differences. However, it should be noted that even at similar levels of power the differences between facial expressions and IAPS pictures were detected. Finally, this analysis assumed a canonical hemodynamic response function; however, emotions elicited by facial expressions and IAPS pictures may have different temporal dynamics (Siegle et al., 2002), and this warrants further exploration. In summary, this study begins to elucidate the underlying functional regions that are common and different among emotional stimulus types that emphasize emotion recognition or emotion evocation in direct comparison in the same subjects. Even though expressive faces may predominantly involve emotion recognition and IAPS pictures predominantly involve emotion evocation, both expressive faces and IAPS pictures recruit similar brain regions, reflected in a common pattern of activation which included amygdala, hippocampus, ventromedial prefrontal cortex, and visual

ARTICLE IN PRESS 12

J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

cortex. This common activation pattern further confirms the role these regions have in general emotional processing. Some brain regions, however, respond preferentially to a particular emotional stimulus type. In this study, a differential pattern of activation was detected in superior temporal gyrus, insula, and anterior cingulate, with more activation to expressive faces compared to IAPS pictures. Inherent properties unique to the specific emotional stimuli (e.g. novelty, complexity, sociality) may have yielded the differential pattern of brain activation. In addition, the effects of specific emotions and their interactions with stimulus types may also be contributing to these differential patterns. These findings may aid in determining the optimal stimulus selection for probing general emotion processing, emotion recognition, emotion induction, and specific emotions. Although, further replication using other emotional stimulus probes (e.g. Ekman faces, evocative films) is needed.

Acknowledgments We wish to thank Ruben Gur and his colleagues at the University of Pennsylvania for graciously sharing with us their stimuli set of facial expressions and Margaret Bradley, Peter Lang, and the NIMH Center for the Study of Emotion and Attention (CSEA) at the University of Florida for providing us with the set of IAPS pictures. Supported by Veterans Education and Research Association of Michigan and National Institutes of Mental Health (NIMH): National Research Service Award (NRSA), F31MH069003 to JCB. References Adolphs, R., Baron-Cohen, S., Tranel, D., 2002. Impaired recognition of social emotions following amygdala damage. J. Cogn. Neurosci. 14 (8), 1264 – 1274. Anderson, A.K., Phelps, E.A., 2001. Lesions of the human amygdala impair enhanced perception of emotionally salient events. Nature 411 (6835), 305 – 309. Ashburner, J., Neelin, P., Collins, D.L., Evans, A., Friston, K., 1997. Incorporating prior knowledge into image registration. NeuroImage 6 (4), 344 – 352. Beauregard, M., Levesque, J., Bourgouin, P., 2001. Neural correlates of conscious self-regulation of emotion. J. Neurosci. 21 (18), RC165. Bechara, A., Tranel, D., Damasio, H., Adolphs, R., Rockland, C., Damasio, A.R., 1995. Double dissociation of conditioning and declarative knowledge relative to the amygdala and hippocampus in humans. Science 269 (5227), 1115 – 1118. Blair, R.J., Morris, J.S., Frith, C.D., Perrett, D.I., Dolan, R.J., 1999. Dissociable neural responses to facial expressions of sadness and anger. Brain 122 (Pt. 5), 883 – 893. Breiter, H.C., Etcoff, N.L., Whalen, P.J., Kennedy, W.A., Rauch, S.L., Buckner, R.L., et al., 1996. Response and habituation of the human amygdala during visual processing of facial expression. Neuron 17 (5), 875 – 887. Bush, G., Luu, P., Posner, M.I., 2000. Cognitive and emotional influences in anterior cingulate cortex. Trends Cogn. Sci. 4 (6), 215 – 222. Bush, G., Vogt, B.A., Holmes, J., Dale, A.M., Greve, D., Jenike, M.A., et al., 2002. Dorsal anterior cingulate cortex: a role in reward-based decision making. Proc. Natl. Acad. Sci. U. S. A. 99 (1), 523 – 528. Calder, A.J., Burton, A.M., Miller, P., Young, A.W., Akamatsu, S., 2001. A principal component analysis of facial expressions. Vision Res. 41 (9), 1179 – 1208. Carroll, J.M., Russell, J.A., 1996. Do facial expressions signal specific emotions?: judging emotion from the face in context. J. Pers. Soc. Psychol. 70 (2), 205 – 218.

Clark, B.M., Siddle, D.A., Bond, N.W., 1992. Effects of social anxiety and facial expression on habituation of the electrodermal orienting response. Biol. Psychol. 33 (2 – 3), 211 – 223. Critchley, H., Daly, E., Phillips, M., Brammer, M., Bullmore, E., Williams, S., et al., 2000. Explicit and implicit neural mechanisms for processing of social information from facial expressions: a functional magnetic resonance imaging study. Hum. Brain Mapp. 9 (2), 93 – 105. Cunningham, W.A., Raye, C.L., Johnson, M.K., 2004. Implicit and explicit evaluation: FMRI correlates of valence, emotional intensity, and control in the processing of attitudes. J. Cogn. Neurosci. 16 (10), 1717 – 1729. Damasio, A.R., 1999. The Feeling of What Happens: Body and Emotion in the Making of Consciousness, first edR Harcourt Brace, New York. Darwin, C., 1998. The Expression of the Emotions in Man and Animal, third edition. Oxford Univ. Press, New York. Davidson, R.J., Jackson, D.C., Kalin, N.H., 2000. Emotion, plasticity, context, and regulation: perspectives from affective neuroscience. Psychol. Bull. 126 (6), 890 – 909. Davis, W.J., Rahman, M.A., Smith, L.J., Burns, A., Senecal, L., McArthur, D., et al., 1995. Properties of human affect induced by static color slides (IAPS): dimensional, categorical and electromyographic analysis. Biol. Psychol. 41 (3), 229 – 253. Dolan, R.J., Fletcher, P., Morris, J., Kapur, N., Deakin, J.F., Frith, C.D., 1996. Neural activation during covert processing of positive emotional facial expressions. NeuroImage 4 (3 Pt. 1), 194 – 200. Downar, J., Crawley, A.P., Mikulis, D.J., Davis, K.D., 2001. The effect of task relevance on the cortical response to changes in visual and auditory stimuli: an event-related fMRI study. NeuroImage 14 (6), 1256 – 1267. Downar, J., Crawley, A.P., Mikulis, D.J., Davis, K.D., 2002. A cortical network sensitive to stimulus salience in a neutral behavioral context across multiple sensory modalities. J. Neurophysiol. 87 (1), 615 – 620. Ekman, P., 1992. Are there basic emotions? Psychol. Rev. 99 (3), 550 – 553. Ekman, P., 1994. Strong evidence for universals in facial expressions: a reply to Russell’s mistaken critique. Psychol. Bull. 115 (2), 268 – 287. Esteves, F., Ohman, A., 1993. Masking the face: recognition of emotional facial expressions as a function of the parameters of backward masking. Scand. J. Psychol. 34 (1), 1 – 18. Fischer, H., Wright, C.I., Whalen, P.J., McInerney, S.C., Shin, L.M., Rauch, S.L., 2003. Brain habituation during repeated exposure to fearful and neutral faces: a functional MRI study. Brain Res. Bull. 59 (5), 387 – 392. Frank, M.G., Stennett, J., 2001. The forced-choice paradigm and the perception of facial expressions of emotion. J. Pers. Soc. Psychol. 80 (1), 75 – 85. Fredrickson, B.L., 2001. The role of positive emotions in positive psychology. The broaden-and-build theory of positive emotions. Am. Psychol. 56 (3), 218 – 226. Fredrickson, B.L., 2004. The broaden-and-build theory of positive emotions. Philos. Trans. R. Soc. London, Ser. B Biol. Sci. 359 (1449), 1367 – 1378. Fried, I., MacDonald, K.A., Wilson, C.L., 1997. Single neuron activity in human hippocampus and amygdala during recognition of faces and objects. Neuron 18 (5), 753 – 765. Friston, K.J., 1998. Generalisability, random effects and population inference. NeuroImage 7, S754. Friston, K.J., Holmes, A.P., Worsley, K.J., Poline, J.B., Frith, C.D., Frackowiak, R.S., 1995. Statistical parametric maps in functional imaging: a general linear approach. Hum. Brain Mapp. 2, 189 – 210. Geday, J., Gjedde, A., Boldsen, A.S., Kupers, R., 2003. Emotional valence modulates activity in the posterior fusiform gyrus and inferior medial prefrontal cortex in social perception. NeuroImage 18 (3), 675 – 684. Genovese, C.R., Lazar, N.A., Nichols, T., 2002. Thresholding of statistical maps in functional neuroimaging using the false discovery rate. NeuroImage 15 (4), 870 – 878. Gur, R.C., Schroeder, L., Turner, T., McGrath, C., Chan, R.M., Turetsky, B.I., et al., 2002. Brain activation during facial emotion processing. NeuroImage 16 (3 Pt. 1), 651 – 662. Halberstadt, J.B., Niedenthal, P.M., 1997. Emotional state and the use

ARTICLE IN PRESS J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx of stimulus dimensions in judgment. J. Pers. Soc. Psychol. 72 (5), 1017 – 1033. Hariri, A.R., Bookheimer, S.Y., Mazziotta, J.C., 2000. Modulating emotional responses: effects of a neocortical network on the limbic system. NeuroReport 11 (1), 43 – 48. Hariri, A.R., Mattay, V.S., Tessitore, A., Fera, F., Weinberger, D.R., 2003. Neocortical modulation of the amygdala response to fearful stimuli. Biol. Psychiatry 53 (6), 494 – 501. Hatfield, E., Cacioppo, J.T., Rapson, R.L., 1992. Primitive emotional contagion. Rev. Person. Soc. Psychol. 14, 151 – 177. Haxby, J.V., Petit, L., Ungerleider, L.G., Courtney, S.M., 2000. Distinguishing the functional roles of multiple regions in distributed neural systems for visual working memory. NeuroImage 11 (5 Pt. 1), 380 – 391. Haxby, J.V., Hoffman, E.A., Gobbini, M.I., 2002. Human neural systems for face recognition and social communication. Biol. Psychiatry 51 (1), 59 – 67. Hornak, J., Rolls, E.T., Wade, D., 1996. Face and voice expression identification in patients with emotional and behavioural changes following ventral frontal lobe damage. Neuropsychologia 34 (4), 247 – 261. Izard, C.E., 1994. Innate and universal facial expressions: evidence from developmental and cross-cultural research. Psychol. Bull. 115 (2), 288 – 299. Killgore, W.D., Yurgelun-Todd, D.A., 2004. Activation of the amygdala and anterior cingulate during nonconscious processing of sad versus happy faces. NeuroImage 21 (4), 1215 – 1223. Kim, H., Somerville, L.H., Johnstone, T., Alexander, A.L., Whalen, P.J., 2003. Inverse amygdala and medial prefrontal cortex responses to surprised faces. NeuroReport 14 (18), 2317 – 2322. Lane, R.D., Fink, G.R., Chau, P.M., Dolan, R.J., 1997a. Neural activation during selective attention to subjective emotional responses. NeuroReport 8 (18), 3969 – 3972. Lane, R.D., Reiman, E.M., Ahern, G.L., Schwartz, G.E., Davidson, R.J., 1997b. Neuroanatomical correlates of happiness, sadness, and disgust. Am. J. Psychiatry 154 (7), 926 – 933. Lane, R.D., Reiman, E.M., Bradley, M.M., Lang, P.J., Ahern, G.L., Davidson, R.J., et al., 1997c. Neuroanatomical correlates of pleasant and unpleasant emotion. Neuropsychologia 35 (11), 1437 – 1444. Lane, R.D., Chua, P.M., Dolan, R.J., 1999. Common effects of emotional valence, arousal and attention on neural activation during visual processing of pictures. Neuropsychologia 37 (9), 989 – 997. Lang, P.J., Greenwald, M.K., Bradley, M.M., Hamm, A.O., 1993. Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology 30 (3), 261 – 273. Lang, P.J., Bradley, M.M., Cuthbert, B.N. (1997). International Affective Picture System (IAPS): Technical Manual and Affective Ratings. Gainesville, FL: NIMH Center for the Study of Emotion and Attention, University of Florida. LeDoux, J.E., 2000. Emotion circuits in the brain. Annu. Rev. Neurosci. 23, 155 – 184. Levesque, J., Joanette, Y., Mensour, B., Beaudoin, G., Leroux, J.M., Bourgouin, P., et al., 2003. Neural correlates of sad feelings in healthy girls. Neuroscience 121 (3), 545 – 551. Liberzon, I., Taylor, S.F., Fig, L.M., Decker, L.R., Koeppe, R.A., Minoshima, S., 2000. Limbic activation and psychophysiologic responses to aversive visual stimuli. Interaction with cognitive task. Neuropsychopharmacology 23 (5), 508 – 516. Liberzon, I., Phan, K.L., Decker, L.R., Taylor, S.F., 2003. Extended amygdala and emotional salience: a PET activation study of positive and negative affect. Neuropsychopharmacology 28 (4), 726 – 733. Mayberg, H.S., 1997. Limbic-cortical dysregulation: a proposed model of depression. J. Neuropsychiatry Clin. Neurosci. 9 (3), 471 – 481. Mayberg, H.S., Liotti, M., Brannan, S.K., McGinnis, S., Mahurin, R.K., Jerabek, P.A., et al., 1999. Reciprocal limbic-cortical function and negative mood: converging PET findings in depression and normal sadness. Am. J. Psychiatry 156 (5), 675 – 682.

13

Milad, M.R., Quirk, G.J., 2002. Neurons in medial prefrontal cortex signal memory for fear extinction. Nature 420 (6911), 70 – 74. Milad, M.R., Vidal-Gonzalez, I., Quirk, G.J., 2004. Electrical stimulation of medial prefrontal cortex reduces conditioned fear in a temporally specific manner. Behav. Neurosci. 118 (2), 389 – 394. Morris, J.S., Frith, C.D., Perrett, D.I., Rowland, D., Young, A.W., Calder, A.J., 1996. A differential neural response in the human amygdala to fearful and happy facial expressions. Nature 383 (6603), 812 – 815. Morris, J.S., Friston, K.J., Buchel, C., Frith, C.D., Young, A.W., Calder, A.J., et al., 1998. A neuromodulatory role for the human amygdala in processing emotional facial expressions. Brain 121 (Pt. 1), 47 – 57. Mourao-Miranda, J., Volchan, E., Moll, J., de Oliveira-Souza, R., Oliveira, L., Bramati, I., et al., 2003. Contributions of stimulus valence and arousal to visual activation during emotional perception. NeuroImage 20 (4), 1955 – 1963. Narumoto, J., Okada, T., Sadato, N., Fukui, K., Yonekura, Y., 2001. Attention to emotion modulates fMRI activity in human right superior temporal sulcus. Brain Res. Cogn. Brain Res. 12 (2), 225 – 231. Ochsner, K.N., Bunge, S.A., Gross, J.J., Gabrieli, J.D., 2002. Rethinking feelings: an FMRI study of the cognitive regulation of emotion. J. Cogn. Neurosci. 14 (8), 1215 – 1229. Ongur, D., Price, J.L., 2000. The organization of networks within the orbital and medial prefrontal cortex of rats, monkeys and humans. Cereb. Cortex 10 (3), 206 – 219. Oppenheim, A., Schafer, R., 1989. Discrete-Time Signal Processing. Englewood Cliffs, Prentice Hall, NJ. Pessoa, L., Kastner, S., Ungerleider, L.G., 2002. Attentional control of the processing of neural and emotional stimuli. Brain Res. Cogn. Brain Res. 15 (1), 31 – 45. Phan, K.L., Wager, T., Taylor, S.F., Liberzon, I., 2002. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. NeuroImage 16 (2), 331 – 348. Phan, K.L., Liberzon, I., Welsh, R.C., Britton, J.C., Taylor, S.F., 2003. Habituation of rostral anterior cingulate cortex to repeated emotionally salient pictures. Neuropsychopharmacology 28 (7), 1344 – 1350. Phan, K.L., Taylor, S.F., Welsh, R.C., Ho, S.H., Britton, J.C., Liberzon, I., 2004. Neural correlates of individual ratings of emotional salience: a trial-related fMRI study. NeuroImage 21 (2), 768 – 780. Phan, K.L., Fitzgerald, D.A., Nathan, P.J., Moore, G.J., Uhde, T.W., Tancer, M.E., 2005. Neural substrates for voluntary suppression of negative affect: a functional magnetic resonance imaging study. Biol. Psychiatry 57 (3), 210 – 219. Phillips, M.L., Young, A.W., Senior, C., Brammer, M., Andrew, C., Calder, A.J., et al., 1997. A specific neural substrate for perceiving facial expressions of disgust. Nature 389 (6650), 495 – 498. Quirk, G.J., Likhtik, E., Pelletier, J.G., Pare, D., 2003. Stimulation of medial prefrontal cortex decreases the responsiveness of central amygdala output neurons. J. Neurosci. 23 (25), 8800 – 8807. Schneider, W., Eschman, A., Zuccolotto, A., 2002a. E-Prime Reference Guide. Psychology Software Tools, Inc., Pittsburgh. Schneider, W., Eschman, A., Zuccolotto, A., 2002b. E-Prime User’s Guide. Psychology Software Tools, Inc., Pittsburgh. Sheehan, D., Janavs, J., Baker, R., Harnett-Sheehan, K., Knapp, E., Sheehan, M., 1998. Mini International Neuropsychiatric Interview. English Version 5.0.0, DSM-IV. Siegle, G.J., Steinhauer, S.R., Thase, M.E., Stenger, V.A., Carter, C.S., 2002. Can’t shake that feeling: event-related fMRI assessment of sustained amygdala activity in response to emotional information in depressed individuals. Biol. Psychiatry 51 (9), 693 – 707. Simpson, J.R., Ongur, D., Akbudak, E., Conturo, T.E., Ollinger, J.M., Snyder, A.Z., et al., 2000. The emotional modulation of cognitive processing: an fMRI study. J. Cogn. Neurosci. 12 (Suppl. 2), 157 – 170. Somerville, L.H., Kim, H., Johnstone, T., Alexander, A.L., Whalen, P.J., 2004. Human amygdala responses during presentation of happy and neutral faces: correlations with state anxiety. Biol. Psychiatry 55 (9), 897 – 903.

ARTICLE IN PRESS 14

J.C. Britton et al. / NeuroImage xx (2006) xxx – xxx

Taylor, S.F., Phan, K.L., Decker, L.R., Liberzon, I., 2003. Subjective rating of emotionally salient stimuli modulates neural activity. NeuroImage 18 (3), 650 – 659. Tulving, E., Markowitsch, H.J., Kapur, S., Habib, R., Houle, S., 1994. Novelty encoding networks in the human brain: positron emission tomography data. NeuroReport 5 (18), 2525 – 2528. Whalen, P.J., Rauch, S.L., Etcoff, N.L., McInerney, S.C., Lee, M.B., Jenike, M.A., 1998. Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. J. Neurosci. 18 (1), 411 – 418. Whalen, P.J., Shin, L.M., McInerney, S.C., Fischer, H., Wright, C.I., Rauch, S.L., 2001. A functional MRI study of human amygdala responses to facial expressions of fear versus anger. Emotion 1 (1), 70 – 83. Wild, B., Erb, M., Bartels, M., 2001. Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: quality, quantity, time course and gender differences. Psychiatry Res. 102 (2), 109 – 124. Williams, L.M., Brown, K.J., Das, P., Boucsein, W., Sokolov, E.N., Brammer, M.J., 2004. The dynamics of cortico-amygdala and auto-

nomic activity over the experimental time course of fear perception. Brain Res. Cogn. Brain Res. 21 (1), 114 – 123. Winston, J.S., O’Doherty, J., Dolan, R.J., 2003. Common and distinct neural responses during direct and incidental processing of multiple facial emotions. NeuroImage 20 (1), 84 – 97. Woods, R.P., Grafton, S.T., Watson, J.D., Sicotte, N.L., Mazziotta, J.C., 1998. Automated image registration: II. Intersubject validation of linear and nonlinear models. J. Comput. Assist. Tomogr. 22 (1), 153 – 165. Wright, C.I., Martis, B., Schwartz, C.E., Shin, L.M., Fischer, H.H., McMullin, K., et al., 2003. Novelty responses and differential effects of order in the amygdala, substantia innominata, and inferior temporal cortex. NeuroImage 18 (3), 660 – 669. Yang, T.T., Menon, V., Eliez, S., Blasey, C., White, C.D., Reid, A.J., et al., 2002a. Amygdalar activation associated with positive and negative facial expressions. NeuroReport 13 (14), 1737 – 1741. Yang, Y., Gu, H., Zhan, W., Xu, S., Silbersweig, D.A., Stern, E., 2002b. Simultaneous perfusion and BOLD imaging using reverse spiral scanning at 3T: characterization of functional contrast and susceptibility artifacts. Magn. Reson. Med. 48 (2), 278 – 289.

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.