Explicit and Incidental Facial Expression Processing: An fMRI Study

Share Embed


Descripción

NeuroImage 14, 465– 473 (2001) doi:10.1006/nimg.2001.0811, available online at http://www.idealibrary.com on

Explicit and Incidental Facial Expression Processing: An fMRI Study Maria Luisa Gorno-Tempini,* ,† Samanta Pradelli,* Marco Serafini,‡ Giuseppe Pagnoni,§ Patrizia Baraldi,§ Carlo Porro, ¶ Roberto Nicoletti,㛳 Carlo Umita`,** and Paolo Nichelli* *Dipartimento di Patologia Neuropsicosensoriale, Universita` di Modena e Reggio Emilia, Modena, Italy; †Wellcome Department of Cognitive Neurology, Institute of Neurology, London, United Kingdom; ‡ASL Modena, Modena, Italy; §Dipartimento di Scienze Biomediche, Universita` di Modena e Reggio Emilia, Modena, Italy; ¶Dipartimento di Scienze e Tecnologie Biomediche, Universita` di Udine, Udine, Italy; 㛳Dipartimento di Psicologia dello Sviluppo e della Socializzazione, Padova, Italy; and **Dipartimento di Psicologia Generale, Universita` di Padova, Padova, Italy Received September 22, 2000

Considerable evidence indicates that processing facial expression involves both subcortical (amygdala and basal ganglia) and cortical (occipito-temporal, orbitofrontal, and prefrontal cortex) structures. However, the specificity of these regions for single types of emotions and for the cognitive demands of expression processing, is still unclear. This functional magnetic resonance imaging (fMRI) study investigated the neural correlates of incidental and explicit processing of the emotional content of faces expressing either disgust or happiness. Subjects were examined while they were viewing neutral, disgusted, or happy faces. The incidental task required subjects to decide about face gender, the explicit task to decide about face expression. In the control task subjects were requested to detect a white square in a greyscale mosaic stimulus. Results showed that the left inferior frontal cortex and the bilateral occipito-temporal junction responded equally to all face conditions. Several cortical and subcortical regions were modulated by task type, and by facial expression. Right neostriatum and left amygdala were activated when subjects made explicit judgements of disgust, bilateral orbitofrontal cortex when they made judgement of happiness, and right frontal and insular cortex when they made judgements about any emotion. © 2001 Academic Press Key Words: fMRI; disgust; happiness; facial expression; emotion; explicit processing.

INTRODUCTION Converging evidence indicates that both subcortical structures, such as the amygdala and the basal ganglia, and cortical regions, such as prefrontal and occipito-temporal areas, are involved in processing facial expressions (Adolphs et al., 1999; Blair et al., 1999, 2000; Hornak, et al., 1996; Morris et al., 1996, 1998a; Nakamura et al., 1999; Phillips et al., 1997; Sprengelmeyer et al., 1996). However, the contribution of these

regions to recognition of different facial expressions and their specific role in emotional processing are still unresolved issues. For instance a number of clinical (Broks et al., 1998) and neuroimaging studies have shown that the amygdala is involved in recognition of fearful expressions. Yet, its role with emotions different from fear is under discussion. Patient data suggests that this structure could also be involved in processing other expressions denoting negative emotions, such as anger (Adolphs et al., 1999; Calder et al., 1996; Scott et al., 1997), disgust (Adolphs et al., 1999), and sadness (Sprengelmeyer et al., 1999). Functional imaging data have not provided conclusive evidence in this matter. In fact, amygdalar activity was enhanced by increasing intensity of sad but not angry (Blair et al., 1999) or disgusted expressions (Phillips et al., 1997). On the other hand, the amygdala was modulated even by unconscious exposure to conditioned angry expressions (Morris et al., 1998b). Furthermore, one functional imaging study has suggested its involvement also in processing happy expressions (Breiter et al., 1996). The role of the basal ganglia is also a matter of controversy. Huntington disease patients (Gray et al., 1997; Sprengelmeyer et al., 1996) are especially impaired in recognising the facial expression of disgust, suggesting a role of the caudate nucleus in processing this emotion. More recently, Calder et al. (2000) reported a patient who showed a selective deficit in recognising disgust signals from multiple modalities after a left hemisphere infarct involving insula, putamen, globus pallidus, and the head of the caudate nucleus. However, the sole functional neuroimaging study that has investigated this question (Phillips et al., 1997), found that the insula, but not the caudate, was activated in response to increasingly disgusted faces. Furthermore, functional neuroimaging has shown that multiple regions within the prefrontal cortex are activated by perceiving facial expressions Yet, their specific role in facial expression recognition and, more

465

1053-8119/01 $35.00 Copyright © 2001 by Academic Press All rights of reproduction in any form reserved.

466

GORNO-TEMPINI ET AL.

generally, in processing emotions remains controversial. While there is considerable evidence suggesting that the orbitofrontal cortex might play a role in emotional and social behaviour (Blair et al., 1999, 2000; Petit and Haxby, 1999; Rolls, 2000), the specificity of the more lateral prefrontal regions is less clear. These lateral areas have been activated in studies of facial expression processing (George et al., 1993; Nakamura et al., 1999). However, they have also been repeatedly involved in more general memory functions (Courtney et al., 1998; Fletcher et al., 1998; Lepage et al., 2000; Haxby et al., 2000; Henson et al., 1999), suggesting that their activation might not be specific to emotional processing. In summary, while the network of brain regions sustaining facial expression processing is well established, the relative contribution of its components is still unclear. Inconsistency across neuroimaging studies could be due to differences in the cognitive task utilised. Incidental tasks, such as gender decision, may be appropriate in revealing responses to emotions that denote great survival value, such as fear (Morris et al., 1996), but may be less effective for other types of expressions. Furthermore, cortical responses in regions such as the occipital and prefrontal lobe are likely to be greatly influenced by demands that different cognitive tasks place on perceptual and memory retrieval processes. We investigated the effect of task modulation on brain responses to facial expression with functional magnetic resonance imaging (fMRI). Ten subjects were requested to process faces expressing disgust or happiness while performing either an incidental task (gender decision) or an explicit emotional judgement (disgust or happiness). We hypothesized that structures showing a modulation due to both task and type of emotion, are specifically involved in emotional processing. On the other hand, areas modulated by task demands, but not by emotion type, could be involved in more general cognitive functions. MATERIALS AND METHODS Subjects Ten right-handed subjects, five males and five females, between 25 and 30 years of age, participated in the experiment. Handedness was assessed by means of the Edinburgh questionnaire (Oldfield, 1971). Exclusion criteria included a history of past or present neurological or psychiatric illness. All subjects gave informed written consent after the nature of the experiment was explained. The Ethic Committee of the University of Modena and Reggio Emilia approved the study.

Scanning Parameters A General Electric Signa Horizon High Speed 77 system at 1.5 Tesla was used to acquire both structural T1 and gradient echoplanar T2* BOLD-contrast images. Echoplanar images were collected using a single shot, blipped, gradient echo echoplanar pulse sequence developed by Peter Jezzard at the National Institutes of Health (Bethesda MD). To maximize field homogeneity, fine manual prescan and localized shimming were performed at the beginning of the first session. Each BOLD-echoplanar volume scan consisted of 16 transverse slices (in plane matrix 64 ⫻ 64; voxel size 3.75 ⫻ 3.75 ⫻ 5 mm; TE ⫽ 40 ms; TR: 3380 ms). Sixty-three volumes were collected in each scanning session and each subject underwent four sessions for a total of 252 vol. A blocked design was used (see following section) and nine volumes were acquired in each block. A T1-weighted high-resolution MRI of each subject was acquired to facilitate anatomical localization. Stimuli and Experimental Design Pictures of 10 individuals (6 males, 4 females) were used in this study. Each individual showed disgusted, happy, or neutral expressions in different pictures. Faces were black and white photographs taken from a standard set of expressions of emotion (Ekman and Frisen, 1976). According to the data from Ekman and Frisen, the mean percentage of emotion recognition was 99.10 (SD ⫽ 2.51) for happy faces and 93.10 (SD ⫽ 5.2) for disgusted faces. Control stimuli were prepared applying an Adobe Photoshop mosaic filter of 512 pixel to the face pictures, thus obtaining greyscale images formed of 8 ⫻ 11 squares, no longer recognisable as faces. The experiment used a blocked paradigm and was designed as a 2 ⫻ 2 factorial design with a control condition. One factor was type of emotion: either disgust (D) or happiness (H). The second factor was the task: subjects were asked either explicit (E) expression recognition (i.e., disgust/neutral or happiness/neutral discrimination) or gender decision (incidental processing, I), i.e., male/female discrimination. Four experimental conditions were thus created: explicit recognition of disgusted or happy faces (ED and EH) in the emotion recognition task and incidental processing of the same facial expressions (ID and IH) in the gender decision task. In the control condition (C), subjects were asked to detect a white square in the center of the control stimuli. Stimuli were presented one at a time for 2500 ms with an interstimulus interval of 3800 ms. Six disgusted or happy and two neutral faces were presented in each block that lasted 30 s. Faces of the same individual were not repeated within one block. However, each face was presented expressing each of the three emotions within the experiment. To maintain the same proportion of motor responses in the gender

467

FACIAL EXPRESSION PROCESSING

decision task, six faces belonged to one sex and two to the other. Before the beginning of each block, written instructions were presented to inform subjects of the task they had to perform. Blocks with the control condition were included between each experimental block, and each scanning session comprised seven blocks, e.g., ED, C, IH, C, EH, C, ID. Each subject underwent four scanning sessions. Presentation order of faces within a block was randomized, while conditions were counterbalanced within and between subjects. Responses were given using a two-position button. Accuracy and reaction times data were collected during the scanning sessions. Image Processing and Statistical Analysis Data were preprocessed and analyzed using SPM 97 (Wellcome Department of Cognitive Neurology, London, UK; http://www.fil.ion.ucl.ac.uk; Friston et al., 1995a). All functional volumes for each subject were realigned to the first volume acquired. Images were then spatially normalised (Friston et al., 1995b) to the Montreal Neurological Institute (MNI) standard brain (Cocosco et al., 1997) in the space of Talairach and Tourneaux (1988) and resampled to obtain images with a voxel size of 4 ⫻ 4 ⫻ 4 mm. All volumes were then smoothed with an 8-mm full width maximum isotropic Gaussian kernel. The statistical analysis of the group modelled each of the four sessions obtained in each subject separately, obtaining an analysis that comprised 40 sessions. The experimental conditions were modeled as boxcar functions convolved with the hemodynamic response function. Condition effects were estimated according to the general linear model and regionally specific effects were compared using linear contrasts. Each contrast produced a statistical parametric map of the t statistic for each voxel, which was subsequently transformed to the unit normal Z distribution. We performed contrasts that enabled us to investigate which brain regions showed common or differential effects of the type of emotion, the task performed, or the interaction between emotion and task (see results section). We report activation that reached a level of significance of P ⬍ 0.001 uncorrected for multiple comparisons in areas of interest and of P ⬍ 0.05 corrected for the entire volume in other regions. The choice of the areas of interest was based on previous findings from neuroimaging and neuropsychological studies as reviewed in the Introduction, and they comprised amygdala, basal ganglia, fusiform, orbitofrontal, and insular cortex. Anatomical localisation of the areas of interest was obtained by confronting the group activation data superimposed on the MNI standard brain, to the Duvernoy (1999) and Talairach and Tourneaux (1988) atlases.

FIG. 1. Mean and standard error of the reaction times of each experimental condition.

RESULTS Behavioral Results Accuracy of performance was greater than 98% for all conditions. Differences between reaction times in the four experimental conditions were investigated using a 2 ⫻ 2 factorial analysis of variance (ANOVA) with type of emotion (D and H) as one factor and task (E and I) as the other, testing for main effects and interactions (Fig. 1). The results showed only a significant effect of task (P ⬍ 0.005). A post hoc Scheffe’s test showed that subjects were slower in the explicit than in the incidental task (P ⬍ 0.01). Neuroimaging Results 1. Areas activated by all stimuli versus control (Table 1). To identify regions common to all experimental conditions versus control, the main effect of stimuli versus control was inclusively masked with each simple main effect (ED-C, ID-C, EH-C, and IH-C) at P ⬍ 0.001. Bilateral activation of the posterior middle temporal gyrus (BA 37/21) was found. In the left hemisphere this activation spread to the superior temporal gyrus (BA 22) and to the temporal parietal junction (BA 39). In the right hemisphere the activation extended posteriorly and inferiorly in the middle occipital gyrus (BA 19). In addition, a cluster of activation comprising the more posterior portion of the left amygdala, spreading to the anterior hippocampus, was present for all face stimuli compared to the controls, regardless of type of expression and task. We found this region activated on each of the four experimental conditions. However, we noted a trend toward greater activation in the explicit recognition of disgust (see below). 2. Effects of type of emotion and interaction with task. (a) Faces expressing disgust (Table 2 and Fig.

468

GORNO-TEMPINI ET AL.

TABLE 1 Areas Commonly Activated by All Four Experimental Conditions versus Controls Areas (BA) L mid/sup temporal (37/21/22) L angular gyrus (39) L inf frontal (44/45) R mid occipitotemporal (19/37/21) L posterior amygdala/ hippocampus

Main effect ⫺64, ⫺60, ⫺60, ⫺52,

⫺48, ⫺60, ⫺32, ⫺56,

⫺40, ⫺40, 56, 64, 40, ⫺24,

24, 24 (7.2) 20, 8 (5.8) ⫺60, 8 (8.1) ⫺48, 0 (7.2) ⫺80, ⫺8 (8.1) ⫺8, ⫺20 (5.5)

16 (6.5) 8 (7.8) ⫺12 (5.9) 40 (5.6)

ED vs control

ID vs control

EH vs control

⫺64, ⫺60, ⫺60, ⫺56,

⫺44, ⫺60, ⫺32, ⫺52,

20 (4.5) 8 (6.5) ⫺12 (3.6) 44 (3.6)

⫺60, ⫺60, ⫺60, ⫺48,

⫺48, ⫺60, ⫺32, ⫺56,

12 (5.1) 8 (7.1) ⫺12 (5.5) 44 (4.9)

⫺64, ⫺60, ⫺60, ⫺60,

⫺48, ⫺64, ⫺32, ⫺52,

⫺36, ⫺48, 60, 62, 44, ⫺24, ⫺28,

24, 24 (4.5) 20, 8 (3.4) ⫺56, 8 (6.9) ⫺46, 0 (5.9) ⫺80, ⫺8 (6.9) ⫺8, ⫺20 (4.2) ⫺16, ⫺12 (3.0)

⫺40, ⫺36, 56, 64, 44, ⫺24, ⫺28,

24, 24 (5.7) 28, 12 (3.4) ⫺64, 8 (6.5) ⫺48, 0 (5.3) ⫺80, ⫺8 (4.2) ⫺8, ⫺20 (3.1) ⫺16, ⫺16 (3.8)

⫺40, ⫺40, 56, 64, 44, ⫺24, ⫺28,

24, 24 (4.8) 20, 8 (3.5) ⫺60, 12 (6.8) ⫺48, 2 (5.6) ⫺80, ⫺8 (7.2) ⫺8, ⫺16 (3.2) ⫺12, ⫺20 (3.9)

16 (5.5) 4 (6.7) ⫺8 (5.1) 36 (4.6)

IH vs control ⫺64, ⫺60, ⫺60, ⫺56,

⫺48, ⫺64, ⫺36, ⫺52,

⫺20 (3.3) 4 (4.7) ⫺8 (3.5) 40 (3.9)

⫺40, ⫺40, 60, 48, 44, ⫺24, ⫺20,

24, 24 (3.9) 20, 8 (4.6) ⫺60, 4 (6.8) ⫺76, 0 (6.5) ⫺80, ⫺8 (6.7) ⫺8, ⫺20 (3.8) ⫺12, ⫺16 (3.4)

Note. Coordinates and Z values are reported for the main effect and for each single simple main effect of the experimental conditions versus controls.

2): A conjunction analysis was performed to identify brain regions that were more active when viewing disgusted compared to happy faces for both the explicit and the incidental tasks: (ED-EH) ⫹ (ID-IH). This analysis allows the identification of regions where there is a main effect of disgust versus happiness, in the absence of any significant interaction with task. No significant effect was found because all regions that showed a main effect of disgust were also qualified by an interaction (ED-ID) ⫺ (EH-IH) and showed a greater effect in the explicit task (see Table 2). The right head of caudate nucleus, the right thalamus, and the left amygdala showed a significantly greater activation for explicit recognition of disgust relative to happiness. Both a significant interaction and a simple main effect of ED versus EH were found (see Table 2 and Fig. 2). These effects did not reach a corrected level of significance but were nevertheless considered because on a priori areas of interest. (b) Faces expressing happiness (Table 3 and Fig. 3): Con-

junction and interaction analyses were performed as described above (see section 2a). No significant cluster of activation was found in the conjunction analysis of happy versus disgusted faces irrespective of task. However, the bilateral orbitofrontal cortex (BA 11/47) showed a marked effect of happiness versus disgust that was greater in the explicit task. 3. Effects of task Conjunction analysis was performed to identify areas that were more involved in the explicit compared to incidental recognition of emotional faces: (ED-ID) ⫹ (EH-IH) (Table 4 and Fig. 4). This analysis allowed us to identify areas involved in performing the explicit task irrespective to the type of emotion to be recognized. The most significant effect was found in the right precentral sulcus. Furthermore, we found an extensive activation of the right prefrontal cortex, comprising the middle (BA 46) and the inferior (BA 44/45) frontal gyri. The activation of the inferior frontal gyrus also spread inferiorly and medially to

TABLE 2 Disgust versus Happiness Areas (BA) R striatum: putamen caudate R thalamus L amygdala

Interaction and Simple main effect

I: 24, 0, 4 (4.2) SM: 20, ⫺8, 0 (3.7) I: 8, 20, 8 (3.8) SM: 8, 20, 8 (3.1) I: 8, ⫺4, 4 (3.3) SM: 8, ⫺12, 4 (3.3) I: ⫺28, 4, ⫺20 (3.3) SM: ⫺28, 0, ⫺20 (4.0)

ED vs control

ID vs control

EH vs control

20, ⫺4, 0 (4.6)

20, ⫺4, 0 (2.1)

8, 16, 12 (3.1)

8, 20, 8 (1.7)





16, ⫺8, 0 (4.6)

16, ⫺8, 4 (2.0)

8, ⫺12, 12 (1.6)

16, ⫺8, 4 (3.2)

⫺24, 4, ⫺20 (3.9)

⫺24, 8, ⫺12 (2.1)



⫺32, 4, ⫺16 (1.7)

24, 0, ⫺4 (1.6)

IH vs control

24, 0, 4 (2.7)

Note. Coordinates and Z values are reported for the interaction (I) between type of emotion and task, for the simple main effect (SM) of ED vs EH and for each single simple main effects of the experimental conditions versus control.

469

FACIAL EXPRESSION PROCESSING

TABLE 3 Happiness versus Disgust Areas (BA) R orbitofrontal (11/47)

L orbitofrontal (11/47)

Interaction and Simple main effect 36, 28, ⫺24 (4.2) 24, 32, ⫺16 (3.2) SM: 36, 28, ⫺24 (5.9) 28, 32, ⫺20 (4.4) I: ⫺40, 32, ⫺20 (4.0) SM: ⫺36, 32, ⫺16 (5.5) I:

ED vs control

ID vs control

EH vs control

IH vs control

28, 24, ⫺20 (3.5) —

40, 28, ⫺24 (2.2) —

36, 28, ⫺24 (7.3) 28, 32, ⫺16 (6.9)

32, 32, ⫺20 (3.0) 28, 24, ⫺20 (2.8)

— ⫺36, 28, ⫺16 (2.2)

⫺40, 32, ⫺20 (3.1) —

⫺40, 28, ⫺20 (6.0) ⫺32, 36, ⫺12 (6.9)

⫺36, 24, ⫺24 (3.2) ⫺20, 40, ⫺12 (2.7)

Note. Coordinates and Z values are reported for the interaction (I) between type of emotion and task, for the simple main effect (SM) of EH vs ED and for each single simple main effects of the experimental conditions versus control.

include the most anterior region of the insula. A small region in the right fusiform gyrus (medial and anterior to the occipital area common to all the face conditions) was also modulated by task. DISCUSSION To investigate how differential task demands modulate brain activation related to perceiving facial expressions, we examined the neural substrates of processing disgusted and happy faces under two task conditions: gender decision and explicit emotional recognition. We obtained a widespread profile of activation of both cortical and subcortical regions previously implicated in processing faces and emotionally related stimuli. These regions include the occipito-temporal and prefrontal cortex, the amygdala, and the basal ganglia. A number of these areas showed a modulation due to type of emotion and task, whereas others were commonly activated by all the experimental conditions. Most of the regions that showed a common effect to all stimuli compared to controls are likely to be involved in visuo-perceptual (right temporo-occipital junction) and semantic analysis (left temporal and left inferior frontal cortex) of faces. However, the activation of the left amygdala could be attributed to its role in facial expression processing. As discussed in the introduction, the role of the amygdala in recognition of expressions other than fear is not yet established. One neuroimaging study elicited its activation for sad faces (Blair et al., 1999). Other human studies have shown that it is more activated for happy than for neutral faces (Breiter et al., 1996), that it is proportionally deactivated by increasingly happy expressions (Morris et al., 1996), and that it is modulated by conscious and unconscious exposure to conditioned angry faces (Morris et al., 1998b). On the other hand, animal studies demonstrated that a group of neurons in the primate amygdala responds primarily to faces (Rolls, 1981; Leonard et al., 1985). In our study we observed that left amygdala was activated when viewing facial expressions was compared to a low-level visual task baseline.

This finding supports the notion that the amygdala plays a general role in extracting the emotional relevance of faces (Rolls, 1999). Yet, its greater activation for disgusted compared to happy faces (see below) argues in favour of its preferential response to negative expressions and emotions. A number of brain regions exhibited a modulation due to task manipulation, with greater activation in the explicit recognition condition. This effect was found in structures that showed differential activation for disgusted or happy expressions, as well as in regions that were equally responsive to both types of emotions. Explicit recognition of facial expression of disgust activated the right striatum, including the caudate nucleus. Furthermore, the left amygdala showed a greater response to explicit disgust than to any other condition, supporting the notion of its involvement in processing all negative expressions, rather than fear alone (Adolphs et al., 1999). The fact that the left, but not the right, amygdala was activated is consistent with the findings of Morris et al. (1998b), who demonstrated a left amygdala involvement in processing supraliminal stimuli and a right amygdala activation in processing subliminal stimuli. The caudate activation is consistent with defective recognition of disgusted faces by Huntington disease patients (Gray et al., 1997; Sprengelmeyer et al., 1996). Two features might explain why the amygdala and caudate were activated in the present but not in Phillips et al.’s (1997) imaging study. First, by comparing disgust with happiness, we chose the two emotions that were, respectively, most impaired and most spared in Huntington disease (Sprengelmeyer et al., 1996). Second, Phillips et al. (1997) used a gender decision task and therefore did not direct subjects to explicitly process the emotional content of faces. Indeed, we have shown that the amygdala and caudate activations were significantly greater in the explicit rather than in the incidental task. Bilateral orbitofrontal cortex was the only region that responded to explicit recognition of happy faces more than to any other condition. This region has been involved in establishing stimulus-reinforcement asso-

470

GORNO-TEMPINI ET AL.

FIG. 2. Subcortical structures that showed greater activation for explicit recognition of disgusted faces. Activations are superimposed on axial [(a) z ⫽ 4; (b) z ⫽ 8] and coronal [(c) y ⫽ 0] sections of the standard Montreal Neurological Institute brain (Cocosco et al., 1997).

ciations, especially with reward expectation (Rolls, 2000). Its lesion can cause severe behavioral disturbances and sometimes a deficit in recognizing facial expressions (Hornak et al., 1996; Blair et al., 2000). Functional neuroimaging studies have shown its activation when a behavioural decision is based on the reward value of the response (Elliott et al., 2000 for review) and when pictures of pleasant stimuli are pre-

sented (Paradiso et al., 1999). Our results confirm the hypothesis of a specific role of the orbitofrontal cortex in emotional processing, possibly indicating pleasant facial expressions as important clues of social reward. The precentral sulcus, the middle and inferior frontal gyri, the posterior fusiform gyrus and the anterior insula of the right hemisphere showed a greater response for the explicit recognition task regardless of

TABLE 4 Explicit versus Incidental Processing Areas (BA) R precentral sulcus (6/8) R insula/inf frontal (44/45) R mid frontal (46) R fusiform (19)

conjunction 52, 52, 32, 48, 52, 48, 36,

0, 44 (6.8) 12, 40 (5.2) 24, 4 (5.1) 20, 8 (5.9) 28, 16 (5.2) 44, 16 (5.4) ⫺72, ⫺12 (4.7)

ED vs control 52, 56, 32, 52, 48, 48, 40,

0, 44 (4.9) 8, 36 (4.4) 24, 4 (4.0) 24, 8 (6.0) 28, 16 (6.2) 40, 10 (4.2) ⫺72, ⫺8 (5.0)

ID vs control 56, 48, 36, 44, 56, 48, 40,

4, 40 (2.0) 12, 44 (3.0) 24, 4 (1.8) 28, 8 (1.8) 32, 16 (3.4) 44, 16 (2.0) ⫺76, ⫺12 (2.2)

EH vs control 52, 52, 28, 48, 52, 48, 36,

4, 44 (6.3) 12, 40 (5.5) 28, 8 (3.9) 20, 8 (5.7) 24, 8 (6.0) 44, 16 (6.0) ⫺72, ⫺12 (6.0)

IH vs control 52, 52, 32, 40, 52, 48, 40,

0, 44 (2.0) 12, 40 (3.0) 20, 8 (1.4) 20, 8 (1.7) 24, 8 (2.2) 44, 16 (1.8) ⫺76, ⫺12 (3.2)

Note. Coordinates and Z values are reported for the conjunction and for each single simple main effect of the four experimental conditions versus control.

FACIAL EXPRESSION PROCESSING

471

FIG. 3. Bilateral orbitofrontal regions that showed greater activation for explicit recognition of happy faces. Activations are superimposed on axial sections [(a) x ⫽ ⫺24; (b) x ⫽ ⫺16] of the standard Montreal Neurological Institute brain (Cocosco et al., 1997).

the type of facial expression. The strong right lateralisation of the activations during explicit emotional recognition is also in agreement with clinical findings that link conscious emotional processing with the right hemisphere. (Adolphs et al., 2000; Borod et al., 1999; Bowers et al., 1985). We argue that these activations can be attributed to general cognitive processes, such as face perception, memory retrieval, and monitoring functions, rather than to specific emotion-related process. For instance, the right fusiform gyrus has been repeatedly involved in perceptual processing of faces (Gorno-Tempini et al., 1998; Kanwisher et al., 1997; Sergent et al., 1992) and our results demonstrate that its function can be modulated by task demands. The greater fusiform activation in explicit expression recognition than in gender decision suggests that a deeper perceptual analysis is necessary to explicitly extract emotional information from a face. Similarly, the acti-

vation of the precentral sulcus, close to an area recently identified as human frontal eye fields, could be explained by greater visual “scanning” in the explicit than in the implicit task (Luna et al., 1998; Petit and Haxby, 1999). The differential role of ventral and dorsal prefrontal regions is still a matter of debate (for a review see Owen et al., 1999). The currently prevailing view postulates that the right ventral prefrontal and anterior insular cortex are implicated in retrieving memory traces from long-term memory, while dorsal regions are implicated in response monitoring (Henson et al., 1999; Fletcher et al., 1998). Accordingly, greater activation of the right ventral and anterior insular cortices in the explicit task might indicate a memory retrieval effort to match facial features with previously experienced representations. The task dependency of the effect in the lateral prefrontal cortex can explain the inconsistency of this activation in previous studies

FIG. 4. Right frontal and fusiform activations related to the explicit task regardless of emotion type. Activations are superimposed on a 3-D rendering image of the standard Montreal Neurological Institute brain (Cocosco et al., 1997).

472

GORNO-TEMPINI ET AL.

of facial expression processing. In fact, a right ventral prefrontal activation was only found when an explicit emotional judgement was required, as when performing an emotional matching task (George et al., 1993) or a facial attractiveness judgement (Nakamura et al., 1999). There was no prefrontal activation when emotional processing was incidental (Morris et al., 1996). Taken together, the results of the present study indicate that the network of regions involved in processing facial expression can be greatly modulated by task demands. Explicit recognition of facial expressions not only increases the demands on aspecific cognitive processes such as perceptual analysis and memory retrieval, but also modulates the response of regions that respond differently to different emotions. Responses in the left amygdala and in the right neostriatum, were enhanced by explicit recognition of disgusted faces, while the bilateral orbitofrontal cortex was typically associated with recognising happy faces. ACKNOWLEDGMENTS This study was funded by a special grant from the Azienda Policlinico di Modena. M.L.G.T. was funded by the Wellcome Trust. Travelling expenses were covered by grant from the British CouncilMinistero dell’Universita` e della Ricerca Scientifica e Tecnologica. We thank Ray Dolan, John Morris, and Cathy Price for their comments.

REFERENCES Adolphs, R., Damasio, H., Tranel, D., Cooper, G., and Damasio, A. R. 2000. A role of somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping. J. Neurosci. 20: 2683–2690. Adolphs, R., Tranel, D., Hamann, S., Young, A. W., Calder, A. J., Phelps, E. A., Anderson, A., Lee, G. P., and Damasio, A. R. 1999. Recognition of facial emotion in nine individuals with bilateral amygdala damage. Neuropsychologia 37: 1111–1117. Blair, R. J., and Cipolotti, L. 2000. Impaired social response reversal. A case of “acquired sociopathy.” Brain 123: 1122–1141. Blair, R. J., Morris, J. S., Frith, C. D., Perrett, D. I., and Dolan, R. J. 1999. Dissociable neural responses to facial expressions of sadness and anger. Brain 122: 883– 893. Borod, J. C., Obler, L. K. Erhan, H. M., Grunwald, I. S., Cicero, B. A., Welkowitz, J., Santschi, R. M., and Whalen, J. R. 1998. Right hemisphere emotional perception: Evidence across multiple channels. Neuropsychology 12: 446 – 458. Bowers, D., Bauer, R .M., Coslett, H. M., and Heilmann, K. M. 1985. Processing of faces by patients with unilateral hemisphere lesions. Brain Cogn. 4: 258 –272. Breiter, H. C., Etcoff, N. L., Whalen, P. J., Kennedy, W. A., Rauch, S. L., Buckner, R. L., Strauss, M. M., Hyman, S. E., and Rosen, B. R. 1996. Response and habituation of the human amygdala during visual processing of facial expression. Neuron 17: 875– 887. Calder, A. J., Keane, F., Manes F., Antoun, N., and Young, A. W. 2000. Impaired recognition and experience of disgust following brain injury. Nat. Neurosci. 3: 1077–1078. Calder, A. J., Young, A. W., Rowland, D., Perret, D. A., Hodges, J. R., and Etcoff, N. L. 1996. Facial emotion recognition after bilateral amygdala damage: Differentially severe impairment of fear. Cogn. Neuropsychol. 13: 699 –745.

Cocosco, C. A., Kollokian, V., Kwan, R., and Evans, A. C. 1997. Brainweb: Online interface to a 3D MRI simulated brain database. Neuroimage 5: S425. Courtney, S. M., Petit, L., Maisog, J. M., Ungerleider, L. G., and Haxby, J. V. 1998. An area specialized for spatial working memory in human frontal cortex. Science 279: 1347–1351. Duvernoy H. M. 1999. The Human Brain: Surface, Three Dimensional Sectional Anatomy with MRI, and Blood Supply, 2nd ed. Springer Verlag, Wien. Ekman, P., and Frisen W. V. 1976. Pictures of Facial Affect. Consulting Psychologists Press, Palo Alto. Elliott, R., Dolan, R. J., and Frith, C. D. 2000. Dissociable functions in the medial and lateral orbitofrontal cortex: Evidence from human neuroimaging studies. Cereb. Cortex 10: 308 –317. Fletcher, P. C., Shallice, T., Frith, C. D., Frackowiak, R. S., and Dolan, R. J. 1998. The functional roles of prefrontal cortex in episodic memory. II. Retrieval. Brain 121: 1249 –1256. Friston, K. J., Ashburner, J., Frith, C. D., Poline, J. B., Heather, J. D., and Frackowiak, R. S. J. 1995a. Spatial registration and normalization of images. Hum. Brain Mapp. 2: 165–189. Friston, K. J., Holmes, A., Worsley, K. J., Poline, J.-B., Frith, C. D., and Frackowiak, R. S. J. 1995b. Statistical parametric maps in functional imaging: A general linear approach. Hum. Brain Mapp. 2: 189 –210. George, M. S., Ketter, T. A., Gill, D. S., Haxby, J. V., Ungerleider, L. G., Herscovitch, P., and Post, R. M. 1993. Brain regions involved in recognizing facial emotion or identity: An oxygen-15 PET study. J. Neuropsychiatry Clin. Neurosci. 5: 384 –394. Gorno-Tempini, M. L., Price, C. J., Josephs, O., Vandenberghe, R., Cappa, S. F., Kapur, N., Frackowiak, R. S., and Tempini, M. L. 1998 The neural systems sustaining face and proper-name processing [published errata appear in Brain 1998. 121(Pt 12):2402; and Brain 2000. 123(Pt 2):419]. Brain 121: 2103–2118. Gray, J. M., Skolnick, B. E., and Gur, R. E. 1997. Impaired recognition of disgust in Huntington’s disease gene carriers. Brain 120: 2029 –2038. Haxby, J. V., Petit, L., Ungerleider, L. G., and Courtney, S. M. 2000. Distinguishing the functional roles of multiple regions in distributed neural systems for visual working memory. Neuroimage 11: 145–156. Henson, R. N., Shallice, T., and Dolan, R. J. 1999. Right prefrontal cortex and episodic memory retrieval: A functional MRI test of the monitoring hypothesis. Brain 122: 1367–1381. Hornak, J., Rolls, E. T., and Wade, D. 1996. Face and voice expression identification in patients with emotional and behavioural changes following ventral frontal lobe damage. Neuropsychologia 34: 247–261. Kanwisher, N., McDermott, J., and Chun, M. M. 1997. The fusiform face area: A module in human extrastriate cortex specialized for face perception. J. Neurosci. 17: 4302– 4311. Leonard, C. M., Rolls, E. T., Wilson, F. A. W., and Baylis, G. C. 1985. Neurons in the amygdala of he monkey with responses selective for faces. Behav. Brain Res. 15: 159 –176. Lepage, M., Ghaffar, O., Nyberg, L., and Tulving, E. 2000. Prefrontal cortex and episodic memory retrieval mode. Proc. Natl. Acad. Sci. USA 97: 506 –511. Luna, B., Thulborn, K. R., Strojwas, M. H., McCurtain, B. J., Berman, R. A., Genovese, C. R., and Sweeney, J. A. 1998. Dorsal cortical regions subserving visually guided saccades in humans: An fMRI study. Cereb. Cortex 8: 40 – 47. Morris, J. S., Friston, K. J., Buchel, C., Frith, C. D., Young, A. W., Calder, A. J., and Dolan, R. J. 1998a. A neuromodulatory role for the human amygdala in processing emotional facial expressions. Brain 121: 47–57.

FACIAL EXPRESSION PROCESSING Morris, J. S., Frith, C. D., Perrett, D. I., Rowland, D., Young, A. W., Calder, A. J., and Dolan, R. J. 1996. A differential neural response in the human amygdala to fearful and happy facial expressions. Nature 383: 812– 815. Morris, J. S., Ohman, A., and Dolan, R. J. 1998b. Conscious and unconscious emotional learning in the human amygdala. Nature 393: 467– 470. Nakamura, K., Kawashima, R., Ito, K., Sugiura, M., Kato, T., Nakamura, A., Hatano, K., Nagumo, S., Kubota, K., Fukuda, H., and Kojima, S. 1999. Activation of the right inferior frontal cortex during assessment of facial emotion. J. Neurophysiol. 82: 1610 –1614. Oldfield, R. C. 1971. The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia 9: 97–113. Owen, A. M., Herrod, N. J., Menon, D. K., Clark, J. C., Downey, S. P., Carpenter, T. A., Minhas, P. S., Turkheimer, F. E., Williams, E. J., Robbins, T. W., Sahakian, B. J., Petrides, M., and Pickard, J. D. 1999. Redefining the functional organization of working memory processes within human lateral prefrontal cortex. Eur. J. Neurosci. 11: 567–574. Paradiso, S., Johnson, D. L., Andreasen, N. C., O’Leary, D. S., Watkins, G. L., Ponto, L. L., and Hichwa, R. D. 1999. Cerebral blood flow changes associated with attribution of emotional valence to pleasant, unpleasant, and neutral visual stimuli in a PET study of normal subjects. Am. J. Psychiatry 156: 1618 – 1629. Petit, L., and Haxby, J. V. 1999. Functional anatomy of pursuit eye movements in humans as revealed by fMRI. J. Neurophysiol. 82: 463– 471. Phillips, M. L., Young, A. W., Senior, C., Brammer, M., Andrew, C., Calder, A. J., Bullmore, E. T., Perrett, D. I., Rowland, D.,

473

Williams, S. C., Gray, J. A., and David, A. S. 1997. A specific neural substrate for perceiving facial expressions of disgust. Nature 389: 495– 498. Rolls, E. T. 1981. Responses of the amygdaloid neurons in the primate. In The Amygdaloid Complex (Y. Ben-Ari, Ed.), pp. 383–393. Elsevier, Amsterdam. Rolls, E. T. 1999. The Brain and Emotion. Oxford Univ. Press, Oxford. Rolls, E. T. 2000. The orbitofrontal cortex and reward. Cereb. Cortex 10: 284 –294. Scott, S. K., Young, A.W., Calder, A. J., Hellawell, D. J., Aggleton, J. P., Johnson, M. 1997. Impaired auditory recognition of fear and anger following bilateral amygdala lesions. Nature 385: 254 –257. Sergent, J., Ohta, S., and MacDonald, B. 1992. Functional neuroanatomy of face and object processing. A positron emission tomography study. Brain 115 Pt 1: 15–36. Sprengelmeyer, R., Young, A. W., Calder, A. J., Karnat, A., Lange, H., Homberg, V., Perrett, D. I., and Rowland, D. 1996. Loss of disgust. Perception of faces and emotions in Huntington’s disease. Brain 119: 1647–1665. Sprengelmeyer R., Young A. W. Schroeder U., Grossenbacher P. G., Federlein, J., Bu¨ttner T., and Przuntek H. 1999. Knowing no fear. Proc. R. Soc. Lond. B 266: 2451–2456. Talairach J., and Tournoux P. 1988. Co-planar Stereotaxic Atlas of the Human Brain. Thieme, New York. Taylor, S. F., Liberzon, I., and Koeppe, R. A. 2000. The effect of graded aversive stimuli on limbic and visual activation. Neuropsychologia 38: 1415–1425.

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.