Consciousness and Emotional Facial Expression Recognition

Share Embed


Descripción

© 2007 Federation of European Psychophysiology Societies M. Balconi & C. Lucchiari: Con sciousness Journalof and Emo Psychophysiology tional Facial Hogrefe 2007; Expression Vol. & Huber 21(2):100–108 Recognition Publishers

Consciousness and Emotional Facial Expression Recognition Subliminal/Supraliminal Stimulation Effect on N200 and P300 ERPs Michela Balconi1 and Claudio Lucchiari2 1

Department of Psychology, Catholic University of Milan Department of Neurology, Neurological National Hospital “C. Besta,” Milan, both Italy

2

Abstract. In this study we analyze whether facial expression recognition is marked by specific event-related potential (ERP) correlates and whether conscious and unconscious elaboration of emotional facial stimuli are qualitatively different processes. ERPs elicited by supraliminal and subliminal (10 ms) stimuli were recorded when subjects were viewing emotional facial expressions of four emotions or neutral stimuli. Two ERP effects (N2 and P3) were analyzed in terms of their peak amplitude and latency variations. An emotional specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). Unaware information processing proved to be quite similar to aware processing in terms of peak morphology but not of latency. A major result of this research was that unconscious stimulation produced a more delayed peak variation than conscious stimulation did. Also, a more posterior distribution of the ERP was found for N2 as a function of emotional content of the stimulus. On the contrary, cortical lateralization (right/left) was not correlated to conscious/unconscious stimulation. The functional significance of our results is underlined in terms of subliminal effect and emotion recognition. Keywords: facial expressions, emotion, subliminal, ERP

Facial expressions of emotions are social and communicative tools. They serve a communicative role, since humans use facial expressions to interpret the intentions of others. When we see a face we need to infer two main types of information. The face has to be identified as a specific stimulus belonging to a unique individual, taking into account change in appearance, aging, etc. Secondly, the facial expression has to be interpreted for its emotional content, which sets the modality for the social interaction (Ekman, 1993). The dissociation between facial identity and facial expression processing, as well as between facial expression and structural features of facial stimulus, has been well documented by the cognitive model of face recognition proposed by Bruce and Young (1986, 1998). This model supposes that there are seven distinct types of information that can be derived from the face, such as structural, expression, and identity information. These types of information, which differ in term of the cognitive and functional subprocesses implicated, are called “codes.” An important question raised by this explicative model is if the cognitive processes involved in distinct aspects of face processing may be topographically separated in specific brain regions (Bentin & Deouell, 2000; Gur, Schroeder, Turner, McGrath, & Chan, 2002). An increasing number of studies have analyzed the Journal of Psychophysiology 2007; Vol. 21(2):100–108 DOI 10.1027/0269-8803.21.2.100

cognitive and neuropsychological features of face recognition, and they offered relevant evidence supporting functional specificity of brain mechanisms responsible for emotional face processing (Posamentier & Abdi, 2003). Specifically, PET (Bernstein, Beig, Siegenthaler, & Grady, 2002; Haxby, Hoffman, & Gobbini, 2000), fMRI (Adolphs, Tranel, & Damasio, 1998; Grelotti, Gauthier, & Schultz, 2002; Kanwisher, McDermontt, & Chun, 1997) and ERP studies (Eimer & McCarthy, 1999; Herrmann et al., 2002) have underlined the brain specificity of emotion decoding. ERP studies in humans have provided evidence consistent with this view. In particular, they have revealed the early emergence of recognition process and their distinctiveness from other cognitive processes. The aforementioned cognitive model of face recognition suggests that the structural and semantic features of face are processed independently, and that the brain regions involved in the distinct aspects of face processing should be topographically separated (Holmes, Vuilleumier, & Eimer, 2003). In particular, the findings of studies in which ERP measures were used show the neural correlates of detecting a face (N170 ERP variation) as being larger for faces than for many other stimuli (Bentin & Deouell, 2000). This component is not affected by face familiarity, facial expressions, or other identity Hogrefe & Huber Publishers

M. Balconi & C. Lucchiari: Consciousness and Emotional Facial Expression Recognition

factors, such as race (Caldara et al., 2003). So, this ERP effect may be the marker of the structural encoding process, and its product an abstract sensory representation of the face; a representation that is independent of context or viewpoint. Here we focused on the semantic encoding process, that is the emotion-specific ERP correlates. Although most ERP studies of emotion have analyzed the later endogenous components (Holmes et al., 2003; Streit, Wölwer, Brinkmeyer, Ihl, & Gaebel, 2000), there is also evidence that emotional process can be differentiated in earlier time windows. A very early positive peak was observed at about 100 ms poststimulus, the P1 effect (Pizzagalli, Koenig, Regard, & Lehmann, 1999), related to emotional valence of the facial stimulus. This effect, even though specific for emotional content, was observed in early latency of the face processing. Thus, since in the present research we were exploring the later wave variations, we chose to not consider this component. In addition, Streit et al. (2000) evaluated differences in ERPs in emotional and structural face processing. They found an early negative deflection at about 240 ms poststimulus (N2 effect), related to emotional faces more than neutral faces for an explicit task (recognition of emotional expression). A similar ERP effect was found with an implicit task (Sato, Takanori, Sakiko, & Michikazu, 2000). Two theoretical positions were proposed to explain this ERP effect: The first interpretation supposes that N2 could be a cognitive marker of the complexity and the relevance of the stimulus (Carretié & Iglesias, 1995). Nevertheless, this position is in contrast with a large part of the experimental evidences (Marinkovic & Halgren, 1998). A second position underlines the emotional-face specificity of N2 (Balconi & Pozzoli, 2003; Herrmann et al., 2002). A successive positive ERP deflection (P300) was monitored by some authors after an emotional stimulation, although it does not seem to be exclusive for faces, since it was observed even in response to adjectives or objects with an emotional content (Bernat, Bunce, & Shevrin, 2001). Thus, the P3 effect seems to be a component representing a decisional aspect of processing, independently of the nature of the stimuli, since this effect is viewed as reflecting decision or cognitive closure of the recognition processing. In addition, it could be a marker of the stimulus complexity, (Brázdil, Rektor, Daniel, Dufek, & Jurák, 2001; Iragui, Kutas, Mitchiner, & Hillyard, 1993). On the other hand, for facial stimuli, previous results on P3 are contradictory: While in some studies neutral faces evoked lower amplitudes than emotional ones (Carretié & Iglesias, 1995), others have found that neutral stimuli evoked the highest peak (Vanderploeg, Brown, & Marsh, 1987). Therefore, some important questions remain to be answered. First of all, the cognitive nature of these ERP variations must be clarified in order to analyze their specificity for emotional facial expression decoding. The comparison of facial expressions with a neutral condition (neutral facial exHogrefe & Huber Publishers

101

pression) might be useful in order to characterize the emotional value of these two peak variations. In the second place, previous paradigms of analysis have focused on conscious elaboration of the facial stimuli, leaving out other interesting cognitive processes under the level of consciousness. For example, there is now considerable evidence supporting the notion that significant affective processing happens outside conscious awareness (Bunce, Bernat, Wong, & Shevrin, 1999; LeDoux, 1996; Shevrin, Bond, Brakel, Hertel, & Williams, 1996). An obvious example, well-known from experimental psychology, is the phenomenon of subliminal perception. From the viewpoint of cognitive neurophysiology, subliminal perception has only been studied in a limited number of cases (Shevrin & Fritzler, 1968; Wong, Shevrin, & Williams, 1994). Some investigations were applied to the classical oddball paradigm (Bernat et al., 2001; Brázdil et al., 2001) and they found a P3 ERP effect for unconscious stimuli similar to supraliminal conditions. ERPs have been shown to be sensitive to the unconscious affective perception of words (Cacioppo, Crites, & Gardner, 1996; Chapman, McCrary, Chapman, & Martin, 1980; Skrandies & Weber, 1996), faces (Kayser et al., 1997) and pictures (Johnston, Miller, & Burleson, 1986; Yee & Miller, 1987). This result might demonstrate that ERPs can index unconscious mental process (Shevrin, 2001). Nevertheless, no previous study has explored the relationship between unconscious elaboration of emotional faces and specific ERP effects such as N2 and P3. Therefore, a second purpose of this study was to elucidate the relationship between conscious and unconscious decoding of emotional facial expressions.

Experiment 1 Method Participants A total of 20 healthy volunteers took part in the study (11 of them women, age 19–25, M = 23.47) after giving informed consent. They were students of psychology at the Catholic University of Milan, all right-handed and with normal or corrected-to-normal vision. They were recruited for a cognitive task of stimulus elaboration and they were not aware that the investigation of emotion was the purpose of the experiment.

Material and Procedure Stimulus materials were taken from the set of pictures of Ekman and Friesen (1976). They were black and white pictures of male and female actors, presenting, respectiveJournal of Psychophysiology 2007; Vol. 21(2):100–108

102

M. Balconi & C. Lucchiari: Consciousness and Emotional Facial Expression Recognition

ly, a happy, sad, angry, fearful, or neutral face. Each face was presented 20 times, resulting in a total of 100 stimuli. Pictures were presented in a randomized order in the center of a computer monitor, with a horizontal angle of 4 ° and a vertical angle of 6 ° (STIM 4.2 software). The interstimulus fixation point was projected at the center of the screen (a white point on a black background). In this supraliminal condition subjects consciously saw the stimulus, which was presented for 500 ms on the monitor with an interstimulus interval (ISI) of 1500 ms. Subjects were seated comfortably in a moderately lighted room with the monitor screen positioned approximately 100 cm in front of their eyes. During the examination, they were requested to continuously focus their eyes on the small fixation point and to minimize blinking. The subjects were told to observe the faces carefully and to decide whether they expressed an emotional value. An explicit response to the emotional features of the stimulus has not required (i.e., by stimpad). This was done for several reasons: to avoid confounding motor potentials in addition to brain potentials; to avoid causing them to be more attentive to the emotional stimuli than the neutral ones; finally, asking for a response to stimuli that are not being seen consciously would be nonsensical to participants. Prior to recording ERPs, the subject was familiarized with the overall procedure (training session), where every subject saw in a random order all the emotional stimuli presented in the successive experimental session (a block of 10 trials, each expression repeated twice).

EEG Data and Registration Parameters The EEG was recorded with a 32-channel DC amplifier (SYNAMPS system) and acquisition software (NEUROSCAN 4.0) at 12 electrodes (four central, Fz, Cz, Pz, Oz; ten lateral, F2, F3, T2, T3, P2, P3, O1, O2) (international 10–20 system; Jasper, 1958) with reference electrodes at the mastoids. Electrooculograms (EOG) were recorded from electrodes lateral and superior to the left eye. The signal (sampled at 256 Hz) was amplified and processed with a pass-band from .01 to 50 Hz and was recorded in continuous mode. Impedance was controlled and maintained below 5 kΩ. An averaged waveform (off-line) was obtained from about 20 artifact-free (trials exceeding 50 μV in amplitude were excluded from the averaging process) individual target stimuli for each type of emotion. The noise in the signal was low (6% epochs were rejected). To evaluate differences in ERP response for the five facial expressions we focused data analysis within two time-windows: 180–300 and 300–380 ms. In this experiment we did not consider earlier peak variation such as P100. Peak amplitude measurement was quantified relative to 100 ms prestimulus. A second dependent variable, the latency measure that reproduces the peak latency, was analyzed. Journal of Psychophysiology 2007; Vol. 21(2):100–108

Data Analysis Behavioral Data In a postexperimental phase all the subjects were asked to analyze the facial expressions and to identify their emotional value. They correctly recognized the emotional significance of the stimuli for: joy (96%), sadness (94%), anger (95%), fear (96%), and neutral face (94%, in this case subjects evaluated “no emotion”).

N2 Effect For the first time-window, a two-way repeated measure analysis of variance (ANOVA) for Type (4) × Electrode site (12) was applied to the peak amplitude measure. Type I errors associated with inhomogeneity of variance were controlled by decreasing the degrees of freedom using the Greenhouse-Geiser ε. The analysis showed significant differences only for the main effect of type, F(4, 19) = 9.71, p = .001, η2 = .43, as shown by Figure 1, but not for site, F(11, 19) = 1.28, p = .38, η2 = .11. A higher negative peak was revealed for each emotional face compared to a neutral face, as showed by successive paired comparisons (respectively, for anger F(1, 19) = 7.11, p = .001, η2 = .40, fear F(1, 19) = 7.02, p = .001, η2 = .38, joy F(1, 19) = 6.49, p = .001, η2 = .35, and sadness F(1, 19) = 6.12, p = .001, η2 = .33). The wave profiles of Figure 1 compare emotional faces and neutral faces. For the site effect, an interaction of Type × Site, F(44, 19) = 12.04, p = .001, η2 = .52, was observed, with a more postero-temporal distribution of the peak for emotional stimuli. In fact, the post hoc comparisons (Tukey test) revealed a significant difference between emotional and neutral stimuli in T1: for fear F(1, 19) = 10.08, p = .001, η2 = .40; anger F(1, 19) = 8.12, p = .001, η2 = .36; joy F(1, 19) = 7.45, p = .001, η2 = .32; and sadness F(1, 19) = 6.65, p = .001, η2 = .30 and in T2: F(1, 19) = 8.81, p = .001, η2 = .38; F(1, 19) = 8.32, p = .001, η2 = .38;

Figure 1. Grand-average ERPs (all sites) elicited by neutral and emotional facial expressions (supraliminal). Hogrefe & Huber Publishers

M. Balconi & C. Lucchiari: Consciousness and Emotional Facial Expression Recognition

Figure 2a. Grand-average ERPs elicited by neutral and emotional facial expressions in T1 (supraliminal). F(1, 19) = 8.16, p = .001, η2 = .37; F(1, 19) = 7.72, p = .001, η2 = .34, respectively. The wave profiles for temporal sites T1 and T2 are represented in Figure 2a and Figure 2b, as a function of type of the stimulus (average of emotional vs. neutral faces). No other comparison was significant. Left and right brain differences as a function of type of stimulus was explored in a second ANOVA (Type × Lateralization). To assess lateralization, a lateral electrode factor (F2, T2, P2, O2 vs. F3, T3, P3, O1) was created. In this case, no main effect was significant to the analysis; for type; F(4, 19) = 1.03, p = 0.38, η2 = .10, and lateralization, F(1, 19) = 0.90, p = 0.39, η2 = .08; nor for their interaction, F(1, 19) = 1.01, p = 0.33, η2 = .16. The ANOVA applied to the second dependent measure (latency) showed no significant main effect for type, F(4, 19) = 1.06, p = 0.27, η2 = .19, and site, F(9, 19) = 1.18, p = 0.26, η2 = .08, as well as their two- or three-way interactions.

P3 Effect Two successive repeated measure ANOVAs were applied to the peak dependent variable for the second temporal window (300–380 ms). As shown in Figure 1, no significant data were revealed for the main effects of type, F(4, 19) = .69, p = .41, η2 = .08, or site, F(11, 19) = .51, p = .42, η2 = .07, nor for their two- or three-way interaction effects. Therefore, the P3 ERP variation seems to be similar as a function of the type of stimulus (emotional or neutral). In addition, the cortical distribution of the ERP is undifferentiated if related to the anterior/posterior position. Second, the lateralization effect was entered into a specific ANOVA (Type × Lateralization). No significant differences were observed, respectively, for type, F(4, 19) = .38, p = .57, η2 = .10, and lateralization, F(1, 19) = 1.22, p = .33, η2 = .12. Latency dependent variable was not differentiated as a function of type of stimulus or stimulation site. Therefore, the peak variation P3 was homogeneously characterized for neutral and emotional faces. Hogrefe & Huber Publishers

103

Figure 2b. Grand-average ERPs elicited by neutral and emotional facial expressions in T2 (supraliminal).

Discussion First, the differences observed between N2 and P3 effects, their specificity for comprehension of the facial expressions of emotions, and, more generally, their cognitive value were analyzed. The second main point of discussion focuses on the cortical localization of the peaks, which showed some heterogeneous distribution. No other study has previously compared the two ERP effects to analyze their differences in terms of functional significance for the process of emotional face recognition. The data support the view that emotion discrimination occurs at the first stage of conceptual stimulus processing, and the current results indicated that emotional facial expressions induced greater activation of the posterior temporal areas, with a latency of about 200 ms from stimulus onset. Second, based on our results we can suppose a different cognitive significance of the two ERP effects, which underlined the specificity of N2 profile as a function of the emotional value of the stimulus (emotional or neutral) and, on the contrary, the absence of any P3 ERP difference between neutral and emotional expressions. The N2 deflection appears more strictly related to the emotional value of faces (Streit et al., 2000), and it could represent an explicit marker of emotional facial expressions, and not a generic signal of face encoding. On the contrary, P3 might be related to the complexity of the stimulus (facial stimulus) and represent a general function of updating memory (Posamentier & Abdi, 2003), but it does not seem to be an emotion-specific index. Thus, these data would suggest that the P3 effect resulted from global processing of the facial stimuli, whereas N2 was more directly related to the comprehension of the emotional value of face. Nevertheless, since in the present study we did not compared directly different tasks (such as face identity vs. emotional identity) but only different facial expressions (i.e., emotional vs. neutral) we can not make a wider conclusion about the task modulation effect on P3, and future studies should consider this direction. The cognitive significance of N2 and P3 is also pointJournal of Psychophysiology 2007; Vol. 21(2):100–108

104

M. Balconi & C. Lucchiari: Consciousness and Emotional Facial Expression Recognition

ed out by the different distribution of the ERP correlates on the scalp surface. As previously underlined, differences in localization were found for N2 and P3. The first negative variation was heterogeneously distributed on the scalp, and the temporal (both right and left) lateralization was emerging. In line with previous results, the posterior sites were observed as much more involved in emotional facial expressions than neutral stimuli (Sato et al., 2001). On the contrary, the positive deflection seemed to be homogeneously localized on cortical sites.

Experiment 2 In the second experiment, we adopted a model of analysis that postulated quite homogeneous cognitive processes for both subliminal and supraliminal stimulation. Specifically we hypothesized that: – Subliminal ERPs have a component structure similar to conventional supraliminal ERPs and – Subliminal ERP components have similar psychological properties to supraliminal ERPs.

the term of[??] left-brain decoder. Nevertheless, some discrepancies from this theoretical perspective have been observed, and an opposite, right lateralization indicated for conscious awareness (Brázdil et al., 2001; Henke, Landis, & Markowitsch, 1993).

Method Participants A total of 20 subjects (different from those of Experiment 1), students of psychology at the Catholic University of Milan, took part in the Experiment 2 (nine were males, age range 21–25, M = 23.12; Ds = 0.29).

Material The same facial stimuli (total of 100 stimuli) and experimental procedure of Experiment 1 were used.

Subliminal Stimulation In line with this model, we supposed that the wave profiles elicited by conscious and unconscious decoding might be similar with respect to their peak profiles. A third interesting point concerns the time of emergence of ERP variations as a function of consciousness level. In fact, as was previously observed, delayed unaware information processing represents a distinctive feature of implicit visual perception (Bernat et al., 2001; Junghöfer, Bradley, Elbert, & Lang, 2001) and it was represented as a consequence of a more complex cognitive process underlying unconscious stimulation. Nevertheless, some other studies have underlined an anticipated peak variation for subthreshold stimuli compared to suprathreshold ones (Brázdil et al., 2001). Thus, this theoretical point have to be explored more exhaustively. Moreover, the cortical localization effect was analyzed. We expect a more posterior distribution of negative peak variation (N2), as revealed by previous studies (Sato et al., 2001). In fact, according to some experimental results, emotional stimuli should elicit the higher activation of visual areas, covering a broad range of the occipito-temporal cortices relative to the neutral stimuli. For this reason, we expected a quite differentiated cortical distribution of N2. On the contrary, a more heterogeneous localization was observed for P3 ERP effect. In fact, whereas some studies found a right temporal activity related to emotional expressions (Krolak-Salmon, Fischer, Vighetto, & Mauguière, 2001), other studies have pointed to a more central (vertex) distribution (Batty & Taylor, 2003). In addition to the localization effect for emotions, we analyzed the lateralization effect from levels of consciousness. Specifically, Gazzaniga (1993) suggested that the left hemisphere is crucial for consciousness because its dominance in response to conscious stimuli is reflected in Journal of Psychophysiology 2007; Vol. 21(2):100–108

In the subliminal condition subjects saw subthreshold stimuli. The stimulus duration was 10 ms, with an ISI of 1500 ms. Subliminal stimuli in the current study meet objective detection threshold criteria (Bernat et al., 2001). On the basis of signal detection theory (SDT), detection is sufficiently exhaustive to have a conscious perception. On the contrary, according to SDT, when detection sensitivity is at chance, it is unlikely that there is conscious awareness of the stimulus (Macmillan, 1986; Snodgrass, 2000). We verified the congruence of the subjective perception of subliminal stimuli with the objective threshold selected. In fact, after performing the first part of the experiment, the conscious discrimination was established for the subthreshold condition, and the inability to consciously report the content of stimuli previously viewed was tested. In a detection task, each subject was asked to distinguish the stimuli previously viewed (target, T) from a set of new stimuli (nontarget, NT) for a total of 100 stimuli, 50 T, 50 NT. The sequence of T and NT stimuli was random. Detection did not differ from the chance mean of 50 (M = 47.13). All subliminal participants lay within an expectable chance distribution, t(19) = 0.86, p = .36 (one tailed).

Data Analysis The same temporal windows of the supraliminal condition were considered, after evaluating the morphological similarity of the two wave profiles. To allow for the direct comparison of supraliminal and subliminal conditions, a bivariate correlation between the two condition grand averages Hogrefe & Huber Publishers

M. Balconi & C. Lucchiari: Consciousness and Emotional Facial Expression Recognition

105

was used to describe this similarity in structure numerically, R = 0.90, p = .001. Correlations between the individual electrodes were similarly positive and sizable: Fz R = .63; Cz R = .90; Pz R = .70; Oz R = .54; F2 R = .55; F3 R = .63; T2 R = .62; T3 R = .80; P2 R = .66; P3 R = .72; O1 R = .63; O2 R = .48; p = .001 for all. These correlations indicate that positive and negative peaks in the ERPs to supraliminal and subliminal stimuli tend to occur at the same latency and have the same form.

N2 Effect Repeated measures ANOVA were applied to the peak and latency dependent variables (see Experiment 1). The first analysis (Type × Site) showed a significant main effect for type, F(4, 19) = 8.83, p = .001, η2 = .33, and site, F(11, 19) = 11.12, p = .001, η2 = .53. Specifically, N200 was higher in amplitude for emotional faces than for neutral faces. Interaction effects were not significant, except for Type × Site, F(44, 19) = 9.42, p = .001, η2 = .35. Figure 3 shows the peak profiles of N2 as a function of type (emotion vs. neutral faces). As shown by the post hoc analysis, a more posterior (Pz) scalp distribution of the peak was observed, if compared to frontal (Fz), F(1, 19) = 10.12, p = .001, η2 = .40, central (Cz), F(1, 19) = 8.82, p = .001, η2 = .37, and occipital (Oz) sites, F(1, 19) = 7.80, p = .001, η2 = .30. Figure 4 shows the scalp distribution of N2. The significant interaction effect Type (4) × Lateralization (4) was successively analyzed. As observed in the supraliminal condition (see Experiment 1), no significant differences were revealed for right/left cortical sides. A third ANOVA (latency dependent variable) showed an homogeneous temporal distribution of N2 for type and site, since no main effect was significant, nor their interactions.

Figure 4. Grand-average ERPs elicited by emotional facial expressions in Fz, Cz, Oz and Pz sites (subliminal).

P3 Effect Two orders of data (peak and latency) were entered into two repeated measure ANOVAs. The first analysis did not reveal significant main effects or interactions. In fact, type, F(3, 19) = 1.32, p = .30, η2 = .09, and site, F(11, 19) = 1.10, p = .38, η2 = .10, did not influence the mean values of the peak dependent variable. No significant differences characterized the lateralization effect in a second ANOVA. Finally, the latency measure had a quite similar trend to the peak data. In fact, not only did type not produce temporal variation of peak emergence, but there were no differences between anterior/posterior distribution in terms of latency.

Supraliminal/Subliminal Comparison Supraliminal and subliminal conditions were entered together in the statistical analysis to directly compare the condition effect. Two mixed design ANOVAs, with Condition as between-subject factor, and Type (2, emotion vs neutral) and Localization (4, Fz, Cz, Oz and Pz) as withinsubject factors, were applied to peak and latency dependent measures for each of the two ERP variations, N2 and P3.

N2 Effect

Figure 3. Grand-average ERPs (all sites) elicited by neutral and emotional facial expressions (subliminal). Hogrefe & Huber Publishers

For the first negative deflection N2, the peak variable was differentiated as a function of the main effect of Type, F(1, 39) = 14.52, p = .001, η2 = .47, Localization, F(3, 39) = 8.31, p = .001, η2 = .38, and the two-way interaction Localization × Type, F(3, 39) = 6.62, p = .001, η2 = .35. On the contrary, the two supraliminal/subliminal grand-averaged waves were quite similar (no significant effect for condition, F(1, 39) = 1.02, p = .39, η2 = .07. No Journal of Psychophysiology 2007; Vol. 21(2):100–108

106

M. Balconi & C. Lucchiari: Consciousness and Emotional Facial Expression Recognition

Figure 5. Grand-average ERPs elicited by emotional facial expressions in supraliminal and subliminal condition.

other effect or interaction was significant. Figure 5 reports the peak profiles as a function of condition effect. The latency dependent variable did not show differences as a function of the main effect of type and localization, but only of condition, F(1, 39) = 5.83, p = .001, η2 = .30. In fact, as shown in Figure 5, the N2 peak deflection was delayed in the subliminal condition if compared to the supraliminal. Whereas subliminal N2 appeared at about 230 ms (M = 228 ms), supraliminal ERP variation was at about 210 ms poststimulus.

P3 Effect P3 effect was successively analyzed for peak value and latency measures. The first mixed design ANOVA (Condition × Type × Localization) showed no significant main and interaction effects. Therefore, the direction of the mean was undifferentiated as a function of the type, F(1, 39) = 1.06, p = .38, η2 = .14, condition, F(1, 39) = 0.63, p = .47, η2 = .06, and localization, F(3, 39) = 0.84, p = 0.30, η2 = .10. On a parallel, the latency of the positive ERP effect was not affected by the independent factors, nor by their possible interactions.

Discussion Similar ERP effects were observed for both N2 and P3 in the supraliminal/subliminal conditions, with analogous morphological peak profiles, unless for the latency of the N2 variation. Based on these results, similarities in processing between supraliminal and subliminal stimulation can also be assessed. First, substantial analogies in the subliminal and supraliminal ERP component structure were wellfounded, suggesting that similar neural activity is involved (Shevrin, 2001; Snodgrass, 2000). Moreover, these similarities between the supraliminal N2/P3 and their subliminal Journal of Psychophysiology 2007; Vol. 21(2):100–108

analog suggests that they represent similar cognitive processes, regardless of the conditions of stimulation. More generally, it seems evident that the information presented to a subject under subliminal conditions may be perceived and processed on a higher level even if the subject is not aware of this information (Balconi, 2003). In fact, results from studies that have examined psychophysiological responses to emotional stimuli show that such stimuli are effective both in capturing attention and in eliciting autonomic responses (Öhman & Wiens, 2003; Regan & Howard, 1995). Subliminal processes appear to have a preattentive origin, because they can be observed in response to stimuli that are prevented from reaching conscious recognition. Nevertheless, temporal retardation of the peak appears to distinguish subliminal and supraliminal information processing. Peak latencies of the corresponding deflections elicited by subliminal stimuli were clearly distinct from the latency of the supraliminal N2, since subliminally, the peak is revealed later. This is a major topic of our research, since the latency effect can be of interest in explaining the cognitive mechanisms underlying supraliminal versus subliminal face processing. In order to explain the temporal effect observed, we hypothesized that the time required to elaborate the emotional information might be conditioned by the type of stimuli viewed. The differences found between the two experimental conditions would depend on the type of processing and, more specifically, on the level of complexity and of the attentional effort of the two cognitive processing implicated. As suggested by Brázdil, Rektor, Dufek, Jurák, & Daniel (1998), who found different patterns in mutual time relations in a part of their sample (in some case delayed and in the other anticipated), the attention-level differences of the subjects could be a valid explanation of the heterogeneous results. As ERP provides information regarding the temporal sequence of human information processing, further analysis of the time relations between the supraliminal and the subliminal N2/P3 peak latencies might be useful.

General Discussion The negative deflection N2 was elicited by the emotional faces and in a lesser measure by the neutral face. In addition, as revealed by the localization sites of wave peaks, it has a specific cortical localization, since it was distributed on the posterior areas of the scalp more than on the frontal. The possibility that a cognitive “cortical code” for emotion-expression recognition exists is indicated by our results on this negative ERP component, since it seems to be strictly related to the emotional value of facial stimuli. This pattern suggests a possible dissociation between a specific visual mechanism responsible for the encoding of faces, and a “higher level” mechanism for associating the facial representation with semantic information about the emotion that the face represents, as previously suggested by Hogrefe & Huber Publishers

M. Balconi & C. Lucchiari: Consciousness and Emotional Facial Expression Recognition

Bruce and Young (1998). On the contrary, no differences between the emotional and neutral conditions were found for P3. Thus, our results suggest an unspecific role of P3 for emotional face processing, although it could be a decisional response-related marker that is not dependent on the semantic content of the stimulus. In order to test this ERP significance, future research should take into account the task effect, comparing, for example, different aspects of face processing (i.e., identity, familiarity of face, etc.). The second topic explored here is conscious/unconscious stimulation effect on emotional facial expression. On the whole, the initial two-fold question (structural and psychological similarity of subliminal and supraliminal ERP profiles), can be answered in favor of the structural similarity of the two ERPs. However, the psychological homogeneity of subliminal and supraliminal stimulation is a main point to consider, since we can infer that subliminal components are elicited by cognitive processes similar to these occurring supraliminally. Moreover, even if in the present research we had no results in favor of lateralization of conscious processing, the question of possible lateralization of awareness to the right side emerges. In fact, some authors suggest a crucial role of the right hemisphere for consciousness, whereas other favor the left hemisphere. The lateralization findings for the supraliminal ERPs call attention to possible qualitative differences between conscious and unconscious affective processing, and this topic should be explored systematically in the future.

References Adolphs, R., Tranel, D., & Damasio, A.R. (1998). The human amygdala in social judgment. Nature 393, 470–474. Balconi, M. (2003). Conscious and unconscious in emotional face processing. Journal of the International Neuropsychological Society, 2, 304–305. Balconi, M., & Pozzoli, U. (2003). Face-selective processing and the effect of pleasant and unpleasant emotional expressions on ERP correlates. International Journal of Psychophysiology, 49, 67–74. Batty, M., & Taylor, M.J. (2003). Early processing of the six basic facial emotional expressions. Cognitive Brain Research, 17, 613–620. Bentin, S., & Deouell, L.Y. (2000). Structural encoding and identification in face processing: ERP evidence for separate mechanisms. Cognitive Neuropsychology, 17, 35–54. Bernat, E., Bunce, S., & Shevrin, H. (2001). Event-related brain potentials differentiate positive and negative mood adjectives during both supraliminal and subliminal visual processing. International Journal of Psychophysiology, 42, 11–34. Bernstein, L.J., Beig, S., Siegenthaler, A.L., & Grady, C.L. (2002). The effect of encoding strategy of the neural correlates of memory for faces. Neuropsychologia, 40, 86–98. Brázdil, M., Rektor, I., Daniel, P., Dufek, M., & Jurák, P. (2001). Intracerebral event-related potentials to subthreshold target stimuli. Clinical Neurophysiology, 112, 650–661. Hogrefe & Huber Publishers

107

Brázdil, M., Rektor, I., Dufek, M., Jurák, P., & Daniel, P. (1998). Effect of subthreshold stimuli on event-related potentials. Electrophysiology and Clinical Neurophysiology, 107, 64–68. Bruce, V., & Young, A.W. (1986). Understanding face recognition. British Journal of Psychology, 77, 305–327. Bruce, V., & Young, A.W. (1998). A theoretical perspective for understanding brain recognition. In A.W., Young (Ed.), Face and mind (pp. 96–130). Oxford: Oxford University Press. Bunce, S., Bernat, E., Wong, P.S., & Shevrin, H. (1999). Further evidence for unconscious learning: Preliminary support for the conditioning of facial EMG to subliminal stimuli. Journal of Psychiatric Research, 33, 341–347. Cacioppo, J.T., Crites, S.L., & Gardner, W.L. (1996). Attitudes to the right: Evaluative processing is associated with lateralized late positive event-related brain potentials. Personality and Social Psychology Bulletin, 22, 1205–1219. Caldara, R., Thut, G., Servoir, P., Michel, C.M., Bovet P., & Renault, B. (2003). Face versus nonface object perception and the “other-race” effect: A spatio-temporal event-related potential study. Clinical Neurophysiology, 114, 515–528. Carretié, L., & Iglesias, J. (1995). An ERP study on the specificity of facial expression processing. International Journal of Psychophysiology, 19, 183–192. Chapman, R.M., McCrary, J.W., Chapman, J.A., & Martin, J.K. (1980). Behavioral and neural analysis of connotative meaning: Word classes and rating scales. Brain and Language, 11, 319–339. Eimer, M., & McCarthy, R.A. (1999). Prosopagnosia and structural encoding of face: Evidence from event-related potentials. Neuroreport, 10, 255–259. Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48, 384–392. Ekman, P., & Friesen, W.V. (1976). Pictures of facial affect. Palo Alto: Consulting Psychologist Press. Gazzaniga, M.S. (1993). Brain mechanisms and conscious experience. Experimental and theoretical studies of consciousness (CIBA Foundation Symposium 174) (pp. 247–262). Chichester: Wiley. Grelotti, D.J., Gauthier, I., & Schultz, R.T. (2002). Social interest and the development of cortical face specialization: What autism teaches us about face processing. Developmental Psychobiology, 40, 213–225. Gur, R.C., Schroeder, T., Turner, T., McGrath, R.M., & Chan, L.I. (2002). Brain activation during facial emotion processing. Neuroimage, 16, 651–662. Haxby, J.V., Hoffman, E.A., & Gobbini, I.M. (2000). The distributed human neural system for faces perception. Trends in Cognitive Sciences, 4, 223–233. Henke, K., Landis, T., & Markowitsch, H.J. (1993). Subliminal perception of pictures in the right hemisphere. Consciousness and Cognition, 2, 225–236. Herrmann, M.J., Aranda, D., Ellgring, H., Mueller, T.J., Strik, W.K., Heidrich, A., et al. (2002). Face-specific event-related potential in humans is independent from facial expression. International Journal of Psychophysiology, 45, 241–244. Holmes, A., Vuilleumier, P., & Eimer, M. (2003). The processing of emotional facial expression is gated by spatial attention: Evidence from event-related brain potentials. Cognitive Brain Research, 16, 174–184. Iragui, V.J., Kutas, M., Mitchiner, M.R., & Hillyard, S.A. (1993). Effect of aging on event-related brain potentials and reaction Journal of Psychophysiology 2007; Vol. 21(2):100–108

108

M. Balconi & C. Lucchiari: Consciousness and Emotional Facial Expression Recognition

times in an auditory oddball task. Psychophysiology, 30, 10–22. Jasper, H.H. (1958). The 10-20 electrode system of the International Federation. Electroencephalography and Clinical Neurophysiology, 10, 371–375. Johnston, V.S., Miller, D.R., & Burleson, M.H. (1986). Multiple P3s to emotional stimuli and their theoretical significance. Psychophysiology, 23, 684–693. Junghöfer, M., Bradley, M.M., Elbert, T.R., & Lang, P.J. (2001). Fleeting images: A new look at early emotion discrimination. Psychophysiology 38, 175–178. Kanwisher, N., McDermontt, J., & Chun, M.M. (1997). The fusiform face area: A module in human extrastriate cortex specialized for face perception. Journal of Neuroscience, 17, 4302–4311. Kayser, J., Tenke, C., Nordby, H., Hammerborg, D., Hugdahl, K., & Erdmann, G. (1997). Event-related potential (ERP) asymmetries to emotional stimuli in a visual half-field paradigm. Psychophysiology, 34, 414–426. Krolak-Salmon, P., Fischer, C., Vighetto, A., & Mauguière, F. (2001). Processing of facial emotional expression: Spatio-temporal data as assessed by scalp event-related potentials. European Journal of Neuroscience, 13, 987–994. LeDoux, J.E. (1996). The emotional brain: The mysterious underpinning of emotional life. New York: Simon & Schuster. Macmillan, N. (1986). The psychophysics of subliminal perception. Behavioral and Brain Sciences, 9, 38–39. Marinkovic, & K., Halgren, E. (1998). Human brain potentials related to the emotional expression, repetition, and gender of faces. Psychobiology, 26, 348–356. Öhman, A., & Wiens, S. (2003). On the automaticity of autonomic responses in emotion: An evolutionary perspective. In R.J. Davidson, K.R. Scherer, & H.H. Goldsmith (Eds.), Handbook of affective sciences (pp. 256–276). New York: Oxford University Press. Pizzagalli, D., Koenig, T., Regard, M., & Lehmann, D. (1999). Affective attitudes to face images associated with intracerebral EEG source location before face viewing. Brain Research, 7, 371–377. Posamentier, M.T., & Abdi, H. (2003). Processing faces and facial expressions. Neuropsychology Review, 13, 113–143. Regan, M., & Howard, R. (1995). Fear conditioning, preparedness, and the contingent negative variation. Psychophysiology, 32, 208–214. Sato, W., Takanori, K., Sakiko, Y., & Michikazu, M. (2000). Emotional expression boosts early visual processing of the face: ERP recording and its decomposition by independent component analysis. Neuroreport, 12, 709–714.

Journal of Psychophysiology 2007; Vol. 21(2):100–108

Shevrin, H. (2001). Event-related markers of unconscious processes. International Journal of Psychophysiology, 42, 209–218. Shevrin, H., Bond, J.A., Brakel, L.A.W., Hertel, R.K., & Williams, W.J. (1996). Conscious and unconscious processes: Psychodynamic, cognitive, and neurophysiological convergences. New York: Guilford. Shevrin, H., & Fritzler, D. (1968). Visual evoked response correlates of unconscious mental processes. Science, 161, 295–298. Skrandies, W., & Weber, P (1996). Dimensionality of semantic meaning and segments of evoked potential field. In C., Ogura, Y., Koga, M., Shimokochi (Eds.), Recent advances in eventrelated brain potential research: Proceedings of the 11th international conference on event-related potentials (EPIC) (pp. 113–118). New York: Elsevier. Snodgrass, J.M. (2000). Unconscious perception: Theory, method, and evidence. Amsterdam: John Benjamins. Streit, M., Wölwer, W., Brinkmeyer, J., Ihl, R., & Gaebel, W. (2000). Electrophysiological correlates of emotional and structural face processing in humans. Neuroscience Letters, 278, 13–16. Vanderploeg, R.D., Brown, W.S., & Marsh, J.T. (1987). Judgments of emotion in words and faces: ERP correlates. International Journal of Psychophysiology, 5, 193–205. Wong, P.S., Shevrin, H., & Williams, W.J. (1994). Conscious and nonconscious processes: An ERP index of an anticipatory response in a conditioning paradigm using visual masked stimuli. Psychophysiology, 31, 87–101. Yee, C.M., & Miller, C.M. (1987). Affective valence and information processing. In R. Johnson, Jr., J.W., Rohrbaugh, & R. Parasuraman (Eds.), Current trends in event-related potential research (EEG Supplement 40) (pp. 300–307). Amsterdam: Elsevier.

Accepted for publication: April 20, 2007

Michela Balconi Department of Psychology Catholic University of Milan L. go Gemelli, 1 I-20123 Milan Italy Tel. +39 02 7234-2586 Fax +39 02 7234-2280 E-mail [email protected]

Hogrefe & Huber Publishers

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.