How do eye gaze and facial expression interact?

Similar documents
Differences in holistic processing do not explain cultural differences in the recognition of facial expression

Taking control of reflexive social attention

Affective evaluations of objects are influenced by observed gaze direction and emotional expression

Effect of Pre-Presentation of a Frontal Face on the Shift of Visual Attention Induced by Averted Gaze

Interaction Between Social Categories in the Composite Face Paradigm. Wenfeng Chen and Naixin Ren. Chinese Academy of Sciences. Andrew W.

This is a repository copy of Differences in holistic processing do not explain cultural differences in the recognition of facial expression.

Running head: GENDER DIFFERENCES IN EMOTION JUDGMENT

Why do we look at people's eyes?

The Role of Feedback in Categorisation

SHORT REPORT Facial features influence the categorization of female sexual orientation

The number line effect reflects top-down control

The obligatory nature of holistic processing of faces in social judgments

Supplementary experiment: neutral faces. This supplementary experiment had originally served as a pilot test of whether participants

Mixed emotions: Holistic and analytic perception of facial expressions

Running Head: STEREOTYPES AND PREJUDICE AFFECT EMOTION RECOGNITION. Stereotypes and Prejudice Affect the Recognition of Emotional Body Postures

Recognition of Faces of Different Species: A Developmental Study Between 5 and 8 Years of Age

Separating Cue Encoding From Target Processing in the Explicit Task- Cuing Procedure: Are There True Task Switch Effects?

HAPPINESS SUPERIORITY EFFECT FOR SCHEMATIC FACES 1. Different Faces in the Crowd: A Happiness Superiority Effect for Schematic Faces in

Does scene context always facilitate retrieval of visual object representations?

Perceptual Fluency Affects Categorization Decisions

Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification. and. Evidence for a Face Superiority Effect. Nila K Leigh

ARTICLE IN PRESS. Vision Research xxx (2008) xxx xxx. Contents lists available at ScienceDirect. Vision Research

Grouped Locations and Object-Based Attention: Comment on Egly, Driver, and Rafal (1994)

Look into my eyes: Gaze direction and person memory

Using Peripheral Processing and Spatial Memory to Facilitate Task Resumption

PAUL S. MATTSON AND LISA R. FOURNIER

Perceiving emotion in crowds: the role of dynamic body postures on the perception of emotion in crowded scenes

Attentional control and reflexive orienting to gaze and arrow cues

In search of the emotional face: Anger vs. happiness superiority in visual search

The path of visual attention

Rapid fear detection relies on high spatial frequencies

No prior entry for threat-related faces: Evidence from temporal order judgments

Valence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University

Interpreting Instructional Cues in Task Switching Procedures: The Role of Mediator Retrieval

Top-down guidance in visual search for facial expressions

Acta Psychologica 141 (2012) Contents lists available at SciVerse ScienceDirect. Acta Psychologica

Invariant Effects of Working Memory Load in the Face of Competition

Sequential Effects in Spatial Exogenous Cueing: Theoretical and Methodological Issues

Scale Invariance and Primacy and Recency Effects in an Absolute Identification Task

Cultural Differences in Cognitive Processing Style: Evidence from Eye Movements During Scene Processing

Satiation in name and face recognition

Sequential similarity and comparison effects in category learning

Predictive Gaze Cues and Personality Judgments Should Eye Trust You?

Object Substitution Masking: When does Mask Preview work?

doi: /brain/awp255 Brain 2010: 133; Integration of gaze direction and facial expression in patients with unilateral amygdala damage

Rachael E. Jack, Caroline Blais, Christoph Scheepers, Philippe G. Schyns, and Roberto Caldara

Moralization Through Moral Shock: Exploring Emotional Antecedents to Moral Conviction. Table of Contents

(Visual) Attention. October 3, PSY Visual Attention 1

The Perception of Gender in Human Faces Samantha Haseltine Gustavus Adolphus College Faculty Advisor: Patricia Costello. Perception of Gender 1

Effect of Positive and Negative Instances on Rule Discovery: Investigation Using Eye Tracking

Balancing Cognitive Demands: Control Adjustments in the Stop-Signal Paradigm

HOW DOES PERCEPTUAL LOAD DIFFER FROM SENSORY CONSTRAINS? TOWARD A UNIFIED THEORY OF GENERAL TASK DIFFICULTY

Stimulus set size modulates the sex emotion interaction in face categorization

Perceptual and Motor Skills, 2010, 111, 3, Perceptual and Motor Skills 2010 KAZUO MORI HIDEKO MORI

The Effect of Contextual Information and Emotional Clarity on Emotional Evaluation

Differential Viewing Strategies towards Attractive and Unattractive Human Faces

Selective bias in temporal bisection task by number exposition

1 Visual search for emotional expressions: Effect of stimulus set on anger and happiness. superiority

Selective attention and asymmetry in the Müller-Lyer illusion

Comment on McLeod and Hume, Overlapping Mental Operations in Serial Performance with Preview: Typing

Looking at You or Looking Elsewhere: The Influence of Head Orientation on the Signal Value of Emotional Facial Expressions

the remaining half of the arrays, a single target image of a different type from the remaining

Are Retrievals from Long-Term Memory Interruptible?

Differences of Face and Object Recognition in Utilizing Early Visual Information

Positive emotion expands visual attention...or maybe not...

Limitations of Object-Based Feature Encoding in Visual Short-Term Memory

A model of parallel time estimation

University of Groningen. Imitation of emotion Velde, Sytske Willemien van der

Scientific Research. The Scientific Method. Scientific Explanation

Eye movements during emotion recognition in faces

What Matters in the Cued Task-Switching Paradigm: Tasks or Cues? Ulrich Mayr. University of Oregon

University of Alberta. The SNARC effect as a tool to Examine Crosstalk during Numerical Processing in a PRP paradigm. Shawn Tan

Archived at the Flinders Academic Commons:

SOCIAL JUDGMENTS ARE INFLUENCED BY BOTH FACIAL EXPRESSION AND DIRECTION OF EYE GAZE

The Simon Effect as a Function of Temporal Overlap between Relevant and Irrelevant

The Color of Similarity

Brain and Cognition, 48(2-3), (2002) Evaluation of nonverbal emotion in face and voice: some preliminary findings on a new battery of tests

Reflexive Spatial Attention to Goal-Directed Reaching

Understanding emotions from standardized facial expressions in autism and normal development

Gaze cues evoke both spatial and object-centered shifts of attention

Viewpoint dependent recognition of familiar faces

Is Face Distinctiveness Gender Based?

Reflexive Visual Orienting in Response to the Social Attention of Others

Enhanced discrimination in autism

The Meaning of the Mask Matters

Short article Detecting objects is easier than categorizing them

PSYCHOLOGICAL SCIENCE. Research Report

Parallel response selection in dual-task situations via automatic category-to-response translation

Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 1. The attentional window configures to object boundaries. University of Iowa

Black 1 White 5 Black

Spatially Diffuse Inhibition Affects Multiple Locations: A Reply to Tipper, Weaver, and Watson (1996)

Journal of Experimental Psychology: General

Prime display offset modulates negative priming only for easy-selection tasks

An investigation of basic facial expression recognition in autism spectrum disorders

Focusing on fear: Attentional disengagement from emotional faces

Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters. Qiong Wu St. Bayside, NY Stuyvesant High School

SELECTIVE ATTENTION AND CONFIDENCE CALIBRATION

Why Does Similarity Correlate With Inductive Strength?

The Associability Theory vs. The Strategic Re-Coding Theory: The Reverse Transfer Along a Continuum Effect in Human Discrimination Learning

What matters in the cued task-switching paradigm: Tasks or cues?

Transcription:

VISUAL COGNITION, 2008, 16 (6), 708733 How do eye gaze and facial expression interact? Markus Bindemann and A. Mike Burton Department of Psychology, University of Glasgow, UK Stephen R. H. Langton Department of Psychology, University of Stirling, UK Previous research has demonstrated an interaction between eye gaze and selected facial emotional expressions, whereby the perception of anger and happiness is impaired when the eyes are horizontally averted within a face, but the perception of fear and sadness is enhanced under the same conditions. The current study reexamined these claims over six experiments. In the first three experiments, the categorization of happy and sad expressions (Experiments 1 and 2) and angry and fearful expressions (Experiment 3) was impaired when eye gaze was averted, in comparison to direct gaze conditions. Experiment 4 replicated these findings in a rating task, which combined all four expressions within the same design. Experiments 5 and 6 then showed that previous findings, that the perception of selected expressions is enhanced under averted gaze, are stimulus and task-bound. The results are discussed in relation to research on facial expression processing and visual attention. In recent years, the perception of eye gaze direction has emerged as a critical component of face processing, particularly in tasks requiring visual attention. For example, Senju and Hasegawa (2005) showed that the detection of a peripheral target is delayed when observers are first required to fixate a central face that is looking directly at them, compared to faces with averted or closed eyes. Thus, direct eye gaze appears to capture or to hold attention on a face (see also Bindemann, Burton, Hooge, Jenkins, & de Haan, 2005). Averted eyes, on the other hand, are capable of rapidly shifting an observer s visual attention, leading to faster target classification in the direction of a seen gaze, than when the target appears in the opposite direction (e.g., Driver et al., 1999; Friesen & Kingstone, 1998; Friesen, Please address all correspondence to Markus Bindemann, Department of Psychology, University of Glasgow, Glasgow G12 8QQ, UK. E-mail: markus@psy.gla.ac.uk This work was supported by a Wellcome Grant (GR072308) to Mike Burton, Stefan R. Schweinberger and Stephen Langton. # 2007 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business http://www.psypress.com/viscog DOI: 10.1080/13506280701269318

EYE GAZE AND EXPRESSION 709 Moore, & Kingstone, 2005; Kingstone, Tipper, Ristic, & Ngan, 2004; Langton & Bruce, 1999; Langton, Watt, & Bruce, 2000). A functional distinction between direct and averted gaze is also evident in tasks requiring sex judgements to faces and in face identification. Thus, observers are faster to classify faces as male or female when they are looking straight at the viewer, than when eye gaze is averted (Macrae, Hood, Milne, Rowe, & Mason, 2002). Similarly, recognition memory is better for faces with direct eye gaze relative to faces displaying averted gaze. In fact, this advantage is evident in adults and children (Hood, Macrae, Cole-Davies, & Dias, 2003), both during face encoding and at face retrieval (Hood et al., 2003; Mason, Hood, & Macrae, 2004). Taken together, these findings indicate that face processing is generally less efficient when eye gaze is averted. However, there may be circumstances when there is an advantage for averted eye gaze during face processing. Adams and Kleck (2003) asked participants to categorize angry- and fearfullooking full-face images, in which the eye gaze was either directed straight at the viewer or horizontally averted. Angry faces showed the typical influence of gaze direction, with slower responses to faces with averted eye gaze. Intriguingly, for fearful expressions the opposite pattern was found. Thus, responses were slower to fearful faces displaying direct gaze, than to the same fearful faces in which the eyes were averted. This remarkable pattern was replicated in a second experiment with happy and sad facial expressions, so that sad faces were classified faster with averted gaze and happy faces were classified faster when eye gaze was direct. These findings suggest that eye gaze selectively affects the perception of facial expression in a reaction time paradigm. Moreover, these effects persisted in a subsequent study, in which participants attributed emotional expressions to neutral faces, ambiguous expression morphs, and prototypical expressions in nonspeeded ratings tasks (Adams & Kleck, 2005). However, Adams and Kleck s (2003) finding with a speeded task in particular raises the intriguing possibility that eye gaze and expression information are integrated into a single percept at an early visual processing stage, suggesting a process more primary than that found at the decision-making level (Adams & Kleck, 2005, p. 10). And yet, despite this evidence, it is difficult to envisage a perceptual system in which averted eye gaze rapidly shifts an observer s attention (e.g., Friesen & Kingstone, 1998; Friesen et al., 2005) and consistently impairs the processing of sex and identity-related face information (Hood et al., 2003; Macrae et al., 2002; Mason et al., 2004), but facilitates the processing of selected emotional expressions. Indeed, if eye gaze and expression are perceptually intertwined in this manner at primary processing stages, then one might expect in return that the perception of eye gaze direction is also affected by the emotional expression on a face. Contrary to this prediction, gaze cueing appears unaffected by facial expression

710 BINDEMANN, BURTON, LANGTON (Hietanen & Leppänen, 2003). For these reasons, it is important to provide further evidence for interactions between eye gaze and the perception of facial expression. This was the aim of the present study. EXPERIMENT 1 The aim of the first experiment was to replicate Adams and Kleck s (2003) finding with happy and sad facial expressions in a speeded categorization task. Subjects responded to the emotional expression of a central target, which was shown in full-face format, and with either direct gaze or with the eyes averted horizontally. According to Adams and Kleck, responses should be faster to happy faces with direct gaze than to the same faces with averted gaze, and the reverse pattern should be found for sad expressions. Method Subjects. Twenty-two students from the University of Glasgow participated in the experiment for a small fee or for course credits. All had normal or corrected to normal vision. Apparatus and stimuli. A Macintosh computer was used to present stimuli and record responses using PsyScope 1.2.5 software. Viewing distance was fixed at 60 cm by a chinrest. A set of 30 faces was used as stimuli. This was derived from 30 photographs, six of each of five male Caucasian models showing happy and sad expressions, and posing with direct, left-averted, and right-averted eye gaze. All faces were cropped to remove extraneous background, converted to greyscale, and measured maximally 5.8 cm7.0 cm (5.686.88 of visual angle at a distance of 60 cm). Prior to the experiment, eight participants verified these facial expressions in a forced-choice design including the same number of fearful, angry, neutral, and surprised faces. On average, happy expressions were identified correctly on 100% of trials and sad expressions on 69% of trials, and these two expressions were never confused with each other (see Figure 1 for examples). This is considerably better than chance performance (17%) and consistent with previous categorization tasks (e.g., Calder et al., 1996; Ekman & Friesen, 1976). Procedure. A trial began with a central fixation cross for 750 ms, followed by a face stimulus, which was displayed until response. Participants made speeded judgements concerning the emotional expression of the target face (i.e., happy vs. sad). Responses were made with the right hand, using the numeric button pad on the right side of a standard computer keyboard, pressing the 3 key for happy and the. key for sad faces. Throughout the experiment, happy and sad expressions, and averted and direct eye gaze

EYE GAZE AND EXPRESSION 711 Figure 1. Example stimuli from Experiment 1 showing happy (a) and sad (b) facial expressions with averted and direct eye gaze. occurred with equal frequency. Participants completed 20 practice trials and three experimental blocks of 80 randomly ordered trials, and could take short breaks between blocks. Results and discussion Table 1 shows the means of the median correct RTs and percentage errors for all conditions. A 2 (direct vs. averted eye gaze)2 (happy vs. sad expression) within-subject analysis of variance (ANOVA) of the RT data showed a main effect of gaze, F(1, 21)4.63, pb.05, due to faster responses to faces with direct gaze than to faces with averted gaze. A main effect of expression was also found, F(1, 21)29.98, pb.01, reflecting faster responses for happy faces. However, the interaction between gaze and expression was not significant, F(1, 21)B1. Error rates were analysed as the RT data. Percentage errors to happy expressions were marginally higher in the averted gaze than in the direct gaze condition, but the reverse trend was found for sad expressions. ANOVA showed no main effect of gaze, F(1, 21)B1, or expression, F(1, 21)B1, but an almost significant interaction between these factors, F(1, 21)3.82, p.06. To discount any possibility that the differences between RTs and errors might reflect a speedaccuracy tradeoff, inverse efficiency scores were

712 BINDEMANN, BURTON, LANGTON TABLE 1 Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates as a function of experimental condition in Experiment 1 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Happy 515 527 13 17 4 6 Sad 560 567 19 19 5 4 calculated to provide a combined measure of RTs and accuracy (Townsend & Ashby, 1983). These were: Happy direct-gaze 544 ms, averted-gaze 567 ms; sad direct-gaze 584 ms, averted-gaze 588 ms. Consistent with the RT data, analysis of the inverse efficiency data showed main effects of gaze, F(1, 21) 8.1, pb.01, and expression, F(1, 21)12.18, pb.01, but the gaze expression interaction was not significant, F(1, 21)3.12, p.10. 1 In this experiment, response times were consistently slower to faces with averted eye gaze, both for happy and sad emotional expressions. This contradicts Adams and Kleck s (2003) finding of an interaction between these expressions and eye gaze direction in a speeded categorization task. There is, however, a possible confound in this experiment as there were twice as many photographs with averted eye gaze (i.e., left and right gaze stimuli) than with direct gaze. An advantage of this manipulation is that, unlike Adams and Kleck s faces, in which gaze direction was manipulated digitally, the present stimuli reflect natural variations in eye gaze direction. However, there is a corresponding disadvantage, in that our stimuli may vary slightly in ways other than gaze, even within the same identity. Increasing the number of averted-gaze faces relative to direct-gaze faces may thus result in a slight increase in expression information in the former conditions. If this results in an increase in task difficulty, then one might predict slower expression decisions when the eyes are averted, for reasons relating to the experimental design, rather than reflecting a direct effect of eye gaze. This was addressed in the next experiment. EXPERIMENT 2 In Experiment 2, happy and sad faces from the Ekman and Friesen (1976) series were used as stimuli. This is a photographic set of prototypical 1 Inverse efficiency scores were calculated for all experiments (except Experiment 4, which did not produce error data). Analysis of these scores consistently produced the same outcome as reaction time analysis and is not reported further.

EYE GAZE AND EXPRESSION 713 full-face expressions with direct eye gaze, which was digitally edited to produce a complementary set with left-averted and right-averted eyes. In this way, the emotional content of the face targets was preserved across the experimental conditions, except for the critical eye gaze information. Method Subjects. Thirty new students from the University of Glasgow participated in the experiment for a small fee or for course credits. All had normal or corrected to normal vision. Stimuli and procedure. The stimuli and procedure were the same as those in Experiment 1, except for the following changes. Happy and sad pictures of five male models from the Ekman and Friesen (1976) series were used to create a new stimulus set of 30 faces. This set comprised the 10 original images, all of which displayed direct eye gaze, and 20 new images of these faces in which the pupils were digitally moved to the left and right side of both eyes (for example stimuli see Figure 2). As before, these images were cropped to remove any extraneous background and measured maximally 5.8 cm7.0 cm. Participants completed an example block of 20 trials and three experimental blocks of 80 randomly intermixed trials. Results and discussion The means of the median correct RTs and percentage errors are shown in Table 2 as a function of experimental condition. As before, a 2 (direct vs. averted gaze)2 (happy vs. sad expression) ANOVA of the RTs showed a main effect of gaze, F(1, 29)7.06, pb.05, reflecting faster responses to faces with direct eye gaze than with averted eye gaze, and a main effect of expression, F(1, 29)14.97, pb.01, showing faster responses to happy than to sad faces. No interaction between these factors was found, F(1, 29)B1. Error rates were analysed as the RT data. Averted gaze resulted in a slight increase in errors for happy faces and a slight decrease for sad faces. ANOVA showed no main effect of gaze, F(1, 29)B1, or of expression, F(1, 29)2.90, and no interaction between these factors, F(1, 29)1.71. Experiment 2 replicates the important aspects of Experiment 1 with a stimulus set that was designed to minimize variation in expression information across the eye gaze conditions. Thus, participants responded slower to faces with horizontally averted gaze relative to the direct gaze conditions, independent of the displayed facial expression.

714 BINDEMANN, BURTON, LANGTON Figure 2. Example faces for Experiments 24 showing happy (a), sad (b), angry (c), and fearful (d) expressions with averted and direct gaze. EXPERIMENT 3 Experiments 1 and 2 show an increase in response times to both happy and sad expressions when the eyes are horizontally averted in these face stimuli.

EYE GAZE AND EXPRESSION 715 TABLE 2 Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates as a function of experimental condition in Experiment 2 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Happy 539 550 12 14 5 6 Sad 572 586 15 18 7 6 The next experiment examines whether these findings can be extended to angry- and fearful-looking faces, which Adams and Kleck (2003) used as an analogue for happy and sad expressions. Similar to the latter expressions, they found that the perception of anger (like happiness) was slowed by averted gaze, whereas fear (like sadness) was categorized faster in this condition. However, these effects were particularly pronounced for angry and fearful stimuli. Despite the absence of an interaction between eye gaze and expression in the previous two experiments, it is therefore possible that Adams and Kleck s findings might persist with angry and fearful faces in Experiment 3. Method Subjects. Twenty-four students from the University of Glasgow participated for a small fee or course credits. All reported normal or corrected to normal vision. Stimuli and procedure. The stimuli and procedure were the same as in previous experiments, except as follows. The happy and sad stimuli were replaced with an equivalently prepared set of angry and fearful faces of the same male identities from the Ekman and Friesen (1976) series (see Figure 2 for examples). Participants now made speeded judgements concerning whether the target face was carrying an angry or a fearful expression, using the same two-choice keypress response ( 3 vs.. ) of previous experiments. Participants completed 20 practice trials and three experimental blocks of 80 randomly ordered trials. Results and discussion Table 3 shows the means of the median correct RTs and percentage errors for all conditions. A 2 (direct vs. averted gaze)2 (angry vs. fearful expression) ANOVA of the RTs showed a main effect of gaze, F(1, 23)6.89, pb.05,

716 BINDEMANN, BURTON, LANGTON reflecting faster responses to faces with direct eye gaze, and a main effect of expression, F(1, 23)4.97, pb.05, reflecting slower responses to fearful than to angry facial expressions. However, as for Experiments 1 and 2, no interaction between gaze and expression was found, F(1, 23)1.14. Analogous analysis of the error data revealed a main effect of gaze, F(1, 23)5.15, pb.05, reflecting a slight increase in errors for both angry faces and fearful faces when eye gaze was averted. No main effect of expression, F(1, 23)B1, and no interaction between these factors was found, F(1, 23)1.04. These results converge with the pattern found in preceding experiments. That is, response times to happy and sad faces (Experiments 1 and 2) and angry and fearful faces (Experiment 3) were consistently slower for full-face targets looking to the side, than for targets in which the eyes were directed at the viewer. EXPERIMENT 4 None of the previous experiments have reproduced Adams and Kleck s (2003) findings with emotional expression and eye gaze. However, each of these experiments compared only two expressions with a limited set of stimuli, which may have enabled observers to utilize superficial cues for classification. For example, merely detecting the teeth in a smile may have been sufficient to distinguish happy and sad expressions, without requiring any further face processing beyond this simple cue. Consequently, it is conceivable that eye gaze affects expression judgements in a different way when the target faces are processed in greater detail. Experiment 4 explored this possibility with a speeded rating task, in which participants judged the perceived intensity (e.g., angry vs. very angry) of all four expressions. It was anticipated that this would require finer discriminations within facial expressions, perhaps revealing differential effects of eye gaze on emotional expressions where none were previously found. TABLE 3 Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates as a function of experimental condition in Experiment 3 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Angry 679 704 23 24 9 12 Fearful 720 732 29 31 11 12

EYE GAZE AND EXPRESSION 717 Method Subject. Twenty students from the University of Glasgow participated for a small fee or for course credits. All reported normal or corrected to normal vision. Stimuli and procedure. The stimuli consisted of the face sets that were used in Experiments 2 and 3, but rather than categorizing these faces according to their emotional expression, subjects were now asked to rate them according to their intensity on a 3-point rating scale (where 1not very angry/fearful/happy/sad, 3very angry/fearful/happy/sad, and 2 somewhere in between). Each trial began with a fixation cross for 750 ms, and was followed by a face that was displayed until a response was made. Subjects responded with the 1, 2, and 3 keys on the button pad of a standard computer keyboard, and were instructed to make use of the full range of intensity ratings. Subjects were told to respond as accurately as possible, but within approximately 2 s of stimulus onset, and the next trial was initiated if no response was obtained within 2.5 s of target presentation. Participants completed 40 practice trials and four experimental blocks of 80 randomly ordered trials. Results and discussion Participants failed to register a response on less than 1% of all trials. The means of the median RTs and intensity ratings are shown in Table 4 as a function of experimental condition. A 4 (angry, fearful, happy, sad)2 (averted vs. direct gaze) ANOVA of the RT data showed a main effect of expression, F(3, 57)42.92, pb.01, reflecting faster responses to happy faces than for each of the other expressions (happy vs. angry, happy vs. fearful, happy vs. sad, all Tukey TABLE 4 Mean reaction times (RTs, in ms), standard errors (SE), and intensity ratings as a function of experimental condition in Experiment 4 Type of emotional expression RTs SE Intensity ratings Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Happy 688 705 23 27 2.77 2.77 Sad 855 893 30 34 1.45 1.52 Angry 868 902 31 36 2.10 2.11 Fearful 846 940 31 37 2.37 2.15

718 BINDEMANN, BURTON, LANGTON HSD, pb.01). As before, a main effect of gaze was also found, F(1, 19) 34.01, pb.01, reflecting faster responses to faces with direct than with averted eye gaze. Unlike previous experiments, the main effect of gaze was qualified by an interaction with expression, F(3, 57)4.98, pb.01. Simple main effect analysis revealed significant gaze effects for angry faces, F(1, 19)4.66, pb.05, fearful faces, F(1, 19)35.79, pb.01, and sad faces, F(1, 19)6.05, pb.05, but not for happy faces, F(1, 19)1.13. For the intensity ratings, ANOVA showed a main effect of expression, F(3, 57)126.51, pb.01, reflecting significant differences between each of the facial expressions (all Tukey HSD, pb.01), except anger and fear. The main effect of gaze did not reach significance, F(1, 19)3.01, p B.10, but an interaction between expression and gaze was found, F(3, 57)17.28, pb.01. As Table 4 suggests, significant effects of eye gaze were found only for fearful faces, Tukey HSD, pb.01. Experiment 4 examined how eye gaze affects expressions judgements when participants are required to process facial expression in greater detail than in previous experiments, in an intensity rating task. Importantly, although this task did not require explicit identification of the emotional expressions, participants nonetheless responded to the individual expressions, as can be seen from the intensity ratings across these conditions. However, only fearful faces also showed an effect of eye gaze, with more fearful ratings in the direct gaze than in the averted gaze condition. Participants response times*the data of primary interest*were slower than in previous experiments. However, this appeared to have little effect on the relation between eye gaze and expression processing, as responses were again significantly slower in the averted gaze than in the direct gaze conditions during the classification of angry, fearful, and sad faces. As before, the same pattern was also found for happy expressions, although this difference did not reach significance in this experiment. EXPERIMENT 5 So far, all of the experiments have failed to replicate Adams and Kleck s (2003) findings with emotional expression and eye gaze, despite several experimental manipulations. These included facial stimuli displaying naturally averted eye gaze (Experiment 1) and digitally averted eye gaze (Experiments 24), our own set of posed expressions (Experiment 1) and prototypical expressions (Experiments 24), a two-choice categorization task (Experiments 13), and a rating task, which combined all four facial expressions within the same design (Experiment 4). However, none of these experiments have employed the same design and the same stimuli as Adams and Kleck (2003). This was addressed in the next experiment, which

EYE GAZE AND EXPRESSION 719 employed Adams and Kleck s set of angry and fearful faces in a two-choice expression task. Method Subjects. Eighteen new subjects, all students from the University of Glasgow, participated in the experiment for a small fee or course credits. All reported normal or corrected to normal vision. Stimuli. The stimuli were identical to Adams and Kleck (2003, Exp. 1), and consisted of 15 male and 15 female faces, which displayed angry and fearful facial expressions, and direct gaze, left-averted gaze, and right-averted gaze. These stimuli were sourced from face sets developed by Beaupré, Cheung, and Hess (2000), Ekman and Friesen (1976), Kirouac and Doré (1984), and some of Adams and Kleck s (2003) own faces. All faces were of Caucasian descent and depicted with direct eye gaze, which was digitally edited by Adams and Kleck to produce complementary sets for the averted gaze conditions. In the experiment, averted-gaze faces were only presented once, but each direct-gaze face was presented twice to provide an equal number of stimuli for each level of eye gaze. This resulted in a total of 240 experimental trials. As in Adams and Kleck (2003), the experiment also included 64 trials of face blends, which were intermingled with the 240 pure expression trials during the experiment. Blends were created by morphing angry and fearful expressions of 16 face identities in equal proportions. As for the pure expressions, blended faces were presented with direct, left-averted, and rightaverted gaze, but direct-gaze blends were presented twice to balance the design. Procedure. Participants made speeded judgements concerning whether the target face was carrying an angry or a fearful expression, using a twochoice keypress response ( 3 vs.. ). A trial consisted of a fixation cross for 750 ms, followed by a face target, which was displayed until a response was made. As in Adams and Kleck (2003), participants were not given practice trials, and completed 304 randomly intermixed experimental trials. Results and discussion Pure expressions. Table 5a shows the means of the median correct RTs and percentage errors for the pure expression conditions. A 2 (direct vs. averted gaze)2 (angry vs. fearful expression) ANOVA of these RT data revealed a main effect of expression, F(1, 17)10.45, pb.01, reflecting faster categorization for angry faces. A main effect of gaze was not found, F(1, 17)B1, but an interaction between gaze and expression, F(1, 17)11.68,

720 BINDEMANN, BURTON, LANGTON TABLE 5a Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates for pure emotional expressions in Experiment 5 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Angry 843 868 37 40 10 17 Fearful 952 913 56 49 15 13 pb.01. Analysis of simple main effects revealed significant effects of gaze for angry expressions, F(1, 17)5.08, pb.05, with faster responses to angry faces with direct gaze, and the opposite pattern for fearful faces, F(1, 17)12.43, pb.01. In contrast to RTs, the error data showed no main effect of expression, F(1, 17)B1, but a main effect of gaze, F(1, 17)8.69, pb.01, due to overall higher errors in the averted gaze conditions. This effect was qualified by an interaction between gaze and expression, F(1, 17)20.24, pb.01, reflecting higher errors for angry faces with averted gaze, compared to the direct gaze condition, F(1, 17)37.43, pb.01, and the reverse trend for fearful expressions, F(1, 17)3.81, p.07 (see Table 5a). Blended expressions. Response times to the ambiguous face blends showed a similar pattern to the pure expressions, with slower responses to direct gaze blends when they were classified as fearful than to their avertedgaze counterparts, and the opposite response pattern for blends that were perceived as angry (see Table 5b). ANOVA revealed a main effect of gaze, F(1, 17)4.57, pb.05, due to overall higher RTs to direct gaze faces, but showed no main effect of expression, F(1, 17)B1, and no interaction between gaze and expression, F(1, 17)2.63, p.12. In addition to RT data, expression blends with averted gaze were given more fear labels than blends with direct gaze (binomial test, N15/18, pb.01), and, conversely, were given more angry labels when displaying direct gaze than when displaying averted gaze (binomial test, N15/18, pb.01). Experiment 5 fails to replicate the findings of the four previous experiments, which showed that expression processing is generally impaired when eye gaze is averted, independent of the displayed facial expression. In contrast, this experiment reveals an interaction between eye gaze and expression processing, which indicates that the perception of angry expressions is weakened by averted eye gaze, whereas the perception of fear is actually enhanced under the same conditions. Experiment 5 thus reproduces Adams and Kleck s (2003) findings, with an identical design and their own set of facial expressions.

EYE GAZE AND EXPRESSION 721 TABLE 5b Mean reaction times (RTs, in ms) and standard errors (SE) for blended emotional expressions in Experiment 5, and the average number of responses (no. of labels) that were made to the blends as a function of expression Label of blended expression RTs SE No of labels Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Angry 1094 1115 85 113 17 11 Fearful 1226 997 184 87 15 21 And yet, this discrepancy between the first four experiments and Experiment 5 is puzzling. Experiments 14 employed several different stimulus sets, with posed and digitally averted eye gaze, and replicated the same response pattern across different tasks. Similarly, Experiment 5 supports the validity of Adams and Kleck s (2003) findings, by producing an orthogonal interaction in eye gaze and expression processing. On closer inspection, however, overall response times and error rates were noticeably higher in this experiment than in Experiment 3 (cf. Tables 3 and 5a, see also Tables 1 and 2), which compared the same two emotional expressions in a similar design. One possibility is that these differences simply reflect the larger, and hence perhaps more varied stimulus set, of Experiment 5. However, anger and fear belong to a set of prototypical expressions (see e.g., Ekman & Friesen, 1976), and as long as reasonable set sizes are used to rule out image-based explanations (as was the case in Experiments 14), these primary expressions should be classified reliably and relatively independently of set size, particularly in a rather trivial two-choice response task. On the other hand, one might expect longer response times as well as higher error rates, if ambiguity between fearful and angry expressions exists. This explanation holds some merit for a heterogeneous face set assembled from different sources and that includes clearly ambiguous expression blends (see Experiment 5, Stimuli section). If such expression information is presented combined with variations in gaze direction, then participants might try to guess the outcome of the experiment, and adjust their behaviour to take gaze into account when making expression decisions. This possibility, that top-down strategies might account for the findings of Experiment 5, is examined in the final experiment. EXPERIMENT 6 This experiment combined all four expressions from Adams and Kleck s (2003) facial displays of emotion in a speeded four-choice categorization

722 BINDEMANN, BURTON, LANGTON task. If gaze was paired strategically with anger and fear in Experiment 5, then these gaze cues should become less effective in a task with more than two facial expressions, as each level of eye gaze (i.e., averted, direct) can now be associated with more than a single expression. In this case, Experiment 6 should not produce the gaze-expression interaction of the previous task. On the other hand, if averted eye gaze facilitates the processing of fearful (Adams & Kleck, 2003, Exp. 1) and sad expressions (Adams & Kleck, 2003, Exp. 2) in this task, then this would provide further evidence for the perceptual integration of specific gaze states and selected emotional expressions. Method Subjects. Thirty-two new subjects, all students from the University of Glasgow, participated in the experiment for a small fee or course credits. All reported normal or corrected to normal vision. Stimuli. In addition to the angry and fearful faces of Experiment 5 (sourced from Adams & Kleck, 2003, Exp. 1), the task now also included happy and sad facial expressions (Adams & Kleck, 2003, Exp. 2). These faces were obtained from stimulus sets developed by Beaupré et al. (2000), Ekman and Friesen (1976), Kirouac and Doré (1984), and some of Adams and Kleck s (2003) own pictures. In all stimuli, direct gaze was digitally moved by Adams and Kleck to produce complementary sets for the averted gaze conditions. For each expression, the design comprised 30 faces with leftaverted gaze, 30 faces with right-averted gaze, and 30 faces with direct gaze. To balance the design, each averted-gaze face appeared once but direct gaze faces appeared twice, giving a total of 480 experimental trials. Note that blends of happy and sad expressions do not yield suitable stimuli (see Adams & Kleck, 2003). Consequently, fearanger blends were also omitted from this experiment. Procedure. Each trial began with a fixation cross for 750 ms, followed by a face stimulus, which was displayed until a response was made. Responses were made using the D, F, K, and L keys on a standard computer keyboard for angry, fearful, happy, and sad expressions respectively. Participants were instructed to use index and middle fingers for keypresses, and to respond as quickly and as accurately a possible. At the start of the experiment, participants were given 80 practice trials to learn this response layout, where they were only shown the printed names of the four target emotions. This was immediately followed by eight experimental blocks of the face stimuli.

EYE GAZE AND EXPRESSION 723 Results and discussion Table 6 shows the means of the median correct RTs and percentage errors for all conditions. A 2 (direct vs. averted gaze)4 (angry vs. fearful vs. happy vs. sad expression) ANOVA of the RT data revealed a main effect of expression, F(3, 93)81.62, pb.01. Tukey HSD test showed significant differences between each of the facial expressions (all ]pb.05), except angry and sad expressions. In addition, a main effect of gaze was found, F(1, 31)15.06, pb.01, and an interaction between gaze and expression, F(3, 93)2.84, pb.05. Analysis of simple main effects revealed significant effects of gaze for angry faces, F(1, 31)15.25, pb.01, and fearful faces, F(1, 31)5.00, pb.05, but not for happy and sad expressions, both FB1. The error data also showed a main effect of expression, F(3, 93)29.89, pb.01, reflecting more errors to fearful, angry, and sad faces than to happy facial expressions (all Tukey HSD, pb.01), and significantly fewer errors to fearful than to sad faces (pb.05). In addition, a main effect of gaze was found, F(1, 31)20.41, pb.01, and an interaction between both factors, F(3, 93)9.64, pb.01. Simple main effect analysis revealed an effect of gaze on angry expressions, F(1, 31)9.46, pb.01, and sad expressions, F(1, 31)34.95, pb.01, but not for fearful and happy faces, both FB1. Experiment 6 fails to replicate the orthogonal interaction between gaze and expression that was observed in Experiment 5. In contrast, a consistent effect of eye gaze was found, resulting from slower expression RTs across the averted gaze conditions. Notably, this effect was most pronounced for angry and fearful faces, the two expressions that produced a strikingly different pattern in Experiment 5, but the same numerical effects were also found for happy and sad expressions. To some extent this result is consistent with the preceding experiments, in which angry and fearful faces generally showed larger gaze effects than happy and sad faces (cf. Tables 14). Indeed, similar differences in magnitude were also observed by Adams and Kleck (2003), TABLE 6 Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates as a function of experimental condition in Experiment 6 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Happy 799 809 18 19 3 4 Sad 997 1012 22 23 15 21 Angry 993 1054 27 33 15 19 Fearful 1057 1093 24 27 13 12

724 BINDEMANN, BURTON, LANGTON who obtained larger gaze effects for angry and fearful faces. This suggests that there are differences in the degree to which eye gaze affects the processing of these facial expressions. These differences could reflect the extent to which the eye regions code an expression. For example, although the eyebrows are of importance for recognizing sad expressions, the eye regions appear much less important for recognizing happy and sad expressions than for recognizing anger and fear (Smith, Cottrell, Gosselin, & Schyns, 2005). Therefore, one factor that might determine the effect of gaze direction in these experiments could be the extent to which an expression is coded by areas around the eyes. More importantly, Experiment 6 suggests that the response pattern of Experiment 5 does no survive a change in task demands, even when this change is seemingly as trivial as adding two emotional expressions to a categorization task. However, it is worth noting that these results were obtained without the fearanger expression blends that were used in Experiment 5, which were omitted because we did not possess an analogous set of happysad blends (see Adams & Kleck, 2003). This raises the question whether stimulus blends are essential for producing the response pattern that was found for angry and fearful faces in Experiment 5. Adams and Kleck (2003) observed a pattern similar to Experiment 5 with happy and sad expressions in a task that did not encompass face blends, which indicates that these gaze effects do not depend entirely on face blends. Similar to Experiment 5, however, this experiment employed a simple binary decision task (i.e., happy vs. sad), which may have encouraged associations between each of the expressions and a particular gaze state (i.e., averted vs. direct). We suggest that Experiment 6 made it more difficult to utilize eye gaze in this fashion, because each gaze direction could be paired with more than one expression in this task. We return to a fuller discussion of these findings in the general discussion. GENERAL DISCUSSION Adams and Kleck (2003) recently reported that the perception of angry and happy facial expressions is impaired in speeded categorization tasks when the eyes of these stimuli are horizontally averted, whereas the perception of fearful and sad expressions is actually enhanced under the same conditions. The present experiments reexamined these claims across different stimulus sets and tasks. Experiment 1 found that the processing of happy and sad expressions is impaired when eye gaze is averted, compared to when a face target is looking straight at a viewer. Unlike previous research, this pattern was observed with faces displaying naturally averted gaze and with verified, posed expressions (Experiment 1).

EYE GAZE AND EXPRESSION 725 Subsequent experiments replicated these effects with happy and sad faces (Experiment 2) and angry and fearful faces with digitally altered gaze (Experiment 3), and with a rating paradigm (Experiment 4), which combined these four expressions within the same task. In contrast to these findings, Experiment 5 then reproduced Adams and Kleck s (2003) results, but with the original angry and fearful stimuli that were employed in this study. Remarkably though, these effects did not survive a change in task demands in Experiment 6, in which all of Adams and Kleck s faces were combined in a four-choice (angry, happy, fearful, and sad) categorization paradigm. Overall, these results suggest that the perception of emotional expression is impaired in a speeded classification task when the eyes of a face stimulus are averted. One straightforward explanation for this finding is that eye gaze is analysed faster than expression, and then influences the allocation of visual attention to the target face. Thus, when eye gaze is averted, an observer s attention is shifted temporarily in the direction of the seen gaze (e.g., Driver et al., 1999; Friesen et al., 2005), making the ongoing expression analysis less efficient than if the face was held at the focus of attention. Consistent with this notion, some face processing resources are more scarce than was previously thought, and are not shared across the visual field (Bindemann, Burton, & Jenkins, 2005). Thus, when eye gaze induces an attention shift from a face target, this ongoing gaze-direction processing may briefly leave insufficient resources to analyse facial expression in the unattended target location. Furthermore, Lewis and Edmonds (2003) showed that face detection is impaired disproportionately when the eyes are occluded, than when the forehead, nose, mouth, or chin are obscured. Observers also fixate the eyes prior to any other facial region (Althoff & Cohen, 1999), and look at the eyes more frequently during face processing (Henderson, Williams, & Falk, 2005; Schyns, Jentsch, Johnson, Schweinberger, & Gosselin, 2003; Smith, Gosselin, & Schyns, 2004). In addition, when variations in eye gaze are salient, as was the case in the present experiments, gaze is judged faster than the emotional expression on a face (see e.g., Ganel, Goshen-Gottstein, & Goodale, 2005), and gaze perception appears entirely unaffected by variations in facial expression (Hietanen & Leppänen, 2003). This suggests that the eyes play a crucial early role in face perception and are in a privileged position to influence, and impair, other ongoing face processes, such as expression analysis. However, this should not be interpreted as evidence that eye gaze and expression analysis is functionally integrated in early face processing. Rather, we suggest that eye gaze affects expression analysis via an intermediate process, namely the allocation of visual attention to a face target.

726 BINDEMANN, BURTON, LANGTON How can this explanation be reconciled with Experiment 5, in which the perception of fear was enhanced in the averted gaze condition? One possibility is that participants employed a gaze-based strategy to disambiguate angry and fearful faces that did not contain a clear expression signal, by pairing anger with direct gaze and fear with averted gaze. Although gaze following is highly reliable in visual orienting tasks, (see e.g., Driver et al., 1999; Friesen & Kingstone, 1998; Friesen et al., 2005), these gaze-following effects could be superseded if expression responses are delayed by a strategic switch to gaze-related information. Several observations support this notion. First, responses were notably faster and fewer errors were made in Experiments 13 than in Experiment 5, suggesting that it was more difficult to distinguish the emotional expressions in the latter experiment, despite using the same simple binary decision task (cf. Tables 13 and 5a). In addition, a breakdown of the RT data for these experiments shows initially slower responses for all expressions in the averted gaze conditions (see Figure 3), demonstrating clearly comparable effects of gaze direction. Whereas this pattern continues to hold for Experiments 13 throughout the task, an interaction emerges in Experiment 5 that is indicative of the acquisition of a gazebased strategy. 2 It is also likely that the emotionally ambiguous face blends in Experiment 5 encouraged the use of additional facial information as part of the decision making process. Eye gaze is the most obvious candidate since face blends were invariant in other aspects. Finally, Experiment 6 combined eye gaze with four emotional expressions, which makes it more difficult to use the eyes strategically, as averted and direct gaze could be paired with more than one expression. And importantly, in Experiment 6 averted gaze generally impaired expression judgements, which further supports the notion that the findings of Experiment 5 represent strategic task effects. To pursue this issue further, another point to take note of here is that the pattern of results in Experiment 5 is consistent with studies in which participants are required to make somewhat challenging expression decisions. For example, Adams and Kleck (2005) showed that emotionally neutral faces can be rated as angry, fearful, happy, or sad as a function of gaze direction in a similar manner to the reaction times pattern in 2 This interpretation receives some support from the statistical analysis. 322 ANOVAs of block, expression, and gaze did not reveal any significant interactions in Experiment 1 or Experiment 3. In Experiment 2, only a blockexpression interaction was found, F(2, 58)4.03, pb.05, due to faster responses to happy faces than sad faces in Block 1, F(1, 29)10.65, pb.01 (all other interactions, ns). In contrast, Experiment 5 showed interactions of expression and gaze, F(1, 18)7.62, pb.01, and, marginally, for block and gaze, F(2, 34)3.04, p.06. An advantage for direct-gaze expressions was found for Block 1, F(1, 17)4.92, pb.05, but not for Blocks 2 or 3 (all other interactions, ns).

EYE GAZE AND EXPRESSION 727 Figure 3. Summary of the effects of eye gaze (in ms, averted gazedirect gaze) on expression for the experimental blocks in Experiments 13 (ac) and Experiment 5 (d), and overall effect sizes.

728 BINDEMANN, BURTON, LANGTON Experiment 5. 3 In a subsequent experiment, Adams and Kleck (2005) also presented participants with pairs of faces in a ratings task. Both faces in a pair were identical (i.e., same identity, expression) except for eye gaze direction, and, similarly again to Experiment 5, expression ratings to these faces varied as a function of eye gaze. Remarkably, however, a majority of participants initially gave the same ratings to both faces in a pair; the interaction with eye gaze was only obtained after the task was adjusted so that participants were forced to provide different ratings for both faces (Adams & Kleck, 2005, footnote 1). These findings with neutral faces and emotional face pairs strongly suggest that the pattern of Experiment 5 is not tied to the actual emotional content of a face, but emerges when a degree of ambiguity or a difficulty in classifying a facial expression exists. Crucially, Adams and Kleck argue that these findings reflect primary processing stages prior to any decisions making, because similar effects are obtained with speeded reaction time tasks (2005, p. 10). Contrary to these conclusions, however, the present study indicates that averted eye gaze generally slows expression categorization in speeded tasks. Therefore, we suggest that averted gaze generally initially impairs expression processing; the use of different eye gaze directions is a strategic component that is acquired subsequently. On a more cautious note, we cannot explain why anger and fear were consistently bound to a specific gaze signal in Adams and Kleck s (2003, 2005) studies and in Experiment 5. For example, if some participants used averted gaze to classify fear, but others used direct gaze, then any effect of gaze should average out over the course of these experiments. Maybe, considering that eye gaze was edited manually to produce the averted gaze conditions, the stimulus set contained some inconsistencies in eye gaze information that systematically biased expression decisions. Or perhaps, when expression is difficult to categorize and the decision is binary, participants use the surface area of the sclera to resolve ambiguity between angry and fearful faces. If this is done by associating the stimuli with the most exposed sclera, that is the averted gaze faces, with the characteristically gaping scleral surface of the fearful expressions (see Figure 2), then this could result in the eye gaze by expression interaction that was observed in Experiment 5. The comparatively high error percentage 3 We obtained a similar result in a reaction time study with happy, sad, and neutral faces (N19) to which observers made forced-choice happy/sad decisions. As in Experiments 1 and 2, responses to happy and sad expressions were slower in the averted than the direct gaze conditions (happy: Averted gaze, 704 ms vs. direct gaze, 667 ms; sad: Averted gaze, 899 ms vs. direct gaze, 877 ms). For neutral faces these effects varied as a function of eye gaze and response. Neutral faces classified as happy showed a cost for averted gaze trials (averted gaze, 1181 ms vs. direct gaze, 1145 ms), but this effect reversed when these faces were classified as sad (averted gaze, 900 ms vs. direct gaze, 910 ms). However, this interaction did not approach significance.

EYE GAZE AND EXPRESSION 729 for angry faces in the averted gaze condition in Experiment 5, which indicates that these faces are frequently mistaken for fearful expressions, provides some tentative support for this notion. Alternatively, it is possible that some gaze directions are naturally associated with particular emotional expressions, leading to enhanced processing when both types of information are combined within the same face. According to Adams and Kleck (2003, 2005), this could reflect a system that combines facial information with a shared signal value, according to whether this information signals the behavioural motivation to approach or to avoid a conspecific (see e.g., Argyle & Cook, 1976; Davidson & Hugdahl, 1995; Harmon-Jones & Segilman, 2001). Thereby, facial information such as direct gaze and positive emotions (and anger) are all associated with approach motivation, which may lead to more efficient expression processing when any of these signals are combined within the same stimulus. Similarly, the processing of negative, avoidance emotions (except anger) may be boosted when these are combined with averted eye gaze, which also signals avoidance. Evidence on how this theory applies to human emotion perception is still relatively sparse. The present experiments suggest that eye gaze and expression do not combine in this manner at any primary processing stages, prior to any decision making. At the same time, it seems plausible that an approach/avoidance model can account for eye gaze and expression integration at subsequent stages. At present, however, it is unresolved how an approach/avoidance theory maps onto existing psychological data, including previous studies with fearful faces. For instance, an approach/ avoidance model associates averted gaze with the perception of fear. At the same time, there is extensive evidence that the perception of fear is intimately linked with direct eye gaze, characterized, in this expression, by a wide scleral contrast above the iris, which disappears as the eyes are turned sideways. This evidence includes neuropsychological studies, which demonstrate that the human amygdala responds more strongly to fear than to other facial expressions (e.g., Breiter et al., 1996; Calder, Lawrence, & Young, 2001; Calder et al., 1996; Morris et al., 1996; Thomas et al., 2001), more strongly to direct than to averted eye gaze (e.g., George, Driver, & Dolan, 2001; Kawashima et al., 1999; for converging evidence from primate studies, see Brothers & Ring, 1993; Brothers, Ring, & Kling, 1990), and particularly to direct fearful gaze (Morris, debonis, & Dolan, 2002; Whalen et al., 2004; for a contrary claim, see Adams, Gordon, Baird, Ambady, & Kleck, 2003; but see also Sato, Yoshikawa, Kochiyama, & Matsumura, 2004). Consistent with this notion, fearful faces were rated as more fearful-looking in the present study when they displayed direct gaze than with averted gaze (Experiment 4). Experiment 4 also revealed no effects of gaze on the ratings of the other expressions, which is also consistent with the idea that gaze forms a specific component of the fearful face signal only (see, e.g., Adolphs et al., 2005;