How do eye gaze and facial expression interact?

Size: px
Start display at page:

Download "How do eye gaze and facial expression interact?"

Transcription

1 VISUAL COGNITION, 2008, 16 (6), How do eye gaze and facial expression interact? Markus Bindemann and A. Mike Burton Department of Psychology, University of Glasgow, UK Stephen R. H. Langton Department of Psychology, University of Stirling, UK Previous research has demonstrated an interaction between eye gaze and selected facial emotional expressions, whereby the perception of anger and happiness is impaired when the eyes are horizontally averted within a face, but the perception of fear and sadness is enhanced under the same conditions. The current study reexamined these claims over six experiments. In the first three experiments, the categorization of happy and sad expressions (Experiments 1 and 2) and angry and fearful expressions (Experiment 3) was impaired when eye gaze was averted, in comparison to direct gaze conditions. Experiment 4 replicated these findings in a rating task, which combined all four expressions within the same design. Experiments 5 and 6 then showed that previous findings, that the perception of selected expressions is enhanced under averted gaze, are stimulus and task-bound. The results are discussed in relation to research on facial expression processing and visual attention. In recent years, the perception of eye gaze direction has emerged as a critical component of face processing, particularly in tasks requiring visual attention. For example, Senju and Hasegawa (2005) showed that the detection of a peripheral target is delayed when observers are first required to fixate a central face that is looking directly at them, compared to faces with averted or closed eyes. Thus, direct eye gaze appears to capture or to hold attention on a face (see also Bindemann, Burton, Hooge, Jenkins, & de Haan, 2005). Averted eyes, on the other hand, are capable of rapidly shifting an observer s visual attention, leading to faster target classification in the direction of a seen gaze, than when the target appears in the opposite direction (e.g., Driver et al., 1999; Friesen & Kingstone, 1998; Friesen, Please address all correspondence to Markus Bindemann, Department of Psychology, University of Glasgow, Glasgow G12 8QQ, UK. markus@psy.gla.ac.uk This work was supported by a Wellcome Grant (GR072308) to Mike Burton, Stefan R. Schweinberger and Stephen Langton. # 2007 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business DOI: /

2 EYE GAZE AND EXPRESSION 709 Moore, & Kingstone, 2005; Kingstone, Tipper, Ristic, & Ngan, 2004; Langton & Bruce, 1999; Langton, Watt, & Bruce, 2000). A functional distinction between direct and averted gaze is also evident in tasks requiring sex judgements to faces and in face identification. Thus, observers are faster to classify faces as male or female when they are looking straight at the viewer, than when eye gaze is averted (Macrae, Hood, Milne, Rowe, & Mason, 2002). Similarly, recognition memory is better for faces with direct eye gaze relative to faces displaying averted gaze. In fact, this advantage is evident in adults and children (Hood, Macrae, Cole-Davies, & Dias, 2003), both during face encoding and at face retrieval (Hood et al., 2003; Mason, Hood, & Macrae, 2004). Taken together, these findings indicate that face processing is generally less efficient when eye gaze is averted. However, there may be circumstances when there is an advantage for averted eye gaze during face processing. Adams and Kleck (2003) asked participants to categorize angry- and fearfullooking full-face images, in which the eye gaze was either directed straight at the viewer or horizontally averted. Angry faces showed the typical influence of gaze direction, with slower responses to faces with averted eye gaze. Intriguingly, for fearful expressions the opposite pattern was found. Thus, responses were slower to fearful faces displaying direct gaze, than to the same fearful faces in which the eyes were averted. This remarkable pattern was replicated in a second experiment with happy and sad facial expressions, so that sad faces were classified faster with averted gaze and happy faces were classified faster when eye gaze was direct. These findings suggest that eye gaze selectively affects the perception of facial expression in a reaction time paradigm. Moreover, these effects persisted in a subsequent study, in which participants attributed emotional expressions to neutral faces, ambiguous expression morphs, and prototypical expressions in nonspeeded ratings tasks (Adams & Kleck, 2005). However, Adams and Kleck s (2003) finding with a speeded task in particular raises the intriguing possibility that eye gaze and expression information are integrated into a single percept at an early visual processing stage, suggesting a process more primary than that found at the decision-making level (Adams & Kleck, 2005, p. 10). And yet, despite this evidence, it is difficult to envisage a perceptual system in which averted eye gaze rapidly shifts an observer s attention (e.g., Friesen & Kingstone, 1998; Friesen et al., 2005) and consistently impairs the processing of sex and identity-related face information (Hood et al., 2003; Macrae et al., 2002; Mason et al., 2004), but facilitates the processing of selected emotional expressions. Indeed, if eye gaze and expression are perceptually intertwined in this manner at primary processing stages, then one might expect in return that the perception of eye gaze direction is also affected by the emotional expression on a face. Contrary to this prediction, gaze cueing appears unaffected by facial expression

3 710 BINDEMANN, BURTON, LANGTON (Hietanen & Leppänen, 2003). For these reasons, it is important to provide further evidence for interactions between eye gaze and the perception of facial expression. This was the aim of the present study. EXPERIMENT 1 The aim of the first experiment was to replicate Adams and Kleck s (2003) finding with happy and sad facial expressions in a speeded categorization task. Subjects responded to the emotional expression of a central target, which was shown in full-face format, and with either direct gaze or with the eyes averted horizontally. According to Adams and Kleck, responses should be faster to happy faces with direct gaze than to the same faces with averted gaze, and the reverse pattern should be found for sad expressions. Method Subjects. Twenty-two students from the University of Glasgow participated in the experiment for a small fee or for course credits. All had normal or corrected to normal vision. Apparatus and stimuli. A Macintosh computer was used to present stimuli and record responses using PsyScope software. Viewing distance was fixed at 60 cm by a chinrest. A set of 30 faces was used as stimuli. This was derived from 30 photographs, six of each of five male Caucasian models showing happy and sad expressions, and posing with direct, left-averted, and right-averted eye gaze. All faces were cropped to remove extraneous background, converted to greyscale, and measured maximally 5.8 cm7.0 cm ( of visual angle at a distance of 60 cm). Prior to the experiment, eight participants verified these facial expressions in a forced-choice design including the same number of fearful, angry, neutral, and surprised faces. On average, happy expressions were identified correctly on 100% of trials and sad expressions on 69% of trials, and these two expressions were never confused with each other (see Figure 1 for examples). This is considerably better than chance performance (17%) and consistent with previous categorization tasks (e.g., Calder et al., 1996; Ekman & Friesen, 1976). Procedure. A trial began with a central fixation cross for 750 ms, followed by a face stimulus, which was displayed until response. Participants made speeded judgements concerning the emotional expression of the target face (i.e., happy vs. sad). Responses were made with the right hand, using the numeric button pad on the right side of a standard computer keyboard, pressing the 3 key for happy and the. key for sad faces. Throughout the experiment, happy and sad expressions, and averted and direct eye gaze

4 EYE GAZE AND EXPRESSION 711 Figure 1. Example stimuli from Experiment 1 showing happy (a) and sad (b) facial expressions with averted and direct eye gaze. occurred with equal frequency. Participants completed 20 practice trials and three experimental blocks of 80 randomly ordered trials, and could take short breaks between blocks. Results and discussion Table 1 shows the means of the median correct RTs and percentage errors for all conditions. A 2 (direct vs. averted eye gaze)2 (happy vs. sad expression) within-subject analysis of variance (ANOVA) of the RT data showed a main effect of gaze, F(1, 21)4.63, pb.05, due to faster responses to faces with direct gaze than to faces with averted gaze. A main effect of expression was also found, F(1, 21)29.98, pb.01, reflecting faster responses for happy faces. However, the interaction between gaze and expression was not significant, F(1, 21)B1. Error rates were analysed as the RT data. Percentage errors to happy expressions were marginally higher in the averted gaze than in the direct gaze condition, but the reverse trend was found for sad expressions. ANOVA showed no main effect of gaze, F(1, 21)B1, or expression, F(1, 21)B1, but an almost significant interaction between these factors, F(1, 21)3.82, p.06. To discount any possibility that the differences between RTs and errors might reflect a speedaccuracy tradeoff, inverse efficiency scores were

5 712 BINDEMANN, BURTON, LANGTON TABLE 1 Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates as a function of experimental condition in Experiment 1 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Happy Sad calculated to provide a combined measure of RTs and accuracy (Townsend & Ashby, 1983). These were: Happy direct-gaze 544 ms, averted-gaze 567 ms; sad direct-gaze 584 ms, averted-gaze 588 ms. Consistent with the RT data, analysis of the inverse efficiency data showed main effects of gaze, F(1, 21) 8.1, pb.01, and expression, F(1, 21)12.18, pb.01, but the gaze expression interaction was not significant, F(1, 21)3.12, p In this experiment, response times were consistently slower to faces with averted eye gaze, both for happy and sad emotional expressions. This contradicts Adams and Kleck s (2003) finding of an interaction between these expressions and eye gaze direction in a speeded categorization task. There is, however, a possible confound in this experiment as there were twice as many photographs with averted eye gaze (i.e., left and right gaze stimuli) than with direct gaze. An advantage of this manipulation is that, unlike Adams and Kleck s faces, in which gaze direction was manipulated digitally, the present stimuli reflect natural variations in eye gaze direction. However, there is a corresponding disadvantage, in that our stimuli may vary slightly in ways other than gaze, even within the same identity. Increasing the number of averted-gaze faces relative to direct-gaze faces may thus result in a slight increase in expression information in the former conditions. If this results in an increase in task difficulty, then one might predict slower expression decisions when the eyes are averted, for reasons relating to the experimental design, rather than reflecting a direct effect of eye gaze. This was addressed in the next experiment. EXPERIMENT 2 In Experiment 2, happy and sad faces from the Ekman and Friesen (1976) series were used as stimuli. This is a photographic set of prototypical 1 Inverse efficiency scores were calculated for all experiments (except Experiment 4, which did not produce error data). Analysis of these scores consistently produced the same outcome as reaction time analysis and is not reported further.

6 EYE GAZE AND EXPRESSION 713 full-face expressions with direct eye gaze, which was digitally edited to produce a complementary set with left-averted and right-averted eyes. In this way, the emotional content of the face targets was preserved across the experimental conditions, except for the critical eye gaze information. Method Subjects. Thirty new students from the University of Glasgow participated in the experiment for a small fee or for course credits. All had normal or corrected to normal vision. Stimuli and procedure. The stimuli and procedure were the same as those in Experiment 1, except for the following changes. Happy and sad pictures of five male models from the Ekman and Friesen (1976) series were used to create a new stimulus set of 30 faces. This set comprised the 10 original images, all of which displayed direct eye gaze, and 20 new images of these faces in which the pupils were digitally moved to the left and right side of both eyes (for example stimuli see Figure 2). As before, these images were cropped to remove any extraneous background and measured maximally 5.8 cm7.0 cm. Participants completed an example block of 20 trials and three experimental blocks of 80 randomly intermixed trials. Results and discussion The means of the median correct RTs and percentage errors are shown in Table 2 as a function of experimental condition. As before, a 2 (direct vs. averted gaze)2 (happy vs. sad expression) ANOVA of the RTs showed a main effect of gaze, F(1, 29)7.06, pb.05, reflecting faster responses to faces with direct eye gaze than with averted eye gaze, and a main effect of expression, F(1, 29)14.97, pb.01, showing faster responses to happy than to sad faces. No interaction between these factors was found, F(1, 29)B1. Error rates were analysed as the RT data. Averted gaze resulted in a slight increase in errors for happy faces and a slight decrease for sad faces. ANOVA showed no main effect of gaze, F(1, 29)B1, or of expression, F(1, 29)2.90, and no interaction between these factors, F(1, 29)1.71. Experiment 2 replicates the important aspects of Experiment 1 with a stimulus set that was designed to minimize variation in expression information across the eye gaze conditions. Thus, participants responded slower to faces with horizontally averted gaze relative to the direct gaze conditions, independent of the displayed facial expression.

7 714 BINDEMANN, BURTON, LANGTON Figure 2. Example faces for Experiments 24 showing happy (a), sad (b), angry (c), and fearful (d) expressions with averted and direct gaze. EXPERIMENT 3 Experiments 1 and 2 show an increase in response times to both happy and sad expressions when the eyes are horizontally averted in these face stimuli.

8 EYE GAZE AND EXPRESSION 715 TABLE 2 Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates as a function of experimental condition in Experiment 2 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Happy Sad The next experiment examines whether these findings can be extended to angry- and fearful-looking faces, which Adams and Kleck (2003) used as an analogue for happy and sad expressions. Similar to the latter expressions, they found that the perception of anger (like happiness) was slowed by averted gaze, whereas fear (like sadness) was categorized faster in this condition. However, these effects were particularly pronounced for angry and fearful stimuli. Despite the absence of an interaction between eye gaze and expression in the previous two experiments, it is therefore possible that Adams and Kleck s findings might persist with angry and fearful faces in Experiment 3. Method Subjects. Twenty-four students from the University of Glasgow participated for a small fee or course credits. All reported normal or corrected to normal vision. Stimuli and procedure. The stimuli and procedure were the same as in previous experiments, except as follows. The happy and sad stimuli were replaced with an equivalently prepared set of angry and fearful faces of the same male identities from the Ekman and Friesen (1976) series (see Figure 2 for examples). Participants now made speeded judgements concerning whether the target face was carrying an angry or a fearful expression, using the same two-choice keypress response ( 3 vs.. ) of previous experiments. Participants completed 20 practice trials and three experimental blocks of 80 randomly ordered trials. Results and discussion Table 3 shows the means of the median correct RTs and percentage errors for all conditions. A 2 (direct vs. averted gaze)2 (angry vs. fearful expression) ANOVA of the RTs showed a main effect of gaze, F(1, 23)6.89, pb.05,

9 716 BINDEMANN, BURTON, LANGTON reflecting faster responses to faces with direct eye gaze, and a main effect of expression, F(1, 23)4.97, pb.05, reflecting slower responses to fearful than to angry facial expressions. However, as for Experiments 1 and 2, no interaction between gaze and expression was found, F(1, 23)1.14. Analogous analysis of the error data revealed a main effect of gaze, F(1, 23)5.15, pb.05, reflecting a slight increase in errors for both angry faces and fearful faces when eye gaze was averted. No main effect of expression, F(1, 23)B1, and no interaction between these factors was found, F(1, 23)1.04. These results converge with the pattern found in preceding experiments. That is, response times to happy and sad faces (Experiments 1 and 2) and angry and fearful faces (Experiment 3) were consistently slower for full-face targets looking to the side, than for targets in which the eyes were directed at the viewer. EXPERIMENT 4 None of the previous experiments have reproduced Adams and Kleck s (2003) findings with emotional expression and eye gaze. However, each of these experiments compared only two expressions with a limited set of stimuli, which may have enabled observers to utilize superficial cues for classification. For example, merely detecting the teeth in a smile may have been sufficient to distinguish happy and sad expressions, without requiring any further face processing beyond this simple cue. Consequently, it is conceivable that eye gaze affects expression judgements in a different way when the target faces are processed in greater detail. Experiment 4 explored this possibility with a speeded rating task, in which participants judged the perceived intensity (e.g., angry vs. very angry) of all four expressions. It was anticipated that this would require finer discriminations within facial expressions, perhaps revealing differential effects of eye gaze on emotional expressions where none were previously found. TABLE 3 Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates as a function of experimental condition in Experiment 3 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Angry Fearful

10 EYE GAZE AND EXPRESSION 717 Method Subject. Twenty students from the University of Glasgow participated for a small fee or for course credits. All reported normal or corrected to normal vision. Stimuli and procedure. The stimuli consisted of the face sets that were used in Experiments 2 and 3, but rather than categorizing these faces according to their emotional expression, subjects were now asked to rate them according to their intensity on a 3-point rating scale (where 1not very angry/fearful/happy/sad, 3very angry/fearful/happy/sad, and 2 somewhere in between). Each trial began with a fixation cross for 750 ms, and was followed by a face that was displayed until a response was made. Subjects responded with the 1, 2, and 3 keys on the button pad of a standard computer keyboard, and were instructed to make use of the full range of intensity ratings. Subjects were told to respond as accurately as possible, but within approximately 2 s of stimulus onset, and the next trial was initiated if no response was obtained within 2.5 s of target presentation. Participants completed 40 practice trials and four experimental blocks of 80 randomly ordered trials. Results and discussion Participants failed to register a response on less than 1% of all trials. The means of the median RTs and intensity ratings are shown in Table 4 as a function of experimental condition. A 4 (angry, fearful, happy, sad)2 (averted vs. direct gaze) ANOVA of the RT data showed a main effect of expression, F(3, 57)42.92, pb.01, reflecting faster responses to happy faces than for each of the other expressions (happy vs. angry, happy vs. fearful, happy vs. sad, all Tukey TABLE 4 Mean reaction times (RTs, in ms), standard errors (SE), and intensity ratings as a function of experimental condition in Experiment 4 Type of emotional expression RTs SE Intensity ratings Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Happy Sad Angry Fearful

11 718 BINDEMANN, BURTON, LANGTON HSD, pb.01). As before, a main effect of gaze was also found, F(1, 19) 34.01, pb.01, reflecting faster responses to faces with direct than with averted eye gaze. Unlike previous experiments, the main effect of gaze was qualified by an interaction with expression, F(3, 57)4.98, pb.01. Simple main effect analysis revealed significant gaze effects for angry faces, F(1, 19)4.66, pb.05, fearful faces, F(1, 19)35.79, pb.01, and sad faces, F(1, 19)6.05, pb.05, but not for happy faces, F(1, 19)1.13. For the intensity ratings, ANOVA showed a main effect of expression, F(3, 57)126.51, pb.01, reflecting significant differences between each of the facial expressions (all Tukey HSD, pb.01), except anger and fear. The main effect of gaze did not reach significance, F(1, 19)3.01, p B.10, but an interaction between expression and gaze was found, F(3, 57)17.28, pb.01. As Table 4 suggests, significant effects of eye gaze were found only for fearful faces, Tukey HSD, pb.01. Experiment 4 examined how eye gaze affects expressions judgements when participants are required to process facial expression in greater detail than in previous experiments, in an intensity rating task. Importantly, although this task did not require explicit identification of the emotional expressions, participants nonetheless responded to the individual expressions, as can be seen from the intensity ratings across these conditions. However, only fearful faces also showed an effect of eye gaze, with more fearful ratings in the direct gaze than in the averted gaze condition. Participants response times*the data of primary interest*were slower than in previous experiments. However, this appeared to have little effect on the relation between eye gaze and expression processing, as responses were again significantly slower in the averted gaze than in the direct gaze conditions during the classification of angry, fearful, and sad faces. As before, the same pattern was also found for happy expressions, although this difference did not reach significance in this experiment. EXPERIMENT 5 So far, all of the experiments have failed to replicate Adams and Kleck s (2003) findings with emotional expression and eye gaze, despite several experimental manipulations. These included facial stimuli displaying naturally averted eye gaze (Experiment 1) and digitally averted eye gaze (Experiments 24), our own set of posed expressions (Experiment 1) and prototypical expressions (Experiments 24), a two-choice categorization task (Experiments 13), and a rating task, which combined all four facial expressions within the same design (Experiment 4). However, none of these experiments have employed the same design and the same stimuli as Adams and Kleck (2003). This was addressed in the next experiment, which

12 EYE GAZE AND EXPRESSION 719 employed Adams and Kleck s set of angry and fearful faces in a two-choice expression task. Method Subjects. Eighteen new subjects, all students from the University of Glasgow, participated in the experiment for a small fee or course credits. All reported normal or corrected to normal vision. Stimuli. The stimuli were identical to Adams and Kleck (2003, Exp. 1), and consisted of 15 male and 15 female faces, which displayed angry and fearful facial expressions, and direct gaze, left-averted gaze, and right-averted gaze. These stimuli were sourced from face sets developed by Beaupré, Cheung, and Hess (2000), Ekman and Friesen (1976), Kirouac and Doré (1984), and some of Adams and Kleck s (2003) own faces. All faces were of Caucasian descent and depicted with direct eye gaze, which was digitally edited by Adams and Kleck to produce complementary sets for the averted gaze conditions. In the experiment, averted-gaze faces were only presented once, but each direct-gaze face was presented twice to provide an equal number of stimuli for each level of eye gaze. This resulted in a total of 240 experimental trials. As in Adams and Kleck (2003), the experiment also included 64 trials of face blends, which were intermingled with the 240 pure expression trials during the experiment. Blends were created by morphing angry and fearful expressions of 16 face identities in equal proportions. As for the pure expressions, blended faces were presented with direct, left-averted, and rightaverted gaze, but direct-gaze blends were presented twice to balance the design. Procedure. Participants made speeded judgements concerning whether the target face was carrying an angry or a fearful expression, using a twochoice keypress response ( 3 vs.. ). A trial consisted of a fixation cross for 750 ms, followed by a face target, which was displayed until a response was made. As in Adams and Kleck (2003), participants were not given practice trials, and completed 304 randomly intermixed experimental trials. Results and discussion Pure expressions. Table 5a shows the means of the median correct RTs and percentage errors for the pure expression conditions. A 2 (direct vs. averted gaze)2 (angry vs. fearful expression) ANOVA of these RT data revealed a main effect of expression, F(1, 17)10.45, pb.01, reflecting faster categorization for angry faces. A main effect of gaze was not found, F(1, 17)B1, but an interaction between gaze and expression, F(1, 17)11.68,

13 720 BINDEMANN, BURTON, LANGTON TABLE 5a Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates for pure emotional expressions in Experiment 5 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Angry Fearful pb.01. Analysis of simple main effects revealed significant effects of gaze for angry expressions, F(1, 17)5.08, pb.05, with faster responses to angry faces with direct gaze, and the opposite pattern for fearful faces, F(1, 17)12.43, pb.01. In contrast to RTs, the error data showed no main effect of expression, F(1, 17)B1, but a main effect of gaze, F(1, 17)8.69, pb.01, due to overall higher errors in the averted gaze conditions. This effect was qualified by an interaction between gaze and expression, F(1, 17)20.24, pb.01, reflecting higher errors for angry faces with averted gaze, compared to the direct gaze condition, F(1, 17)37.43, pb.01, and the reverse trend for fearful expressions, F(1, 17)3.81, p.07 (see Table 5a). Blended expressions. Response times to the ambiguous face blends showed a similar pattern to the pure expressions, with slower responses to direct gaze blends when they were classified as fearful than to their avertedgaze counterparts, and the opposite response pattern for blends that were perceived as angry (see Table 5b). ANOVA revealed a main effect of gaze, F(1, 17)4.57, pb.05, due to overall higher RTs to direct gaze faces, but showed no main effect of expression, F(1, 17)B1, and no interaction between gaze and expression, F(1, 17)2.63, p.12. In addition to RT data, expression blends with averted gaze were given more fear labels than blends with direct gaze (binomial test, N15/18, pb.01), and, conversely, were given more angry labels when displaying direct gaze than when displaying averted gaze (binomial test, N15/18, pb.01). Experiment 5 fails to replicate the findings of the four previous experiments, which showed that expression processing is generally impaired when eye gaze is averted, independent of the displayed facial expression. In contrast, this experiment reveals an interaction between eye gaze and expression processing, which indicates that the perception of angry expressions is weakened by averted eye gaze, whereas the perception of fear is actually enhanced under the same conditions. Experiment 5 thus reproduces Adams and Kleck s (2003) findings, with an identical design and their own set of facial expressions.

14 EYE GAZE AND EXPRESSION 721 TABLE 5b Mean reaction times (RTs, in ms) and standard errors (SE) for blended emotional expressions in Experiment 5, and the average number of responses (no. of labels) that were made to the blends as a function of expression Label of blended expression RTs SE No of labels Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Angry Fearful And yet, this discrepancy between the first four experiments and Experiment 5 is puzzling. Experiments 14 employed several different stimulus sets, with posed and digitally averted eye gaze, and replicated the same response pattern across different tasks. Similarly, Experiment 5 supports the validity of Adams and Kleck s (2003) findings, by producing an orthogonal interaction in eye gaze and expression processing. On closer inspection, however, overall response times and error rates were noticeably higher in this experiment than in Experiment 3 (cf. Tables 3 and 5a, see also Tables 1 and 2), which compared the same two emotional expressions in a similar design. One possibility is that these differences simply reflect the larger, and hence perhaps more varied stimulus set, of Experiment 5. However, anger and fear belong to a set of prototypical expressions (see e.g., Ekman & Friesen, 1976), and as long as reasonable set sizes are used to rule out image-based explanations (as was the case in Experiments 14), these primary expressions should be classified reliably and relatively independently of set size, particularly in a rather trivial two-choice response task. On the other hand, one might expect longer response times as well as higher error rates, if ambiguity between fearful and angry expressions exists. This explanation holds some merit for a heterogeneous face set assembled from different sources and that includes clearly ambiguous expression blends (see Experiment 5, Stimuli section). If such expression information is presented combined with variations in gaze direction, then participants might try to guess the outcome of the experiment, and adjust their behaviour to take gaze into account when making expression decisions. This possibility, that top-down strategies might account for the findings of Experiment 5, is examined in the final experiment. EXPERIMENT 6 This experiment combined all four expressions from Adams and Kleck s (2003) facial displays of emotion in a speeded four-choice categorization

15 722 BINDEMANN, BURTON, LANGTON task. If gaze was paired strategically with anger and fear in Experiment 5, then these gaze cues should become less effective in a task with more than two facial expressions, as each level of eye gaze (i.e., averted, direct) can now be associated with more than a single expression. In this case, Experiment 6 should not produce the gaze-expression interaction of the previous task. On the other hand, if averted eye gaze facilitates the processing of fearful (Adams & Kleck, 2003, Exp. 1) and sad expressions (Adams & Kleck, 2003, Exp. 2) in this task, then this would provide further evidence for the perceptual integration of specific gaze states and selected emotional expressions. Method Subjects. Thirty-two new subjects, all students from the University of Glasgow, participated in the experiment for a small fee or course credits. All reported normal or corrected to normal vision. Stimuli. In addition to the angry and fearful faces of Experiment 5 (sourced from Adams & Kleck, 2003, Exp. 1), the task now also included happy and sad facial expressions (Adams & Kleck, 2003, Exp. 2). These faces were obtained from stimulus sets developed by Beaupré et al. (2000), Ekman and Friesen (1976), Kirouac and Doré (1984), and some of Adams and Kleck s (2003) own pictures. In all stimuli, direct gaze was digitally moved by Adams and Kleck to produce complementary sets for the averted gaze conditions. For each expression, the design comprised 30 faces with leftaverted gaze, 30 faces with right-averted gaze, and 30 faces with direct gaze. To balance the design, each averted-gaze face appeared once but direct gaze faces appeared twice, giving a total of 480 experimental trials. Note that blends of happy and sad expressions do not yield suitable stimuli (see Adams & Kleck, 2003). Consequently, fearanger blends were also omitted from this experiment. Procedure. Each trial began with a fixation cross for 750 ms, followed by a face stimulus, which was displayed until a response was made. Responses were made using the D, F, K, and L keys on a standard computer keyboard for angry, fearful, happy, and sad expressions respectively. Participants were instructed to use index and middle fingers for keypresses, and to respond as quickly and as accurately a possible. At the start of the experiment, participants were given 80 practice trials to learn this response layout, where they were only shown the printed names of the four target emotions. This was immediately followed by eight experimental blocks of the face stimuli.

16 EYE GAZE AND EXPRESSION 723 Results and discussion Table 6 shows the means of the median correct RTs and percentage errors for all conditions. A 2 (direct vs. averted gaze)4 (angry vs. fearful vs. happy vs. sad expression) ANOVA of the RT data revealed a main effect of expression, F(3, 93)81.62, pb.01. Tukey HSD test showed significant differences between each of the facial expressions (all ]pb.05), except angry and sad expressions. In addition, a main effect of gaze was found, F(1, 31)15.06, pb.01, and an interaction between gaze and expression, F(3, 93)2.84, pb.05. Analysis of simple main effects revealed significant effects of gaze for angry faces, F(1, 31)15.25, pb.01, and fearful faces, F(1, 31)5.00, pb.05, but not for happy and sad expressions, both FB1. The error data also showed a main effect of expression, F(3, 93)29.89, pb.01, reflecting more errors to fearful, angry, and sad faces than to happy facial expressions (all Tukey HSD, pb.01), and significantly fewer errors to fearful than to sad faces (pb.05). In addition, a main effect of gaze was found, F(1, 31)20.41, pb.01, and an interaction between both factors, F(3, 93)9.64, pb.01. Simple main effect analysis revealed an effect of gaze on angry expressions, F(1, 31)9.46, pb.01, and sad expressions, F(1, 31)34.95, pb.01, but not for fearful and happy faces, both FB1. Experiment 6 fails to replicate the orthogonal interaction between gaze and expression that was observed in Experiment 5. In contrast, a consistent effect of eye gaze was found, resulting from slower expression RTs across the averted gaze conditions. Notably, this effect was most pronounced for angry and fearful faces, the two expressions that produced a strikingly different pattern in Experiment 5, but the same numerical effects were also found for happy and sad expressions. To some extent this result is consistent with the preceding experiments, in which angry and fearful faces generally showed larger gaze effects than happy and sad faces (cf. Tables 14). Indeed, similar differences in magnitude were also observed by Adams and Kleck (2003), TABLE 6 Mean reaction times (RTs, in ms), standard errors (SE), and percentage error rates as a function of experimental condition in Experiment 6 Type of emotional expression RTs SE % errors Direct gaze Averted gaze Direct gaze Averted gaze Direct gaze Averted gaze Happy Sad Angry Fearful

17 724 BINDEMANN, BURTON, LANGTON who obtained larger gaze effects for angry and fearful faces. This suggests that there are differences in the degree to which eye gaze affects the processing of these facial expressions. These differences could reflect the extent to which the eye regions code an expression. For example, although the eyebrows are of importance for recognizing sad expressions, the eye regions appear much less important for recognizing happy and sad expressions than for recognizing anger and fear (Smith, Cottrell, Gosselin, & Schyns, 2005). Therefore, one factor that might determine the effect of gaze direction in these experiments could be the extent to which an expression is coded by areas around the eyes. More importantly, Experiment 6 suggests that the response pattern of Experiment 5 does no survive a change in task demands, even when this change is seemingly as trivial as adding two emotional expressions to a categorization task. However, it is worth noting that these results were obtained without the fearanger expression blends that were used in Experiment 5, which were omitted because we did not possess an analogous set of happysad blends (see Adams & Kleck, 2003). This raises the question whether stimulus blends are essential for producing the response pattern that was found for angry and fearful faces in Experiment 5. Adams and Kleck (2003) observed a pattern similar to Experiment 5 with happy and sad expressions in a task that did not encompass face blends, which indicates that these gaze effects do not depend entirely on face blends. Similar to Experiment 5, however, this experiment employed a simple binary decision task (i.e., happy vs. sad), which may have encouraged associations between each of the expressions and a particular gaze state (i.e., averted vs. direct). We suggest that Experiment 6 made it more difficult to utilize eye gaze in this fashion, because each gaze direction could be paired with more than one expression in this task. We return to a fuller discussion of these findings in the general discussion. GENERAL DISCUSSION Adams and Kleck (2003) recently reported that the perception of angry and happy facial expressions is impaired in speeded categorization tasks when the eyes of these stimuli are horizontally averted, whereas the perception of fearful and sad expressions is actually enhanced under the same conditions. The present experiments reexamined these claims across different stimulus sets and tasks. Experiment 1 found that the processing of happy and sad expressions is impaired when eye gaze is averted, compared to when a face target is looking straight at a viewer. Unlike previous research, this pattern was observed with faces displaying naturally averted gaze and with verified, posed expressions (Experiment 1).

18 EYE GAZE AND EXPRESSION 725 Subsequent experiments replicated these effects with happy and sad faces (Experiment 2) and angry and fearful faces with digitally altered gaze (Experiment 3), and with a rating paradigm (Experiment 4), which combined these four expressions within the same task. In contrast to these findings, Experiment 5 then reproduced Adams and Kleck s (2003) results, but with the original angry and fearful stimuli that were employed in this study. Remarkably though, these effects did not survive a change in task demands in Experiment 6, in which all of Adams and Kleck s faces were combined in a four-choice (angry, happy, fearful, and sad) categorization paradigm. Overall, these results suggest that the perception of emotional expression is impaired in a speeded classification task when the eyes of a face stimulus are averted. One straightforward explanation for this finding is that eye gaze is analysed faster than expression, and then influences the allocation of visual attention to the target face. Thus, when eye gaze is averted, an observer s attention is shifted temporarily in the direction of the seen gaze (e.g., Driver et al., 1999; Friesen et al., 2005), making the ongoing expression analysis less efficient than if the face was held at the focus of attention. Consistent with this notion, some face processing resources are more scarce than was previously thought, and are not shared across the visual field (Bindemann, Burton, & Jenkins, 2005). Thus, when eye gaze induces an attention shift from a face target, this ongoing gaze-direction processing may briefly leave insufficient resources to analyse facial expression in the unattended target location. Furthermore, Lewis and Edmonds (2003) showed that face detection is impaired disproportionately when the eyes are occluded, than when the forehead, nose, mouth, or chin are obscured. Observers also fixate the eyes prior to any other facial region (Althoff & Cohen, 1999), and look at the eyes more frequently during face processing (Henderson, Williams, & Falk, 2005; Schyns, Jentsch, Johnson, Schweinberger, & Gosselin, 2003; Smith, Gosselin, & Schyns, 2004). In addition, when variations in eye gaze are salient, as was the case in the present experiments, gaze is judged faster than the emotional expression on a face (see e.g., Ganel, Goshen-Gottstein, & Goodale, 2005), and gaze perception appears entirely unaffected by variations in facial expression (Hietanen & Leppänen, 2003). This suggests that the eyes play a crucial early role in face perception and are in a privileged position to influence, and impair, other ongoing face processes, such as expression analysis. However, this should not be interpreted as evidence that eye gaze and expression analysis is functionally integrated in early face processing. Rather, we suggest that eye gaze affects expression analysis via an intermediate process, namely the allocation of visual attention to a face target.

19 726 BINDEMANN, BURTON, LANGTON How can this explanation be reconciled with Experiment 5, in which the perception of fear was enhanced in the averted gaze condition? One possibility is that participants employed a gaze-based strategy to disambiguate angry and fearful faces that did not contain a clear expression signal, by pairing anger with direct gaze and fear with averted gaze. Although gaze following is highly reliable in visual orienting tasks, (see e.g., Driver et al., 1999; Friesen & Kingstone, 1998; Friesen et al., 2005), these gaze-following effects could be superseded if expression responses are delayed by a strategic switch to gaze-related information. Several observations support this notion. First, responses were notably faster and fewer errors were made in Experiments 13 than in Experiment 5, suggesting that it was more difficult to distinguish the emotional expressions in the latter experiment, despite using the same simple binary decision task (cf. Tables 13 and 5a). In addition, a breakdown of the RT data for these experiments shows initially slower responses for all expressions in the averted gaze conditions (see Figure 3), demonstrating clearly comparable effects of gaze direction. Whereas this pattern continues to hold for Experiments 13 throughout the task, an interaction emerges in Experiment 5 that is indicative of the acquisition of a gazebased strategy. 2 It is also likely that the emotionally ambiguous face blends in Experiment 5 encouraged the use of additional facial information as part of the decision making process. Eye gaze is the most obvious candidate since face blends were invariant in other aspects. Finally, Experiment 6 combined eye gaze with four emotional expressions, which makes it more difficult to use the eyes strategically, as averted and direct gaze could be paired with more than one expression. And importantly, in Experiment 6 averted gaze generally impaired expression judgements, which further supports the notion that the findings of Experiment 5 represent strategic task effects. To pursue this issue further, another point to take note of here is that the pattern of results in Experiment 5 is consistent with studies in which participants are required to make somewhat challenging expression decisions. For example, Adams and Kleck (2005) showed that emotionally neutral faces can be rated as angry, fearful, happy, or sad as a function of gaze direction in a similar manner to the reaction times pattern in 2 This interpretation receives some support from the statistical analysis. 322 ANOVAs of block, expression, and gaze did not reveal any significant interactions in Experiment 1 or Experiment 3. In Experiment 2, only a blockexpression interaction was found, F(2, 58)4.03, pb.05, due to faster responses to happy faces than sad faces in Block 1, F(1, 29)10.65, pb.01 (all other interactions, ns). In contrast, Experiment 5 showed interactions of expression and gaze, F(1, 18)7.62, pb.01, and, marginally, for block and gaze, F(2, 34)3.04, p.06. An advantage for direct-gaze expressions was found for Block 1, F(1, 17)4.92, pb.05, but not for Blocks 2 or 3 (all other interactions, ns).

20 EYE GAZE AND EXPRESSION 727 Figure 3. Summary of the effects of eye gaze (in ms, averted gazedirect gaze) on expression for the experimental blocks in Experiments 13 (ac) and Experiment 5 (d), and overall effect sizes.

21 728 BINDEMANN, BURTON, LANGTON Experiment 5. 3 In a subsequent experiment, Adams and Kleck (2005) also presented participants with pairs of faces in a ratings task. Both faces in a pair were identical (i.e., same identity, expression) except for eye gaze direction, and, similarly again to Experiment 5, expression ratings to these faces varied as a function of eye gaze. Remarkably, however, a majority of participants initially gave the same ratings to both faces in a pair; the interaction with eye gaze was only obtained after the task was adjusted so that participants were forced to provide different ratings for both faces (Adams & Kleck, 2005, footnote 1). These findings with neutral faces and emotional face pairs strongly suggest that the pattern of Experiment 5 is not tied to the actual emotional content of a face, but emerges when a degree of ambiguity or a difficulty in classifying a facial expression exists. Crucially, Adams and Kleck argue that these findings reflect primary processing stages prior to any decisions making, because similar effects are obtained with speeded reaction time tasks (2005, p. 10). Contrary to these conclusions, however, the present study indicates that averted eye gaze generally slows expression categorization in speeded tasks. Therefore, we suggest that averted gaze generally initially impairs expression processing; the use of different eye gaze directions is a strategic component that is acquired subsequently. On a more cautious note, we cannot explain why anger and fear were consistently bound to a specific gaze signal in Adams and Kleck s (2003, 2005) studies and in Experiment 5. For example, if some participants used averted gaze to classify fear, but others used direct gaze, then any effect of gaze should average out over the course of these experiments. Maybe, considering that eye gaze was edited manually to produce the averted gaze conditions, the stimulus set contained some inconsistencies in eye gaze information that systematically biased expression decisions. Or perhaps, when expression is difficult to categorize and the decision is binary, participants use the surface area of the sclera to resolve ambiguity between angry and fearful faces. If this is done by associating the stimuli with the most exposed sclera, that is the averted gaze faces, with the characteristically gaping scleral surface of the fearful expressions (see Figure 2), then this could result in the eye gaze by expression interaction that was observed in Experiment 5. The comparatively high error percentage 3 We obtained a similar result in a reaction time study with happy, sad, and neutral faces (N19) to which observers made forced-choice happy/sad decisions. As in Experiments 1 and 2, responses to happy and sad expressions were slower in the averted than the direct gaze conditions (happy: Averted gaze, 704 ms vs. direct gaze, 667 ms; sad: Averted gaze, 899 ms vs. direct gaze, 877 ms). For neutral faces these effects varied as a function of eye gaze and response. Neutral faces classified as happy showed a cost for averted gaze trials (averted gaze, 1181 ms vs. direct gaze, 1145 ms), but this effect reversed when these faces were classified as sad (averted gaze, 900 ms vs. direct gaze, 910 ms). However, this interaction did not approach significance.

22 EYE GAZE AND EXPRESSION 729 for angry faces in the averted gaze condition in Experiment 5, which indicates that these faces are frequently mistaken for fearful expressions, provides some tentative support for this notion. Alternatively, it is possible that some gaze directions are naturally associated with particular emotional expressions, leading to enhanced processing when both types of information are combined within the same face. According to Adams and Kleck (2003, 2005), this could reflect a system that combines facial information with a shared signal value, according to whether this information signals the behavioural motivation to approach or to avoid a conspecific (see e.g., Argyle & Cook, 1976; Davidson & Hugdahl, 1995; Harmon-Jones & Segilman, 2001). Thereby, facial information such as direct gaze and positive emotions (and anger) are all associated with approach motivation, which may lead to more efficient expression processing when any of these signals are combined within the same stimulus. Similarly, the processing of negative, avoidance emotions (except anger) may be boosted when these are combined with averted eye gaze, which also signals avoidance. Evidence on how this theory applies to human emotion perception is still relatively sparse. The present experiments suggest that eye gaze and expression do not combine in this manner at any primary processing stages, prior to any decision making. At the same time, it seems plausible that an approach/avoidance model can account for eye gaze and expression integration at subsequent stages. At present, however, it is unresolved how an approach/avoidance theory maps onto existing psychological data, including previous studies with fearful faces. For instance, an approach/ avoidance model associates averted gaze with the perception of fear. At the same time, there is extensive evidence that the perception of fear is intimately linked with direct eye gaze, characterized, in this expression, by a wide scleral contrast above the iris, which disappears as the eyes are turned sideways. This evidence includes neuropsychological studies, which demonstrate that the human amygdala responds more strongly to fear than to other facial expressions (e.g., Breiter et al., 1996; Calder, Lawrence, & Young, 2001; Calder et al., 1996; Morris et al., 1996; Thomas et al., 2001), more strongly to direct than to averted eye gaze (e.g., George, Driver, & Dolan, 2001; Kawashima et al., 1999; for converging evidence from primate studies, see Brothers & Ring, 1993; Brothers, Ring, & Kling, 1990), and particularly to direct fearful gaze (Morris, debonis, & Dolan, 2002; Whalen et al., 2004; for a contrary claim, see Adams, Gordon, Baird, Ambady, & Kleck, 2003; but see also Sato, Yoshikawa, Kochiyama, & Matsumura, 2004). Consistent with this notion, fearful faces were rated as more fearful-looking in the present study when they displayed direct gaze than with averted gaze (Experiment 4). Experiment 4 also revealed no effects of gaze on the ratings of the other expressions, which is also consistent with the idea that gaze forms a specific component of the fearful face signal only (see, e.g., Adolphs et al., 2005;

Differences in holistic processing do not explain cultural differences in the recognition of facial expression

Differences in holistic processing do not explain cultural differences in the recognition of facial expression THE QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2017 VOL. 70, NO. 12, 2445 2459 http://dx.doi.org/10.1080/17470218.2016.1240816 Differences in holistic processing do not explain cultural differences

More information

Taking control of reflexive social attention

Taking control of reflexive social attention Cognition 94 (2005) B55 B65 www.elsevier.com/locate/cognit Brief article Taking control of reflexive social attention Jelena Ristic*, Alan Kingstone Department of Psychology, University of British Columbia,

More information

Affective evaluations of objects are influenced by observed gaze direction and emotional expression

Affective evaluations of objects are influenced by observed gaze direction and emotional expression Cognition xxx (2006) xxx xxx www.elsevier.com/locate/cognit Brief article Affective evaluations of objects are influenced by observed gaze direction and emotional expression Andrew P. Bayliss a, *, Alexandra

More information

Effect of Pre-Presentation of a Frontal Face on the Shift of Visual Attention Induced by Averted Gaze

Effect of Pre-Presentation of a Frontal Face on the Shift of Visual Attention Induced by Averted Gaze Psychology, 2014, 5, 451-460 Published Online April 2014 in SciRes. http://www.scirp.org/journal/psych http://dx.doi.org/10.4236/psych.2014.55055 Effect of Pre-Presentation of a Frontal Face on the Shift

More information

Interaction Between Social Categories in the Composite Face Paradigm. Wenfeng Chen and Naixin Ren. Chinese Academy of Sciences. Andrew W.

Interaction Between Social Categories in the Composite Face Paradigm. Wenfeng Chen and Naixin Ren. Chinese Academy of Sciences. Andrew W. Interaction Between Social Categories in the Composite Face Paradigm Wenfeng Chen and Naixin Ren Chinese Academy of Sciences Andrew W. Young University of York Chang Hong Liu Bournemouth University Author

More information

This is a repository copy of Differences in holistic processing do not explain cultural differences in the recognition of facial expression.

This is a repository copy of Differences in holistic processing do not explain cultural differences in the recognition of facial expression. This is a repository copy of Differences in holistic processing do not explain cultural differences in the recognition of facial expression. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/107341/

More information

Running head: GENDER DIFFERENCES IN EMOTION JUDGMENT

Running head: GENDER DIFFERENCES IN EMOTION JUDGMENT Running head: GENDER DIFFERENCES IN EMOTION JUDGMENT Gender Differences for Speed and Accuracy in the Judgment of the Six Basic Emotions Samantha Lumbert Rochester Institute of Technology 256 Abstract

More information

Why do we look at people's eyes?

Why do we look at people's eyes? Journal of Eye Movement Research 1(1):1, 1-6 Why do we look at people's eyes? Elina Birmingham University of British Columbia Walter Bischof University of Alberta Alan Kingstone University of British Columbia

More information

The Role of Feedback in Categorisation

The Role of Feedback in Categorisation The Role of in Categorisation Mark Suret (m.suret@psychol.cam.ac.uk) Department of Experimental Psychology; Downing Street Cambridge, CB2 3EB UK I.P.L. McLaren (iplm2@cus.cam.ac.uk) Department of Experimental

More information

SHORT REPORT Facial features influence the categorization of female sexual orientation

SHORT REPORT Facial features influence the categorization of female sexual orientation Perception, 2013, volume 42, pages 1090 1094 doi:10.1068/p7575 SHORT REPORT Facial features influence the categorization of female sexual orientation Konstantin O Tskhay, Melissa M Feriozzo, Nicholas O

More information

The number line effect reflects top-down control

The number line effect reflects top-down control Journal Psychonomic Bulletin & Review 2006,?? 13 (?), (5),862-868???-??? The number line effect reflects top-down control JELENA RISTIC University of British Columbia, Vancouver, British Columbia, Canada

More information

The obligatory nature of holistic processing of faces in social judgments

The obligatory nature of holistic processing of faces in social judgments Perception, 2010, volume 39, pages 514 ^ 532 doi:10.1068/p6501 The obligatory nature of holistic processing of faces in social judgments Alexander Todorov, Valerie Loehr, Nikolaas N Oosterhof Department

More information

Supplementary experiment: neutral faces. This supplementary experiment had originally served as a pilot test of whether participants

Supplementary experiment: neutral faces. This supplementary experiment had originally served as a pilot test of whether participants Supplementary experiment: neutral faces This supplementary experiment had originally served as a pilot test of whether participants would automatically shift their attention towards to objects the seen

More information

Mixed emotions: Holistic and analytic perception of facial expressions

Mixed emotions: Holistic and analytic perception of facial expressions COGNITION AND EMOTION 2012, 26 (6), 961977 Mixed emotions: Holistic and analytic perception of facial expressions James W. Tanaka 1, Martha D. Kaiser 2, Sean Butler 3, and Richard Le Grand 4 1 Department

More information

Running Head: STEREOTYPES AND PREJUDICE AFFECT EMOTION RECOGNITION. Stereotypes and Prejudice Affect the Recognition of Emotional Body Postures

Running Head: STEREOTYPES AND PREJUDICE AFFECT EMOTION RECOGNITION. Stereotypes and Prejudice Affect the Recognition of Emotional Body Postures Running Head: STEREOTYPES AND PREJUDICE AFFECT EMOTION RECOGNITION Stereotypes and Prejudice Affect the Recognition of Emotional Body Postures Emotion, in press Gijsbert Bijlstra 1, Rob W. Holland 1, Ron

More information

Recognition of Faces of Different Species: A Developmental Study Between 5 and 8 Years of Age

Recognition of Faces of Different Species: A Developmental Study Between 5 and 8 Years of Age Infant and Child Development Inf. Child Dev. 10: 39 45 (2001) DOI: 10.1002/icd.245 Recognition of Faces of Different Species: A Developmental Study Between 5 and 8 Years of Age Olivier Pascalis a, *, Elisabeth

More information

Separating Cue Encoding From Target Processing in the Explicit Task- Cuing Procedure: Are There True Task Switch Effects?

Separating Cue Encoding From Target Processing in the Explicit Task- Cuing Procedure: Are There True Task Switch Effects? Journal of Experimental Psychology: Learning, Memory, and Cognition 2007, Vol. 33, No. 3, 484 502 Copyright 2007 by the American Psychological Association 0278-7393/07/$12.00 DOI: 10.1037/0278-7393.33.3.484

More information

HAPPINESS SUPERIORITY EFFECT FOR SCHEMATIC FACES 1. Different Faces in the Crowd: A Happiness Superiority Effect for Schematic Faces in

HAPPINESS SUPERIORITY EFFECT FOR SCHEMATIC FACES 1. Different Faces in the Crowd: A Happiness Superiority Effect for Schematic Faces in HAPPINESS SUPERIORITY EFFECT FOR SCHEMATIC FACES 1 Different Faces in the Crowd: A Happiness Superiority Effect for Schematic Faces in Heterogeneous Backgrounds Belinda M. Craig, Stefanie I. Becker, &

More information

Does scene context always facilitate retrieval of visual object representations?

Does scene context always facilitate retrieval of visual object representations? Psychon Bull Rev (2011) 18:309 315 DOI 10.3758/s13423-010-0045-x Does scene context always facilitate retrieval of visual object representations? Ryoichi Nakashima & Kazuhiko Yokosawa Published online:

More information

Perceptual Fluency Affects Categorization Decisions

Perceptual Fluency Affects Categorization Decisions Perceptual Fluency Affects Categorization Decisions Sarah J. Miles (smiles25@uwo.ca) and John Paul Minda (jpminda@uwo.ca) Department of Psychology The University of Western Ontario London, ON N6A 5C2 Abstract

More information

Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification. and. Evidence for a Face Superiority Effect. Nila K Leigh

Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification. and. Evidence for a Face Superiority Effect. Nila K Leigh 1 Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification and Evidence for a Face Superiority Effect Nila K Leigh 131 Ave B (Apt. 1B) New York, NY 10009 Stuyvesant High School 345 Chambers

More information

ARTICLE IN PRESS. Vision Research xxx (2008) xxx xxx. Contents lists available at ScienceDirect. Vision Research

ARTICLE IN PRESS. Vision Research xxx (2008) xxx xxx. Contents lists available at ScienceDirect. Vision Research Vision Research xxx (2008) xxx xxx Contents lists available at ScienceDirect Vision Research journal homepage: www.elsevier.com/locate/visres Intertrial target-feature changes do not lead to more distraction

More information

Grouped Locations and Object-Based Attention: Comment on Egly, Driver, and Rafal (1994)

Grouped Locations and Object-Based Attention: Comment on Egly, Driver, and Rafal (1994) Journal of Experimental Psychology: General 1994, Vol. 123, No. 3, 316-320 Copyright 1994 by the American Psychological Association. Inc. 0096-3445/94/S3.00 COMMENT Grouped Locations and Object-Based Attention:

More information

Look into my eyes: Gaze direction and person memory

Look into my eyes: Gaze direction and person memory MEMORY, 2004, 12 5), 637±643 Look into my eyes: Gaze direction and person memory Malia F. Mason Dartmouth College, NH, USA Bruce M. Hood University of Bristol, UK C. Neil Macrae Dartmouth College, NH,

More information

Using Peripheral Processing and Spatial Memory to Facilitate Task Resumption

Using Peripheral Processing and Spatial Memory to Facilitate Task Resumption Using Peripheral Processing and Spatial Memory to Facilitate Task Resumption Raj M. Ratwani 1,2, Alyssa E. Andrews 2, Malcolm McCurry 1, J. Gregory Trafton 1,2, Matthew S. Peterson 2 Naval Research Laboratory

More information

PAUL S. MATTSON AND LISA R. FOURNIER

PAUL S. MATTSON AND LISA R. FOURNIER Memory & Cognition 2008, 36 (7), 1236-1247 doi: 10.3758/MC/36.7.1236 An action sequence held in memory can interfere with response selection of a target stimulus, but does not interfere with response activation

More information

Perceiving emotion in crowds: the role of dynamic body postures on the perception of emotion in crowded scenes

Perceiving emotion in crowds: the role of dynamic body postures on the perception of emotion in crowded scenes Exp Brain Res (2010) 204:361 372 DOI 10.1007/s00221-009-2037-5 RESEARCH ARTICLE Perceiving emotion in crowds: the role of dynamic body postures on the perception of emotion in crowded scenes Joanna Edel

More information

Attentional control and reflexive orienting to gaze and arrow cues

Attentional control and reflexive orienting to gaze and arrow cues Psychonomic Bulletin & Review 2007, 14 (5), 964-969 Attentional control and reflexive orienting to gaze and arrow cues JELENA RISTIC University of California, Santa Barbara, California ALISSA WRIGHT University

More information

In search of the emotional face: Anger vs. happiness superiority in visual search

In search of the emotional face: Anger vs. happiness superiority in visual search 1 In search of the emotional face: Anger vs. happiness superiority in visual search Ruth A. Savage, Ottmar V. Lipp, Belinda M. Craig, Stefanie I. Becker School of Psychology, University of Queensland,

More information

The path of visual attention

The path of visual attention Acta Psychologica 121 (2006) 199 209 www.elsevier.com/locate/actpsy The path of visual attention James M. Brown a, *, Bruno G. Breitmeyer b, Katherine A. Leighty a, Hope I. Denney a a Department of Psychology,

More information

Rapid fear detection relies on high spatial frequencies

Rapid fear detection relies on high spatial frequencies Supplemental Material for Rapid fear detection relies on high spatial frequencies Timo Stein, Kiley Seymour, Martin N. Hebart, and Philipp Sterzer Additional experimental details Participants Volunteers

More information

No prior entry for threat-related faces: Evidence from temporal order judgments

No prior entry for threat-related faces: Evidence from temporal order judgments No prior entry for threat-related faces 1 1 Running head: NO PRIOR ENTRY FOR THREAT-RELATED FACES 2 3 4 No prior entry for threat-related faces: Evidence from temporal order judgments 5 6 Antonio Schettino

More information

Valence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University

Valence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University Valence and Gender Effects on Emotion Recognition Following TBI Cassie Brown Arizona State University Knox & Douglas (2009) Social Integration and Facial Expression Recognition Participants: Severe TBI

More information

Interpreting Instructional Cues in Task Switching Procedures: The Role of Mediator Retrieval

Interpreting Instructional Cues in Task Switching Procedures: The Role of Mediator Retrieval Journal of Experimental Psychology: Learning, Memory, and Cognition 2006, Vol. 32, No. 3, 347 363 Copyright 2006 by the American Psychological Association 0278-7393/06/$12.00 DOI: 10.1037/0278-7393.32.3.347

More information

Top-down guidance in visual search for facial expressions

Top-down guidance in visual search for facial expressions Psychonomic Bulletin & Review 2007, 14 (1), 159-165 Top-down guidance in visual search for facial expressions SOWON HAHN AND SCOTT D. GRONLUND University of Oklahoma, Norman, Oklahoma Using a visual search

More information

Acta Psychologica 141 (2012) Contents lists available at SciVerse ScienceDirect. Acta Psychologica

Acta Psychologica 141 (2012) Contents lists available at SciVerse ScienceDirect. Acta Psychologica Acta Psychologica 141 (2012) 270 275 Contents lists available at SciVerse ScienceDirect Acta Psychologica journal homepage: www.elsevier.com/ locate/actpsy Gaze cues influence memory but not for long Michael

More information

Invariant Effects of Working Memory Load in the Face of Competition

Invariant Effects of Working Memory Load in the Face of Competition Invariant Effects of Working Memory Load in the Face of Competition Ewald Neumann (ewald.neumann@canterbury.ac.nz) Department of Psychology, University of Canterbury Christchurch, New Zealand Stephen J.

More information

Sequential Effects in Spatial Exogenous Cueing: Theoretical and Methodological Issues

Sequential Effects in Spatial Exogenous Cueing: Theoretical and Methodological Issues Sequential Effects in Spatial Exogenous Cueing: Theoretical and Methodological Issues Alessandro Couyoumdjian (alessandro.couyoumdjian@uniroma1.it) Faculty of Psychology 1, University La Sapienza via dei

More information

Scale Invariance and Primacy and Recency Effects in an Absolute Identification Task

Scale Invariance and Primacy and Recency Effects in an Absolute Identification Task Neath, I., & Brown, G. D. A. (2005). Scale Invariance and Primacy and Recency Effects in an Absolute Identification Task. Memory Lab Technical Report 2005-01, Purdue University. Scale Invariance and Primacy

More information

Cultural Differences in Cognitive Processing Style: Evidence from Eye Movements During Scene Processing

Cultural Differences in Cognitive Processing Style: Evidence from Eye Movements During Scene Processing Cultural Differences in Cognitive Processing Style: Evidence from Eye Movements During Scene Processing Zihui Lu (zihui.lu@utoronto.ca) Meredyth Daneman (daneman@psych.utoronto.ca) Eyal M. Reingold (reingold@psych.utoronto.ca)

More information

Satiation in name and face recognition

Satiation in name and face recognition Memory & Cognition 2000, 28 (5), 783-788 Satiation in name and face recognition MICHAEL B. LEWIS and HADYN D. ELLIS Cardiff University, Cardiff, Wales Massive repetition of a word can lead to a loss of

More information

Sequential similarity and comparison effects in category learning

Sequential similarity and comparison effects in category learning Sequential similarity and comparison effects in category learning Paulo F. Carvalho (pcarvalh@indiana.edu) Department of Psychological and Brain Sciences, Indiana University 1101 East Tenth Street Bloomington,

More information

Predictive Gaze Cues and Personality Judgments Should Eye Trust You?

Predictive Gaze Cues and Personality Judgments Should Eye Trust You? PSYCHOLOGICAL SCIENCE Research Article Predictive Gaze Cues and Personality Judgments Should Eye Trust You? Andrew P. Bayliss and Steven P. Tipper Centre for Cognitive Neuroscience, School of Psychology,

More information

Object Substitution Masking: When does Mask Preview work?

Object Substitution Masking: When does Mask Preview work? Object Substitution Masking: When does Mask Preview work? Stephen W. H. Lim (psylwhs@nus.edu.sg) Department of Psychology, National University of Singapore, Block AS6, 11 Law Link, Singapore 117570 Chua

More information

doi: /brain/awp255 Brain 2010: 133; Integration of gaze direction and facial expression in patients with unilateral amygdala damage

doi: /brain/awp255 Brain 2010: 133; Integration of gaze direction and facial expression in patients with unilateral amygdala damage doi:10.1093/brain/awp255 Brain 2010: 133; 248 261 248 BRAIN A JOURNAL OF NEUROLOGY Integration of gaze direction and facial expression in patients with unilateral amygdala damage Chiara Cristinzio, 1,2

More information

Rachael E. Jack, Caroline Blais, Christoph Scheepers, Philippe G. Schyns, and Roberto Caldara

Rachael E. Jack, Caroline Blais, Christoph Scheepers, Philippe G. Schyns, and Roberto Caldara Current Biology, Volume 19 Supplemental Data Cultural Confusions Show that Facial Expressions Are Not Universal Rachael E. Jack, Caroline Blais, Christoph Scheepers, Philippe G. Schyns, and Roberto Caldara

More information

Moralization Through Moral Shock: Exploring Emotional Antecedents to Moral Conviction. Table of Contents

Moralization Through Moral Shock: Exploring Emotional Antecedents to Moral Conviction. Table of Contents Supplemental Materials 1 Supplemental Materials for Wisneski and Skitka Moralization Through Moral Shock: Exploring Emotional Antecedents to Moral Conviction Table of Contents 2 Pilot Studies 2 High Awareness

More information

(Visual) Attention. October 3, PSY Visual Attention 1

(Visual) Attention. October 3, PSY Visual Attention 1 (Visual) Attention Perception and awareness of a visual object seems to involve attending to the object. Do we have to attend to an object to perceive it? Some tasks seem to proceed with little or no attention

More information

The Perception of Gender in Human Faces Samantha Haseltine Gustavus Adolphus College Faculty Advisor: Patricia Costello. Perception of Gender 1

The Perception of Gender in Human Faces Samantha Haseltine Gustavus Adolphus College Faculty Advisor: Patricia Costello. Perception of Gender 1 The Perception of Gender in Human Faces Samantha Haseltine Gustavus Adolphus College Faculty Advisor: Patricia Costello Perception of Gender 1 Perception of Gender 2 Abstract: When we look at a face we

More information

Effect of Positive and Negative Instances on Rule Discovery: Investigation Using Eye Tracking

Effect of Positive and Negative Instances on Rule Discovery: Investigation Using Eye Tracking Effect of Positive and Negative Instances on Rule Discovery: Investigation Using Eye Tracking Miki Matsumuro (muro@cog.human.nagoya-u.ac.jp) Kazuhisa Miwa (miwa@is.nagoya-u.ac.jp) Graduate School of Information

More information

Balancing Cognitive Demands: Control Adjustments in the Stop-Signal Paradigm

Balancing Cognitive Demands: Control Adjustments in the Stop-Signal Paradigm Journal of Experimental Psychology: Learning, Memory, and Cognition 2011, Vol. 37, No. 2, 392 404 2010 American Psychological Association 0278-7393/10/$12.00 DOI: 10.1037/a0021800 Balancing Cognitive Demands:

More information

HOW DOES PERCEPTUAL LOAD DIFFER FROM SENSORY CONSTRAINS? TOWARD A UNIFIED THEORY OF GENERAL TASK DIFFICULTY

HOW DOES PERCEPTUAL LOAD DIFFER FROM SENSORY CONSTRAINS? TOWARD A UNIFIED THEORY OF GENERAL TASK DIFFICULTY HOW DOES PERCEPTUAL LOAD DIFFER FROM SESORY COSTRAIS? TOWARD A UIFIED THEORY OF GEERAL TASK DIFFICULTY Hanna Benoni and Yehoshua Tsal Department of Psychology, Tel-Aviv University hannaben@post.tau.ac.il

More information

Stimulus set size modulates the sex emotion interaction in face categorization

Stimulus set size modulates the sex emotion interaction in face categorization Atten Percept Psychophys (2015) 77:1285 1294 DOI 10.3758/s13414-015-0849-x Stimulus set size modulates the sex emotion interaction in face categorization Ottmar V. Lipp & Fika Karnadewi & Belinda M. Craig

More information

Perceptual and Motor Skills, 2010, 111, 3, Perceptual and Motor Skills 2010 KAZUO MORI HIDEKO MORI

Perceptual and Motor Skills, 2010, 111, 3, Perceptual and Motor Skills 2010 KAZUO MORI HIDEKO MORI Perceptual and Motor Skills, 2010, 111, 3, 785-789. Perceptual and Motor Skills 2010 EXAMINATION OF THE PASSIVE FACIAL FEEDBACK HYPOTHESIS USING AN IMPLICIT MEASURE: WITH A FURROWED BROW, NEUTRAL OBJECTS

More information

The Effect of Contextual Information and Emotional Clarity on Emotional Evaluation

The Effect of Contextual Information and Emotional Clarity on Emotional Evaluation American International Journal of Social Science Vol. 6, No. 4, December 2017 The Effect of Contextual Information and Emotional Clarity on Emotional Evaluation Fada Pan*, Leyuan Li, Yanyan Zhang, Li Zhang

More information

Differential Viewing Strategies towards Attractive and Unattractive Human Faces

Differential Viewing Strategies towards Attractive and Unattractive Human Faces Differential Viewing Strategies towards Attractive and Unattractive Human Faces Ivan Getov igetov@clemson.edu Greg Gettings ggettin@clemson.edu A.J. Villanueva aaronjv@clemson.edu Chris Belcher cbelche@clemson.edu

More information

Selective bias in temporal bisection task by number exposition

Selective bias in temporal bisection task by number exposition Selective bias in temporal bisection task by number exposition Carmelo M. Vicario¹ ¹ Dipartimento di Psicologia, Università Roma la Sapienza, via dei Marsi 78, Roma, Italy Key words: number- time- spatial

More information

1 Visual search for emotional expressions: Effect of stimulus set on anger and happiness. superiority

1 Visual search for emotional expressions: Effect of stimulus set on anger and happiness. superiority 1 Visual search for emotional expressions: Effect of stimulus set on anger and happiness superiority Ruth A. Savage 1, Stefanie I. Becker 1 & Ottmar V. Lipp 2 1 School of Psychology, University of Queensland,

More information

Selective attention and asymmetry in the Müller-Lyer illusion

Selective attention and asymmetry in the Müller-Lyer illusion Psychonomic Bulletin & Review 2004, 11 (5), 916-920 Selective attention and asymmetry in the Müller-Lyer illusion JOHN PREDEBON University of Sydney, Sydney, New South Wales, Australia Two experiments

More information

Comment on McLeod and Hume, Overlapping Mental Operations in Serial Performance with Preview: Typing

Comment on McLeod and Hume, Overlapping Mental Operations in Serial Performance with Preview: Typing THE QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 1994, 47A (1) 201-205 Comment on McLeod and Hume, Overlapping Mental Operations in Serial Performance with Preview: Typing Harold Pashler University of

More information

Looking at You or Looking Elsewhere: The Influence of Head Orientation on the Signal Value of Emotional Facial Expressions

Looking at You or Looking Elsewhere: The Influence of Head Orientation on the Signal Value of Emotional Facial Expressions Motiv Emot (2007) 31:137 144 DOI 10.1007/s11031-007-9057-x ORIGINAL PAPER Looking at You or Looking Elsewhere: The Influence of Head Orientation on the Signal Value of Emotional Facial Expressions Ursula

More information

the remaining half of the arrays, a single target image of a different type from the remaining

the remaining half of the arrays, a single target image of a different type from the remaining 8 the remaining half of the arrays, a single target image of a different type from the remaining items was included. Participants were asked to decide whether a different item was included in the array,

More information

Are Retrievals from Long-Term Memory Interruptible?

Are Retrievals from Long-Term Memory Interruptible? Are Retrievals from Long-Term Memory Interruptible? Michael D. Byrne byrne@acm.org Department of Psychology Rice University Houston, TX 77251 Abstract Many simple performance parameters about human memory

More information

Differences of Face and Object Recognition in Utilizing Early Visual Information

Differences of Face and Object Recognition in Utilizing Early Visual Information Differences of Face and Object Recognition in Utilizing Early Visual Information Peter Kalocsai and Irving Biederman Department of Psychology and Computer Science University of Southern California Los

More information

Positive emotion expands visual attention...or maybe not...

Positive emotion expands visual attention...or maybe not... Positive emotion expands visual attention...or maybe not... Taylor, AJ, Bendall, RCA and Thompson, C Title Authors Type URL Positive emotion expands visual attention...or maybe not... Taylor, AJ, Bendall,

More information

Limitations of Object-Based Feature Encoding in Visual Short-Term Memory

Limitations of Object-Based Feature Encoding in Visual Short-Term Memory Journal of Experimental Psychology: Human Perception and Performance 2002, Vol. 28, No. 2, 458 468 Copyright 2002 by the American Psychological Association, Inc. 0096-1523/02/$5.00 DOI: 10.1037//0096-1523.28.2.458

More information

A model of parallel time estimation

A model of parallel time estimation A model of parallel time estimation Hedderik van Rijn 1 and Niels Taatgen 1,2 1 Department of Artificial Intelligence, University of Groningen Grote Kruisstraat 2/1, 9712 TS Groningen 2 Department of Psychology,

More information

University of Groningen. Imitation of emotion Velde, Sytske Willemien van der

University of Groningen. Imitation of emotion Velde, Sytske Willemien van der University of Groningen Imitation of emotion Velde, Sytske Willemien van der IMPORTANT NOTE: You are advised to consult the publisher's version (publisher's PDF) if you wish to cite from it. Please check

More information

Scientific Research. The Scientific Method. Scientific Explanation

Scientific Research. The Scientific Method. Scientific Explanation Scientific Research The Scientific Method Make systematic observations. Develop a testable explanation. Submit the explanation to empirical test. If explanation fails the test, then Revise the explanation

More information

Eye movements during emotion recognition in faces

Eye movements during emotion recognition in faces Journal of Vision (2014) 14(13):14, 1 16 http://www.journalofvision.org/content/14/13/14 1 Eye movements during emotion recognition in faces Department of Psychological & Brain Sciences, M. W. Schurgin

More information

What Matters in the Cued Task-Switching Paradigm: Tasks or Cues? Ulrich Mayr. University of Oregon

What Matters in the Cued Task-Switching Paradigm: Tasks or Cues? Ulrich Mayr. University of Oregon What Matters in the Cued Task-Switching Paradigm: Tasks or Cues? Ulrich Mayr University of Oregon Running head: Cue-specific versus task-specific switch costs Ulrich Mayr Department of Psychology University

More information

University of Alberta. The SNARC effect as a tool to Examine Crosstalk during Numerical Processing in a PRP paradigm. Shawn Tan

University of Alberta. The SNARC effect as a tool to Examine Crosstalk during Numerical Processing in a PRP paradigm. Shawn Tan University of Alberta The SNARC effect as a tool to Examine Crosstalk during Numerical Processing in a PRP paradigm by Shawn Tan A thesis submitted to the Faculty of Graduate Studies and Research in partial

More information

Archived at the Flinders Academic Commons:

Archived at the Flinders Academic Commons: Archived at the Flinders Academic Commons: http://dspace.flinders.edu.au/dspace/ This is the authors version of an article published in Psychologica Belgica. The original publication is available by subscription

More information

SOCIAL JUDGMENTS ARE INFLUENCED BY BOTH FACIAL EXPRESSION AND DIRECTION OF EYE GAZE

SOCIAL JUDGMENTS ARE INFLUENCED BY BOTH FACIAL EXPRESSION AND DIRECTION OF EYE GAZE Social Cognition, Vol. 9, No. 4, 011, pp. 415 49 SOCIAL JUDGMENTS ARE INFLUENCED BY BOTH FACIAL EXPRESSION AND DIRECTION OF EYE GAZE Megan L. Willis Macquarie University, Sydney, Australia; University

More information

The Simon Effect as a Function of Temporal Overlap between Relevant and Irrelevant

The Simon Effect as a Function of Temporal Overlap between Relevant and Irrelevant University of North Florida UNF Digital Commons All Volumes (2001-2008) The Osprey Journal of Ideas and Inquiry 2008 The Simon Effect as a Function of Temporal Overlap between Relevant and Irrelevant Leslie

More information

The Color of Similarity

The Color of Similarity The Color of Similarity Brooke O. Breaux (bfo1493@louisiana.edu) Institute of Cognitive Science, University of Louisiana at Lafayette, Lafayette, LA 70504 USA Michele I. Feist (feist@louisiana.edu) Institute

More information

Brain and Cognition, 48(2-3), (2002) Evaluation of nonverbal emotion in face and voice: some preliminary findings on a new battery of tests

Brain and Cognition, 48(2-3), (2002) Evaluation of nonverbal emotion in face and voice: some preliminary findings on a new battery of tests Brain and Cognition, 48(2-3), 499-514 (2002) Evaluation of nonverbal emotion in face and voice: some preliminary findings on a new battery of tests Marc David Pell McGill University, Montréal Abstract

More information

Reflexive Spatial Attention to Goal-Directed Reaching

Reflexive Spatial Attention to Goal-Directed Reaching Reflexive Spatial Attention to Goal-Directed Reaching Alexis A. Barton (aabarton@indiana.edu) Bennett I. Bertenthal (bbertent@indiana.edu) Samuel Harding (hardinsm@indiana.edu) Department of Psychological

More information

Understanding emotions from standardized facial expressions in autism and normal development

Understanding emotions from standardized facial expressions in autism and normal development Understanding emotions from standardized facial expressions in autism and normal development autism 2005 SAGE Publications and The National Autistic Society Vol 9(4) 428 449; 056082 1362-3613(200510)9:4

More information

Gaze cues evoke both spatial and object-centered shifts of attention

Gaze cues evoke both spatial and object-centered shifts of attention Journal Perception & Psychophysics 2006,?? 68 (?), (2),???-??? 310-318 Gaze cues evoke both spatial and object-centered shifts of attention ANDREW P. BAYLISS and STEVEN P. TIPPER University of Wales, Bangor,

More information

Viewpoint dependent recognition of familiar faces

Viewpoint dependent recognition of familiar faces Viewpoint dependent recognition of familiar faces N. F. Troje* and D. Kersten *Max-Planck Institut für biologische Kybernetik, Spemannstr. 38, 72076 Tübingen, Germany Department of Psychology, University

More information

Is Face Distinctiveness Gender Based?

Is Face Distinctiveness Gender Based? Journal of Experimental Psychology: Human Perception and Performance 2006, Vol. 32, No. 4, 789 798 Copyright 2006 by the American Psychological Association 0096-1523/06/$12.00 DOI: 10.1037/0096-1523.32.4.789

More information

Reflexive Visual Orienting in Response to the Social Attention of Others

Reflexive Visual Orienting in Response to the Social Attention of Others Reflexive Visual Orienting in Response to the Social Attention of Others Stephen R. H. Langton and Vicki Bruce Department of Psychology, University of Stirling Running Head: VISUAL ORIENTING AND SOCIAL

More information

Enhanced discrimination in autism

Enhanced discrimination in autism THE QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2001, 54A (4), 961 979 Enhanced discrimination in autism Michelle O Riordan and Kate Plaisted University of Cambridge, Cambridge, UK Children with autism

More information

The Meaning of the Mask Matters

The Meaning of the Mask Matters PSYCHOLOGICAL SCIENCE Research Report The Meaning of the Mask Matters Evidence of Conceptual Interference in the Attentional Blink Paul E. Dux and Veronika Coltheart Macquarie Centre for Cognitive Science,

More information

Short article Detecting objects is easier than categorizing them

Short article Detecting objects is easier than categorizing them THE QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY 2008, 61 (4), 552 557 Short article Detecting objects is easier than categorizing them Jeffrey S. Bowers and Keely W. Jones University of Bristol, Bristol,

More information

PSYCHOLOGICAL SCIENCE. Research Report

PSYCHOLOGICAL SCIENCE. Research Report Research Report CHANGING FACES: A Detection Advantage in the Flicker Paradigm Tony Ro, 1 Charlotte Russell, 2 and Nilli Lavie 2 1 Rice University and 2 University College London, London, United Kingdom

More information

Parallel response selection in dual-task situations via automatic category-to-response translation

Parallel response selection in dual-task situations via automatic category-to-response translation Attention, Perception, & Psychophysics 2010, 72 (7), 1791-1802 doi:10.3758/app.72.7.1791 Parallel response selection in dual-task situations via automatic category-to-response translation SANDRA A J. THOMSON,

More information

Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 1. The attentional window configures to object boundaries. University of Iowa

Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 1. The attentional window configures to object boundaries. University of Iowa Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 1 The attentional window configures to object boundaries University of Iowa Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 2 ABSTRACT When

More information

Black 1 White 5 Black

Black 1 White 5 Black PSYCHOLOGICAL SCIENCE Research Report Black 1 White 5 Black Hypodescent in Reflexive Categorization of Racially Ambiguous Faces Destiny Peery and Galen V. Bodenhausen Northwestern University ABSTRACT Historically,

More information

Spatially Diffuse Inhibition Affects Multiple Locations: A Reply to Tipper, Weaver, and Watson (1996)

Spatially Diffuse Inhibition Affects Multiple Locations: A Reply to Tipper, Weaver, and Watson (1996) Journal of Experimental Psychology: Human Perception and Performance 1996, Vol. 22, No. 5, 1294-1298 Copyright 1996 by the American Psychological Association, Inc. 0096-1523/%/$3.00 Spatially Diffuse Inhibition

More information

Journal of Experimental Psychology: General

Journal of Experimental Psychology: General Journal of Experimental Psychology: General Internal Representations Reveal Cultural Diversity in Expectations of Facial Expressions of Emotion Rachael E. Jack, Roberto Caldara, and Philippe G. Schyns

More information

Prime display offset modulates negative priming only for easy-selection tasks

Prime display offset modulates negative priming only for easy-selection tasks Memory & Cognition 2007, 35 (3), 504-513 Prime display offset modulates negative priming only for easy-selection tasks Christian Frings Saarland University, Saarbrücken, Germany and Peter Wühr Friedrich

More information

An investigation of basic facial expression recognition in autism spectrum disorders

An investigation of basic facial expression recognition in autism spectrum disorders Cognition and Emotion ISSN: 0269-9931 (Print) 1464-0600 (Online) Journal homepage: http://www.tandfonline.com/loi/pcem20 An investigation of basic facial expression recognition in autism spectrum disorders

More information

Focusing on fear: Attentional disengagement from emotional faces

Focusing on fear: Attentional disengagement from emotional faces Europe PMC Funders Group Author Manuscript Published in final edited form as: Vis cogn. 2005 ; 12(1): 145 158. doi:10.1080/13506280444000076. Focusing on fear: Attentional disengagement from emotional

More information

Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters. Qiong Wu St. Bayside, NY Stuyvesant High School

Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters. Qiong Wu St. Bayside, NY Stuyvesant High School Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters Qiong Wu 58-11 205 St. Bayside, NY 11364 Stuyvesant High School 345 Chambers St. New York, NY 10282 Q. Wu (2001) Are faces special?

More information

SELECTIVE ATTENTION AND CONFIDENCE CALIBRATION

SELECTIVE ATTENTION AND CONFIDENCE CALIBRATION SELECTIVE ATTENTION AND CONFIDENCE CALIBRATION Jordan Schoenherr, Craig Leth-Steensen, and William M. Petrusic psychophysics.lab@gmail.com, craig_leth_steensen@carleton.ca, bpetrusi@carleton.ca Carleton

More information

Why Does Similarity Correlate With Inductive Strength?

Why Does Similarity Correlate With Inductive Strength? Why Does Similarity Correlate With Inductive Strength? Uri Hasson (uhasson@princeton.edu) Psychology Department, Princeton University Princeton, NJ 08540 USA Geoffrey P. Goodwin (ggoodwin@princeton.edu)

More information

The Associability Theory vs. The Strategic Re-Coding Theory: The Reverse Transfer Along a Continuum Effect in Human Discrimination Learning

The Associability Theory vs. The Strategic Re-Coding Theory: The Reverse Transfer Along a Continuum Effect in Human Discrimination Learning The Associability Theory vs. The Strategic Re-Coding Theory: The Reverse Transfer Along a Continuum Effect in Human Discrimination Learning Mizue Tachi (MT334@hermes.cam.ac.uk) R.P. McLaren (RPM31@hermes.cam.ac.uk)

More information

What matters in the cued task-switching paradigm: Tasks or cues?

What matters in the cued task-switching paradigm: Tasks or cues? Journal Psychonomic Bulletin & Review 2006,?? 13 (?), (5),???-??? 794-799 What matters in the cued task-switching paradigm: Tasks or cues? ULRICH MAYR University of Oregon, Eugene, Oregon Schneider and

More information