Running Head: PERCEIVED REALISM OF DYNAMIC FACIAL EXPRESSIONS. Perceived Realism of Dynamic Facial Expressions of Emotion

Size: px
Start display at page:

Download "Running Head: PERCEIVED REALISM OF DYNAMIC FACIAL EXPRESSIONS. Perceived Realism of Dynamic Facial Expressions of Emotion"

Transcription

1 Perceived Realism 1 Running Head: PERCEIVED REALISM OF DYNAMIC FACIAL EXPRESSIONS Perceived Realism of Dynamic Facial Expressions of Emotion Optimal Durations for the Presentation of Emotional Onsets and Offsets Holger Hoffmann, Harald C. Traue, Franziska Bachmayr and Henrik Kessler University Clinic for Psychosomatic Medicine and Psychotherapy, Medical Psychology Section, Ulm, Germany Correspondence concerning this article should be addressed to Holger Hoffmann, University Clinic for Psychosomatic Medicine and Psychotherapy, Medical Psychology Section, Am Hochsträß 8, D Ulm, Germany, or on to holger.hoffmann@uni-ulm.de. This research was supported in part by grants from the Transregional Collaborative Research Centre SFB/TRR 62 "Companion-Technology for Cognitive Technical Systems" funded by the German Research Foundation (DFG).

2 Perceived Realism 2 Abstract The presentation of facial displays of emotions is an important method in emotion recognition studies in various basic and applied settings. This study intends to make a methodological contribution and investigates the perceived realism of dynamic facial expressions for six emotions (fear, sadness, anger, happiness, disgust, and surprise). We presented dynamic displays of faces evolving from a neutral to an emotional expression (onsets) and faces evolving from an emotional expression to a neutral one (offsets). Participants rated the perceived realism of stimuli of different durations ( ms) and adjusted the duration of each sequence until they perceived it as maximally realistic. Durations perceived as most realistic are reported for each emotion, providing an important basis for the construction of dynamic facial stimuli for future research.

3 Perceived Realism 3 Perceived Realism of Dynamic Facial Expressions of Emotion Optimal Durations for the Presentation of Emotional Onsets and Offsets The presentation of facial displays of emotions is an important method in emotion recognition studies used in basic as well as clinical research. Research in this area was propelled by the introduction of standardized stimuli, primarily the Japanese and Caucasian Facial Expressions of Emotion image set (JACFEE; Matsumoto & Ekman, 1988) and the portrait pictures of facial affect that were produced using Ekman s Facial Action Coding System (FACS; Ekman & Friesen, 1978). Although facial expressions do not necessarily reflect an internal affective state (Kappas, 2003) especially if they are posed as in the case of the JACFEE pictures for the purpose of this article we refer to an emotional facial expression when the face looks to an observer at least as if it is displaying an emotion. One of the major drawbacks of research on facial expressions is that static displays are used in most studies (e.g. Calder et al., 2003; Hall et al., 2004) which are clearly different from reallife conditions and therefore lack ecological validity (Carroll & Russell, 1997). The use of dynamic facial displays of emotion that evolve from a neutral to an emotional expression over time, and vice versa, offers a solution to this problem. The time the full-blown expression stays on the face is called apex, and the change back to a neutral expression is defined as offset. In addition to increased ecological validity, the use of dynamic stimuli offers additional advantages. First, recent neuroimaging research (Kilts, Egan, Gideon, Ely, & Hoffman, 2003; LaBar, Crupain, Voyvodic, & McCarthy, 2003; Sato, Kochiyama, Yoshikawa, Naito, & Matsumura, 2004) and computer modelling studies (Haxby, Hoffman, & Gobbini, 2002) have shown that more brain areas are active in reaction to dynamic than to static facial emotions, indicating that the use of dynamic stimuli may provide additional information that helps to elucidate underlying recognition mechanisms. Second, there is empirical evidence showing that participants recognize emotions better when displayed

4 Perceived Realism 4 dynamically than when displayed as static or multi-static pictures (Ambadar, Schooler, & Cohn, 2005; Harwood, Hall, & Shinkfield, 1999; Wehrle, Kaiser, Schmidt, & Scherer, 2000; Weyers, Muhlberger, Hefele, & Pauli, 2006). Ecological validity may become problematic, however, when morphed sequences are used. For example, unnaturally slow morphed sequences have been used in one study (Blair, Colledge, Murray, & Mitchell, 2001), but the perceived realism of those stimuli has never been assessed (LaBar et al., 2003). Although the importance of dynamic information in facial-emotion recognition was pointed out by Bassili (1978) and Ekman & Friesen (1982) several decades ago, detailed studies on the perceived realism of facial expressions with different durations have only been conducted recently. To our knowledge, only three studies to date report on the perceived realism of different durations for different facial expressions (Kamachi et al., 2001; Sato & Yoshikawa, 2004; Pollick, Hill, Calder, & Paterson, 2003). We will compare the results of those three studies with our own data in the discussion. The major aim of the present study is to determine at which duration morphed dynamic emotional expressions have to be shown in order to be perceived as realistic. Although the nature of this investigation is exploratory rather than hypothesis-driven, we assume that distinct emotions need to be displayed at different durations to appear realistic (see Kamachi et al.,2001, Sato & Yoshikawa,2004, and Pollick et al.,2003). This study was designed to measure the perception of the temporal characteristics of both onsets and offsets of facial displays of emotion. Participants were shown dynamic displays of faces either evolving from a neutral to an emotional expression (onsets) or faces evolving back from an emotional expression to a neutral one (offset) and asked to adjust the duration of sequences until they perceived it as most realistic.

5 Perceived Realism 5 Method Participants A total of n=124 volunteers participated in this experiment. The onset condition was rated by n=84 participants (aged years, M=21.93 years, 71% female) and n=40 participants (aged years, M = years, 53% female) rated the offset condition. Written consent from participants was obtained before the experiment, which was approved by the University ethics board. Stimuli Sequences used in this experiment were generated using static images from the JACFEE image set (Matsumoto & Ekman, 1988). In this image set, fifty-six different actors portray one of six emotions (anger, disgust, fear, happiness, sadness, surprise), half of them are male and half are female, half are of Japanese and half of Caucasian origin. Throughout our experiments, this balance was maintained in our sub-sets of stimuli in order to enhance ecological validity. Every actor displays only one emotional and one neutral expression. Except for happiness, which was photographed while the actors were spontaneously smiling, all emotional expressions were posed. Various studies have shown the reliability and validity of the JACFEE set in displaying the intended emotions (e.g., Biehl et al. 1997). Morph sequences were generated using the FEMT (Facial Expression Morphing Tool; Kessler et al., 2005). This newly developed software uses state-of-the-art morphing algorithms to produce intermediate frames between two images. To optimize the production of facial morphs, additional techniques were implemented. First, to minimize distracting facial information, sequences were generated using multiple layers in order to morph only the important features of the face. Second, the use of multiple layers and special smoothing algorithms allowed us to create realistic transitions from closed to open mouths. Third,

6 Perceived Realism 6 transition control allowed us to apply different rates of warping and color blending for different facial areas across the sequence. Since there are no empirical data available for the actual transition of facial features in emotional expressions, we heuristically applied global warping with an S-curve and color blending was done in a linear manner. The FEMT generates video files of which the duration is specified by the number of frames morphed between the first and the last image and by the number of frames per second (in this study constantly 25 frames s -1 ). Since each frame was constantly presented for 40 ms and the duration of sequences differed, the degree of morphing changed accordingly. Regardless of the absolute length of the video clips, with a frame rate of 25 s -1 all sequences were perceived as fluid. All stimuli used in this study were in color. Morph generation was done on a regular Win2K/XP system. We selected pictures for the six emotions (anger, disgust, fear, happiness, sadness and surprise) from the JACFEE set; each picture was portrayed by eight different actors. This resulted in a total of 48 different actors portraying the emotions. A neutral picture of each actor was also selected. Next we generated emotional sequences, in which the expression changed from a neutral face to an emotional face (onset) or from an emotional expression to a neutral one (offset), using both pictures from the JACFEE set, and synthesized intermediate images using FEMT. Sequences were generated using various durations and consisted of 6 to 76 frames (by an increment of 5 frames), using a frame rate of 25 frames s -1. We thus obtained 15 video clips for each actor, lasting between 240 ms and 3040 ms (in steps of 200 ms), for a total number of 720 sequences (48 actors 15 video clips) for each condition. We used a 6 (emotions: anger, disgust, fear, happiness, sadness, surprise) 2 (actor s sex) 2 (actor s ethnicity: Caucasian, Japanese) 2 (participant s sex) mixed design (linear mixed model considering the dependency structure of the data). Apparatus

7 Perceived Realism 7 The experiment was run using our own computer software written in Delphi 6.0 on a Win2k/XP system. The stimuli were presented on 19 -TFT monitors, using a resolution of pixels and 24 bit colour depth. The viewing distance was approximately 60 cm, which corresponds to a viewing angle of 11.4 horizontally and 15.6 vertically ( pixels). Procedure Participants were tested individually using our laboratory software, each session lasting approximately 15 minutes. Subjects saw 48 emotion sequences (8 actors for each emotion) showing the development from a neutral to an emotional face for the onset sample or showing the fading from an emotional to a neutral face for the offset sample. They were instructed to adjust the duration of each sequence until they perceived the sequence as maximally realistic. This was done by repeatedly pressing faster/slower buttons. Participants could use six different buttons to control the action: four to change the duration of the presented video clip (in steps of +/- 200/600 ms), one to repeat the current sequence and one to select the current time frame when they had decided it was maximally realistic. Since in a preliminary test, subjects could barely differentiate between two sequences with longer durations (e.g. 1.8 and 2.0 seconds), we decided to present sequences in steps of 200 ms. Each emotion sequence was embedded in a prior 1000 ms neutral (onset) or emotional (offset) face and a 300 ms full-blown emotional (onset) or neutral (offset) face afterwards. Sequences were presented in random order throughout all emotions. The duration of the first sequence displayed for every trial was also randomized ( ms). The name of the emotion displayed was written at the top of the screen as the sequence unfolded. Furthermore, participants were allowed to repeat a sequence as often as they wanted, until they were confident that the duration was as realistic as possible.

8 Perceived Realism 8 Results Onset condition Due to the distribution of the selected time frames, values were transformed logarithmically to be used in a linear mixed model. An analysis of variance revealed that the duration selected as most realistic differed significantly across emotions, F(5,3923) = , p < The main effect participant s sex was also significant (F(1,82) = 4.24, p <.05), as well as the interaction between participant s sex and type of emotion, F(5,3923) = 7.05, p < The main effect actor s sex was also significant (F(1,3923) = 4.36, p <.05), but the main effect of actor s ethnicity was not. Other interactions were not significant either. Table 1 shows descriptive statistics for the preferred duration of each emotion and the estimated values ( model estimates ) based upon our statistical model. Sadness was considered most realistic when shown for relatively long durations, followed by anger, happiness and disgust; surprise and fear were seen as most realistic when shown for relatively short durations (see also Figure 1, left side). Further, male participants (M = 806 ms, SD = 571 ms) tended to perceive the sequences as more realistic when presented at shorter durations than female participants (M = 947 ms, SD = 598 ms), especially for the emotions disgust, fear and sadness (p <.05). Offset condition The analysis of the selected time frames for each emotion revealed that preferred durations again differed across emotions, (F(5,1855) = 35.20, p < 0.001), whereas there was only a trend for participant s sex (F(1,38) = 3.43, p =.072). No significant effects were found for actor s sex or ethnicity. No significant interaction effects were found either. Table 1 shows descriptive statistics and the model estimates for each emotion, indicating which offset duration was perceived as most realistic by participants. The most realistic offset duration for

9 Perceived Realism 9 sadness was longer than for all other emotions. Again, surprise was perceived as being most realistic when presented for a shorter duration than all other emotions. The most realistic offset durations for anger, disgust, fear, and happiness ranged between 1200 and 1700 ms (see also Figure 1, right side). [Table 1 to be included here] [Figure 1 to be included here] When comparing the perceived realism for offset and onset condition, a 2 2 analysis of variance (ANOVA) with type of presentation condition (onset, offset) and participant s sex showed that conditions differed significantly (F(1,5948) = , p <.0001). The main effect of participant s sex was also significant (F(1,5948) = , p <.0001), as well as the interaction between participant s sex and condition (F(1,5948) = 18.75, p <.0001). The means show that female participants chose longer durations on average, and this was particularly the case in the offset condition. Discussion This study contributes new data to the question of how long dynamic facial displays of emotion have to be shown in order to be perceived as realistic. Such empirically obtained time frames allow the creation of realistic dynamic facial expressions. The results confirm that facial displays have different onset durations at which they are perceived as most realistic. Durations vary from 480 ms to 1120 ms according to the estimation of our statistical model. Sadness ( ms), surprise ( ms) and fear ( ms) are the emotions for which the longest and the shortest onset durations, respectively, are perceived as being realistic. The most realistic onset durations for anger, disgust, and happiness range between 670 ms and 890 ms. We suggest that these temporal characteristics should be taken into account when presenting dynamic facial expression stimuli. As with onsets, preferred offset durations differed across emotions. Surprise ( ms) and sadness ( ms) the

10 Perceived Realism 10 two displays with the shortest and longest onset durations also had the shortest and longest offset durations. Offset durations for happiness, disgust, fear, and anger ranged between 1040 ms and 1510 ms. The mean offset duration was significantly longer than the mean onset duration across all emotions presented. Participants may have chosen longer durations for offsets than for onsets, because real-life emotion expressions may appear more quickly than they fade away. Our study went beyond the earlier work of Kamachi et al. (2001) and Sato and Yoshikawa (2004) in several aspects. We used six emotions and drew upon a reliable, valid and widely used stimulus set (JACFEE, see below) to enhance ecological validity. Yet our data are comparable to those from other studies conducted with different methods. In line with our own findings, Kamachi et al. (2001) reported that there were specific durations for different facial emotion film clips at which recognition rates were highest. The durations for sadness (3.4 s) and surprise (.2 s) that they reported as optimal for the recognition of an emotion, were in the same relative range as in our study. The duration for anger (.9 s) was almost exactly the same. Moreover, participants easily perceived a facial expression as surprise when it was shown for a brief duration. Sato and Yoshikawa (2004) asked participants to rate the naturalness of the onset of dynamic facial expressions of six emotions presented in different time frames. Although they offered only four possible durations (255, 510, 1020 and 2040 ms), their findings are comparable to ours. Surprise and, to a lesser extent, fear were rated most natural when displayed briefly (255 ms), whereas sadness was rated most natural when shown for long durations (1020 ms). An interesting study on real dynamic facial expressions was conducted by Pollick et al. (2003). They recorded posed facial expressions of actors with 3-D point-light techniques and reported mean durations for the onsets of anger, happiness, sadness and surprise. Supporting our findings, they also found surprise (858 ms) to be the emotion with the shortest

11 Perceived Realism 11 duration, sadness (1067 ms) with the longest duration, and anger (933 ms) to lie in the intermediate range. Happiness (1100 ms), on the contrary, was displayed longer than our subjects perceived it to be realistic (888 ms). Interestingly, Pollick et al. (2003) reported a higher variability of durations between actors than between emotions. Since in our stimulus set all sequences were morphed and therefore had no temporal variability between posers, we could not contribute data to that issue. With this high variability in mind, one should be cautious providing time data for emotion expressions across different actors. For instance, actors gender was a significant main effect in our study. The detailed interplay between emotion expression and the displaying person remains an open issue for further research. The gender difference found in this study was an unexpected surprise. Since the study was not designed to clarify this issue, interpretations of this effect are speculative, and further studies should provide more detailed information. However, as differences were significant, we decided, to provide data for optimal time frames separated by gender. Limitations of the Study Since all participants were of Caucasian origin, the issue of possible cultural differences remains open. Sato and Yoshikawa (2004) tested Asian subjects and their results were similar to ours, but this question could only be investigated further in cross-cultural studies. Some methodological concerns remain. Although heuristically using the S-curve warping procedure, morphed sequences do not fully reflect real-life conditions. We are currently extracting the optical flow of the most important facial areas by analyzing facial emotions from videos. Research has shown, for example, that different facial movements in the expression of happiness have specific temporal characteristics (Schmidt, Cohn, & Tian, 2003). Our morphing algorithms will include this information to create sequences that develop in a non-linear manner in the future. The realism of these morphs will then need to be

12 Perceived Realism 12 tested again. Additionally, longer sequences theoretically include a larger amount of information because they have more frames. The confound between sequence duration and amount of information can not be ruled out with our data. Another potential limitation (offset condition) concerns the limited range of durations that subjects could choose for the film clips. Almost all displays of emotions (except surprise) were often perceived as realistic at 3040 ms. Therefore it is not certain that the chosen range ( ms) of the different durations was optimal for this experiment. Future research should test longer durations with a larger sample of participants in order to confirm that the offset characteristics reported here can be suggested as optimal durations. The last, and probably most profound criticism concerns the rationale behind our methodological approach. We implicitly assume that the durations perceived as being most realistic by our subjects do in fact mirror real-life dynamic facial expressions of emotions. From an epistemological standpoint, however, there is no necessary link between the perception and production of facial expressions. This is further complicated by the fact, that our stimuli were artificially morphed sequences based on real human faces. We do not know whether the morphed (and thus intrinsically unnatural ) stimuli had any effects on the judgements of our subjects. The so-called uncanny valley effect (Mori, 1970) could for instance have influenced subjects ratings of realism: Artificial agents (robots, facial avatars, etc.) are perceived as being increasingly realistic the more natural they look. Paradoxically, when being very close to a natural appearance they are perceived as uncanny and unrealistic. Since our subjects only made relative judgements, this problem cannot be solved with the present design.

13 Perceived Realism 13 References Ambadar, Z., Schooler, J. W., & Cohn J. F. (2005). Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions. Psychological Science, 16(5), Bassili, J. N. (1978). Facial motion in the perception of faces and of emotional expression. Journal of Experimental Psychology: Human Perception and Performance, 4(3) Biehl, M., Matsumoto, D., Ekman, P., Hearn, V., Heider, K., Kudoh, T., & Ton, V. (1997). Matsumoto and Ekman s Japanese and Caucasian Facial Expressions of Emotion (JACFEE): Reliability Data and Cross-National Differences. Journal of Nonverbal Behavior, 21, Blair, R. J., Colledge, E., Murray, L., & Mitchell, D. G. (2001). A selective impairment in the processing of sad and fearful expressions in children with psychopathic tendencies. Journal of Abnormal Child Psychology, 29(6), Calder, A. J., Keane, J., Manly, T., Sprengelmeyer, R., Scott, S., Nimmo-Smith, I., & Young, A. W. (2003). Facial expression recognition across the adult life span. Neuropsychologia, 41(2), Carroll, J. M., & Russell, J. A. (1997). Facial expressions in Hollywood's portrayal of emotion. Journal of Personality & Social Psychology, 72(1), Ekman, P., & Friesen, W. V. (1978). Facial Action Coding System. Palo Alto: Consulting Psychologists Press. Ekman, P., & Friesen, W. V. (1982). Felt, false, and miserable smiles. Journal of Nonverbal Behavior, 6(4), Hall, J. A., & Matsumoto, D. (2004). Gender differences in judgments of multiple emotions from facial expressions. Emotion, 4(2),

14 Perceived Realism 14 Harwood, N. K., Hall, L. J., & Shinkfield, A. J. (1999). Recognition of facial emotional expressions from moving and static displays by individuals with mental retardation. American Journal of Mental Retardation, 04(3), Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. (2002). Human neural systems for face recognition and social communication. Biological Psychiatry, 51(1), Kamachi, M., Bruce, V., Mukaida, S., Gyoba, J., Yoshikawa, S., & Akamatsu, S. (2001). Dynamic properties influence the perception of facial expressions. Perception, 30(7), Kappas, A. (2003). What facial activity can and cannot tell us about emotions. In M. Katsikitis (Ed.), The human face: Measurement and meaning (pp ). Dordrecht: Kluwer Academic Publishers. Kessler, H., Hoffmann, H., Bayerl, P., Neumann, H., Basic, A., Deighton, R. M., & Traue, H. C. (2005). Die Messung von Emotionserkennung mittels Computer-Morphing: Neue Methoden für Forschung und Klinik [Measuring emotion recognition with computer morphing: New methods for research and clinical practice]. Nervenheilkunde, 24, Kilts, C. D., Egan, G., Gideon, D. A., Ely, T. D., & Hoffman, J. M. (2003). Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions. Neuroimage, 18(1), LaBar, K. S., Crupain, M. J., Voyvodic, J. T., & McCarthy, G. (2003). Dynamic perception of facial affect and identity in the human brain. Cerebral Cortex, 13(10), Matsumoto, D., & Ekman, P. (1988). Japanese and Caucasian Facial Expressions of Emotion (JACFEE) and Neutral Faces (JACNeuF). [Slides]: Dr. Paul Ekman, Department of Psychiatry, University of California, San Francisco, 401 Parnassus, San Francisco, CA

15 Perceived Realism 15 Mori, M. (1970). The uncanny valley. Energy, 7(4), Pollick, F. E., Hill, H., Calder, A., & Paterson, H. (2003). Recognising facial expression from spatially and temporally modified movements. Perception, 32(7), Sato, W., Kochiyama, T., Yoshikawa, S., Naito, E., & Matsumura, M. (2004). Enhanced neural activity in response to dynamic facial expressions of emotion: an fmri study. Cognitive Brain Research, 20(1), Sato, W., & Yoshikawa, S. (2004). The dynamic aspects of emotional facial expressions. Cognition and Emotion, 18(5), Schmidt, K. L., Cohn, J. F., & Tian, Y. (2003). Signal characteristics of spontaneous facial expressions: automatic movement in solitary and social smiles. Biological Psychology, 65(1), Wehrle, T., Kaiser, S., Schmidt, S., & Scherer, K. R. (2000). Studying the dynamics of emotional expression using synthesized facial muscle movements. Journal of Personality and Social Psychology, 78(1), Weyers, P., Muhlberger, A., Hefele, C., & Pauli, P. (2006). Electromyographic responses to static and dynamic avatar emotional facial expressions. Psychophysiology, 43(5),

16 Perceived Realism 16 Table 1 Descriptive statistics for the durations perceived as being realistic for the onset and offset condition Onset Condition Offset Condition Emotion n M (SD) in ms Mdn in ms Model Estimates n M (SD) in ms Mdn in ms Model Estimates Male participants Surprise (419) (14) (414) (36) Fear (356) (21) (566) (53) Happiness (419) (9) (597) (38) Disgust (351) (24) (567) (18) Anger (448) (29) (606) (35) Sadness (456) (17) (676) (52) Female participants Surprise (377) (15) (725) (47) Fear (431) (29) (683) (73) Happiness (353) (9) (728) (50) Disgust (389) (32) (723) (23) Anger (419) (32) (670) (42) Sadness (419) (22) (723) (58) Note. Mean values can be used as optimized time frames for the presentation of emotional sequences (all values in ms). Model estimates are calculated values for each duration (details in the methods section). Standard deviation in parentheses.

17 Perceived Realism 17 Figure 1 Box plots of the selected durations for each emotion for male and female participants Note. The x-axis displays the different categories of emotion, the y-axis shows the different durations between 240 and 3040 ms. The graph on the left side shows the results for emotional onsets, the graph on the right side shows the results of emotional offsets. Bars represent minimal and maximal values.

Dynamic facial expressions of emotion induce representational momentum

Dynamic facial expressions of emotion induce representational momentum Cognitive, Affective, & Behavioral Neuroscience 2008, 8 (1), 25-31 doi: 10.3758/CABN.8.1.25 Dynamic facial expressions of emotion induce representational momentum SAKIKO YOSHIKAWA AND WATARU SATO Kyoto

More information

The Noh Mask Test: Social skills test

The Noh Mask Test: Social skills test The Japanese Journal of Psychonomic Science 2015, Vol. 34, No. 1, 119 126 DOI: http://dx.doi.org/10.14947/psychono.34.15 119 The Noh Mask Test: Social skills test Seiko Minoshita Kawamura Gakuen Woman

More information

Misinterpretation of facial expression:a cross-cultural study

Misinterpretation of facial expression:a cross-cultural study Psychiatry and Clinical Neurosciences (1999), 53, 45 50 Regular Article Misinterpretation of facial expression:a cross-cultural study TOSHIKI SHIOIRI, md, phd, 1,3 TOSHIYUKI SOMEYA, md, phd, 2 DAIGA HELMESTE,

More information

This is the accepted version of this article. To be published as : This is the author version published as:

This is the accepted version of this article. To be published as : This is the author version published as: QUT Digital Repository: http://eprints.qut.edu.au/ This is the author version published as: This is the accepted version of this article. To be published as : This is the author version published as: Chew,

More information

Differences in holistic processing do not explain cultural differences in the recognition of facial expression

Differences in holistic processing do not explain cultural differences in the recognition of facial expression THE QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2017 VOL. 70, NO. 12, 2445 2459 http://dx.doi.org/10.1080/17470218.2016.1240816 Differences in holistic processing do not explain cultural differences

More information

What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression.

What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression. What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression. While our emotions are very different, they all involve

More information

Accepted author version posted online: 11 Apr 2015.

Accepted author version posted online: 11 Apr 2015. This article was downloaded by: [Sherri Widen] On: 11 April 2015, At: 07:20 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,

More information

MATSUMOTO AND EKMAN'S JAPANESE AND CAUCASIAN FACIAL EXPRESSIONS OF EMOTION (JACFEE): RELIABILITY DATA AND CROSS-NATIONAL DIFFERENCES

MATSUMOTO AND EKMAN'S JAPANESE AND CAUCASIAN FACIAL EXPRESSIONS OF EMOTION (JACFEE): RELIABILITY DATA AND CROSS-NATIONAL DIFFERENCES MATSUMOTO AND EKMAN'S JAPANESE AND CAUCASIAN FACIAL EXPRESSIONS OF EMOTION (JACFEE): RELIABILITY DATA AND CROSS-NATIONAL DIFFERENCES Michael Biehl, David Matsumoto, Paul Ekman, Valerie Meant, Karl Heider,

More information

Facial Expression Biometrics Using Tracker Displacement Features

Facial Expression Biometrics Using Tracker Displacement Features Facial Expression Biometrics Using Tracker Displacement Features Sergey Tulyakov 1, Thomas Slowe 2,ZhiZhang 1, and Venu Govindaraju 1 1 Center for Unified Biometrics and Sensors University at Buffalo,

More information

Enhanced Experience of Emotional Arousal in Response to Dynamic Facial Expressions

Enhanced Experience of Emotional Arousal in Response to Dynamic Facial Expressions DOI 10.1007/s10919-007-0025-7 ORIGINAL PAPER Enhanced Experience of Emotional Arousal in Response to Dynamic Facial Expressions Wataru Sato Æ Sakiko Yoshikawa Ó Springer Science+Business Media, LLC 2007

More information

Emotion Recognition using a Cauchy Naive Bayes Classifier

Emotion Recognition using a Cauchy Naive Bayes Classifier Emotion Recognition using a Cauchy Naive Bayes Classifier Abstract Recognizing human facial expression and emotion by computer is an interesting and challenging problem. In this paper we propose a method

More information

This is a repository copy of Differences in holistic processing do not explain cultural differences in the recognition of facial expression.

This is a repository copy of Differences in holistic processing do not explain cultural differences in the recognition of facial expression. This is a repository copy of Differences in holistic processing do not explain cultural differences in the recognition of facial expression. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/107341/

More information

Judgments of Facial Expressions of Emotion in Profile

Judgments of Facial Expressions of Emotion in Profile Emotion 2011 American Psychological Association 2011, Vol. 11, No. 5, 1223 1229 1528-3542/11/$12.00 DOI: 10.1037/a0024356 BRIEF REPORT Judgments of Facial Expressions of Emotion in Profile David Matsumoto

More information

Brain and Cognition, 48(2-3), (2002) Evaluation of nonverbal emotion in face and voice: some preliminary findings on a new battery of tests

Brain and Cognition, 48(2-3), (2002) Evaluation of nonverbal emotion in face and voice: some preliminary findings on a new battery of tests Brain and Cognition, 48(2-3), 499-514 (2002) Evaluation of nonverbal emotion in face and voice: some preliminary findings on a new battery of tests Marc David Pell McGill University, Montréal Abstract

More information

Facial Dynamics as Indicators of Trustworthiness and Cooperative Behavior

Facial Dynamics as Indicators of Trustworthiness and Cooperative Behavior Emotion Copyright 2007 by the American Psychological Association 2007, Vol. 7, No. 4, 730 735 1528-3542/07/$12.00 DOI: 10.1037/1528-3542.7.4.730 Facial Dynamics as Indicators of Trustworthiness and Cooperative

More information

Running head: GENDER DIFFERENCES IN EMOTION JUDGMENT

Running head: GENDER DIFFERENCES IN EMOTION JUDGMENT Running head: GENDER DIFFERENCES IN EMOTION JUDGMENT Gender Differences for Speed and Accuracy in the Judgment of the Six Basic Emotions Samantha Lumbert Rochester Institute of Technology 256 Abstract

More information

General and specific abilities to recognise negative emotions, especially disgust, as portrayed in the face and the body

General and specific abilities to recognise negative emotions, especially disgust, as portrayed in the face and the body COGNITION AND EMOTION 2005, 19 3), 397±412 General and specific abilities to recognise negative emotions, especially disgust, as portrayed in the face and the body Paul Rozin, Cory Taylor, Lauren Ross,

More information

Running head: FACIAL EXPRESSION AND SKIN COLOR ON APPROACHABILITY 1. Influence of facial expression and skin color on approachability judgment

Running head: FACIAL EXPRESSION AND SKIN COLOR ON APPROACHABILITY 1. Influence of facial expression and skin color on approachability judgment Running head: FACIAL EXPRESSION AND SKIN COLOR ON APPROACHABILITY 1 Influence of facial expression and skin color on approachability judgment Federico Leguizamo Barroso California State University Northridge

More information

Comparison of Multisensory Display Rules. in Expressing Complex Emotions between Cultures

Comparison of Multisensory Display Rules. in Expressing Complex Emotions between Cultures ISCA Archive http://www.isca-speech.org/archive FAAVSP - The 1 st Joint Conference on Facial Analysis, Animation, and Auditory-Visual Speech Processing Vienna, Austria, September 11-13, 2015 Comparison

More information

MPEG-4 Facial Expression Synthesis based on Appraisal Theory

MPEG-4 Facial Expression Synthesis based on Appraisal Theory MPEG-4 Facial Expression Synthesis based on Appraisal Theory L. Malatesta, A. Raouzaiou, K. Karpouzis and S. Kollias Image, Video and Multimedia Systems Laboratory, National Technical University of Athens,

More information

Research Article PSYCHOLOGICAL SCIENCE

Research Article PSYCHOLOGICAL SCIENCE PSYCHOLOGICAL SCIENCE Research Article Haptic Recognition of Static and Dynamic Expressions of Emotion in the Live Face S.J. Lederman, 1 R.L. Klatzky, 2 A. Abramowicz, 1 K. Salsman, 1 R. Kitada, 1 and

More information

Evaluating the emotional content of human motions on real and virtual characters

Evaluating the emotional content of human motions on real and virtual characters Evaluating the emotional content of human motions on real and virtual characters Rachel McDonnell 1 Sophie Jörg 1 Joanna McHugh 2 Fiona Newell 2 Carol O Sullivan 1 1 Graphics Vision & Visualisation Group

More information

The innate hypothesis

The innate hypothesis The innate hypothesis DARWIN (1872) proposed that the facial expression of emotion evolved as part of the actions necessary for life: Anger: Frowning (to protect eyes in anticipation of attack) Surprise:

More information

Scalar Ratings of Contempt 1 Running Head: Scalar Ratings of Contempt. Scalar Ratings of Contempt Expressions. David Matsumoto

Scalar Ratings of Contempt 1 Running Head: Scalar Ratings of Contempt. Scalar Ratings of Contempt Expressions. David Matsumoto 1 Running Head: Scalar Ratings of Contempt Scalar Ratings of Contempt Expressions David Matsumoto San Francisco State University 2 Abstract This article reports two studies examining the recognition of

More information

American-Japanese cultural differences in judgements of emotional expressions of different intensities

American-Japanese cultural differences in judgements of emotional expressions of different intensities COGNITION AND EMOTION, 2002, 16 (6), 721 747 American-Japanese cultural differences in judgements of emotional expressions of different intensities David Matsumoto and Theodora Consolacion San Francisco

More information

Culture and Emotion THE EVOLUTION OF HUMAN EMOTION. Outline

Culture and Emotion THE EVOLUTION OF HUMAN EMOTION. Outline Outline Culture and Emotion The Evolution of Human Emotion Universality in Emotion- The Basic Emotions Perspective Cultural Differences in Emotion Conclusion Chapter 8 THE EVOLUTION OF HUMAN EMOTION Emotion:

More information

The Effect of Gender and Age Differences on the Recognition of Emotions from Facial Expressions

The Effect of Gender and Age Differences on the Recognition of Emotions from Facial Expressions The Effect of Gender and Age Differences on the Recognition of Emotions from Facial Expressions Daniela Schneevogt University of Copenhagen d.schneevogt@googlemail.com Patrizia Paggio University of Copenhagen

More information

Valence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University

Valence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University Valence and Gender Effects on Emotion Recognition Following TBI Cassie Brown Arizona State University Knox & Douglas (2009) Social Integration and Facial Expression Recognition Participants: Severe TBI

More information

Recognition and discrimination of prototypical dynamic expressions of pain and emotions

Recognition and discrimination of prototypical dynamic expressions of pain and emotions Pain xxx (2007) xxx xxx www.elsevier.com/locate/pain Recognition and discrimination of prototypical dynamic expressions of pain and emotions Daniela Simon a,e, *, Kenneth D. Craig b, Frederic Gosselin

More information

Drive-reducing behaviors (eating, drinking) Drive (hunger, thirst) Need (food, water)

Drive-reducing behaviors (eating, drinking) Drive (hunger, thirst) Need (food, water) Instinct Theory: we are motivated by our inborn automated behaviors that generally lead to survival. But instincts only explain why we do a small fraction of our behaviors. Does this behavior adequately

More information

Comparison of Deliberate and Spontaneous Facial Movement in Smiles and Eyebrow Raises

Comparison of Deliberate and Spontaneous Facial Movement in Smiles and Eyebrow Raises J Nonverbal Behav (2009) 33:35 45 DOI 10.1007/s10919-008-0058-6 ORIGINAL PAPER Comparison of Deliberate and Spontaneous Facial Movement in Smiles and Eyebrow Raises Karen L. Schmidt Æ Sharika Bhattacharya

More information

Using simulated body language and colours to express emotions with the Nao robot

Using simulated body language and colours to express emotions with the Nao robot Using simulated body language and colours to express emotions with the Nao robot Wouter van der Waal S4120922 Bachelor Thesis Artificial Intelligence Radboud University Nijmegen Supervisor: Khiet Truong

More information

Facial expression recognition with spatiotemporal local descriptors

Facial expression recognition with spatiotemporal local descriptors Facial expression recognition with spatiotemporal local descriptors Guoying Zhao, Matti Pietikäinen Machine Vision Group, Infotech Oulu and Department of Electrical and Information Engineering, P. O. Box

More information

Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters. Qiong Wu St. Bayside, NY Stuyvesant High School

Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters. Qiong Wu St. Bayside, NY Stuyvesant High School Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters Qiong Wu 58-11 205 St. Bayside, NY 11364 Stuyvesant High School 345 Chambers St. New York, NY 10282 Q. Wu (2001) Are faces special?

More information

Understanding Emotions. How does this man feel in each of these photos?

Understanding Emotions. How does this man feel in each of these photos? Understanding Emotions How does this man feel in each of these photos? Emotions Lecture Overview What are Emotions? Facial displays of emotion Culture-based and sex-based differences Definitions Spend

More information

Judgment of perceived exertion by static and dynamic facial expression

Judgment of perceived exertion by static and dynamic facial expression Judgment of perceived exertion by static and dynamic facial expression Ding Hau Huang a, Wen Ko Chiou a, Bi Hui Chen b a Graduate Institute of Business and Management, College of Management, Chung Gung

More information

Journal of Experimental Psychology: General

Journal of Experimental Psychology: General Journal of Experimental Psychology: General Internal Representations Reveal Cultural Diversity in Expectations of Facial Expressions of Emotion Rachael E. Jack, Roberto Caldara, and Philippe G. Schyns

More information

Looking at You or Looking Elsewhere: The Influence of Head Orientation on the Signal Value of Emotional Facial Expressions

Looking at You or Looking Elsewhere: The Influence of Head Orientation on the Signal Value of Emotional Facial Expressions Motiv Emot (2007) 31:137 144 DOI 10.1007/s11031-007-9057-x ORIGINAL PAPER Looking at You or Looking Elsewhere: The Influence of Head Orientation on the Signal Value of Emotional Facial Expressions Ursula

More information

Running head: INDIVIDUAL DIFFERENCES FOR EMOTION EXPRESSIONS 1. Individual Differences Are More Important

Running head: INDIVIDUAL DIFFERENCES FOR EMOTION EXPRESSIONS 1. Individual Differences Are More Important Running head: INDIVIDUAL DIFFERENCES FOR EMOTION EXPRESSIONS 1 1 2 Individual Differences Are More Important Than The Emotional Category For The Perception Of Emotional Expressions 3 4 5 6 7 Elena Moltchanova,

More information

Emotions and Deception Detection Skills

Emotions and Deception Detection Skills Emotions and Deception Detection Skills Smarter Procurement using science-based people skills Based on the Science of Dr Paul Ekman Alan Hudson, Managing Director, EI Asia Pacific Who Are We? EI Asia Pacific

More information

This paper is in press (Psychological Science) Mona Lisa s Smile Perception or Deception?

This paper is in press (Psychological Science) Mona Lisa s Smile Perception or Deception? This paper is in press (Psychological Science) Mona Lisa s Smile Perception or Deception? Isabel Bohrn 1, Claus-Christian Carbon 2, & Florian Hutzler 3,* 1 Department of Experimental and Neurocognitive

More information

Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial Expressions Across Cultures

Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial Expressions Across Cultures 736270JCCXXX10.1177/0022022117736270Journal of Cross-Cultural PsychologyFang et al. research-article2017 Article Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial

More information

Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification. and. Evidence for a Face Superiority Effect. Nila K Leigh

Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification. and. Evidence for a Face Superiority Effect. Nila K Leigh 1 Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification and Evidence for a Face Superiority Effect Nila K Leigh 131 Ave B (Apt. 1B) New York, NY 10009 Stuyvesant High School 345 Chambers

More information

Emotion perception from dynamic and static body expressions in point-light and full-light displays

Emotion perception from dynamic and static body expressions in point-light and full-light displays Perception, 2004, volume 33, pages 717 ^ 746 DOI:10.1068/p5096 Emotion perception from dynamic and static body expressions in point-light and full-light displays Anthony P Atkinsonô Department of Psychology,

More information

Motivation represents the reasons for people's actions, desires, and needs. Typically, this unit is described as a goal

Motivation represents the reasons for people's actions, desires, and needs. Typically, this unit is described as a goal Motivation What is motivation? Motivation represents the reasons for people's actions, desires, and needs. Reasons here implies some sort of desired end state Typically, this unit is described as a goal

More information

HAS ANGER 1000 FACES? HOW CONSISTENT ARE THE FACIAL EMG PATTERNS OF DIFFERENT METHODS TO ELICIT FACIAL ANGER EXPRESSION OF VARYING INTENSITIES?

HAS ANGER 1000 FACES? HOW CONSISTENT ARE THE FACIAL EMG PATTERNS OF DIFFERENT METHODS TO ELICIT FACIAL ANGER EXPRESSION OF VARYING INTENSITIES? JURNAL PSIKOLOGI 2002, NO. 1, 14-27 HAS ANGER 1000 FACES? HOW CONSISTENT ARE THE FACIAL EMG PATTERNS OF DIFFERENT METHODS TO ELICIT FACIAL ANGER EXPRESSION OF VARYING INTENSITIES? Claudia Rolko, Jan Eichstaedt,

More information

Durham Research Online

Durham Research Online Durham Research Online Deposited in DRO: 09 May 2014 Version of attached le: Accepted Version Peer-review status of attached le: Peer-reviewed Citation for published item: Dobs, K. and B ultho, I. and

More information

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology ISSN: 0975-766X CODEN: IJPTFI Available Online through Research Article www.ijptonline.com FACIAL EMOTION RECOGNITION USING NEURAL NETWORK Kashyap Chiranjiv Devendra, Azad Singh Tomar, Pratigyna.N.Javali,

More information

The final publication is available り本文ファイルは に公開.

The final publication is available り本文ファイルは に公開. Title Impaired overt facial mimicry in re expressions in high-functioning aut Author(s) Yoshimura, Sayaka; Sato, Wataru; Uo Motomi Citation Journal of autism and developmental 1318-1328 Issue Date 2015-05-01

More information

A framework for the Recognition of Human Emotion using Soft Computing models

A framework for the Recognition of Human Emotion using Soft Computing models A framework for the Recognition of Human Emotion using Soft Computing models Md. Iqbal Quraishi Dept. of Information Technology Kalyani Govt Engg. College J Pal Choudhury Dept. of Information Technology

More information

Culture, Display Rules, and Emotion Judgments

Culture, Display Rules, and Emotion Judgments ΨΥΧΟΛΟΓΙΑ, 2018, 23 (1) 1-17 PSYCHOLOGY, 2018, 23 (1) 1-17 Culture, Display Rules, and Emotion Judgments David Matsumoto 1, Jungwook Choi 2, Satoko Hirayama 3 Akihiro Domae 4 & Susumu Yamaguchi 5 This

More information

CROSS-CULTURAL SIMILARITIES AND DIFFERENCES

CROSS-CULTURAL SIMILARITIES AND DIFFERENCES Chapter CROSS-CULTURAL SIMILARITIES AND DIFFERENCES IN THE PERCEPTION AND RECOGNITION OF FACIAL EXPRESSIONS Xiaoqian Yan, Andrew W. Young and Timothy J. Andrews * Department of Psychology, University of

More information

Michael L. Kimbarow, Ph.D. Wendy Quach, Ph.D. Marion D. Meyerson, Ph.D. San Jose State University San Jose, CA

Michael L. Kimbarow, Ph.D. Wendy Quach, Ph.D. Marion D. Meyerson, Ph.D. San Jose State University San Jose, CA Michael L. Kimbarow, Ph.D. Wendy Quach, Ph.D. Marion D. Meyerson, Ph.D. San Jose State University San Jose, CA Fear Anger Sadness Disgust Happiness Expressions are universal (Ekman et al, 1987; Levensen,Eikman,Heider

More information

PSYC 222 Motivation and Emotions

PSYC 222 Motivation and Emotions PSYC 222 Motivation and Emotions Session 6 The Concept of Emotion Lecturer: Dr. Annabella Osei-Tutu, Psychology Department Contact Information: aopare-henaku@ug.edu.gh College of Education School of Continuing

More information

Evaluating the emotional content of human motions on real and virtual characters

Evaluating the emotional content of human motions on real and virtual characters Evaluating the emotional content of human motions on real and virtual characters Rachel McDonnell 1 Sophie Jörg 1 Joanna McHugh 2 Fiona Newell 2 Carol O Sullivan 1 1 Graphics Vision&VisualisationGroup

More information

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson Spotting Liars and Deception Detection skills - people reading skills in the risk context Alan Hudson < AH Business Psychology 2016> This presentation has been prepared for the Actuaries Institute 2016

More information

Frank Tong. Department of Psychology Green Hall Princeton University Princeton, NJ 08544

Frank Tong. Department of Psychology Green Hall Princeton University Princeton, NJ 08544 Frank Tong Department of Psychology Green Hall Princeton University Princeton, NJ 08544 Office: Room 3-N-2B Telephone: 609-258-2652 Fax: 609-258-1113 Email: ftong@princeton.edu Graduate School Applicants

More information

Department of Psychology, University of Virginia, 102 Gilmer Hall, P.O. Box. Department of Neurology, University of Lübeck, Lübeck, Germany

Department of Psychology, University of Virginia, 102 Gilmer Hall, P.O. Box. Department of Neurology, University of Lübeck, Lübeck, Germany When in infancy does the fear bias develop? Tobias Grossmann 1 & Sarah Jessen 2 1 Department of Psychology, University of Virginia, 102 Gilmer Hall, P.O. Box 400400, Charlottesville, VA 22904, U.S.A. 2

More information

Human Classification for Web Videos

Human Classification for Web Videos Human Classification for Web Videos Aaron Manson Final Report for COMP4650 Advanced Computing Research Project Bachelor of Advanced Computing (Honours) Research School of Computer Science, College of Engineering

More information

THE TIMING OF FACIAL MOTION IN POSED AND SPONTANEOUS SMILES

THE TIMING OF FACIAL MOTION IN POSED AND SPONTANEOUS SMILES THE TIMING OF FACIAL MOTION IN POSED AND SPONTANEOUS SMILES J.F. COHN* and K.L.SCHMIDT University of Pittsburgh Department of Psychology 4327 Sennott Square, 210 South Bouquet Street Pittsburgh, PA 15260,

More information

Perceiving emotion in crowds: the role of dynamic body postures on the perception of emotion in crowded scenes

Perceiving emotion in crowds: the role of dynamic body postures on the perception of emotion in crowded scenes Exp Brain Res (2010) 204:361 372 DOI 10.1007/s00221-009-2037-5 RESEARCH ARTICLE Perceiving emotion in crowds: the role of dynamic body postures on the perception of emotion in crowded scenes Joanna Edel

More information

UvA-DARE (Digital Academic Repository)

UvA-DARE (Digital Academic Repository) UvA-DARE (Digital Academic Repository) Seeing mixed emotions: The specificity of emotion perception from static and dynamic facial expressions across cultures Fang, X.; Sauter, D.A.; van Kleef, G.A. Published

More information

Analysis of Dynamic Characteristics of Spontaneous Facial Expressions

Analysis of Dynamic Characteristics of Spontaneous Facial Expressions nalysis of Dynamic Characteristics of Spontaneous Facial Expressions Masashi Komori (komori@oecu.jp) Yoshitaro Onishi (mi3a@oecu.jp) Division of Information and Computer Sciences, Osaka Electro-Communication

More information

Affective Game Engines: Motivation & Requirements

Affective Game Engines: Motivation & Requirements Affective Game Engines: Motivation & Requirements Eva Hudlicka Psychometrix Associates Blacksburg, VA hudlicka@ieee.org psychometrixassociates.com DigiPen Institute of Technology February 20, 2009 1 Outline

More information

Facial Behavior as a Soft Biometric

Facial Behavior as a Soft Biometric Facial Behavior as a Soft Biometric Abhay L. Kashyap University of Maryland, Baltimore County 1000 Hilltop Circle, Baltimore, MD 21250 abhay1@umbc.edu Sergey Tulyakov, Venu Govindaraju University at Buffalo

More information

Trait Perceptions of Dynamic and Static Faces as a Function of Facial. Maturity and Facial Expression

Trait Perceptions of Dynamic and Static Faces as a Function of Facial. Maturity and Facial Expression Trait Perceptions of Dynamic and Static Faces as a Function of Facial Maturity and Facial Expression Master s Thesis Presented to The Faculty of the Graduate School of Arts and Sciences Brandeis University

More information

Viewpoint dependent recognition of familiar faces

Viewpoint dependent recognition of familiar faces Viewpoint dependent recognition of familiar faces N. F. Troje* and D. Kersten *Max-Planck Institut für biologische Kybernetik, Spemannstr. 38, 72076 Tübingen, Germany Department of Psychology, University

More information

The effects of subthreshold synchrony on the perception of simultaneity. Ludwig-Maximilians-Universität Leopoldstr 13 D München/Munich, Germany

The effects of subthreshold synchrony on the perception of simultaneity. Ludwig-Maximilians-Universität Leopoldstr 13 D München/Munich, Germany The effects of subthreshold synchrony on the perception of simultaneity 1,2 Mark A. Elliott, 2 Zhuanghua Shi & 2,3 Fatma Sürer 1 Department of Psychology National University of Ireland Galway, Ireland.

More information

Temporal Context and the Recognition of Emotion from Facial Expression

Temporal Context and the Recognition of Emotion from Facial Expression Temporal Context and the Recognition of Emotion from Facial Expression Rana El Kaliouby 1, Peter Robinson 1, Simeon Keates 2 1 Computer Laboratory University of Cambridge Cambridge CB3 0FD, U.K. {rana.el-kaliouby,

More information

Recognition and Understanding of Emotions in Persons with Mild to Moderate Mental Retardation

Recognition and Understanding of Emotions in Persons with Mild to Moderate Mental Retardation J. Psychosoc. Rehabil. Ment. Health (2015) 2(1):59 66 DOI 10.1007/s40737-014-0019-9 ORIGINAL ARTICLE Recognition and Understanding of Emotions in Persons with Mild to Moderate Mental Retardation Lini Joseph

More information

CULTURAL SIMILARITY S CONSEQUENCES A Distance Perspective on Cross-Cultural Differences in Emotion Recognition

CULTURAL SIMILARITY S CONSEQUENCES A Distance Perspective on Cross-Cultural Differences in Emotion Recognition 10.1177/0022022102239157 JOURNAL Elfenbein, Ambady OF CROSS-CULTURAL / DISTANCE PERSPECTIVE PSYCHOLOGY CULTURAL SIMILARITY S CONSEQUENCES A Distance Perspective on Cross-Cultural Differences in Emotion

More information

Emotion Perception in Emotionless Face Images Suggests a Norm-based Representation

Emotion Perception in Emotionless Face Images Suggests a Norm-based Representation Emotion Perception in Emotionless Face Images Suggests a Norm-based Representation Donald Neth and Aleix M. Martinez Dept. Electrical and Computer Engineering Dept. Biomedical Engineering The Ohio State

More information

Does scene context always facilitate retrieval of visual object representations?

Does scene context always facilitate retrieval of visual object representations? Psychon Bull Rev (2011) 18:309 315 DOI 10.3758/s13423-010-0045-x Does scene context always facilitate retrieval of visual object representations? Ryoichi Nakashima & Kazuhiko Yokosawa Published online:

More information

To What Extent Can the Recognition of Unfamiliar Faces be Accounted for by the Direct Output of Simple Cells?

To What Extent Can the Recognition of Unfamiliar Faces be Accounted for by the Direct Output of Simple Cells? To What Extent Can the Recognition of Unfamiliar Faces be Accounted for by the Direct Output of Simple Cells? Peter Kalocsai, Irving Biederman, and Eric E. Cooper University of Southern California Hedco

More information

ERI User s Guide. 2. Obtaining the ERI for research purposes

ERI User s Guide. 2. Obtaining the ERI for research purposes ERI User s Guide 1. Goal and features of the ERI The Emotion Recognition Index (Scherer & Scherer, 2011) is a rapid screening instrument measuring emotion recognition ability. The ERI consists of a facial

More information

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender Proc. IEEE Int. Conf. on Systems, Man and Cybernetics (SMC 2004), Den Haag, pp. 2203-2208, IEEE omnipress 2004 Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

More information

Assessing Naturalness and Emotional Intensity: A Perceptual Study of Animated Facial Motion

Assessing Naturalness and Emotional Intensity: A Perceptual Study of Animated Facial Motion Assessing Naturalness and Emotional Intensity: A Perceptual Study of Animated Facial Motion Jennifer Hyde1, Elizabeth J. Carter, Sara Kiesler, Jessica K. Hodgins1, Computer Science Department1, Robotics

More information

CHANG YUN, ZHIGANG DENG, and MERRILL HISCOCK University of Houston

CHANG YUN, ZHIGANG DENG, and MERRILL HISCOCK University of Houston Can Local Avatars Satisfy A Global Audience? A Case Study of High-Fidelity 3D Facial Avatar Animation in Subject Identification and Emotion Perception by US and International Groups CHANG YUN, ZHIGANG

More information

Introduction to affect computing and its applications

Introduction to affect computing and its applications Introduction to affect computing and its applications Overview What is emotion? What is affective computing + examples? Why is affective computing useful? How do we do affect computing? Some interesting

More information

Chang Yun, Zhigang Deng, and Merrill Hiscock 1. Computer Science Department University of Houston Houston, TX, 77204, USA

Chang Yun, Zhigang Deng, and Merrill Hiscock 1. Computer Science Department University of Houston Houston, TX, 77204, USA Can Local Avatars Satisfy Global Audience? A Case Study of High-Fidelity 3D Facial Avatar Animation in Subject Identification and Emotion Perception by US and International Groups. * Chang Yun, Zhigang

More information

CPSC81 Final Paper: Facial Expression Recognition Using CNNs

CPSC81 Final Paper: Facial Expression Recognition Using CNNs CPSC81 Final Paper: Facial Expression Recognition Using CNNs Luis Ceballos Swarthmore College, 500 College Ave., Swarthmore, PA 19081 USA Sarah Wallace Swarthmore College, 500 College Ave., Swarthmore,

More information

Human and computer recognition of facial expressions of emotion. University of California, San Diego

Human and computer recognition of facial expressions of emotion. University of California, San Diego Human and computer recognition of facial expressions of emotion J.M. Susskind 1, G. Littlewort 2, M.S. Bartlett 2, J. Movellan 2 and A.K. Anderson 1 1 Department of Psychology, University of Toronto 2

More information

PSYC 221 Introduction to General Psychology

PSYC 221 Introduction to General Psychology PSYC 221 Introduction to General Psychology Session 1 Definitions, perspectives and research methods in psychology Lecturer: Dr. Joana Salifu Yendork, Psychology Department Contact Information: jyendork@ug.edu.gh

More information

HARRISON ASSESSMENTS DEBRIEF GUIDE 1. OVERVIEW OF HARRISON ASSESSMENT

HARRISON ASSESSMENTS DEBRIEF GUIDE 1. OVERVIEW OF HARRISON ASSESSMENT HARRISON ASSESSMENTS HARRISON ASSESSMENTS DEBRIEF GUIDE 1. OVERVIEW OF HARRISON ASSESSMENT Have you put aside an hour and do you have a hard copy of your report? Get a quick take on their initial reactions

More information

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012 Outline Emotion What are emotions? Why do we have emotions? How do we express emotions? Cultural regulation of emotion Eliciting events Cultural display rules Social Emotions Behavioral component Characteristic

More information

Emotion October 16th, 2009 : Lecture 11

Emotion October 16th, 2009 : Lecture 11 Lecture Overview October 16th, 2009 : Lecture 11 Finishing up Groups s Jury Decision Making Jury Decision Making Group Polarization and Group Think Group Decision Making and Juries Value of Unanimity 12

More information

A Possibility for Expressing Multi-Emotion on Robot Faces

A Possibility for Expressing Multi-Emotion on Robot Faces The 5 th Conference of TRS Conference 26-27 May 2011, Bangkok, Thailand A Possibility for Expressing Multi-Emotion on Robot Faces Trin Veerasiri 1*, Djitt Laowattana 2 Institute of Field robotics, King

More information

Emotional Body Language Displayed by Artificial Agents

Emotional Body Language Displayed by Artificial Agents Emotional Body Language Displayed by Artificial Agents ARYEL BECK, STRI & School of Computer Science, University of Hertfordshire, UK BRETT STEVENS, School of Creative Technologies, University of Portsmouth,

More information

Person Perception. Forming Impressions of Others. Mar 5, 2012, Banu Cingöz Ulu

Person Perception. Forming Impressions of Others. Mar 5, 2012, Banu Cingöz Ulu Person Perception Forming Impressions of Others Mar 5, 2012, Banu Cingöz Ulu Person Perception person perception: how we come to know about others temporary states, emotions, intentions and desires impression

More information

Understanding Facial Expressions and Microexpressions

Understanding Facial Expressions and Microexpressions Understanding Facial Expressions and Microexpressions 1 You can go to a book store and find many books on bodylanguage, communication and persuasion. Many of them seem to cover the same material though:

More information

The challenge of representing emotional colouring. Roddy Cowie

The challenge of representing emotional colouring. Roddy Cowie The challenge of representing emotional colouring Roddy Cowie My aim: A. To outline the way I see research in an area that I have been involved with for ~15 years - in a way lets us compare notes C. To

More information

The Relation Between Perception and Action: What Should Neuroscience Learn From Psychology?

The Relation Between Perception and Action: What Should Neuroscience Learn From Psychology? ECOLOGICAL PSYCHOLOGY, 13(2), 117 122 Copyright 2001, Lawrence Erlbaum Associates, Inc. The Relation Between Perception and Action: What Should Neuroscience Learn From Psychology? Patrick R. Green Department

More information

K ING'S. Derek Goldsmith LONDON. College. FoundedI 82. Neuroscience & Emotion INSTITUTE OF PSYCHIATRY

K ING'S. Derek Goldsmith LONDON. College. FoundedI 82. Neuroscience & Emotion INSTITUTE OF PSYCHIATRY K ING'S College LONDON INSTITUTE OF PSYCHIATRY FoundedI 82 Neuroscience & Emotion Derek Goldsmith Remediation of poor emotion processing in schizophrenia: behavioural and eye movement findings Tamara Russell

More information

Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images

Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images Ioulia Guizatdinova and Veikko Surakka Research Group for Emotions, Sociality, and Computing Tampere Unit for Computer-Human

More information

THE DURATION OF MICRO-EXPRESSION 1. How Fast Are the Leaked Facial Expressions: The Duration. of Micro-Expressions*

THE DURATION OF MICRO-EXPRESSION 1. How Fast Are the Leaked Facial Expressions: The Duration. of Micro-Expressions* THE DURATION OF MICRO-EXPRESSION 1 How Fast Are the Leaked Facial Expressions: The Duration of Micro-Expressions* Wen-Jing Yan, Qi Wu, Yu-Hsin Chen, Jing Liang, Xiaolan Fu 1 State Key Laboratory of Brain

More information

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS International Archives of Photogrammetry and Remote Sensing. Vol. XXXII, Part 5. Hakodate 1998 FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS Ayako KATOH*, Yasuhiro FUKUI**

More information

Channel Dominance in Decoding Affective Nonverbal Behavior Displayed by Avatars 1. What You See is What You Get:

Channel Dominance in Decoding Affective Nonverbal Behavior Displayed by Avatars 1. What You See is What You Get: Channel Dominance in Decoding Affective Nonverbal Behavior Displayed by Avatars 1 What You See is What You Get: Channel Dominance in the Decoding of Affective Nonverbal Behavior Displayed by Avatars Presentation

More information

SPATIAL UPDATING 1. Do Not Cross the Line: Heuristic Spatial Updating in Dynamic Scenes. Markus Huff. Department of Psychology, University of Tübingen

SPATIAL UPDATING 1. Do Not Cross the Line: Heuristic Spatial Updating in Dynamic Scenes. Markus Huff. Department of Psychology, University of Tübingen SPATIAL UPDATING 1 Huff, M., & Schwan, S. (in press). Do not cross the line: Heuristic spatial updating in dynamic scenes. Psychonomic Bulletin & Review. doi: 10.3758/s13423-012-0293-z The final publication

More information

Differential Viewing Strategies towards Attractive and Unattractive Human Faces

Differential Viewing Strategies towards Attractive and Unattractive Human Faces Differential Viewing Strategies towards Attractive and Unattractive Human Faces Ivan Getov igetov@clemson.edu Greg Gettings ggettin@clemson.edu A.J. Villanueva aaronjv@clemson.edu Chris Belcher cbelche@clemson.edu

More information

Emotion Elicitation Effect of Films in a Japanese

Emotion Elicitation Effect of Films in a Japanese SOCIAL BEHAVIOR AND PERSONALITY, 00, (), - Society for Personality Research (Inc.) Emotion Elicitation Effect of Films in a Japanese Sample Wataru Sato, Motoko Noguchi, and Sakiko Yoshikawa Kyoto University,

More information