Face Emotions and Short Surveys during Automotive Tasks

Size: px
Start display at page:

Download "Face Emotions and Short Surveys during Automotive Tasks"

Transcription

1 Face Emotions and Short Surveys during Automotive Tasks LEE QUINTANAR, PETE TRUJILLO, AND JEREMY WATSON March 2016 J.D. Power A Global Marketing Information Company jdpower.com

2 Introduction Facial expressions are a daily occurrence in the communication of human emotions. In 1872, Charles Darwin described facial expressions as signals of specific emotions (Darwin, 1872/1998), which was later tested by Paul Ekman and Wallace Friesen (Ekman & Friesen, 1987). Their team conducted a cross-cultural study, demonstrating that interpretations of facial expressions appear to be universal across cultures. While minor cultural variations appeared in the ratings of emotional intensity, agreement in emotional evidence was found to be high across cultures. Moreover, a method called the Facial Action Coding System (FACS) was designed to help classify human facial movements by their appearance on the face, based on a system originally developed by a Swedish anatomist (Hjortsjö, 1969). Ekman, Friesen, and Hager later published a significant update to FACS (Ekman et al. 2002), with variants of the system emerging in modern technologies for computer-based detection of facial expressions. The purpose of our research is to better understand the relationship between human facial expressions face emotions and consumer attitudes toward products and services. J.D. Power s Voice of the Customer research measures customer satisfaction with products and services based on consumer surveys. Survey research asks respondents what they think or how they feel about products/services and then extrapolates that data to actual attitudes that impact consumer behavior. Biometrics is an alternative to infer attitudes from observations of bodily behaviors corresponding to human emotions. Specifically, the research evaluated how human facial expressions would compare with survey responses in order to measure attitudes and behaviors. Early research by Cacioppo, Quintanar, Petty, and Snyder (1981, 1979) evaluated the relationship among facial expressions, emotions, and attitudes, and an expansion of this assessment using modern computer technologies seemed promising. Biometrics is an alternative to infer attitudes from observations of bodily behaviors that correspond to human emotions. Biometrics takes physical information from a human body and makes it quantifiable. Biometrics technologies are becoming increasingly refined. A variety of bodily measurements can be evaluated, including facial expressions, heart rate, eye tracking, pupil dilation, galvanic skin response (arousal), and voice modulations. How effective is biometrics in assessing emotions and attitudes? The focus of the research in this study is on facial expressions as an assessment of respondent s attitudes. The automation of methods to recognize the emotional content of facial expressions has been evolving in parallel with psychological research. Research conducted by Bartlett et al. (2003, 2006) began with prototypes for automatic recognition of spontaneous facial actions. Littlewort et al. (2006) explored the dynamics of facial expression extracted automatically from video. Pantic and Bartlett (2007) went further with a machine analysis of facial expressions. Wu, Bartlett, and Movellan (2010) applied the use of Gabor motion energy filters in the recognition of facial expressions. Computerized methods improved when 1

3 Littlewort, Whitehill, Wu, Fasel, Frank, Movellan, and Bartlett (2011) developed the Computer Expression Recognition Toolbox (CERT) which served as an end-to-end system for fully automated facial expression recognition that operates in real time. To access the affective nature of facial expressions, the points on the face are scanned by computer analysis to recognize emotions. The results of this J.D. Power study are expected to help researchers better understand how facial expressions/face emotions can accurately assess consumer attitudes and, in turn, predict their behaviors. Research Design Methodology: The research paradigm consisted of a website evaluation scenario in which participants evaluated a site s usability. Three 2015 automotive websites (Honda, Kia, and Hyundai) were presented to participants in a randomized order. Asian automakers were chosen to reduce preference bias that might emerge with European or U.S.-based automakers. Participants were asked to use each website s Build a Car price tool and afterward complete a short survey to rate Appearance, Navigation, Speed, and their Overall satisfaction. A webcam captured videos of participants faces as they used the Build a Car tool. Although eye-tracking information was also collected, it was not used in this analysis. The research paradigm was a website evaluation scenario in which webcam face videos were collected while using the website s Build a Car tool. Figure 1: Build a Car Websites Were Utilized in Research Procedure Note: Participants completed short surveys after using Build a Car websites. Facial encoding: The technology platform we used for the emotions recognition of facial expressions was the imotions Biometric Research Platform (Release 5, 2015). Detailed specifications can be found on the imotions product website (2016-a), which includes additional resources, e.g., a facial emotions publications list (2016-b) and a guide for facial emotions analysis (2016-c). The underlying technology involved three steps: (1) face detection; (2) feature detection; and (3) classification. The position, orientation, and information encompassing key facial features were input into classification algorithms that translated the features into emotional states and affective metrics. These technologies rested 2

4 on methods of image processing, edge detection, Gabor filters, and statistical comparisons with normative databases provided by facial expression software engines. This can be imagined as a kind of invisible virtual mesh covering the face of a respondent: whenever the face moves or changes expressions, the face model adapts, follows, and classifies emotions. The face video collected during each Build a Car website session was analyzed using the imotions biometrics platform, which was also used to set up the PC-based experimental procedure, sequence and timing of stimulus events, baseline gray screen, website presentation, and online survey questions and ratings. Different baselines screens were also tested but yielded no differences from a gray screen for establishing an emotions baseline. The imotions platform digitizes facial expressions to measure emotions by using encoding algorithms based on the FACET scoring methods. Facial encoding scans the various points on a face and then interprets the patterns into measurable emotion events. Face emotion index scores were collected simultaneously for the following nine indices: Joy, Anger, Surprise, Fear, Sadness, Disgust, Contempt, Confusion, and Frustration. Overall sentiment scores were also collected for generalized positive and negative facial expressions. Sample: The sample consisted of 30 college-educated participants, 50% male and 50% female. Ages ranged from 21 to 67 (37 mean), with a racial mix of 79% white and 7% each of black, Asian, and Hispanic. The average length of each session was 35 minutes. Figure 2: Facial Encoding for Emotions Webcam Facial encoding scans the various points on a face and then interprets the patterns into measurable emotion events. Eye Tracker Note: Webcam face videos were collected and computer analyzed for emotions. 3

5 Analysis Face emotions from video recordings: On average, face videos were recorded for 35 minutes for each of the 30 participants at a frame rate of 30 frames per second. Each frame was computer analyzed for the 11 emotions listed above, which resulted in a data set consisting of nearly 2 million observations (records): 35 min x (30 frames/sec) x 30 participants = 1.9 million. Consequently, proper data aggregation and transformation methods were required for effective data analysis. There is a need to reduce big data into a measurable Emotion Index. Figure 3: Data Example Biometric measures require big data solutions for analysis. An Emotion Index was calculated by deriving a percentage of emotion change. Note: Biometric measures require big data solutions for analysis. J.D. Power Emotion Index: Emotion Index scores were calculated by deriving the percentage of emotion change between the highest and lowest ranges for each participant across all conditions. For each face emotion, a minimum and maximum were determined for each person across all conditions (within-subjects). An Emotion Index score was derived by (a) converting to positive integers; (b) calculating the percentage between minimum and maximum; and (c) converting to a 1,000-point scale. This indexing approach created a percent Emotion Index (from zero to 1,000) based on the range of distance between within-subject minimum and maximum end points. This was found to be the best and most emotion-sensitive data transform method among the alternative methods, such as difference scores from baseline; threshold binary scores; square roots; and logarithm scores. Raw data scores were low and variable across participants, while other transforms changed the scale and/or data distribution. 4

6 Results Core Emotional Dimensions A factor analysis of the 11 face emotions was performed using a principal components analysis with orthogonal varimax rotation (see Kim & Mueller, 1978a; 1978b), followed by an oblique Procrustes rotation (SAS ROTATE=PROMAX), with the varimax output as the target matrix. An oblique rotation method was used because the simultaneous emotion measurements were expected to be interrelated with each other. The number of factors retained was determined based on the solution that best satisfied the following criteria: the percentage of variance explained by each factor; the outcome of a scree test; the size of the eigenvalue differences between factors; the number of high loadings on each factor; the perseverance of factors over each of the possible rotations; and the meaningfulness of the factor structures over different rotations. As shown in Figure 4, a rotation of three factors, accounting for 93.1% of the variance, was selected as representing the best estimate of the primary emotional judgmental dimensions utilized during the automotive task evaluation. After examining the pattern of factor loadings, these factors were labeled Enjoyment, Dislike, and Perplexed. Factor scores were also calculated for use in later analyses. This method of collapsing the data matrix by stringing out across conditions to assess dimensionality and derive factor scores for group comparisons has been found useful in prior precedent (Quintanar, 1982; Osgood, May, and Miron, 1975). Figure 4: Factor Analysis of Core Emotional Dimensions Three factors were found as primary emotional dimensions utilized during this automotive task evaluation: Enjoyment, Dislike, and Perplexed. Note: Factor loadings are multiplied by 100, rounded to integers, and those > 40 are flagged by asterisks. Scree plots and eigenvalues indicate three primary factors. 5

7 Face Emotions Analysis of Variance (ANOVA repeated measures) was used to compare automaker websites on face emotions. The strongest emotions appeared during the first 2 minutes of the Build a Car website sessions. Kia s car build had higher levels of Confusion and Disgust and lower levels of Joy. An analysis of factor scores also showed that Kia had higher Perplexed and lower Enjoyment scores. Figure 5: Face Emotions during Initial Impressions Honda Hyundai Kia Confusion Disgust Joy Note: Kia s car build had higher levels of Confusion and Disgust and lower levels of Joy. The level of Confusion in Kia s Build a Car evaluation persisted throughout the remainder of the Web session after the initial impressions. Surprise also emerged at higher levels during Kia s car build session. Figure 6: Face Emotions during Latter Session Honda Hyundai Kia Confusion Surprise Note: Confusion in Kia s car build persisted throughout the remainder of the Web session. 6

8 An analysis of emotion indices is also available on a second-by-second basis, which can be useful for comparing emotions when key events occur during a session. You can also observe fluctuations and overall slope. An example with Confusion is shown in Figure 7. Figure 7: Confusion across 10 Minutes of Using Build a Car Tool Note: Confusion in car builds is shown in second-by-second plots. Survey Ratings An analysis of short survey ratings after each Build a Car session found that the Honda car build scored highest in Navigation, Speed, and Overall Satisfaction. Hyundai was a close second. Kia s car build scored lowest in Appearance, Navigation, and Speed. As shown in Figure 8, these ratings are consistent with the face emotion results. Figure 8: Ranking Car Builds by Survey Ratings Note: Highest ratings are marked by green and lowest by orange. 7

9 When website satisfaction ratings were divided into low/high categories, it was found as expected that Appearance, Navigation, and Speed were highest when Overall satisfaction was high (see Figure 9). Figure 9: Attributes during Low/High Satisfaction Note: Appearance, Navigation & Speed receive highest ratings when satisfaction is high. Face Emotions and Survey Ratings How do face emotions relate to participant evaluations and satisfaction levels? Face emotions were found to be aligned, that is, there is a directional relationship with survey ratings. Negative face emotions were at higher levels when satisfaction ratings were lower. Findings show that face emotions were aligned with survey ratings. Negative face emotions were higher when satisfaction ratings were lower. Figure 10: Face Emotions during Low/High Satisfaction Ratings Note: Face emotions are aligned with ratings, with negative emotions higher when satisfaction ratings are lower. 8

10 A correlational analysis also showed that negative emotions increased as survey ratings decreased. Why were negative emotions more prominent than positive? It may be that the nature of this task seemed more like work rather than fun for this website evaluation. Conclusions Overall, face emotions were an accurate measure of participant reactions during the Build a Car website sessions as measured by the J.D. Power Emotion Index percent (zero-1,000). Initial impressions (first 2 minutes) showed higher levels of Confusion (a mean index score of 745) and Disgust (609), and lower levels of Joy (237) in Kia s car build. During the latter part of the website session, higher Confusion (739) persisted throughout Kia s car build, with Surprise emerging (444). Honda and Hyundai didn t have these issues. Factor analysis of the 11 emotions revealed three core underlying emotional dimensions used by participants during this automotive task: (1) Enjoyment; (2) Dislike; and (3) Perplexed. Further analysis showed that Kia scored highest on Perplexed and lowest on Enjoyment (more confusion) during initial impressions. Correlations were found between attribute ratings and emotions. Negative face emotions were high when overall website satisfaction was low. These results are corroborated by Kia s recent 2015 decision to dismiss their Web design firm in order to pursue a better redesign. These findings are also supported by the J. D. Power manufacturer website evaluation studies. Overall, Honda and Hyundai car builds were straightforward and allowed you to easily build and explore car options. The Kia car build was pretty (nice photos and car views), but more complicated, harder to search and navigate. Overall, face emotions were an accurate measure of participant reactions during website sessions as measured by the Emotion Index. Moreover, participants initial impressions seemed impacted when a large pop-up panel window appeared first thing in Kia s car build and required a ZIP code to continue. Although the Hyundai car build also asked for a ZIP code, it was done via a small pop-up panel described as needed for the latest rebates and prices. Honda didn t ask for a ZIP code until the end of their car build and only as an optional item. Future Research and Applications There are many applications of this research in providing nonobtrusive evaluations of human emotions to predict consumer behavior and attitudes toward products and services. Customer video feeds can be used to provide evaluations of consumer reactions in automotive, retail, travel, hospitality, or similar environments. Face emotions can be gathered from video sources such as webcams for digital-comfortable consumers (e.g., Millennials) to leave video-based service feedback or product reviews. There might also be optional security-based applications to assess strongly polarized emotions. Moreover, it s possible to do a census of branches/facilities to assess and improve customer service. 9

11 There is an abundance of video opportunities for recognizing face emotions so much so that questions of privacy and legal permissions to record and process such video for evaluation of personal emotions may be necessary. One strategy might be to obtain approval for recordings similar to what is currently done when contacting a call center and a request for approval to record for the purpose of improving customer service is asked up front. Future research is expected to investigate more thoroughly the core judgmental dimensions used in product evaluation to assess how they persist across various industries. Empirical assessments of emotion indexing methods and data aggregation strategies are also important for hardened research paradigms and tools. Other opportunities can include utilizing scenarios that elicit stronger emotions about products and services. Future research is also expected to delve deeper into the comparison of face emotions to survey-measured attitudes with special attention given to the persistence of these feelings and their predictive nature on consumer behaviors. For example, are face emotions more transitory and reflective of the moment rather than more enduring attitudes in predicting consumer behavior? Perhaps face emotions are additive and with consistent reactions contribute to the formation of enduring persistent attitudes. Perhaps facial expressions are inherent in the processing of emotions and always involved with attitudes at all levels. There are many research opportunities available for evaluating more effective ways to blend biometrics, consumer attitudes, and the prediction of consumer behavior. We see many applications for nonobtrusive evaluations of emotions to predict consumer behavior and attitudes toward products and services. Authors Lee Quintanar, Ph.D., Director, Marketing Science, J.D. Power Pete Trujillo, Senior Manager, J.D. Power Jeremy Watson, Ph.D., Senior Statistician, J.D. Power References J. D. Power 2015 Biometrics Research Study SM Bartlett, M.S., Littlewort, G., Braathen, B., Sejnowski, T.J., & Movellan, J.R. (2003). A prototype for automatic recognition of spontaneous facial actions. In S. Becker & S. Thrun & K. Obermayer, (Eds.) Advances in Neural Information Processing Systems, Vol 15, p , MIT Press. Bartlett, M.S., Littlewort, G.C., Frank, M.G., Lainscsek,C., Fasel, I., Movellan, J.R. (2006). Automatic Recognition of Facial Actions in Spontaneous Expressions. Journal of Multimedia 1(6) p Cacioppo, J. T., Quintanar, L. R., Petty, R. E., & Snyder, C. W. (1981). Electroencephalographic, facial EMG, and cardiac changes during equivocal and less equivocal attitudinal processing [Abstract]. Psychophysiology, 18,

12 Cacioppo, J. T., Quintanar, L. R., Petty, R. E., & Snyder, C. W. (1979). Changes in cardiac and facial EMG activity during the forewarning, anticipation, and presentation of proattitudinal, counterattitudinal, and neutral communications [Abstract]. Psychophysiology, 16, 194. Darwin, Charles. (1998). The Expression of The Emotions In Man And Animals. New York: Philosophical Library. (original work published in 1872). Ekman, P., et al. (1987). Universals and Cultural Differences in the Judgments of Facial Expressions of Emotion. Journal of Personality & Social Psychology, 53(4), Ekman, P., Friesen, W. V., & Hager, J. C. (Eds.). (2002). Facial Action Coding System [e-book]. Salt Lake City, UT: Research Nexus.Ekman, P., Friesen, W. V., & O Sullivan, M. (1988). Smiles when lying. Journal of Personality and Social Psychology, 54, Hjorztsjö, C. H. (1969). Man's face and mimic language. Studentlitteratur, Lund, Sweden. imotions (2016a). Biometric Research Platform. Product website: imotions (2016b). Publications resources for Facial Expressions Analysis. Retrieved from imotions (2016c). Facial Expression Analysis: Everything you need to know to elevate your research with emotion analytics - The Definitive Guide. Retrieved from Kim, J., & Mueller, C. W. Introduction to factor analysis. (Sage University Paper Series on Quantitative Applications in the Social Sciences, ). Beverly Hills and London: Sage Publications, 1978a. Kim, J., S Mueller, C. W. Factor analysis: statistical methods and practical issues. (Sage University Paper Series on Quantitative Applications in the Social Sciences, ). Beverly Hills and London: Sage Publications, 1978b. Littlewort, G., Bartlett, M., Fasel, I., Susskind, J., and Movellan, J. (2006). Dynamics of facial expression extracted automatically from video. Image and Vision Computing 24(6), p Littlewort G, Whitehill J, Wu T, Fasel I, Frank M, Movellan J, and Bartlett M (2011). The Computer Expression Recognition Toolbox (CERT). Proc. IEEE International Conference on Automatic Face and Gesture Recognition. Osgood, C. E., May, W. H., & Miron, M. S. Cross-cultural universals of affective meaning. Chicago, Illinois: University of Illinois Press, Pantic M. & Bartlett, M.S. (2007). Machine Analysis of Facial Expressions, in K. Delac & M. Grgic, Eds., Face Recognition, Vienna, Austria: I-Tech Education and Publishing, 2007, pp Quintanar, L. R. (1982). The interactive computer as a social stimulus in computer-managed instruction: a theoretical and empirical analysis of the social psychological processes evoked during human-computer interaction (Doctoral dissertation). The University of Notre Dame, Notre Dame, Indiana. 11

13 Wu, T., Bartlett, M.S., and Movellan, J. (2010). Facial expression recognition using Gabor motion energy filters. IEEE CVPR workshop on Computer Vision and Pattern Recognition for Human Communicative Behavior Analysis. 12

Face Emotions and Short Surveys during Automotive Tasks. April 2016

Face Emotions and Short Surveys during Automotive Tasks. April 2016 Face Emotions and Short Surveys during Automotive Tasks April 2016 Presented at the 2016 Council of American Survey Research Organizations (CASRO) Digital Conference, March 2016 A Global Marketing Information

More information

Facial expression recognition with spatiotemporal local descriptors

Facial expression recognition with spatiotemporal local descriptors Facial expression recognition with spatiotemporal local descriptors Guoying Zhao, Matti Pietikäinen Machine Vision Group, Infotech Oulu and Department of Electrical and Information Engineering, P. O. Box

More information

Emotion Recognition using a Cauchy Naive Bayes Classifier

Emotion Recognition using a Cauchy Naive Bayes Classifier Emotion Recognition using a Cauchy Naive Bayes Classifier Abstract Recognizing human facial expression and emotion by computer is an interesting and challenging problem. In this paper we propose a method

More information

This is the accepted version of this article. To be published as : This is the author version published as:

This is the accepted version of this article. To be published as : This is the author version published as: QUT Digital Repository: http://eprints.qut.edu.au/ This is the author version published as: This is the accepted version of this article. To be published as : This is the author version published as: Chew,

More information

Facial Expression Recognition Using Principal Component Analysis

Facial Expression Recognition Using Principal Component Analysis Facial Expression Recognition Using Principal Component Analysis Ajit P. Gosavi, S. R. Khot Abstract Expression detection is useful as a non-invasive method of lie detection and behaviour prediction. However,

More information

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology ISSN: 0975-766X CODEN: IJPTFI Available Online through Research Article www.ijptonline.com FACIAL EMOTION RECOGNITION USING NEURAL NETWORK Kashyap Chiranjiv Devendra, Azad Singh Tomar, Pratigyna.N.Javali,

More information

Facial Behavior as a Soft Biometric

Facial Behavior as a Soft Biometric Facial Behavior as a Soft Biometric Abhay L. Kashyap University of Maryland, Baltimore County 1000 Hilltop Circle, Baltimore, MD 21250 abhay1@umbc.edu Sergey Tulyakov, Venu Govindaraju University at Buffalo

More information

Emotions and Motivation

Emotions and Motivation Emotions and Motivation LP 8A emotions, theories of emotions 1 10.1 What Are Emotions? Emotions Vary in Valence and Arousal Emotions Have a Physiological Component What to Believe? Using Psychological

More information

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012 Outline Emotion What are emotions? Why do we have emotions? How do we express emotions? Cultural regulation of emotion Eliciting events Cultural display rules Social Emotions Behavioral component Characteristic

More information

Face Analysis : Identity vs. Expressions

Face Analysis : Identity vs. Expressions Hugo Mercier, 1,2 Patrice Dalle 1 Face Analysis : Identity vs. Expressions 1 IRIT - Université Paul Sabatier 118 Route de Narbonne, F-31062 Toulouse Cedex 9, France 2 Websourd Bâtiment A 99, route d'espagne

More information

A MULTIMODAL NONVERBAL HUMAN-ROBOT COMMUNICATION SYSTEM ICCB 2015

A MULTIMODAL NONVERBAL HUMAN-ROBOT COMMUNICATION SYSTEM ICCB 2015 VI International Conference on Computational Bioengineering ICCB 2015 M. Cerrolaza and S.Oller (Eds) A MULTIMODAL NONVERBAL HUMAN-ROBOT COMMUNICATION SYSTEM ICCB 2015 SALAH SALEH *, MANISH SAHU, ZUHAIR

More information

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification Reza Lotfian and Carlos Busso Multimodal Signal Processing (MSP) lab The University of Texas

More information

Culture and Emotion THE EVOLUTION OF HUMAN EMOTION. Outline

Culture and Emotion THE EVOLUTION OF HUMAN EMOTION. Outline Outline Culture and Emotion The Evolution of Human Emotion Universality in Emotion- The Basic Emotions Perspective Cultural Differences in Emotion Conclusion Chapter 8 THE EVOLUTION OF HUMAN EMOTION Emotion:

More information

Bio-Feedback Based Simulator for Mission Critical Training

Bio-Feedback Based Simulator for Mission Critical Training Bio-Feedback Based Simulator for Mission Critical Training Igor Balk Polhemus, 40 Hercules drive, Colchester, VT 05446 +1 802 655 31 59 x301 balk@alum.mit.edu Abstract. The paper address needs for training

More information

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information C. Busso, Z. Deng, S. Yildirim, M. Bulut, C. M. Lee, A. Kazemzadeh, S. Lee, U. Neumann, S. Narayanan Emotion

More information

A framework for the Recognition of Human Emotion using Soft Computing models

A framework for the Recognition of Human Emotion using Soft Computing models A framework for the Recognition of Human Emotion using Soft Computing models Md. Iqbal Quraishi Dept. of Information Technology Kalyani Govt Engg. College J Pal Choudhury Dept. of Information Technology

More information

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION EMOTION: a response of the whole organism, involving 1. physiological arousal 2. expressive behaviors, and 3. conscious experience A mix of bodily arousal

More information

Study on Aging Effect on Facial Expression Recognition

Study on Aging Effect on Facial Expression Recognition Study on Aging Effect on Facial Expression Recognition Nora Algaraawi, Tim Morris Abstract Automatic facial expression recognition (AFER) is an active research area in computer vision. However, aging causes

More information

Emotion-Aware Machines

Emotion-Aware Machines Emotion-Aware Machines Saif Mohammad, Senior Research Officer National Research Council Canada 1 Emotion-Aware Machines Saif Mohammad National Research Council Canada 2 What does it mean for a machine

More information

Gender Based Emotion Recognition using Speech Signals: A Review

Gender Based Emotion Recognition using Speech Signals: A Review 50 Gender Based Emotion Recognition using Speech Signals: A Review Parvinder Kaur 1, Mandeep Kaur 2 1 Department of Electronics and Communication Engineering, Punjabi University, Patiala, India 2 Department

More information

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods.

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods. LP 8D emotions & James/Lange 1 Emotions An emotion is a complex psychological state that involves subjective experience, physiological response, and behavioral or expressive responses. These aspects are

More information

Research Proposal on Emotion Recognition

Research Proposal on Emotion Recognition Research Proposal on Emotion Recognition Colin Grubb June 3, 2012 Abstract In this paper I will introduce my thesis question: To what extent can emotion recognition be improved by combining audio and visual

More information

Potential applications of affective computing in the surveillance work of CCTV operators

Potential applications of affective computing in the surveillance work of CCTV operators Loughborough University Institutional Repository Potential applications of affective computing in the surveillance work of CCTV operators This item was submitted to Loughborough University's Institutional

More information

1/12/2012. How can you tell if someone is experiencing an emotion? Emotion. Dr.

1/12/2012. How can you tell if someone is experiencing an emotion?   Emotion. Dr. http://www.bitrebels.com/design/76-unbelievable-street-and-wall-art-illusions/ 1/12/2012 Psychology 456 Emotion Dr. Jamie Nekich A Little About Me Ph.D. Counseling Psychology Stanford University Dissertation:

More information

PHYSIOLOGICAL RESEARCH

PHYSIOLOGICAL RESEARCH DOMAIN STUDIES PHYSIOLOGICAL RESEARCH In order to understand the current landscape of psychophysiological evaluation methods, we conducted a survey of academic literature. We explored several different

More information

Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results

Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results Seppo J. Laukka 1, Antti Rantanen 1, Guoying Zhao 2, Matti Taini 2, Janne Heikkilä

More information

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry.

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry. Proceedings Chapter Valence-arousal evaluation using physiological signals in an emotion recall paradigm CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry Abstract The work presented in this paper aims

More information

Emotions of Living Creatures

Emotions of Living Creatures Robot Emotions Emotions of Living Creatures motivation system for complex organisms determine the behavioral reaction to environmental (often social) and internal events of major significance for the needs

More information

Determining Emotions via Biometric Software

Determining Emotions via Biometric Software Proceedings of Student-Faculty Research Day, CSIS, Pace University, May 5 th, 2017 Determining Emotions via Biometric Software Thomas Croteau, Akshay Dikshit, Pranav Narvankar, and Bhakti Sawarkar Seidenberg

More information

Motivation represents the reasons for people's actions, desires, and needs. Typically, this unit is described as a goal

Motivation represents the reasons for people's actions, desires, and needs. Typically, this unit is described as a goal Motivation What is motivation? Motivation represents the reasons for people's actions, desires, and needs. Reasons here implies some sort of desired end state Typically, this unit is described as a goal

More information

For Micro-expression Recognition: Database and Suggestions

For Micro-expression Recognition: Database and Suggestions For Micro-expression Recognition: Database and Suggestions Wen-Jing Yan a,b, Su-Jing Wang a, Yong-Jin Liu c, Qi Wu d, Xiaolan Fu a,1 a State Key Laboratory of Brain and Cognitive Science, Institute of

More information

General Psych Thinking & Feeling

General Psych Thinking & Feeling General Psych Thinking & Feeling Piaget s Theory Challenged Infants have more than reactive sensing Have some form of discrimination (reasoning) 1-month-old babies given a pacifier; never see it Babies

More information

Making a psychometric. Dr Benjamin Cowan- Lecture 9

Making a psychometric. Dr Benjamin Cowan- Lecture 9 Making a psychometric Dr Benjamin Cowan- Lecture 9 What this lecture will cover What is a questionnaire? Development of questionnaires Item development Scale options Scale reliability & validity Factor

More information

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals.

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. Bandara G.M.M.B.O bhanukab@gmail.com Godawita B.M.D.T tharu9363@gmail.com Gunathilaka

More information

Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images

Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images Ioulia Guizatdinova and Veikko Surakka Research Group for Emotions, Sociality, and Computing Tampere Unit for Computer-Human

More information

IMPLEMENTATION OF AN AUTOMATED SMART HOME CONTROL FOR DETECTING HUMAN EMOTIONS VIA FACIAL DETECTION

IMPLEMENTATION OF AN AUTOMATED SMART HOME CONTROL FOR DETECTING HUMAN EMOTIONS VIA FACIAL DETECTION IMPLEMENTATION OF AN AUTOMATED SMART HOME CONTROL FOR DETECTING HUMAN EMOTIONS VIA FACIAL DETECTION Lim Teck Boon 1, Mohd Heikal Husin 2, Zarul Fitri Zaaba 3 and Mohd Azam Osman 4 1 Universiti Sains Malaysia,

More information

Automatic Facial Expression Recognition Using Boosted Discriminatory Classifiers

Automatic Facial Expression Recognition Using Boosted Discriminatory Classifiers Automatic Facial Expression Recognition Using Boosted Discriminatory Classifiers Stephen Moore and Richard Bowden Centre for Vision Speech and Signal Processing University of Surrey, Guildford, GU2 7JW,

More information

International Journal of Research in Science and Technology. (IJRST) 2018, Vol. No. 8, Issue No. IV, Oct-Dec e-issn: , p-issn: X

International Journal of Research in Science and Technology. (IJRST) 2018, Vol. No. 8, Issue No. IV, Oct-Dec e-issn: , p-issn: X CLOUD FILE SHARING AND DATA SECURITY THREATS EXPLORING THE EMPLOYABILITY OF GRAPH-BASED UNSUPERVISED LEARNING IN DETECTING AND SAFEGUARDING CLOUD FILES Harshit Yadav Student, Bal Bharati Public School,

More information

A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection

A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection Tobias Gehrig and Hazım Kemal Ekenel Facial Image Processing and Analysis Group, Institute for Anthropomatics Karlsruhe

More information

ANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS

ANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS ANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS Nanxiang Li and Carlos Busso Multimodal Signal Processing (MSP) Laboratory Department of Electrical Engineering, The University

More information

Running head: FACIAL EXPRESSION AND SKIN COLOR ON APPROACHABILITY 1. Influence of facial expression and skin color on approachability judgment

Running head: FACIAL EXPRESSION AND SKIN COLOR ON APPROACHABILITY 1. Influence of facial expression and skin color on approachability judgment Running head: FACIAL EXPRESSION AND SKIN COLOR ON APPROACHABILITY 1 Influence of facial expression and skin color on approachability judgment Federico Leguizamo Barroso California State University Northridge

More information

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS International Archives of Photogrammetry and Remote Sensing. Vol. XXXII, Part 5. Hakodate 1998 FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS Ayako KATOH*, Yasuhiro FUKUI**

More information

TWO HANDED SIGN LANGUAGE RECOGNITION SYSTEM USING IMAGE PROCESSING

TWO HANDED SIGN LANGUAGE RECOGNITION SYSTEM USING IMAGE PROCESSING 134 TWO HANDED SIGN LANGUAGE RECOGNITION SYSTEM USING IMAGE PROCESSING H.F.S.M.Fonseka 1, J.T.Jonathan 2, P.Sabeshan 3 and M.B.Dissanayaka 4 1 Department of Electrical And Electronic Engineering, Faculty

More information

SmileMaze: A Tutoring System in Real-Time Facial Expression Perception and Production in Children with Autism Spectrum Disorder

SmileMaze: A Tutoring System in Real-Time Facial Expression Perception and Production in Children with Autism Spectrum Disorder SmileMaze: A Tutoring System in Real-Time Facial Expression Perception and Production in Children with Autism Spectrum Disorder Jeff Cockburn 1, Marni Bartlett 2, James Tanaka 1, Javier Movellan 2, Matt

More information

Music Recommendation System for Human Attention Modulation by Facial Recognition on a driving task: A Proof of Concept

Music Recommendation System for Human Attention Modulation by Facial Recognition on a driving task: A Proof of Concept Music Recommendation System for Human Attention Modulation by Facial Recognition on a driving task: A Proof of Concept Roberto Avila - Vázquez 1, Sergio Navarro Tuch 1, Rogelio Bustamante Bello, Ricardo

More information

Social Context Based Emotion Expression

Social Context Based Emotion Expression Social Context Based Emotion Expression Radosław Niewiadomski (1), Catherine Pelachaud (2) (1) University of Perugia, Italy (2) University Paris VIII, France radek@dipmat.unipg.it Social Context Based

More information

The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression

The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression Patrick Lucey 1,2, Jeffrey F. Cohn 1,2, Takeo Kanade 1, Jason Saragih 1, Zara Ambadar 2 Robotics

More information

D4.10 Demonstrator 4 EoT Application

D4.10 Demonstrator 4 EoT Application Horizon 2020 PROGRAMME ICT-01-2014: Smart Cyber-Physical Systems This project has received funding from the European Union s Horizon 2020 research and innovation programme under Grant Agreement No 643924

More information

Report. Automatic Decoding of Facial Movements Reveals Deceptive Pain Expressions

Report. Automatic Decoding of Facial Movements Reveals Deceptive Pain Expressions Current Biology 24, 738 743, March 31, 2014 ª2014 Elsevier Ltd All rights reserved http://dx.doi.org/10.1016/j.cub.2014.02.009 Automatic Decoding of Facial Movements Reveals Deceptive Pain Expressions

More information

Subjective randomness and natural scene statistics

Subjective randomness and natural scene statistics Psychonomic Bulletin & Review 2010, 17 (5), 624-629 doi:10.3758/pbr.17.5.624 Brief Reports Subjective randomness and natural scene statistics Anne S. Hsu University College London, London, England Thomas

More information

Facial Expression Biometrics Using Tracker Displacement Features

Facial Expression Biometrics Using Tracker Displacement Features Facial Expression Biometrics Using Tracker Displacement Features Sergey Tulyakov 1, Thomas Slowe 2,ZhiZhang 1, and Venu Govindaraju 1 1 Center for Unified Biometrics and Sensors University at Buffalo,

More information

Pupil Dilation as an Indicator of Cognitive Workload in Human-Computer Interaction

Pupil Dilation as an Indicator of Cognitive Workload in Human-Computer Interaction Pupil Dilation as an Indicator of Cognitive Workload in Human-Computer Interaction Marc Pomplun and Sindhura Sunkara Department of Computer Science, University of Massachusetts at Boston 100 Morrissey

More information

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson Spotting Liars and Deception Detection skills - people reading skills in the risk context Alan Hudson < AH Business Psychology 2016> This presentation has been prepared for the Actuaries Institute 2016

More information

To What Extent Can the Recognition of Unfamiliar Faces be Accounted for by the Direct Output of Simple Cells?

To What Extent Can the Recognition of Unfamiliar Faces be Accounted for by the Direct Output of Simple Cells? To What Extent Can the Recognition of Unfamiliar Faces be Accounted for by the Direct Output of Simple Cells? Peter Kalocsai, Irving Biederman, and Eric E. Cooper University of Southern California Hedco

More information

Reveal Relationships in Categorical Data

Reveal Relationships in Categorical Data SPSS Categories 15.0 Specifications Reveal Relationships in Categorical Data Unleash the full potential of your data through perceptual mapping, optimal scaling, preference scaling, and dimension reduction

More information

Assessment of Impact on Health and Environmental Building Performance of Projects Utilizing the Green Advantage LEED ID Credit

Assessment of Impact on Health and Environmental Building Performance of Projects Utilizing the Green Advantage LEED ID Credit July 26, 2009 Assessment of Impact on Health and Environmental Building Performance of Projects Utilizing the Green Advantage LEED ID Credit This study, undertaken collaboratively by the s Powell Center

More information

CS-E Deep Learning Session 4: Convolutional Networks

CS-E Deep Learning Session 4: Convolutional Networks CS-E4050 - Deep Learning Session 4: Convolutional Networks Jyri Kivinen Aalto University 23 September 2015 Credits: Thanks to Tapani Raiko for slides material. CS-E4050 - Deep Learning Session 4: Convolutional

More information

CHAPTER NINE INTERPERSONAL DETERMINANTS OF CONSUMER BEHAVIOR

CHAPTER NINE INTERPERSONAL DETERMINANTS OF CONSUMER BEHAVIOR CHAPTER NINE INTERPERSONAL DETERMINANTS OF CONSUMER BEHAVIOR CHAPTER OBJECTIVES Differentiate between customer behavior and consumer behavior Explain how marketers classify behavioral influences on consumer

More information

Affective Game Engines: Motivation & Requirements

Affective Game Engines: Motivation & Requirements Affective Game Engines: Motivation & Requirements Eva Hudlicka Psychometrix Associates Blacksburg, VA hudlicka@ieee.org psychometrixassociates.com DigiPen Institute of Technology February 20, 2009 1 Outline

More information

Viewpoint Dependence in Human Spatial Memory

Viewpoint Dependence in Human Spatial Memory From: AAAI Technical Report SS-96-03. Compilation copyright 1996, AAAI (www.aaai.org). All rights reserved. Viewpoint Dependence in Human Spatial Memory Timothy P. McNamara Vaibhav A. Diwadkar Department

More information

Implementation of image processing approach to translation of ASL finger-spelling to digital text

Implementation of image processing approach to translation of ASL finger-spelling to digital text Rochester Institute of Technology RIT Scholar Works Articles 2006 Implementation of image processing approach to translation of ASL finger-spelling to digital text Divya Mandloi Kanthi Sarella Chance Glenn

More information

Contrastive Analysis on Emotional Cognition of Skeuomorphic and Flat Icon

Contrastive Analysis on Emotional Cognition of Skeuomorphic and Flat Icon Contrastive Analysis on Emotional Cognition of Skeuomorphic and Flat Icon Xiaoming Zhang, Qiang Wang and Yan Shi Abstract In the field of designs of interface and icons, as the skeuomorphism style fades

More information

7 Grip aperture and target shape

7 Grip aperture and target shape 7 Grip aperture and target shape Based on: Verheij R, Brenner E, Smeets JBJ. The influence of target object shape on maximum grip aperture in human grasping movements. Exp Brain Res, In revision 103 Introduction

More information

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender Proc. IEEE Int. Conf. on Systems, Man and Cybernetics (SMC 2004), Den Haag, pp. 2203-2208, IEEE omnipress 2004 Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

More information

Recognising Emotions from Keyboard Stroke Pattern

Recognising Emotions from Keyboard Stroke Pattern Recognising Emotions from Keyboard Stroke Pattern Preeti Khanna Faculty SBM, SVKM s NMIMS Vile Parle, Mumbai M.Sasikumar Associate Director CDAC, Kharghar Navi Mumbai ABSTRACT In day to day life, emotions

More information

Exam Review Day One. Please sign in up front!

Exam Review Day One. Please sign in up front! Exam Review Day One Please sign in up front! Today... We will be covering: Thinking and Problem Solving, Motivation, Emotion, and Intelligence. Thinking and Problem Solving Thinking and Problem Solving

More information

FUSE TECHNICAL REPORT

FUSE TECHNICAL REPORT FUSE TECHNICAL REPORT 1 / 16 Contents Page 3 Page 4 Page 8 Page 10 Page 13 Page 16 Introduction FUSE Accuracy Validation Testing LBD Risk Score Model Details FUSE Risk Score Implementation Details FUSE

More information

EXTRACTION OF RETINAL BLOOD VESSELS USING IMAGE PROCESSING TECHNIQUES

EXTRACTION OF RETINAL BLOOD VESSELS USING IMAGE PROCESSING TECHNIQUES EXTRACTION OF RETINAL BLOOD VESSELS USING IMAGE PROCESSING TECHNIQUES T.HARI BABU 1, Y.RATNA KUMAR 2 1 (PG Scholar, Dept. of Electronics and Communication Engineering, College of Engineering(A), Andhra

More information

CASME Database: A Dataset of Spontaneous Micro-Expressions Collected From Neutralized Faces

CASME Database: A Dataset of Spontaneous Micro-Expressions Collected From Neutralized Faces CASME Database: A Dataset of Spontaneous Micro-Expressions Collected From Neutralized Faces Wen-Jing Yan, Qi Wu, Yong-Jin Liu, Su-Jing Wang and Xiaolan Fu* Abstract Micro-expressions are facial expressions

More information

Beyond AI: Bringing Emotional Intelligence to the Digital

Beyond AI: Bringing Emotional Intelligence to the Digital Beyond AI: Bringing Emotional Intelligence to the Digital World Emotions influence every aspect of our lives how we live, work and play to the decisions we make We are surrounded by hyper-connected devices,

More information

Finding the Truth: Interview and Interrogation Training Simulations. Ron Punako Senior Software Engineer

Finding the Truth: Interview and Interrogation Training Simulations. Ron Punako Senior Software Engineer Finding the Truth: Interview and Interrogation Training Simulations Ron Punako Senior Software Engineer 1 CTC Overview 501(c)(3) nonprofit established in 1987 Staff of 1,400+ professionals More than 50

More information

Drive-reducing behaviors (eating, drinking) Drive (hunger, thirst) Need (food, water)

Drive-reducing behaviors (eating, drinking) Drive (hunger, thirst) Need (food, water) Instinct Theory: we are motivated by our inborn automated behaviors that generally lead to survival. But instincts only explain why we do a small fraction of our behaviors. Does this behavior adequately

More information

Squid: Exercise Effectiveness and. Muscular Activation Tracking

Squid: Exercise Effectiveness and. Muscular Activation Tracking 1 Squid: Exercise Effectiveness and Muscular Activation Tracking Design Team Trevor Lorden, Adam Morgan, Kyle Peters, Joseph Sheehan, Thomas Wilbur Interactive Media Alexandra Aas, Alexandra Moran, Amy

More information

Facial Expression Analysis for Estimating Pain in Clinical Settings

Facial Expression Analysis for Estimating Pain in Clinical Settings Facial Expression Analysis for Estimating Pain in Clinical Settings Karan Sikka University of California San Diego 9450 Gilman Drive, La Jolla, California, USA ksikka@ucsd.edu ABSTRACT Pain assessment

More information

Perceived similarity and visual descriptions in content-based image retrieval

Perceived similarity and visual descriptions in content-based image retrieval University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 2007 Perceived similarity and visual descriptions in content-based image

More information

REAL-TIME SMILE SONIFICATION USING SURFACE EMG SIGNAL AND THE EVALUATION OF ITS USABILITY.

REAL-TIME SMILE SONIFICATION USING SURFACE EMG SIGNAL AND THE EVALUATION OF ITS USABILITY. REAL-TIME SMILE SONIFICATION USING SURFACE EMG SIGNAL AND THE EVALUATION OF ITS USABILITY Yuki Nakayama 1 Yuji Takano 2 Masaki Matsubara 3 Kenji Suzuki 4 Hiroko Terasawa 3,5 1 Graduate School of Library,

More information

CONSUMERS PREFERENCE ON SCOOTER DESIGN WITH GENDER- NEUTRAL STYLE

CONSUMERS PREFERENCE ON SCOOTER DESIGN WITH GENDER- NEUTRAL STYLE CONSUMERS PREFERENCE ON SCOOTER DESIGN WITH GENDER- NEUTRAL STYLE Chun-Chih Chen 1 and I-Jen Sung 2 1 Department of Industrial Design, National Kaohsiung Normal University, Kaohsiung City, Taiwan 2 Department

More information

From Dials to Facial Coding: Automated Detection of Spontaneous Facial Expressions for Media Research

From Dials to Facial Coding: Automated Detection of Spontaneous Facial Expressions for Media Research From Dials to Facial Coding: Automated Detection of Spontaneous Facial Expressions for Media Research Evan Kodra, Thibaud Senechal, Daniel McDuff, Rana el Kaliouby Abstract Typical consumer media research

More information

Generalization of a Vision-Based Computational Model of Mind-Reading

Generalization of a Vision-Based Computational Model of Mind-Reading Generalization of a Vision-Based Computational Model of Mind-Reading Rana el Kaliouby and Peter Robinson Computer Laboratory, University of Cambridge, 5 JJ Thomson Avenue, Cambridge UK CB3 FD Abstract.

More information

Development of novel algorithm by combining Wavelet based Enhanced Canny edge Detection and Adaptive Filtering Method for Human Emotion Recognition

Development of novel algorithm by combining Wavelet based Enhanced Canny edge Detection and Adaptive Filtering Method for Human Emotion Recognition International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 12, Issue 9 (September 2016), PP.67-72 Development of novel algorithm by combining

More information

Understanding Emotions. How does this man feel in each of these photos?

Understanding Emotions. How does this man feel in each of these photos? Understanding Emotions How does this man feel in each of these photos? Emotions Lecture Overview What are Emotions? Facial displays of emotion Culture-based and sex-based differences Definitions Spend

More information

training & research for academic newcomers A project of the King Baudouin Foundation

training & research for academic newcomers A project of the King Baudouin Foundation training & research for academic newcomers A project of the King Baudouin Foundation Communication and Presentation skills Hanna Mamzer, Bernd-Friedrich Voigt, Thomas Gebhardt Structure of the day: 9.00-10.45

More information

WHITE PAPER. Efficient Measurement of Large Light Source Near-Field Color and Luminance Distributions for Optical Design and Simulation

WHITE PAPER. Efficient Measurement of Large Light Source Near-Field Color and Luminance Distributions for Optical Design and Simulation Efficient Measurement of Large Light Source Near-Field Color and Luminance Distributions for Optical Design and Simulation Efficient Measurement of Large Light Source Near-Field Color and Luminance Distributions

More information

On Shape And the Computability of Emotions X. Lu, et al.

On Shape And the Computability of Emotions X. Lu, et al. On Shape And the Computability of Emotions X. Lu, et al. MICC Reading group 10.07.2013 1 On Shape and the Computability of Emotion X. Lu, P. Suryanarayan, R. B. Adams Jr., J. Li, M. G. Newman, J. Z. Wang

More information

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition , pp.131-135 http://dx.doi.org/10.14257/astl.2013.39.24 Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition SeungTaek Ryoo and Jae-Khun Chang School of Computer Engineering

More information

When fitting patients with hearing aids, the

When fitting patients with hearing aids, the COVER STORY MarkeTrak VIII Patients report improved quality of life with hearing aid usage By Sergei Kochkin, PhD This is the fourth installment from the MarkeTrak VIII database exploring customer satisfaction

More information

It takes 2 to Tango: The role of emotion in service interactions

It takes 2 to Tango: The role of emotion in service interactions It takes 2 to Tango: The role of emotion in service interactions March 19, 2015 Véronique TRAN Tango, you said tango? https://www.youtube.com/watch?v=gssfp8nrvyg 2 Emotion and business are much more connected

More information

SmileTracker: Automatically and Unobtrusively Recording Smiles and their Context

SmileTracker: Automatically and Unobtrusively Recording Smiles and their Context SmileTracker: Automatically and Unobtrusively Recording Smiles and their Context Natasha Jaques * MIT Media Lab 75 Amherst St. Cambridge, MA 02142 USA jaquesn@mit.edu * Both authors contributed equally

More information

What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression.

What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression. What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression. While our emotions are very different, they all involve

More information

Mammogram Analysis: Tumor Classification

Mammogram Analysis: Tumor Classification Mammogram Analysis: Tumor Classification Term Project Report Geethapriya Raghavan geeragh@mail.utexas.edu EE 381K - Multidimensional Digital Signal Processing Spring 2005 Abstract Breast cancer is the

More information

Measuring Focused Attention Using Fixation Inner-Density

Measuring Focused Attention Using Fixation Inner-Density Measuring Focused Attention Using Fixation Inner-Density Wen Liu, Mina Shojaeizadeh, Soussan Djamasbi, Andrew C. Trapp User Experience & Decision Making Research Laboratory, Worcester Polytechnic Institute

More information

Employee Recruitment: Question Formation for Employment Interviews based on Facial Cue Analytics

Employee Recruitment: Question Formation for Employment Interviews based on Facial Cue Analytics Employee Recruitment: Question Formation for Employment Interviews based on Facial Cue Analytics A.T. Rupasinghe* (tharuka@compsoc.lk) N.L. Gunawardena (nadeesha.lg@gmail.com) S. Shujan (msshuji@gmail.com)

More information

A Deep Learning Approach for Subject Independent Emotion Recognition from Facial Expressions

A Deep Learning Approach for Subject Independent Emotion Recognition from Facial Expressions A Deep Learning Approach for Subject Independent Emotion Recognition from Facial Expressions VICTOR-EMIL NEAGOE *, ANDREI-PETRU BĂRAR *, NICU SEBE **, PAUL ROBITU * * Faculty of Electronics, Telecommunications

More information

Enhanced Facial Expressions Recognition using Modular Equable 2DPCA and Equable 2DPC

Enhanced Facial Expressions Recognition using Modular Equable 2DPCA and Equable 2DPC Enhanced Facial Expressions Recognition using Modular Equable 2DPCA and Equable 2DPC Sushma Choudhar 1, Sachin Puntambekar 2 1 Research Scholar-Digital Communication Medicaps Institute of Technology &

More information

Active User Affect Recognition and Assistance

Active User Affect Recognition and Assistance Active User Affect Recognition and Assistance Wenhui Liao, Zhiwe Zhu, Markus Guhe*, Mike Schoelles*, Qiang Ji, and Wayne Gray* Email: jiq@rpi.edu Department of Electrical, Computer, and System Eng. *Department

More information

Temporal Context and the Recognition of Emotion from Facial Expression

Temporal Context and the Recognition of Emotion from Facial Expression Temporal Context and the Recognition of Emotion from Facial Expression Rana El Kaliouby 1, Peter Robinson 1, Simeon Keates 2 1 Computer Laboratory University of Cambridge Cambridge CB3 0FD, U.K. {rana.el-kaliouby,

More information

PSYC 222 Motivation and Emotions

PSYC 222 Motivation and Emotions PSYC 222 Motivation and Emotions Session 6 The Concept of Emotion Lecturer: Dr. Annabella Osei-Tutu, Psychology Department Contact Information: aopare-henaku@ug.edu.gh College of Education School of Continuing

More information

Efficient Measurement of Large Light Source Near-field Color and Luminance Distributions for Optical Design and Simulation

Efficient Measurement of Large Light Source Near-field Color and Luminance Distributions for Optical Design and Simulation Efficient Measurement of Large Light Source Near-field Color and Luminance Distributions for Optical Design and Simulation Hubert Kostal*, Douglas Kreysar, Ronald Rykowski Radiant Imaging, Inc., 22908

More information

Examining the Psychometric Properties of The McQuaig Occupational Test

Examining the Psychometric Properties of The McQuaig Occupational Test Examining the Psychometric Properties of The McQuaig Occupational Test Prepared for: The McQuaig Institute of Executive Development Ltd., Toronto, Canada Prepared by: Henryk Krajewski, Ph.D., Senior Consultant,

More information