Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition

Similar documents
Facial expression recognition with spatiotemporal local descriptors

Study on Aging Effect on Facial Expression Recognition

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

This is the accepted version of this article. To be published as : This is the author version published as:

Facial Expression Recognition Using Principal Component Analysis

1/12/2012. How can you tell if someone is experiencing an emotion? Emotion. Dr.

Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results

Bio-Feedback Based Simulator for Mission Critical Training

Available online at ScienceDirect. Procedia Computer Science 45 (2015 )

On Shape And the Computability of Emotions X. Lu, et al.

Blue Eyes Technology

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry.

Emotion Recognition using a Cauchy Naive Bayes Classifier

Face Analysis : Identity vs. Expressions

Mental State Recognition by using Brain Waves

Social Context Based Emotion Expression

IMPLEMENTATION OF AN AUTOMATED SMART HOME CONTROL FOR DETECTING HUMAN EMOTIONS VIA FACIAL DETECTION

Emotion-Aware Machines

Facial Expression Biometrics Using Tracker Displacement Features

Affective Game Engines: Motivation & Requirements

Gender Based Emotion Recognition using Speech Signals: A Review

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods.

Emotion Detection Through Facial Feature Recognition

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals.

Introduction to affect computing and its applications

Automatic Facial Expression Recognition Using Boosted Discriminatory Classifiers

A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Emotions of Living Creatures

Facial Event Classification with Task Oriented Dynamic Bayesian Network

INTER-RATER RELIABILITY OF ACTUAL TAGGED EMOTION CATEGORIES VALIDATION USING COHEN S KAPPA COEFFICIENT

A framework for the Recognition of Human Emotion using Soft Computing models

ACTIVE APPEARANCE MODELS FOR AFFECT RECOGNITION USING FACIAL EXPRESSIONS. Matthew Stephen Ratliff

Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners

Automatic Emotion Recognition Using Facial Expression: A Review

Real-time Automatic Deceit Detection from Involuntary Facial Expressions

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification

Emotion Lecture 26 1

Comparison of Lip Image Feature Extraction Methods for Improvement of Isolated Word Recognition Rate

PSYC 222 Motivation and Emotions

Affective Dialogue Communication System with Emotional Memories for Humanoid Robots

Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning

Facial Expression and Consumer Attitudes toward Cultural Goods

HUMAN EMOTION DETECTION THROUGH FACIAL EXPRESSIONS

Facial Expression Classification Using Convolutional Neural Network and Support Vector Machine

Quantification of facial expressions using high-dimensional shape transformations

Emotionally Augmented Storytelling Agent

Recognising Emotions from Keyboard Stroke Pattern

Neuro-Inspired Statistical. Rensselaer Polytechnic Institute National Science Foundation

Machine Analysis of Facial Expressions

SCHIZOPHRENIA, AS SEEN BY A

Age Estimation based on Multi-Region Convolutional Neural Network

Advanced FACS Methodological Issues

Emotion Analysis Using Emotion Recognition Module Evolved by Genetic Programming

Facial Behavior as a Soft Biometric

Positive and Negative Expressions Classification Using the Belief Theory

Enhanced Facial Expressions Recognition using Modular Equable 2DPCA and Equable 2DPC

Emote to Win: Affective Interactions with a Computer Game Agent

EMOTIONAL INTELLIGENCE

THE TIMING OF FACIAL MOTION IN POSED AND SPONTANEOUS SMILES

Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images

DEEP convolutional neural networks have gained much

A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI

Discovering Facial Expressions for States of Amused, Persuaded, Informed, Sentimental and Inspired

The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression

Facial Emotion Recognition with Facial Analysis

Emotions and Motivation

Fudan University, China

MPEG-4 Facial Expression Synthesis based on Appraisal Theory

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION

Understanding Emotions. How does this man feel in each of these photos?

Face Factors: Data reduction and Computer vision for Behavioral Science

Computational models of emotion

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS

Estimating Multiple Evoked Emotions from Videos

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques

Learning to Rank Authenticity from Facial Activity Descriptors Otto von Guericke University, Magdeburg - Germany

A MULTIMODAL NONVERBAL HUMAN-ROBOT COMMUNICATION SYSTEM ICCB 2015

Chapter Eight: Emotion and Motivation

IJESRT. Scientific Journal Impact Factor: (ISRA), Impact Factor: 1.852

Sociable Robots Peeping into the Human World

The Analysis of Learner s Concentration by Facial Expression Changes & Movements

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012

Automated Real Time Emotion Recognition using Facial Expression Analysis

Get The FACS Fast: Automated FACS face analysis benefits from the addition of velocity

Emotional Design. D. Norman (Emotional Design, 2004) Model with three levels. Visceral (lowest level) Behavioral (middle level) Reflective (top level)

Practice Question MOTIVATION AND EMOTION. Motivation as Drives. Motivation 10/22/2012

Automatic Coding of Facial Expressions Displayed During Posed and Genuine Pain

MULTIMODAL EMOTION ANALYSIS

EMOTIONS S E N I O R S P E C I A L I S T I N P S Y C H I A T R Y A N D S E X T H E R A P Y

Towards an EEG-based Emotion Recognizer for Humanoid Robots

Hierarchically Organized Mirroring Processes in Social Cognition: The Functional Neuroanatomy of Empathy

EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS

Affective Modeling and Recognition of Learning Emotion: Application to E-learning

Emotion Classification along Valence Axis Using ERP Signals

Understanding Consumer Experience with ACT-R

Comparison of Deliberate and Spontaneous Facial Movement in Smiles and Eyebrow Raises

Nature of emotion: Five perennial questions. Motivation & Emotion Nature of emotion. Five questions. Outline Nature of emotion. Remaining chapters

A Possibility for Expressing Multi-Emotion on Robot Faces

Transcription:

, pp.131-135 http://dx.doi.org/10.14257/astl.2013.39.24 Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition SeungTaek Ryoo and Jae-Khun Chang School of Computer Engineering Hanshin Univ. Osan City, South Korea {stryoo, jchang}@hs.ac.kr Abstract. Emotion recognition is one of the most important factors in emotional ICT. In this paper, we suggest the system of emotion affective color transfer. To recognize human emotion, we use feature based facial expression recognition. Suggested system can be affected Image contents according to individual emotion. To emotionally transfer image contents, we analyze color, shape and stroke information. Emotion affective system can be used various multimedia contents such as music, movie and game. Keywords: Human Emotion, Facial Expression, Non-photorealistic Rendering. 1 Introduction While there is extensive research on emotion in the workplace and on information and communication technology (ICT) implementation, largely ignored is the emotionality of ICT implementation and change management more generally, even though the emotional experience of such processes is critical to their success. Emotion recognition is one of the most important factors in emotional ICT. In most of the state of the art technologies on emotion recognition, features are extracted from the image, speech and bio-signals and they are classified into specific emotional categories based on pre-trained recognition models. In this paper, we suggest the system of emotion affective color transfer. To recognize human emotion, we use feature based facial expression recognition. Suggested system can be affected Image contents according to individual emotion. To emotionally transfer image contents, we analyze color, shape and stroke information. Emotion affective system can be used various multimedia contents such as music, movie and game. 2 Previous Works One of the means of showing emotion is through changes in facial expressions. Apart from the six basic emotions (love, joy, surprise, anger, sadness, fear), the human face is capable of displaying expressions for a variety of other emotions. In 2000, Parrott identified 136 emotional states that humans are capable of displaying and categorized ISSN: 2287-1233 ASTL Copyright 2013 SERSC

them into separate classes and subclasses [1]. In more recent years, there have been attempts at recognizing expressions other than the six basic ones. One of the techniques used to recognize non-basic expressions is by automatically recognizing the individual AUs which in turn helps in recognizing finer changes in expressions. An example of such a system is the Tian et al. s AFA system[2]. Most of the developed methods attempt to recognize the basic expressions and some attempts at recognizing non-basic expressions. However there have been very few attempts at recognizing the temporal dynamics of the face. Temporal dynamics refers to the timing and duration of facial activities. The important terms that are used in connection with temporal dynamics are: onset, apex and offset [3]. Having discussed emotions and the associated facial expressions, let us now take a look at Facial Features. Facial features can be classified as being permanent or transient. Permanent features are the features like eyes, lips, brows and cheeks which remain permanently. Transient features include facial lines, brow wrinkles and deepened furrows that appear with changes in expression and disappear on a neutral face. Tian et al. s AFA system uses recognition and tracking of permanent and transient features in order to automatically detect AUs. 3 Emotion Affective Image Transfer Fig. 1. The overview of emotion affective image transfer 132 Copyright 2013 SERSC

The suggested system consists of human emotion modeling, facial expression based emotion recognition and emotion affective image transfer. To define human emotion, we use 2D emotional model (pleasant-unpleasant and arousal-relaxation). Individual human emotion can be detected by using feature based facial expression. We transfer the color of input image to emotion affective image using emotional color wheel. 3.1 Human Emotion Modeling Emotion can be differentiated from a number of similar constructs within the field of affective neuroscience. Feelings are best understood as a subjective representation of emotions, private to the individual experiencing them. Moods are diffuse affective states that generally last for much longer durations than emotions and are also usually less intense than emotions. Affect is an encompassing term, used to describe the topics of emotion, feelings, and moods together, even though it is commonly used interchangeably with emotion. In order to model personal emotions we first classified the individual personal sensibilities. Based on Russell s 2D emotional model[4], we use one emotional model to show the location of emotion upon two axes as pleasant-unpleasant and arousalrelaxation. The emotional data show the sensitivity by classifying pleasant-unpleasant, arousal-relaxation on the axis. Each sensitive value has a value from -5 to 5 and with this we use it to represent the expression of emotional data by dividing the Russell s 2D emotion model into nine quadrants and set the standard sensitivity. 3.2 Facial Expression Recognition To recognize human emotion, we use feature based facial expression recognition. Suggested emotion detection system consists of facial area detection, feature point extraction and emotional facial expression recognition. To detect facial area, we use Haar-like features and AdaBoost learning procedure [5]. Facial feature points can be extracted by using Active Shape Models (ASM)[6]. The number of extracted features can be reduced by using Principal Component Analysis (PCA)[7]. To recognize emotion from extracted features, we use Support Vector Machine (SVM)[8] which is one of machine learning mechanisms. 3.3 Emotional Color Transfer We transfer the color of input image to emotion affective image using emotional color wheel [9]. The color palette of input image extracted from its color distribution. According to classified facial expression based personal emotion, the emotional color palette calculated by using emotional color wheel. Emotion affective color image can be produced by mapping the color palette of input image to the emotional color palette. Figure 2 shows an example result of proposed emotional color transfer method. Copyright 2013 SERSC 133

Fig. 2. The result of emotional color transfer 4 Conclusion In this paper, we suggest the system of emotion affective color transfer. To recognize human emotion, we define 2D emotional model and use feature based facial expression recognition using machine learning mechanism. To emotionally transfer image contents, we convert the color of input image to emotion affective image using emotional color wheel. However, this suggested system transfers only the color information of image contents. In our future works, we develop the technique of emotional shape and stroke transfer using shape and stroke information. Also, suggested emotion affective system can be expanded various multimedia contents such as music, movie and game. Acknowledgments. This research was supported by Hanshin University Research Grant. References 1. W.G. Parrott, Emotions in Social Psychology, Psychology Press, Philadelphia, October 2000. 134 Copyright 2013 SERSC

2. Y. Tian, T. Kanade and J. Cohn, Recognizing Action Units for Facial Expression Analysis, IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 2, pp. 97 115, 2001. 3. P. Ekman and E.L. Rosenberg, What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS), Illustrated Edition, Oxford University Press, 1997. 4. Russell, J. Pancultural Aspects of the Human Conceptural Organization of Emotions, J. of Personality and Social Psychology 45, pp 1281 1288, 1983 5. Paul Viola and Michael Jones, Robust real-time face detection, International Journal of Computer Vision, vol. 57, pp 137-154, 2004 6. T.F. Cootes, C.J. Taylor, D.H. Cooper and J. Graham, Active shape models - their training and application, Computer Vision and Image Understanding, vol. 61, pp 38 59, 1995 7. Abdi. H. and Williams, L.J.. Principal component analysis., Wiley Interdisciplinary Reviews: Computational Statistics, pp 433 459, 2010 8. Corinna Cortes and Vladimir N. Vapnik, Support-Vector Networks, Machine Learning, Vol. 20, 1995 9. Robert Plutchik, The Nature of Emotions. American Scientist, vol. 89, Issue 4, p.344, 2001 Copyright 2013 SERSC 135