, pp.131-135 http://dx.doi.org/10.14257/astl.2013.39.24 Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition SeungTaek Ryoo and Jae-Khun Chang School of Computer Engineering Hanshin Univ. Osan City, South Korea {stryoo, jchang}@hs.ac.kr Abstract. Emotion recognition is one of the most important factors in emotional ICT. In this paper, we suggest the system of emotion affective color transfer. To recognize human emotion, we use feature based facial expression recognition. Suggested system can be affected Image contents according to individual emotion. To emotionally transfer image contents, we analyze color, shape and stroke information. Emotion affective system can be used various multimedia contents such as music, movie and game. Keywords: Human Emotion, Facial Expression, Non-photorealistic Rendering. 1 Introduction While there is extensive research on emotion in the workplace and on information and communication technology (ICT) implementation, largely ignored is the emotionality of ICT implementation and change management more generally, even though the emotional experience of such processes is critical to their success. Emotion recognition is one of the most important factors in emotional ICT. In most of the state of the art technologies on emotion recognition, features are extracted from the image, speech and bio-signals and they are classified into specific emotional categories based on pre-trained recognition models. In this paper, we suggest the system of emotion affective color transfer. To recognize human emotion, we use feature based facial expression recognition. Suggested system can be affected Image contents according to individual emotion. To emotionally transfer image contents, we analyze color, shape and stroke information. Emotion affective system can be used various multimedia contents such as music, movie and game. 2 Previous Works One of the means of showing emotion is through changes in facial expressions. Apart from the six basic emotions (love, joy, surprise, anger, sadness, fear), the human face is capable of displaying expressions for a variety of other emotions. In 2000, Parrott identified 136 emotional states that humans are capable of displaying and categorized ISSN: 2287-1233 ASTL Copyright 2013 SERSC
them into separate classes and subclasses [1]. In more recent years, there have been attempts at recognizing expressions other than the six basic ones. One of the techniques used to recognize non-basic expressions is by automatically recognizing the individual AUs which in turn helps in recognizing finer changes in expressions. An example of such a system is the Tian et al. s AFA system[2]. Most of the developed methods attempt to recognize the basic expressions and some attempts at recognizing non-basic expressions. However there have been very few attempts at recognizing the temporal dynamics of the face. Temporal dynamics refers to the timing and duration of facial activities. The important terms that are used in connection with temporal dynamics are: onset, apex and offset [3]. Having discussed emotions and the associated facial expressions, let us now take a look at Facial Features. Facial features can be classified as being permanent or transient. Permanent features are the features like eyes, lips, brows and cheeks which remain permanently. Transient features include facial lines, brow wrinkles and deepened furrows that appear with changes in expression and disappear on a neutral face. Tian et al. s AFA system uses recognition and tracking of permanent and transient features in order to automatically detect AUs. 3 Emotion Affective Image Transfer Fig. 1. The overview of emotion affective image transfer 132 Copyright 2013 SERSC
The suggested system consists of human emotion modeling, facial expression based emotion recognition and emotion affective image transfer. To define human emotion, we use 2D emotional model (pleasant-unpleasant and arousal-relaxation). Individual human emotion can be detected by using feature based facial expression. We transfer the color of input image to emotion affective image using emotional color wheel. 3.1 Human Emotion Modeling Emotion can be differentiated from a number of similar constructs within the field of affective neuroscience. Feelings are best understood as a subjective representation of emotions, private to the individual experiencing them. Moods are diffuse affective states that generally last for much longer durations than emotions and are also usually less intense than emotions. Affect is an encompassing term, used to describe the topics of emotion, feelings, and moods together, even though it is commonly used interchangeably with emotion. In order to model personal emotions we first classified the individual personal sensibilities. Based on Russell s 2D emotional model[4], we use one emotional model to show the location of emotion upon two axes as pleasant-unpleasant and arousalrelaxation. The emotional data show the sensitivity by classifying pleasant-unpleasant, arousal-relaxation on the axis. Each sensitive value has a value from -5 to 5 and with this we use it to represent the expression of emotional data by dividing the Russell s 2D emotion model into nine quadrants and set the standard sensitivity. 3.2 Facial Expression Recognition To recognize human emotion, we use feature based facial expression recognition. Suggested emotion detection system consists of facial area detection, feature point extraction and emotional facial expression recognition. To detect facial area, we use Haar-like features and AdaBoost learning procedure [5]. Facial feature points can be extracted by using Active Shape Models (ASM)[6]. The number of extracted features can be reduced by using Principal Component Analysis (PCA)[7]. To recognize emotion from extracted features, we use Support Vector Machine (SVM)[8] which is one of machine learning mechanisms. 3.3 Emotional Color Transfer We transfer the color of input image to emotion affective image using emotional color wheel [9]. The color palette of input image extracted from its color distribution. According to classified facial expression based personal emotion, the emotional color palette calculated by using emotional color wheel. Emotion affective color image can be produced by mapping the color palette of input image to the emotional color palette. Figure 2 shows an example result of proposed emotional color transfer method. Copyright 2013 SERSC 133
Fig. 2. The result of emotional color transfer 4 Conclusion In this paper, we suggest the system of emotion affective color transfer. To recognize human emotion, we define 2D emotional model and use feature based facial expression recognition using machine learning mechanism. To emotionally transfer image contents, we convert the color of input image to emotion affective image using emotional color wheel. However, this suggested system transfers only the color information of image contents. In our future works, we develop the technique of emotional shape and stroke transfer using shape and stroke information. Also, suggested emotion affective system can be expanded various multimedia contents such as music, movie and game. Acknowledgments. This research was supported by Hanshin University Research Grant. References 1. W.G. Parrott, Emotions in Social Psychology, Psychology Press, Philadelphia, October 2000. 134 Copyright 2013 SERSC
2. Y. Tian, T. Kanade and J. Cohn, Recognizing Action Units for Facial Expression Analysis, IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 2, pp. 97 115, 2001. 3. P. Ekman and E.L. Rosenberg, What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS), Illustrated Edition, Oxford University Press, 1997. 4. Russell, J. Pancultural Aspects of the Human Conceptural Organization of Emotions, J. of Personality and Social Psychology 45, pp 1281 1288, 1983 5. Paul Viola and Michael Jones, Robust real-time face detection, International Journal of Computer Vision, vol. 57, pp 137-154, 2004 6. T.F. Cootes, C.J. Taylor, D.H. Cooper and J. Graham, Active shape models - their training and application, Computer Vision and Image Understanding, vol. 61, pp 38 59, 1995 7. Abdi. H. and Williams, L.J.. Principal component analysis., Wiley Interdisciplinary Reviews: Computational Statistics, pp 433 459, 2010 8. Corinna Cortes and Vladimir N. Vapnik, Support-Vector Networks, Machine Learning, Vol. 20, 1995 9. Robert Plutchik, The Nature of Emotions. American Scientist, vol. 89, Issue 4, p.344, 2001 Copyright 2013 SERSC 135