An Adaptable Fuzzy Emotion Model for Emotion Recognition

Similar documents
Introduction to affect computing and its applications

Strategies using Facial Expressions and Gaze Behaviors for Animated Agents

Affective Dialogue Communication System with Emotional Memories for Humanoid Robots

Artificial Emotions to Assist Social Coordination in HRI

The Importance of the Mind for Understanding How Emotions Are

Natural Emotion Expression of a Robot Based on Reinforcer Intensity and Contingency

Emotions of Living Creatures

Affect in Virtual Agents (and Robots) Professor Beste Filiz Yuksel University of San Francisco CS 686/486

References. Note: Image credits are in the slide notes

EMOTIONS S E N I O R S P E C I A L I S T I N P S Y C H I A T R Y A N D S E X T H E R A P Y

Affective Game Engines: Motivation & Requirements

Emotion Lecture 26 1

Affective Systems. Rotterdam, November 11, 2004

emotions "affective computing" 30/3/

Agents, Emotional Intelligence and Fuzzy Logic

Using an Emotional Intelligent Agent to Improve the Learner s Performance

Research Proposal on Emotion Recognition

Affective Computing for Intelligent Agents. Introduction to Artificial Intelligence CIS 4930, Spring 2005 Guest Speaker: Cindy Bethel

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods.

Human Emotion. Psychology 3131 Professor June Gruber

A ective Computing of Constitutional States for Human Information Interaction

Fuzzy Perception, Emotion and Expression for Interactive Robots *

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Learning and Emotional Intelligence in Agents

Feelings. Subjective experience Phenomenological awareness Cognitive interpretation. Sense of purpose

Aspects of emotion. Motivation & Emotion. Aspects of emotion. Review of previous lecture: Perennial questions about emotion

From Affect Programs to Higher Cognitive Emotions: An Emotion-Based Control Approach

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

Recognising Emotions from Keyboard Stroke Pattern

A Fuzzy Logic System to Encode Emotion-Related Words and Phrases

A Possibility for Expressing Multi-Emotion on Robot Faces

COMBINING CATEGORICAL AND PRIMITIVES-BASED EMOTION RECOGNITION. University of Southern California (USC), Los Angeles, CA, USA

Sociable Robots Peeping into the Human World

Emotion in Intelligent Virtual Agents: the Flow Model of Emotion

Representing Emotion and Mood States for Virtual Agents

Introduction to Psychology. Lecture no: 27 EMOTIONS

The Effect of Dominance Manipulation on the Perception and Believability of an Emotional Expression

Emotion Recognition using a Cauchy Naive Bayes Classifier

Motivation represents the reasons for people's actions, desires, and needs. Typically, this unit is described as a goal

Emotional Development

A Fuzzy Emotional Agent for Decision-Making in a Mobile Robot

Managing emotions in turbulent and troubling times. Professor Peter J. Jordan Griffith Business School

Brain Mechanisms Explain Emotion and Consciousness. Paul Thagard University of Waterloo

Affect and Affordance: Architectures without Emotion

Emotion-Aware Machines

Affective Agent Architectures

Emotions and Motivation

Comparison of Multisensory Display Rules. in Expressing Complex Emotions between Cultures

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification

CS148 - Building Intelligent Robots Lecture 5: Autonomus Control Architectures. Instructor: Chad Jenkins (cjenkins)

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION

Multilevel Emotion Modeling for Autonomous Agents

Culture and Emotion THE EVOLUTION OF HUMAN EMOTION. Outline

Affective Computing Ana Paiva & João Dias. Lecture 1. Course Presentation

User Affective State Assessment for HCI Systems

This is the accepted version of this article. To be published as : This is the author version published as:

Emotion Theory. Dr. Vijay Kumar

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012

Nature of emotion: Six perennial questions

Computational Intelligence Lecture 21: Integrating Fuzzy Systems and Neural Networks

Indian Institute of Technology Kanpur National Programme on Technology Enhanced Learning (NPTEL) Course Title A Brief Introduction of Psychology

Temporal Context and the Recognition of Emotion from Facial Expression

Nature of emotion: Six perennial questions

1/12/2012. How can you tell if someone is experiencing an emotion? Emotion. Dr.

Talking Heads for the Web: what for? Koray Balci Fabio Pianesi Massimo Zancanaro

Motives as Intrinsic Activation for Human-Robot Interaction

Emotions in Intelligent Agents

Gender Based Emotion Recognition using Speech Signals: A Review

What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression.

Useful Roles of Emotions in Animated Pedagogical Agents. Ilusca L. L. Menezes IFT6261 :: Winter 2006

Case-based Reasoning in Health Care

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

CPSC81 Final Paper: Facial Expression Recognition Using CNNs

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition

Estimating Intent for Human-Robot Interaction

Computational Analytical Framework for Affective Modeling: Towards Guidelines for Designing Computational Models of Emotions

CAAF: A Cognitive Affective Agent Programming Framework

Fuzzy Expert System Design for Medical Diagnosis

Predicting Learners Emotional Response in Intelligent Distance Learning Systems

The challenge of representing emotional colouring. Roddy Cowie

Implementation of Inference Engine in Adaptive Neuro Fuzzy Inference System to Predict and Control the Sugar Level in Diabetic Patient

Dacher Keltner (professor of psychology at University of California, Berkeley) helped filmmakers understand emotions for the Pixar movie Inside Out.

Audio-based Emotion Recognition for Advanced Automatic Retrieval in Judicial Domain

Anxiety Detection during Human-Robot Interaction *

Q-Learning with Basic Emotions

COMPARISON BETWEEN GMM-SVM SEQUENCE KERNEL AND GMM: APPLICATION TO SPEECH EMOTION RECOGNITION

Fever Diagnosis Rule-Based Expert Systems

A Fuzzy Logic Computational Model for Emotion Regulation Based on Gross Theory

A framework for the Recognition of Human Emotion using Soft Computing models

The innate hypothesis

Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification. and. Evidence for a Face Superiority Effect. Nila K Leigh

Hierarchically Organized Mirroring Processes in Social Cognition: The Functional Neuroanatomy of Empathy

EMOTIONAL LEARNING. Synonyms. Definition

Fuzzy Model on Human Emotions Recognition

Nothing in biology makes sense except in the light of evolution Theodosius Dobzhansky Descent with modification Darwin

Mental State Recognition by using Brain Waves

Facial expression recognition with spatiotemporal local descriptors

Designing Emotion-Capable Robots, One Emotion at a Time

World Journal of Engineering Research and Technology WJERT

Time-varying affective response for humanoid robots 1

Transcription:

An Adaptable Fuzzy Emotion Model for Emotion Recognition Natascha Esau Lisa Kleinjohann C-LAB Fuerstenallee 11 D-33102 Paderborn, Germany e-mail: {nesau, lisa, bernd}@c-lab.de Bernd Kleinjohann Abstract Existing emotion recognition applications usually distinguish between a small number of emotions. However this set of so called basic emotions varies from one application to another depending on their according needs. In order to support such differing application needs an adaptable emotion model based on the fuzzy hypercube is presented. In addition to existing models it supports also the recognition of derived emotions which are combinations of basic emotions. We show the application of this model by a prosody based fuzzy emotion recognition system. Keywords: Fuzzy emotion model, fuzzy hypercube, fuzzy emotion recognition, basic emotion. 1 Introduction Emotions are an evident part of interactions between human beeings. But also for interactions of humans with computer systems emotions play a major role, since humans can never entirely switch off their emotions. During the last years interest in emotions increased considerably in various domains of computer based systems. Examples are robots or virtual agents that show emotions or human-computer interfaces that consider human emotions in their interaction capabilities. In Japan an entire stream called KANSEI information processing [10] deals with subjective human feelings when interacting with IT systems. These few examples already reveal two major tasks of emotion processing in IT systems, the recognition of human emotions and the (re)production of artificial emotions. Whereas robots or virtual agents often show emotions themselves, for many other IT systems the recognition of human emotions and appropriate reactions suffice to improve the system s performance or acceptance. Imagine for instance a user who is angry, because the IT system does not behave in the expected way or she tried several times to accomplish a task without success. In such a situation it would be very helpful and increase the system acceptance, if an IT system could recognize this emotion and react accordingly. Another example is speech recognition. According to investigations at the MIT a doubling of the word error rate to about 32% was observed when people talk in an angry way [4]. In such a case an appropriate system reaction would be to redirect the user to a human operator or to give hints how she could decrease the error rate. Depending on the intended application domain different emotions are relevant for emotion recognition. According to the observation described above, the speech recognition system Mercury distinguishes only two classes of emotions: frustration and neutral. For other applications like personal robots or entertainment robots certainly the recognition of some more emotions like happiness or sadness would be interesting to react with according robot behavior. Since also psychologists have not yet agreed upon a set of basic emotions (see Section 2) it is not likely to identify a set of emotions, that is appropriate for all computer based emotion recognition (CBER) systems. Therefore, in this paper we propose an emotion model for emotion recognition that is easily adaptable to the selected set of basic emotions for the CBER problem (see Section 3). However humans do not only feel some basic emotions in their pure form but also some more complex or derived emotions [16]. An example is for instance curiosity which is according to experiments by Plutchik a combination of acceptance and surprise. Accounting for this observation our emotion model does not only support basic emo- 73

tions but also supports the representation of such derived emotions or blends. Furthermore different intensities or degrees of emotions can be observed [16]. For modelling of different degrees of membership to different classes in a classification system in many application domains, among them also emotion recognition, fuzzy logic is a very useful approach. Therefore we developed our adaptable emotion model using the fuzzy hypercube as basis (see Section 3). We show the applicability of our approach by a system for emotion recognition from prosody of natural speech in Section 4. Afterwards we compare our approach with related work and give a short conclusion. 2 Emotion Models in Psychology Psychologists have tried to explain the nature of human emotions for decades or even centuries. Nevertheless no unique established emotion model exists. However emotion is now usually seen as a dynamic process that involves several modalities like motoric expression, physiological arousal and subjective feeling [13, 9]. For computer based emotion recognition (CBER), however models that help with the classification of emotions are more important. Among these two major types of emotion models can be distinguished (also mixtures of these types are found): models that rely on basic emotions and emotion models that classify emotions according to different dimensions like valence, potency, arousal, intensity etc. The first one has a major advantage for CBER, since it considerably decreases recognition complexity due to a small number of basic emotions to which the CBER can be restricted. A well known earlier model of basic emotions is the work of Plutchik [16]. He uses basic emotions as a kind of building block for derived emotions, so called secondary emotions. Plutchik even distinguishes ternary emotions that are combinations of secondary derived emotions. His model like many others also describes the concept of emotion intensity, that represents the strength by which an emotion is felt. Although emotion models exist for several decades, even now there is no general agreement among psychologists how many basic emotions exist and what they are. This is shown in table 1 which is an excerpt from [15]. Due to the variety of basic emotions described in liter- Table 1: Basic emotions distinguished by psychologists Psychologist Plutchik Ekman, Friesen, Ellsworth Frijda Izard James Mowrer, contempt, disgust, distress, fear, guilt, interest, joy, shame, surprise Oatley and Johnson- Laird Basic Emotions Acceptance, anger, anticipaton, disgust, joy, fear, sadness, surprise, disgust, fear, joy, sadness, surprise Desire, happiness, interest, surprise, wonder, sorrow Fear, grief, love, rage Pain, pleasure, disgust, anxiety, happiness, sadness ature it seems reasonable to develop an emotion model for emotion recognition that is easily adaptable to the selected set of basic emotions for the CBER problem. 3 Fuzzy Emotion Model As already stated, according to psychologists like Plutchik humans do not only feel a single basic emotion but have more complex emotional states, where more than one basic emotion is involved with varying strength or intensity. Therefore we propose a fuzzy classification of emotional states using fuzzy hypercubes [12]. Furthermore we assume that the intensity of an emotion can be mapped to the interval [0,1]. First we define a fuzzy set corresponding to an emotional state and then show how it is represented in a fuzzy emotion hypercube. Fuzzy set for emotional state. Let BE be a finite base set of n basic emotions e 1,e 2,... e n and {µ FEj : BE [0,1],j = 1,2,...} an infinite set of fuzzy membership functions. Then each FE j := {(e i,µ FEj (e i ) e i BE},j = 1,2,... defines a fuzzy set corresponding to one emotional state E j. Fuzzy emotion hypercube. If BE, µ FEj and FE j are defined as described above, we shall use the mem- 74

bership vector (µ FEj (e 1 ),µ FEj (e 2 ),...,µ FEj (e n )) =: (µ FEj (e i )) to denote a point in an n-dimensional hypercube. Each axis of the hypercube corresponds to one basic emotion e i. Thus a membership vector (µ FEj (e i )) corresponds to one emotional state E j and can be interpreted psychologically as vector of emotion intensities (I ei ) := (I e1,i e2,...,i en ). The number of distinguished emotions depends on the psychological theory or in the case of computer based emotion recognition on the intended application. If for instance the three basic emotions happiness h, anger a and surprise s shall be distinguished, a three dimensional unit cube as depicted in Figure 1 is needed for modelling emotional states. (0,0,1) (0,0,0) (0,1,0) E 2 E 1 Happiness (1,0,0) Figure 1: Fuzzy unit cube for three emotions happiness, suprise and anger The corners in the unit cube describe dual memberships (0 or 1) for all emotions, vertices desribe dual memberships for two emotions and the third one varies from 0 to 1. For example, the point E 1 = (1.0, 0.2, 0.3) corresponding to the fuzzy set FE 1 = {(h,1.0),(a,0.2), (s,0.3)} represents a happy emotional state. The point E 2 = (0.2,1.0,0.9) corresponding to FE 2 = {(h,0.2),(a,1.0),(s,0.9)} certainly represents an emotional state for a derived emotion from anger and suprise. The point (0, 0, 0) represents the entirely neutral state where no emotion is present. Neutral + Happin. Happin. + + + Happin. + Happin. Figure 2: Subdivisions of unit cube representing basic and derived emotions Figure 2 shows how the unit cube could be further divided in order to represent basic emotions and their mixtures. In the subcubes denoted by a single emotion the membership function of this emotion takes values in the interval [0.5, 1.0] whereas the membership values for the other emotions respectively their intensities are below 0.5. Therfore it is reasonable to associate the subcube with this basic emotion. In the subcubes denoted with a sum of emotions (e.g. + Happiness) memberships of these emotions are in the interval [0.5,1.0] whereas the membership of the third emotion is below 0.5. Hence a derived emotion from these two basic emotions (e.g. surprise and happiness) is assumed. The subcube where the membership values of all basic emotions are between 0.5 and 1.0 is denoted by the sum + + Happiness. If a general n-dimensional emotion hypercube is regarded certainly not all combinations of up to n emotions make sense. However, whether a combination is reasonable or not is certainly a psychological question. If combinations that do not make sense are recognized by a CBER this could for instance indicate an error. 4 Application This section deals with the application of our adaptable emotion model for the fuzzy rule based emotion recognition system PROSBER [2]. 4.1 Overview of PROSBER PROSBER recognizes emotions from the prosody of natural speech. It takes single sentences as input and classifies them into the emotion categories happiness, sadness, anger and fear. Furthermore a neutral emotional state is distinguished. PROSBER automatically generates the fuzzy models for emotion recognition. Accordingly two working modes are distinguished, training and recognition, as depicted in Figure 3. During the training the training samples with wellknown emotion values are used to create the fuzzy models for the individual emotions. For that purpose sequences of acoustic parameters like fundmental frequency or jitter are extracted. PROSBER extracts about twenty parameters that have shown their relevance for emotion recognition in psychological stud- 75

Speech signal Speech signal Preprocessing Training Preprocessing Recognition Frames Frames extraction extraction sequences sequences vectors calculation vectors calculation Emotions of training samples Fuzzy model generation Membership functions generation Membership functions selection Fuzzy rules Fuzzy classification Figure 3: Architecture of PROSBER Fuzzy rule construction Emotion ies or in other speech based emotion recognition systems. The sequences of these acoustic parameters are summarized by statistical analysis steps performed by the feature calculation. The fuzzy model generation is based on a fuzzy grid approach [11]. It performs the following three steps on the training database. First the membership functions for every feature are generated. Afterwards for each emotion up to six most significant features are selected and then the fuzzy rule system for each emotion is generated. These fuzzy models are used in the emotion recognition process to classify unknown audio data. A detailed description of PROSBER can be found in [2]. 4.2 Fuzzy Emotion Recognition in PROSBER In order to describe the application of our emotion model we shall now have a closer look at the fuzzy classification. In principle it is structured as depicted in Figure 4. vectors Emotion 1 Emotion n Fuzzification.. Rule Base Emotion n Fuzzy Inference Defuzzification Emotion 1 intensity.. Emotion n intensity Max Emotion Figure 4: Principle structure of fuzzy classification For each basic emotion e i,i = 1,...,n, a separate rule set is generated by an adapted fuzzy grid method. Each rule takes the fuzzified features f j,j = 1,...,K,K 6, as input and produces a fuzzy emotion value I ei as output. We represent each feature f j and emotion intensity I ei by five triangular membership functions verylow, low, medium, high and veryhigh as schematically depicted in Figure 5. However, the actual start and end coordinates as well as the maximum coordinates are generated automatically during the training phase. This representation is simple enough to support real time emotion recognition, yet allows to distinguish degrees to which a feature or emotion is present in the current input sentence. Furthermore, it is in line with psychologists approaches who often use two up to ten levels for characterizing psychological phenomena like emotion intensities. very low low med. 1.0 0.8 0.5 0.2 0 very high high 0.25 0.5 0.75 1.0 Figure 5: Membership functions for features and emotions The rule set for the emotion e i is generated by a fuzzy grid approach [11]. Since this approach uses only the AND connector it generates 5 K+1 rules of the following form: IF f 1 IS verylow AND... AND f K IS verylow THEN I ei IS veryhigh IF f 1 IS verylow AND... AND f K IS low THEN I ei IS veryhigh IF f 1 IS verylow AND... AND f K IS medium THEN I ei IS medium... The number of rules could be reduced, if the OR connector or rule pruning could be used. However, both features are not yet supported by the fuzzy library we use. For defuzzification of emotion values we use the center of gravity (COG) method. By projecting the COG to the x-axis we calculate the corresponding emotion intensity. Hence a four dimensional vector (I h,i s,i a,i f ) = (µ E (h),µ E (s),µ E (a),µ E (f)) containing the intensities of the four emotions happiness h, sadness s, anger a, and fear f is generated. This vector represents the membership values for each 76

emotion and hence determines a point in the four dimensional emotion hypercube. Presently PROSBER recognizes a single basic emotion. In order to select this emotion we determine the emotion e rec {h,s,a,f} with maximum intensity I erec = max{i h,i s,i a,i f }. If the maximum cannot be determined unambiguously, since two or more intensity values are maximal, that emotion is selected which was recognized for the previous sentence. The neutral emotional state is identified by a hypercube part near to the origin as depicted in Figure 2. We plan to extend PROSBER for recognition of combined emotions as described above. 5 Related Work Up to now a variety of emotion models have been described in literature. They are mainly dedicated to computer based emotion (re)production or simulation in different application domains. A broad application domain are virtual agents, that show (pseudo)emotional behavior in their communication with humans [3, 8, 5, 7]. They rely on a dimensional model of emotions based on the event-appraisal emotion model of Ortony et al. [14]. They usually distinguish the dimensions pleasure, arousal and dominance and try to maintain their dynamics over time. The PETEEI system [7] for simulation of a pet s evolving emotional intelligence similarly to our system uses fuzzy sets for emotion representation. However this system is different from our approach since it associates certain types of events with positive or negative feelings in order to react with according emotions whereas our approach is dedicated to emotion recognition from certain features (of speech, facial expression etc.). Emotion recognition as investigated in our approach is to some extent covered by Kismet [6]. However Kismet recognizes intentions rather than emotions. The emotional system developed for AIBO and SDR [1] like our model uses basic emotions. Since it is intended for production of emotional behavior it uses the dimensions pleasure and arousal as mentioned above. But the dominance dimension is substituted by a confidence dimension representing the certainty of recognized external stimuli. The models described above only deal with single emotions and do not allow to represent combinations or blends of emotions like our approach. The Cathexis model [17] supports this feature and is also adaptable to different sets of basic emotions or emotion families as they are called there by supporting the coexistence of several active so called emotion proto-specialists representing different emotion families. However Cathexis is also dedicated to the (re)production of emotional behavior in synthetic agents whereas our adaptable emotion model is intended for emotion recognition. 6 Conclusion and Outlook This paper presented an adaptable emotion model for emotion recognition. It uses the concept of an n- dimensional fuzzy hypercube to represent emotional states made up of n basic emotions. In contrast to other approaches this allows not only the representation and recognition of a fixed set of basic emotions but also supports the handling of derived emotions. We showed the application of this model using the fuzzy prosody based emotion recognition system PROSBER. As a first step we proposed a division of the unit hypercube in equally sized subcubes to distinguish basic emotions and their combinations or blends. An interesting point for further investigation is whether this subdivision corresponds to human recognition. This could for instance be done using a learning approach that automatically finds such subdivisions and compares them with human interpretations of corresponding emotional states. References [1] R. C. Arkin, M. Fujita, T. Takagi, and R. Hasegawa. An ethological and emotional basis for human-robot interaction. In Robotics and Autonomous Systems, vol. 42, no. 3, pages 191 201. Elsevier Science, 2003. [2] A. Austermann, N. Esau, L. Kleinjohann, and B. Kleinjohann. Prosody based emotion recognition for mexi. In Proceedings of IEEE/RSJ Int. Conference on Intelligent Robots and Systems (IROS 2005), Edmonton, Alberta, Canada, August 2005. [3] J. Bates. The role of emotion in believable agents. Communication of the ACM, 37(7), pages 122 125, 1992. [4] A. Boozer. Characterization of Emotional Speech in Human-Computer-Dialogues. M.Sc. Thesis. MIT Press, 2003. 77

[5] C. Breazeal. Affective interaction between humans and robots. In Proc. of ECAL 01, pages 582 591, Prague, 2001. [6] C. Breazeal and L. Aryananda. Recognition of affective communicative intent in robot-directed speech. In Autonomous Robots 12, pages 83 104. Kluwer Academic Publishers, 2002. [7] M. S. El-Nasr, J. Yen, and T. Ioerger. Flame - a fuzzy logic adaptive model of emotions. In Automous Agents and Multi-agent Systems 3, pages 219 257, 2000. [8] C. Elliot. The Affective Reasoner: A Process model of emotions in a multi-agent system. Ph.D. Thesis. Institute for the Learning Sciences, Evanston, IL: Northwestern University, 1992. [9] N. H. Frijda. Neural Networks and Fuzzy Systems; A Dynamical Systems Approach to Machine Intelligence. Cambridge University Press, 1986. [10] S. Hashimoto. Kansei as the third target of information processing and related topics. In Proceedings of Intl. Workshop on Kansei Technology of Emotion, pages 101 104, 1997. [11] H. Ishibuchi and T. Nakashima. A study on generating fuzzy classification rules using histogramms. In Knowledge based Intelligent electronic Systems, Bd. 1. Prentice Hall, 1998. [12] B. Kosko. Neural Networks and Fuzzy Systems; A Dynamical Systems Approach to Machine Intelligence. Prentice Hall, Englewood Cliffs, NJ, 1992. [13] R. S. Lazarus. Emotion and Adaptation. Oxford University Press, 1991. [14] A. Ortony, G. Clore, and A. Collins. The Cognitive Structure of Emotions. Cambridge University Press, 1988. [15] A. Ortony and W. Turner. What s basic about basic emotions? Psychological Review, pages 315 331, 1990. [16] R. Plutchik. The Emotions. University Press of America, Inc., revised edition, 1990. [17] J. Velasquez. Modeling emotions and other motivations in synthetic agents. In Proceedings of the AAAI Conference, pages 10 15. Providence, RI, 1997. 78