An Empathic Virtual Dialog Agent to Improve Human-Machine Interaction

Size: px
Start display at page:

Download "An Empathic Virtual Dialog Agent to Improve Human-Machine Interaction"

Transcription

1 An Empathic Virtual Dialog Agent to Improve Human-Machine Interaction Magalie Ochs Université Paris 6, LIP6 Orange Labs France Catherine Pelachaud Université Paris 8, INRIA Rocquencourt David Sadek Orange Labs France ABSTRACT Recent research has shown that virtual agents expressing empathic emotions toward users have the potentiality to enhance human-machine interaction. To identify under which circumstances a virtual agent should express empathic emotions, we have analyzed real human-machine dialog situations that have led users to express emotion. The results of this empirical study have been combined with theoretical descriptions of emotions to construct a model of empathic emotions. Based on this model, a module of emotions has been implemented as a plug-in for JSA agents. It determines the empathic emotions (their types and intensity) of such agents in real time. It has been used to develop a demonstrator where users can interact with an empathic dialog agent to obtain information on their s. An evaluation of this agent has enabled us to both validate the proposed model of empathic emotions and highlight the positive user s perception of the virtual agent. 1. INTRODUCTION A growing interest in using virtual agents as interfaces to computational systems has been observed in recent years. This is motivated by an attempt to enhance human-machine interaction. Humanoid-like agents are generally used to embody some roles typically performed by humans, as for example a tutor [13]. The expression of emotions can increase their believability by creating an illusion of life [3]. Recent research has shown that virtual agent s expressions of empathic emotions enhance users satisfaction [14, 26], engagement [14], performance in task achievement [22], and the perception of the virtual agent [5, 23]. In our research, we are particularly interested in the use of virtual dialog agents in information systems. Users interact using natural language to find out information on a specific domain. We aim to give such agents the capability to express empathic emotions toward users while dialoging. Our aim is to improve interaction using empathic agents [14, 26, 5, 23, 22]. Empathy is commonly defined as the capacity to put your-self in someone else s shoes to understand her emotions [20]. To be empathic assumes one is able to evaluate Cite as: An Empathic Virtual Dialog Agent to Improve Human-Machine Interaction, M. Ochs, C. Pelachaud and D. Sadek, Proc. of 7th Int. Conf. on Autonomous Agents and Multiagent Systems (AA- MAS 2008), Padgham, Parkes, Müller and Parsons (eds.), May, , 2008, Estoril, Portugal, pp Copyright c 2008, International Foundation for Autonomous Agents and Multiagent Systems ( All rights reserved. the emotional dimension of a situation for another person. To achieve this goal, a virtual agent should know in which circumstances which emotions may be felt. To endow a virtual dialog agent with empathic capabilities, one way is to provide it with a representation of the conditions of users emotions elicitation during dialog. Then, the agent can deduce the emotions potentially felt by the user during the interaction. Several computational models of emotions include a representation of emotions elicitation conditions (as for instance in [7, 28]). It enables one to determine which emotions of the virtual agent are triggered during an interaction. Generally, researchers [7, 28, 10] use a specific cognitive psychological theory of emotion (mainly the OCC model [19]) to define the agent s emotions. In this approach, the authors assume that the emotions that may appear during interaction with one or multi agents and their conditions of elicitation correspond to those described in the chosen theory. For an empathic virtual dialog agent, the emotions that should be modeled are those that may be felt by the user during the dialog. Our computational model of empathic emotions is based on an empirical and theoretical approach. An exploratory analysis of real human-machine dialogs that have led users to express emotions have been done to try to identify the characteristics of emotional dialogic situations. Combined with the descriptions of emotions in cognitive psychological theories, the types and the conditions of elicitation of emotions that may be triggered during human-machine dialogs have been defined. Our model of empathic emotions have been implemented as a plug-in for JSA agents (Jade Semantics Agents [15]). The JSA agent technology has been used to develop an empathic dialog agent. A subjective evaluation of this agent has been performed. It enabled us both to validate the model of empathic emotions and to highlight the positive impact of empathic virtual agent on humanmachine interaction. The paper structure is as follow. After giving an overview of existing empathic virtual agent models (Section 2), we present our model of empathic emotions (Section 3). Section 4 describes the implementation of empathic dialog virtual agent followed by its evaluation (Section 6). 2. EXISTING EMPATHIC VIRTUAL AGENTS Empathy in human-machine interaction can be considered in two ways: a user can feel empathic emotions toward a virtual agent (for instance in FearNot! [21]) or a virtual agent can express empathic emotions toward a user [7, 16, 28, 25]. In our research, we focus on the empathy of a virtual 89

2 agent toward users. Most of empathic virtual agents are based on the OCC model [19]. Consequently, only two types of empathic emotions are considered : happy-for and sorry-for. However, research in psychology suggests that the type of an empathic emotion toward a person is similar to the type of the emotion of the latter [11]. Indeed, by empathy, someone may, for instance, feel fear for another person. Therefore, there exist as many types of empathic emotion as types of non empathic one. Then, an empathic virtual agent should feel an empathic emotion of frustration for the user if it thinks the user is frustrated. In [7], the happy-for (respectively sorry-for) emotion is elicited by the empathic agent when a goal of another agent (virtual agent or user) is achieved (respectively failed). The empathic virtual agent has a representation of the other agent s goals. It deduces these goals from their emotional reactions. Consequently, the agent knows the other s goals only if they have been involved in an emotion elicitation. Therefore, the other agent s goals representation might be incomplete. In [28], the virtual agent expresses happy-for (respectively sorry-for) emotion only if it detects a positive (respectively negative) emotional expression of its interlocutor. The agent s empathic emotions are in this case elicited by the perception of the expression of an emotion of another agent. Indentically, in [25], the virtual agent expresses empathy according to the user s emotions (frustration, calm or joy) recognized through physiological sensors. However, an empathic emotion can be elicited even if this emotion is not felt or expressed by the interlocutor [24]. Another approach consists in observing real interpersonal mediated interactions in order to identify the circumstances under which an individual expresses empathy and how it is displayed. The system CARE (Companion Assisted Reactive Empathizer) has been constructed to analyze user s empathic behavior during a treasure hunt game in a virtual world [16]. The results of this study are domain-dependent. The conditions of empathic emotion elicitation in the context of a game may not be transposable in another context (as for example the context in which a user interacts with a virtual agent to find out information on a specific domain). Our method to create empathic virtual agent is based both on a theoretical and empirical approaches. It consists to identify through psychological cognitive theories of emotion and through the study of real human-machine emotional dialogs, the situations that may elicit users emotions. In the next section, we present our model of empathic emotions. 3. A MODEL OF EMPATHIC EMOTIONS An empathic virtual dialog agent should express empathic emotions in situations in which the user potentially felt an emotion. The agent should therefore know the conditions of elicitation, the types and intensity of the users emotions during the dialog. 3.1 Theoretical Foundations. According to the cognitive appraisal theories [31], emotions are triggered by a subjective interpretation of an event. This interpretation corresponds to the evaluation of a set of variables (called appraisal variables). When an event occurred (or is anticipated) in the environment, the individual evaluates the latter through a set of variables. The values of these variables determine the type and the intensity of the elicited emotion. In our work, we focus on the goal-based emotions [19]. We consider the following appraisal variables (extracted from [30]): The consequence of the event on the individual goal: an event may trigger an emotion only if the person thinks that it affects one of her goals. The consequences of the event on the individual goal determine the elicited emotion. For instance, fear is triggered when a survival goal is threatened or risks to be threatened. Generally, failed or threatened goals elicit negative emotions whereas achieved goals trigger positive ones. The causes of the event: the causes of an event that lead to emotion elicitation may influence the type of the elicited emotion. For instance, a goal failure caused by another agent may trigger anger. The consistency of consequences with the expectations: the elicited emotion depends on the consistency between the current situation (i.e. the consequences of the occurred event on the individual s goals) and the situation expected by the individual. The potential to cope with consequences: the coping potential represents the capacity of an individual to deal with a situation that has led to a threat or failed goal. It may influence the elicited emotion. The interpretation of an event (i.e. the evaluation of appraisal variables and then the elicited emotion) depends principally on the individual s goals and beliefs (on the event, its causes, its real and expected consequences, and on her coping potential). That explains the different emotional reactions of distinct individuals in front of a same situation. In a dialog context, an event corresponds to a communicative act. Consequently, according to the appraisal theory of emotion [31], a communicative act may trigger a user s emotion if it affects one of her goals. To identify more precisely the dialogical situations that may lead a user to feel emotion, we have analyzed real human-machine dialogs that have led a user to express emotions. We present in the next section the results of this study. 3.2 The Analysis of Users Emotions Elicitation in Human-Machine Interaction. The analyzed dialogs have been derived from two vocal applications. The users interact orally with a virtual dialog agent to find out information on a specific domain (on stock exchange or on restaurants in Paris). First, the dialogs have been annotated with the label negative emotion by two annotators 1. The annotations have been done based on vocal and semantic cues of user s emotions. Secondly, these dialogs have been annotated with a particular coding scheme in order to highlight the characteristics of the dialogical situations that may elicit emotions in a human-machine context (for more details on the coding scheme see [18]). The analysis of the annotated dialogs has enabled us to identify more precisely the characteristics of a situation that may lead to a negative emotion elicitation in human-machine interaction. Concerning the appraisal variable consequence of the event, 1 Unfortunately, the dialog corpus did not cover situations that have led users to express positive emotion 90

3 a communicative act may trigger a user s negative emotion when it involves the failure of a user s intention 2. The cause of the event that seems to elicit a user s negative emotion is in some cases the dialog agent because of a belief conflict on an intention (the agent thinks the user has an intention different from her own one). In the negative emotional situations, the user s expectations seem to be inconsistent with the situation that she observes. After the failure of her intention, the user tries sometimes to achieve it in another way (coping potential). In some cases, the user seems not to be able to cope with the situation. The results of this study on human-machine emotional dialogs are not sufficient to construct a model of empathic emotions. Consequently, these results have been combined with the descriptions of emotions from appraisal theory in order to deduce the type and the intensity of the emotions that a user may experience during human-machine dialogs. 3.3 The Virtual Agent s Empathic Emotions To identify the types of emotions a user may feel during human-machine interaction, we have explored the work of Scherer[30] and have tried to correlate his descriptions of the conditions of elicitation of emotion type to the characteristics of emotional dialogical situations introduced above. We have chosen the Scherer s approach because his model is constructed on an analysis of the consensual appraisal variables coming from different theories. In the OCC model, only few appraisal variables are considered. For instance, the coping potential is not taken into account. A positive emotion is generally triggered when a goal is completed. More precisely, if the goal achievement was expected, an emotion of satisfaction is elicited; while, if it was not expected, an emotion of joy appears [30]. In the humanmachine dialogs, a user s goal achievement corresponds to the successful completion of her intention. Generally, the user expects that her intentions (underlying her communicative act) will be achieved. Therefore, we consider only the emotion of satisfaction. We suppose that the user may experience satisfaction when one of her intentions is completed. A goal failure generally triggers a negative emotion. If a situation does not match with an individual s expectations, an emotion of frustration is elicited [30]. Consequently, the user may experience frustration when one of her intentions failed. An emotion of sadness appears when the individual cannot cope with the situation. On the other hand, if she can cope with the situation, an emotion of irritation is elicited [30]. The user may feel sadness when she does not know any other action that enables her to carry out her failed intention. If an action can be achieved by the user to complete her intention, she may experience irritation. When the goal failure is caused by another person, an emotion of anger may be elicited. In the dialogs analysis described above, this situation may correspond to a user s intention failure caused by the dialog agent due to a belief conflict. The user may experience anger toward the agent when a belief conflict with the dialog agent has led to a goal failure. Based on research on emotion s intensity [19, 9, 7, 28], we suppose that the intensity of the elicited emotion is positively correlated to the unexpectedness of the situation, to 2 In the human-machine dialogs we studied, we have observed the user s and agent s intentions with more particular attention. An intention is defined as a persistent goal (for more details see [29]) the effort invested by the user to try to carry out her intention, to the importance for the user to achieve her intention (in the case of positive elicited emotion), to the importance not to have her intention failed (in the case of negative elicited emotion) and negatively correlated to the user s coping potential. Of course, we cannot deduce the exact emotion felt by the user from this description of emotions. Other elements (as for example the mood, the personality, and the current emotions) influence the elicitation of an emotion. However, this approach enables us to provide the virtual agent with information on the dialogical situations that may trigger a user s emotion. Indeed, from these rules on emotions elicitation, the virtual agent can deduce the emotion potentially felt by the user during the dialog. This information is used to elicit her empathic emotions. When the virtual dialog agent thinks the user has potentially an emotion, an empathic emotion may be triggered toward the latter. For instance, if the agent thinks the user feels frustration in the current situation, the agent triggers an empathic emotion of frustration toward the user. The elicitation of empathic emotions and their intensity depends on different factors as for example the relation between the user and the agent. To illustrate it, we introduce a degree of empathy. It is positively correlated to the similarity and the relationship between the user and the virtual dialog agent. Indeed, as highlighted in [21], people experience more empathic emotions with persons with whom they have some similarities (for example the same age) or a particular relationship (as friendship). The degree of empathy depends also on the degree to which the user deserves or not deserves the situation. Generally, people tend to be more pleased (respectively displeased) for others if they think the situation is deserved (respectively not deserved) [19]. The conditions of empathic emotions elicitation have been formalized in terms of mental attitudes. They are represented by particular mental states, i.e. combinations of beliefs, uncertainties, and intentions. This formalization enables a BDI-like agent [27, 29], which reasons and acts according to its mental state, to infer its empathic emotions during the interaction (for more details on the formalization see [17]). 4. IMPLEMENTATION OF THE MODEL OF EMPATHIC EMOTIONS Based on the model of empathic emotions described above, a module of emotions has been developed. It corresponds to a plug-in for the JSA agents (Jade Semantics Agents). These agents are implemented within the JSA framework [15], a plug-in of the JADE platform (Java Agent DEvelopment Framework). The JSA framework 3 enables one to implement BDI-like dialog agent. The module of emotions we have implemented provides empathic capabilities to these agents. It identifies dynamically the empathic emotions (type and intensity) of the JSA agent toward its interlocutor. 4.1 A Module of Emotions The JSA agents are able to interpret the meaning of a received message and to respond to it automatically. The process for the interpretation and the reasoning are implemented trough rules called SIP (Semantic Interpretation 3 The JSA framework is open-source [12] 91

4 Principle)[15]. The module of emotions is composed of a set of java classes to represent emotions, two SIPs for the emotions elicitation, specific methods to compute and update the intensity of emotions, and a graphic interface to visualize the agent s emotions and their intensity dynamic (Figure 1). Based on the speech act theory [1], the JSA agents use a model of communicative acts [29] to infer the user s beliefs and intentions concerning the dialog. For instance, when the user asks an information, the dialog agent deduces that the user has the intention to know the information and has the intention that the agent knows her intention to know the information. The agent infers also that the user believes that her intentions will be achieved following the enunciation of the communicative act. From these user s mental attitudes and given the rules on empathic emotions elicitation, the agent computes its empathic emotions toward the user. For instance, if the agent believes that one of the user s intentions has just failed, it infers that the user potentially feels an emotion of frustration. Then, an empathic emotion of frustration is triggered. If it thinks that no other action enables the user to achieve this intention, an empathic emotion of sadness is elicited. The intensity of emotion is computed according to the values of the intensity variables introduced in the previous section. The intensity decreases when no emotion is elicited. The updating of the intensity when an emotion is triggered is inspired with the dynamic model of emotion proposed in [32]. 4.2 EDAMS : An Empathic Dialog Agent in a Mail System From the JSA framework and the module of emotions introduced above, a demonstrator of an empathic dialog agent (called EDAMS : An Empathic Dialog Agent in a Mail System) has been implemented. The user interacts with the EDAMS to obtain information on her mails. She selects predefined sentences on the interface to dialog with the agent (Figure 2). In order to display the empathic emotions, a 3D talking head is used (Figure 2). During the dialog, the module of emotions transfers to the talking head the type and intensity of the empathic emotion to display. The talking head adopts the facial expression 4 corresponding to this emotion (Figure 3). To give the capability to the EDAMS to answer to the user s requests, different information on the user s messages (type of the message, sender, level of urgency, content, etc) are added to the EDAMS database. A module has been developed to traduce the user s request in natural language to FIPA-ACL [8], the language used by the JSA agents to reason. Some values have to be fixed by the programmer to enable the EDAMS to compute the intensity of emotions: the degree of certainty of the user to achieve her intentions, the importance for the user that her intention is achieved and not failed, and the agent s degree of empathy. These values may depend on the application context, the type of the intentions and on the user s characteristics. Let us illustrate our system through some examples of emotional interactions between a user and the EDAMS: when the EDAMS is started, if the user asks the system to close 4 Few research has been done on the expressions of empathy [6]. In our work, we suppose that the facial expression corresponding to an empathic emotion is similar to the one of an emotion of the same type her mailbox (by selecting the sentence Je voudrais fermer ma messagerie (I would like to close my mail box), Figure 2), an empathic emotion of frustration is expressed because the user s intention cannot be achieved (the mailbox should first be opened). If the user asks the EDAMS to connect her with another person (by selecting, for instance, the sentence Peux-tu me mettre en contact avec Bobby? (Can you put me in contact with Bobby, Figure 2), the EDAMS expresses an empathic emotion of sadness because it cannot satisfy the user s request and thinks that no other action enables the user to complete this intention. If the user asks the EDAMS to tell her the new messages (by selecting the sentence Pourrais-tu me lire mes nouveaux messages? (Could read me my new s?), the talking head expresses an emotion of satisfaction and reads the user s new messages. When the user selects the sentence No, it is not what I want, the agent supposes that a belief conflict on the user s intention has just occurred and displays an emotion of anger to express the fact that it is angry against itself (because it supposes that the user is angry against it) 5. The EDAMS has been used to evaluate the impact of an empathic virtual agent on the human-machine interaction, and more precisely on the user s perception of the agent. In the next section, we present the experimental protocol and the results of the evaluation. 5. THE IMPACT OF AN EMPATHIC VIRTUAL AGENT ON USER S PERCEPTION Recent research has shown that virtual agents which express empathic emotions toward a user enhance humanmachine interaction [5, 14, 22, 23, 26]. However, these few experimentations of emotional agents are mostly in the context of game [5, 14, 22, 26]. Moreover, the results seem to depend on the culture of the subjects [4, 5]. They are sometimes contradictory [2, 22]. As highlighted by Becker et al. [4], the agent s expressions of emotions may be harmful to the interaction when they are incongruous to the situation. Today, no research seems to have explored, in a French cultural context, the impact on the interaction of an empathic dialog agent used as information system. Therefore, an evaluation of the EDAMS has to be done to assure the effect of the agent s expressions of empathic emotions on the interaction. The evaluation of the EDAMS has been performed in order to test the following hypotheses: 1. when a user interacts with a virtual agent to find out information in a particular domain, she perceives more positively the agent when the latter expresses empathic emotions; 2. when a user interacts with a virtual agent to find out information in a particular domain, she perceives more negatively the agent when the latter expresses incongruous emotions to the dialog situations. 5.1 Design of the Experiment In order to test our hypotheses, three versions of the EDAMS have been developed: 5 The most appropriate emotion to express when the agent thinks the user is angry against it, is unclear. Another possibility can be considered, as for example the expression of sadness as the agent is sorry it could not help appropriately the user 92

5 Figure 1: Screenshot of the graphic interface of the module of emotions Figure 2: Screenshot of the interface of the EDAMS 1. the non-emotional version used as a control condition in which the virtual agent does not express any emotion; 2. the empathic version in which the virtual dialog agent expresses empathic emotions through its facial expressions during the interaction with the user. This version corresponds to the one described in the previous section. The conditions of emotions expressions are those defined in our model of empathic emotions; 3. the non-congruent emotional version in which the virtual dialog agent expresses incongruous emotions to the situations of the interaction through its facial expressions. More precisely, the valence of the emotions expressed are the opposite of those in the empathic version. For instance, if, in a given situation, the virtual agent expresses an emotion of sadness in the empathic version, then, in the non-congruent emotional version, the agent expresses an emotion of satisfaction. In others words, in this version, the agent expresses a positive (respectively negative) emotion when the user potentially feels a negative (respectively positive) emotion. In the three versions, the interface of the system (Figure 2), the verbal behavior of the agent and its facial expressions are the same. Only the conditions of emotions elicitation vary. Eighteen subjects (nine men and nine women) have participated to the experiment. The average age was 35 (standard deviation=11.86). No participant knew the research subject and the EDAMS. During the test, each subject has performed three sequences of four or fives requests for each version of the ERDAMS. To achieve a request, the subject asked the virtual agent to execute an action by clicking on the corresponding sentence displayed on the interface (as for instance Can you read me my new messages? ). The requests to achieve appeared in a screen on the right of the screen on which the interface of the system EDAMS was displayed. The order of the sequences and the versions were counterbalanced. The instructions to the test were presented to the participants at the beginning of the evaluation session. It explained the aim of the test as an evaluation study of their perception of the agent s facial expressions. After each sequence of tests, the subjects filled a questionnaire that enabled us to collect information on their perception of the agent. The questionnaire to evaluate the user s perception of the virtual agent is composed of 15 affirmations: 11 regarding the virtual agent (as for example She was pleasant ) and 4 concerning more precisely her facial expressions (as for example I have liked her facial expressions ). Finally, the 93

6 Figure 3: Facial expressions of the agent s empathic emotions perception of the following aspects of the virtual agent have been measured: pleasant, irritating, strange, compassionate, expressive, cold, jovial, boring, strict, cheerful, stressful, appreciation of the facial expressions, natural of the facial expressions, their perturbing aspect, and their exaggerated aspect 6. The participants have indicated their agreement or disagreement for each affirmation by checking the box corresponding to their opinion on a Likert scale of 7 points (from 1 not agree at all to 7 fully agree). At the end of the test, each participant has received a gift token to the value of 15 euros. The test for each participant has not exceeded 40 minutes. 5.2 Results The results for each of the 15 quality factors evaluated have been analyzed separately. The distributions of the results are normal. An ANOVA of repeated measurements and a post-hoc test HSD-Tukey have been applied. In the following, these abbreviations are used to describe the different versions: NE for Non-Emotional version (no emotion displayed), E for Empathic version, and NCE for Non- Congruent Emotional version. The results are presented in Tables 1, 2 and 3. The first column indicates the studied quality factors and the first line the versions compared; the elements of the table (i.e. the intersection between one line and one column) correspond to the version in which the quality factor of the agent has been the best perceived (n.s. means non significant, *: p <.05, **: p <.01, ***: p <.001). For instance, in Table 1, the notation E at the intersection of the line NE-E et the column jovial means that, in the empathic version, the virtual agent has been perceived significantly more jovial (with p <.01) than in the non-emotional one. User s perception of the virtual agent s positive quality factors. The analysis of the results shows an effect of the version on the user s perception of the pleasant (F(2,34)=20.597, p<.001), compassionate (F(2,34)=7.44, p<.01), expressive (F(2,34)=4.6790, p<.05), jovial (F(2,34)=12.246, p<.001), and cheerful (F(2,34)=7.7887, p<.01) aspect of the virtual agent. The significant differences appear mainly between the empathic version and the non-emotional one and between the non-congruent version and the empathic one. When the virtual agent expresses empathic emotions (positive and negative), it is perceived more jovial, expressive and cheerful than when it does not express any emotions. Moreover, 6 No definition of these adjectives has been provided to the subjects. when the emotions are displayed in incongruous situations, the virtual agent is perceived less pleasant, compassionate, expressive, jovial and cheerful than when the same emotions are expressed by empathy (Table 1). NE-E NE-NCE E-NCE pleasant n.s. NE E jovial E n.s. E expressive E n.s E cheerful E n.s. E compassionate n.s. n.s E Table 1: Comparison of the user s perception of the agent s positive quality factors in the different EDAMS versions User s perception of the virtual agent s negative quality factors. The results reveal an effect of the version on the user s perception regarding the irritating (F(2,34)=15.409, p<.001), strange (F(2,34)=12.518, p<.001), cold (F(2,34)=5.1405, p<.05), and stressful (F(2,34)=11.679, p<.001) quality factors of the virtual agent. Significant differences appear between the non-emotional version NE and the non-congruent one NCE and between the empathic version E and the non-congruent one NCE. The virtual agent is perceived as being more irritating, strange, cold, and stressful when it expresses emotions in incongruous situations than when it displays empathic emotions or no emotion (Table 2). NE - E NE - NCE E - NCE irritating n.s. NCE NCE strange n.s. NCE NCE cold n.s. NCE NCE boring n.s. n.s. n.s. strict n.s. n.s. n.s. stressful n.s. NCE NCE Table 2: Comparison of the user s perception of the agent s negative quality factors in the different versions 94

7 User s perception of the virtual agent s facial expressions. The results of the experiment show a significant effect of the version on the user s appreciation of the agent s facial expressions (F(2,34)=19.324, p<.001), her perception of the natural aspect (F(2,34)=11.666, p<.001), the perturbing one (F(2,34)=14.880, p<.001), and the exaggerated aspect (F(2,32)=18.522, p<.001) of the agent s facial expressions. The main significant differences appear between the nonemotional version and the non-congruent one and between the empathic version and the non-congruent one. The facial expressions of emotions incongruous to the dialog situations are less appreciated than non emotional one or those expressed by empathy. The same facial expressions of emotion are perceived more natural and less perturbing and exaggerated when they are displayed in empathic situations that in incongruous ones (Table 3). NE - E NE - NCE E - NCE appreciation n.s. NE E natural n.s. NE E perturbing n.s. NCE NCE exaggerated n.s. NCE NCE Table 3: Comparison of the user s perception of the agent s facial expressions in the different EDAMS versions 5.3 Discussion Firstly, the evaluation enables us to compare the user s perception of the non emotional virtual agent and the empathic one. The results show that empathic expressions of emotions do not impair the user s perception of the agent. Indeed, it does not appear more irritating, strange, cold, or stressful when it expresses empathic emotions than when it displays no emotion. Moreover, the facial expressions of emotions are not perceived less natural, more perturbing or exaggerated than non emotional ones. Some significant differences have been observed. The virtual agent appears more expressive, jovial and cheerful when it expresses both positive and negative empathic emotions than when it displays no emotion. These results allow us to confirm our first hypothesis: when a user interacts with a virtual agent to find out information in a particular domain, she perceives more positively the agent when the latter expresses empathic emotions. Moreover, the results reveal that the emotions expressed in incongruous dialog situations have a negative effect on the user s perception of the agent. Indeed, she perceives the virtual agent less pleasant, more irritating, strange, cold and stressful than when the agent expresses no emotion. The facial expressions of emotions in this case seem more exaggerated and perturbing and less natural in comparison with neutral facial expressions. These results confirm our second hypothesis: when a user interacts with a virtual agent to find out information in a particular domain, she perceives more negatively the agent when the latter expresses incongruous emotions to the dialog situations. By comparing the user s perception depending on the agent s conditions of the expression of emotions, it appears that the global perception of the agent depends on the congruency between the dialog situations and the expressions of emotions. Indeed, when the emotional expressions are not congruent with the dialog situations, the agent is perceived more negatively. Moreover, the same facial expressions of emotions are perceived differently depending on the conditions of emotions expressions. They seem less natural, more exaggerated and perturbing when they are not congruent with the dialog situations than when they are displayed by empathy. To conclude, the results of the evaluation show that a virtual agent which expresses emotions in incongruous situations is perceived more negatively than one that does not express any emotions. Inversely, when the agent expresses emotions in the conditions described in our models of empathic emotions, it is perceived more positively than when it does not express any emotion. The expressions of emotions are, therefore, in this case, appropriate to the dialog situations. These results validate the conditions of empathic emotion elicitation defined in our model. They are relevant to determine which empathic emotions the agent should express in which circumstances in order to enhance the user s perception of the virtual agent. 6. CONCLUSION In this paper, we have presented a model of empathic emotions for a virtual agent based both on empirical and theoretical approaches. It enables one to provide empathic capabilities to virtual agents and more particularly to BDI-like agents. This model has been implemented as a plug-in for JSA agents (BDI-like dialog agent). From a JSA agent coupled with the module of emotions and a talking head to display the empathic emotions, a demonstration of an empathic dialog agent used as information system has been developed in order to evaluate the impact of such an agent on the interaction with the user. The results of the evaluation show that the agent and its emotional facial expressions are perceived differently depending on the congruency between the displayed emotions and the dialog situations. Moreover, it appears that the virtual agent which expresses emotions by empathy (in the conditions described in the proposed model of empathic emotions) enhances the user s perception of the agent. These results, consistent with previous evaluations in other contexts [14, 26, 22, 5, 23], tend to promote the use of empathic virtual agent to improve human-machine interaction. However, to create a more complete model of empathic emotions, other user s goals have to be taken into account. Indeed, in the research presented in this paper, only the user s intentions related to her communicative acts are used to model the conditions of emotions elicitation. Moreover, the model of empathic emotions and the module of emotions implemented are a priori domain-independent. But, the model proposed here have been constructed based on an empirical analysis of human-machine dialogs in a specific context (in which the agent is used as information system). Evaluation of the model in others contexts should be done to ensure its domain-independency. 7. ACKNOWLEDGMENT We are very grateful to Thierry Martinez, Vincent Louis and Gaspard Breton for their helps in the implementation of the emotion model of emotions in JSA agent, and to Valérie Maffiolo, Valérie Botherel, Morgane Roger, and Anne Batistello for their valuable helps on the evaluation of EDAMS. 95

8 8. REFERENCES [1] J. Austin. How to do things with words. Oxford University Press, London, [2] C. Bartneck. emuu - An Embodied Emotional Character for the Ambient Intelligent Home. PhD thesis, Eindhoven University of Technology, [3] J. Bates. The role of emotion in believable agents. Communications of the ACM (CACM), 37(7): , [4] C. Becker, I. Wachsmuth, H. Prendinger, and M. Ishizuka. Evaluating affective feedback of the 3D agent max in a competitive cards game. In J. Tao, T. Tan, and R. W. Picard, editors, Proceedings of the International Conference on Affective Computing and Intelligent Interaction (ACII), Pékin, Chine, oct Springer. [5] S. Brave, C. Nass, and K. Hutchinson. Computers that care: Investigating the effects of orientation of emotion exhibited by an embodied computer agent. International Journal of Human-Computer Studies, 62: , [6] N. Chovil. Social determinants of facial displays. Journal of Nonverbal Behavior, 15: , [7] C. Elliot. The Affective Reasoner: A process model of emotions in a multi-agent system. PhD thesis, Northwestern University, [8] FIPA-ACL [9] N. Frijda, A. Ortony, J. Sonnemans, and G.L.Clore. Emotion: Review of Personality and Social Psychology, chapter The complexity of intensity: Issues concerning the structure of emotion intensity. Sage, [10] J. Gratch and S. Marsella. A domain-independent framework for modeling emotion. Journal of Cognitive Systems Research, 5(4): , [11] J. Hakansson. Exploring the phenomenon of empathy. PhD thesis, Department of Psychology, Stockholm University, [12] JADE [13] W. Johnson, J. Rickel, and J. Lester. Animated pedagogical agents: Face-to-face interaction in interactive learning environments. International Journal of Artificial Intelligence in Education, 11:47 78, [14] J. Klein, Y. Moon, and R. Picard. This computer responds to user frustration. In Proceedings of the Conference on Human Factors in Computing Systems, pages , Pittsburgh, USA, may ACM Press. [15] V. Louis and T. Martinez. Developing Multi-agent Systems with JADE, chapter JADE Semantics Framework, pages John Wiley and Sons Inc, [16] S. McQuiggan and J. Lester. Learning empathy: A data-driven framework for modeling empathetic companion agents. In P. Stone and G. Weiss, editors, Proceedings of the International joint Conference on Autonomous Agents and Multi-Agent Systems (AAMAS), Hakodate, Japan, may [17] M. Ochs, C. Pelachaud, and D. Sadek. An Empathic Rational Dialog Agent. In International Conference on Affective Computing and Intelligent Interaction (ACII), pages , Lisbon, Portugal, [18] M. Ochs, C. Pelachaud, and D. Sadek. Emotion elicitation in an empathic virtual dialog agent. In Prooceedings of the Second European Cognitive Science Conference (EuroCogSci), [19] A. Ortony, G. Clore, and A. Collins. The cognitive structure of emotions. Cambridge University Press, United Kingdom, [20] E. Pacherie. L empathie, chapter L empathie et ses degrés, pages Odile Jacob, [21] A. Paiva, J. Dias, D. Sobral, S. Woods, and L. Hall. Building empathic lifelike characters: the proximity factor. In A. Paiva, R. Aylett, and S. Marsella, editors, Proceedings of International joint Conference on Autonomous Agents and Multi-Agent Systems (AAMAS), Workshop on Empathic Agents, New York, USA, aug [22] T. Partala and V. Surakka. The effects of affective interventions in human-computer interaction. Interacting with computers, 16: , [23] R. Picard and K. Liu. Relative Subjective Count and Assessment of Interruptive Technologies Applied to Mobile Monitoring of Stress. International Journal of Human-Computer Studies, 65: , [24] I. Poggi. Emotions from mind to mind. In A. Paiva, R. Aylett, and S. Marsella, editors, Proceedings of International joint Conference on Autonomous Agents and Multi-Agent Systems (AAMAS), Workshop on Empathic Agents, New York, USA, aug [25] H. Prendinger and M. Ishizuka. The empathic companion: A character-based interface that addresses users affective states. International Journal of Applied Artificial Intelligence, 19: , [26] H. Prendinger, J. Mori, and M. Ishizuka. Using human physiology to evaluate subtle expressivity of a virtual quizmaster in a mathematical game. International Journal of Human-Computer Studies, 62: , [27] A. S. Rao and M. Georgeff. Modeling rational agents within a BDI-architecture. In Proceedings of the International Conference on Principles of Knowledge Representation and Reasoning (KR), [28] S. Reilly. Believable Social and Emotional Agents. PhD thesis, Carnegie Mellon University, [29] D. Sadek. Attitudes mentales et interaction rationnelle: vers une théorie formelle de la communication. PhD thesis, Université Rennes I, [30] K. Scherer. Criteria for emotion-antecedent appraisal: A review. In V. Hamilton, G. Bower, and N. Frijda, editors, Cognitive perspectives on emotion and motivation, pages Dordrecht, Kluwer, [31] K. Scherer. Emotion. In M. Hewstone and W. Stroebe, editors, Introduction to Social Psychology: A European perspective, pages Oxford Blackwell Publishers, [32] E. Tanguy, J. Bryson, and W. P. A dynamic emotion representation model within a facial animation system. Technical report, Department of Computer Science; University of Bath; England,

Intelligent Expressions of Emotions

Intelligent Expressions of Emotions Intelligent Expressions of Emotions Magalie Ochs 1, Radosław Niewiadomski 2, Catherine Pelachaud 3 and David Sadek 1 1 France Telecom, R&D Division, Technology Center, France, {magalie.ochs, david.sadek}@francetelecom.com

More information

Affect in Virtual Agents (and Robots) Professor Beste Filiz Yuksel University of San Francisco CS 686/486

Affect in Virtual Agents (and Robots) Professor Beste Filiz Yuksel University of San Francisco CS 686/486 Affect in Virtual Agents (and Robots) Professor Beste Filiz Yuksel University of San Francisco CS 686/486 Software / Virtual Agents and Robots Affective Agents Computer emotions are of primary interest

More information

Useful Roles of Emotions in Animated Pedagogical Agents. Ilusca L. L. Menezes IFT6261 :: Winter 2006

Useful Roles of Emotions in Animated Pedagogical Agents. Ilusca L. L. Menezes IFT6261 :: Winter 2006 Useful Roles of Emotions in Animated Ilusca L. L. Menezes IFT6261 :: Winter 2006 Objectives To provide opportunities: To understand the importance of the emotions in learning To explore how to model emotions

More information

Predicting User Physiological Response for Interactive Environments: An Inductive Approach

Predicting User Physiological Response for Interactive Environments: An Inductive Approach Predicting User Physiological Response for Interactive Environments: An Inductive Approach Scott W. McQuiggan, Sunyoung Lee, and James C. Lester Department of Computer Science North Carolina State University

More information

Assessing the validity of appraisalbased models of emotion

Assessing the validity of appraisalbased models of emotion Assessing the validity of appraisalbased models of emotion Jonathan Gratch, Stacy Marsella, Ning Wang, Brooke Stankovic Institute for Creative Technologies University of Southern California The projects

More information

Introduction to affect computing and its applications

Introduction to affect computing and its applications Introduction to affect computing and its applications Overview What is emotion? What is affective computing + examples? Why is affective computing useful? How do we do affect computing? Some interesting

More information

Agents with emotions: a logical perspective

Agents with emotions: a logical perspective Agents with emotions: a logical perspective Emiliano Lorini Institut de Recherche en Informatique de Toulouse (IRIT), France lorini@irit.fr Abstract The aim of this short paper is to discuss the role played

More information

Affective Game Engines: Motivation & Requirements

Affective Game Engines: Motivation & Requirements Affective Game Engines: Motivation & Requirements Eva Hudlicka Psychometrix Associates Blacksburg, VA hudlicka@ieee.org psychometrixassociates.com DigiPen Institute of Technology February 20, 2009 1 Outline

More information

The Effect of Dominance Manipulation on the Perception and Believability of an Emotional Expression

The Effect of Dominance Manipulation on the Perception and Believability of an Emotional Expression The Effect of Dominance Manipulation on the Perception and Believability of an Emotional Expression Wim F.J. van der Ham 1(B), Joost Broekens 2, and Peter H.M.P. Roelofsma 1 1 AAL-VU, VU Amsterdam, De

More information

Artificial Emotions to Assist Social Coordination in HRI

Artificial Emotions to Assist Social Coordination in HRI Artificial Emotions to Assist Social Coordination in HRI Jekaterina Novikova, Leon Watts Department of Computer Science University of Bath Bath, BA2 7AY United Kingdom j.novikova@bath.ac.uk Abstract. Human-Robot

More information

Creating Adaptive and Individual Personalities in Many Characters Without Hand-Crafting Behaviors

Creating Adaptive and Individual Personalities in Many Characters Without Hand-Crafting Behaviors Creating Adaptive and Individual Personalities in Many Characters Without Hand-Crafting Behaviors Jennifer Sandercock, Lin Padgham, and Fabio Zambetta School of Computer Science and Information Technology,

More information

EMOTIONAL LEARNING. Synonyms. Definition

EMOTIONAL LEARNING. Synonyms. Definition EMOTIONAL LEARNING Claude Frasson and Alicia Heraz Department of Computer Science, University of Montreal Montreal (Québec) Canada {frasson,heraz}@umontreal.ca Synonyms Affective Learning, Emotional Intelligence,

More information

Recognizing, Modeling, and Responding to Users Affective States

Recognizing, Modeling, and Responding to Users Affective States Recognizing, Modeling, and Responding to Users Affective States Helmut Prendinger 1, Junichiro Mori 2, and Mitsuru Ishizuka 2 1 National Institute of Informatics 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430,

More information

Effects of Emotional Agents on Human Players in the Public Goods Game

Effects of Emotional Agents on Human Players in the Public Goods Game Effects of Agents on Human Players in the Public Goods Game Katharina Göttlicher, Sabine Stein, Dirk Reichardt Duale Hochschule Baden-Württemberg Stuttgart {ai06037, ai06009}@informatik.dhbw-stuttgart.de,

More information

Behavioral EQ MULTI-RATER PROFILE. Prepared for: By: Session: 22 Jul Madeline Bertrand. Sample Organization

Behavioral EQ MULTI-RATER PROFILE. Prepared for: By: Session: 22 Jul Madeline Bertrand. Sample Organization Behavioral EQ MULTI-RATER PROFILE Prepared for: Madeline Bertrand By: Sample Organization Session: Improving Interpersonal Effectiveness 22 Jul 2014 Behavioral EQ, Putting Emotional Intelligence to Work,

More information

Leading with Emotional Intelligence. Courtney Holladay, PhD Executive Director

Leading with Emotional Intelligence. Courtney Holladay, PhD Executive Director Leading with Emotional Intelligence Courtney Holladay, PhD Executive Director Your Role Model Think of a leader, in your current work environment, who demonstrated Emotional Intelligence during a particular

More information

Agents with Emotional Intelligence for Storytelling

Agents with Emotional Intelligence for Storytelling Agents with Emotional Intelligence for Storytelling João Dias and Ana Paiva INESC-ID and Instituto Superior Técnico, Technical University of Lisbon, Tagus Park, Av. Prof. Cavaco Silva, 2780-990 Porto Salvo,

More information

Affective Computing for Intelligent Agents. Introduction to Artificial Intelligence CIS 4930, Spring 2005 Guest Speaker: Cindy Bethel

Affective Computing for Intelligent Agents. Introduction to Artificial Intelligence CIS 4930, Spring 2005 Guest Speaker: Cindy Bethel Affective Computing for Intelligent Agents Introduction to Artificial Intelligence CIS 4930, Spring 2005 Guest Speaker: Cindy Bethel Affective Computing Affect: phenomena manifesting i itself under the

More information

Modeling and evaluating empathy in embodied companion agents

Modeling and evaluating empathy in embodied companion agents Modeling and evaluating empathy in embodied companion agents Scott W. McQuiggan, James C. Lester Dept. of Computer Science, 890 Oval Drive, North Carolina State University Raleigh, NC 27695, USA {swmcquig,

More information

WP 7: Emotion in Cognition and Action

WP 7: Emotion in Cognition and Action WP 7: Emotion in Cognition and Action Lola Cañamero, UH 2 nd Plenary, May 24-27 2005, Newcastle WP7: The context Emotion in cognition & action in multi-modal interfaces? Emotion-oriented systems adapted

More information

The Comforting Presence of Relational Agents

The Comforting Presence of Relational Agents The Comforting Presence of Relational Agents Timothy Bickmore Assistant Professor College of Computer and Information Science Northeastern University 360 Huntington Ave, WVH 202 Boston, MA 02115 USA bickmore@ccs.neu.edu

More information

Emotional intelligence in interactive systems

Emotional intelligence in interactive systems To appear in Design and Emotion, (Eds., D. McDonagh, P. Hekkert, J. van Erp and D. Gyi (pp.262-266) London: Taylor & Francis Emotional intelligence in interactive systems Antonella De Angeli and Graham

More information

CAAF: A Cognitive Affective Agent Programming Framework

CAAF: A Cognitive Affective Agent Programming Framework CAAF: A Cognitive Affective Agent Programming Framework F. Kaptein, J. Broekens, K. V. Hindriks, and M. Neerincx Delft University of Technology, Mekelweg 2, 2628 CD Delft, The Netherlands, F.C.A.Kaptein@tudelft.nl

More information

A BDI Approach to Infer Student s Emotions in an Intelligent Learning Environment

A BDI Approach to Infer Student s Emotions in an Intelligent Learning Environment JAQUES, Patrícia Augustin; VICCARI, Rosa Maria. A BDI Approach to Infer Student s Emotions. Computers and education, Volume 49, Issue 2, September 2007, Pages 360-384, 2007. A BDI Approach to Infer Student

More information

The Comforting Presence of Relational Agents

The Comforting Presence of Relational Agents The Comforting Presence of Relational Agents Timothy Bickmore Assistant Professor College of Computer and Information Science Northeastern University 360 Huntington Ave, WVH 202 Boston, MA 02115 USA bickmore@ccs.neu.edu

More information

Practical Approaches to Comforting Users with Relational Agents

Practical Approaches to Comforting Users with Relational Agents Practical Approaches to Comforting Users with Relational Agents Timothy Bickmore, Daniel Schulman Northeastern University College of Computer and Information Science 360 Huntington Avenue, WVH 202, Boston,

More information

Affective Transitions in Narrative-Centered Learning Environments

Affective Transitions in Narrative-Centered Learning Environments Affective Transitions in Narrative-Centered Learning Environments Scott W. McQuiggan, Jennifer L. Robison, and James C. Lester Department of Computer Science, North Carolina State University, Raleigh NC

More information

Programming with Goals (3)

Programming with Goals (3) Programming with Goals (3) M. Birna van Riemsdijk, TU Delft, The Netherlands GOAL slides adapted from MAS course slides by Hindriks 4/24/11 Delft University of Technology Challenge the future Outline GOAL:

More information

SARA: Social Affective Relational Agent: A Study on the Role of Empathy in Artificial Social Agents

SARA: Social Affective Relational Agent: A Study on the Role of Empathy in Artificial Social Agents SARA: Social Affective Relational Agent: A Study on the Role of Empathy in Artificial Social Agents Sandra Gama, Gabriel Barata, Daniel Gonçalves, Rui Prada, and Ana Paiva INESC-ID and Instituto Superior

More information

Emotions of Living Creatures

Emotions of Living Creatures Robot Emotions Emotions of Living Creatures motivation system for complex organisms determine the behavioral reaction to environmental (often social) and internal events of major significance for the needs

More information

EMOTIONAL INTELLIGENCE QUESTIONNAIRE

EMOTIONAL INTELLIGENCE QUESTIONNAIRE EMOTIONAL INTELLIGENCE QUESTIONNAIRE Personal Report JOHN SMITH 2017 MySkillsProfile. All rights reserved. Introduction The EIQ16 measures aspects of your emotional intelligence by asking you questions

More information

Affect and Affordance: Architectures without Emotion

Affect and Affordance: Architectures without Emotion Affect and Affordance: Architectures without Emotion Darryl N. Davis, Suzanne C. Lewis and Gauvain Bourgne University of Hull, UK. Abstract General frameworks of mind map across tasks and domains. By what

More information

Emotional Intelligence

Emotional Intelligence Emotional Intelligence 1 Emotional Intelligence Emotional intelligence is your ability to recognize & understand emotions in yourself and others, and your ability to use this awareness to manage your behavior

More information

Verbal and nonverbal discourse planning

Verbal and nonverbal discourse planning Verbal and nonverbal discourse planning Berardina De Carolis Catherine Pelachaud Isabella Poggi University of Bari University of Rome La Sapienza University of Rome Three Department of Informatics Department

More information

Emotional Agents need Formal Models of Emotion

Emotional Agents need Formal Models of Emotion Emotional Agents need Formal Models of Emotion Joost Broekens and Doug DeGroot Leiden Institute of Advanced Computer Science, Leiden University, Leiden, The Netherlands, Email: {broekens,degroot}@liacs.nl

More information

Predicting Learners Emotional Response in Intelligent Distance Learning Systems

Predicting Learners Emotional Response in Intelligent Distance Learning Systems Predicting Learners Emotional Response in Intelligent Distance Learning Systems Soumaya Chaffar, Claude Frasson Département d'informatique et de recherche opérationnelle Université de Montréal C.P. 6128,

More information

Empathy for Max. (Preliminary Project Report)

Empathy for Max. (Preliminary Project Report) Empathy for Max (Preliminary Project Report) Christian Becker Faculty of Technology University of Bielefeld 33549 Bielefeld, Germany cbecker@techfak.uni-bielefeld.de Mitsuru Ishizuka Graduate School of

More information

Metrics for Character Believability in Interactive Narrative

Metrics for Character Believability in Interactive Narrative Metrics for Character Believability in Interactive Narrative Author Version: September 30, 2013 In proceedings ICIDS 2013 Paulo Gomes 1, Ana Paiva 2, Carlos Martinho 2, and Arnav Jhala 1 1 UC Santa Cruz,

More information

Managing emotions in turbulent and troubling times. Professor Peter J. Jordan Griffith Business School

Managing emotions in turbulent and troubling times. Professor Peter J. Jordan Griffith Business School Managing emotions in turbulent and troubling times Professor Peter J. Jordan Griffith Business School Overview Emotions and behaviour Emotional reactions to change Emotional intelligence What emotions

More information

Innate and Learned Emotion Network

Innate and Learned Emotion Network Innate and Learned Emotion Network Rony NOVIANTO a and Mary-Anne WILLIAMS b a rony@ronynovianto.com b mary-anne@themagiclab.org Centre for Quantum Computation and Intelligent Systems University of Technology

More information

Unifying Cognitive Functions and Emotional Appraisal. Bob Marinier John Laird University of Michigan 26 th Soar Workshop: May 24, 2006

Unifying Cognitive Functions and Emotional Appraisal. Bob Marinier John Laird University of Michigan 26 th Soar Workshop: May 24, 2006 Unifying Cognitive Functions and Emotional Appraisal Bob Marinier John Laird University of Michigan 26 th Soar Workshop: May 24, 2006 Introduction Have independent theories of emotion and cognitive functions

More information

!!!!!!! !!!!!!!!!!!!

!!!!!!! !!!!!!!!!!!! Running head: VISUAL ADVERTISING ON SELF-REFERENCING AND EMPATHY 1 Effects of Visual Advertising on Self-Referencing and Empathy Towards Health Conditions Christine Cao and Jennifer Ball University of

More information

K I N G. mentally ill E N. 38 myevt.com exceptional veterinary team March/April 2012

K I N G. mentally ill E N. 38 myevt.com exceptional veterinary team March/April 2012 W OR K I N G W IT H mentally ill C LI E N TS 38 myevt.com exceptional veterinary team March/April 2012 Corissa C. Lotta, PhD, and Stacie L. Fishell, MA Peer Reviewed March/April 2012 exceptional veterinary

More information

Modeling Parallel and Reactive Empathy in Virtual Agents: An Inductive Approach

Modeling Parallel and Reactive Empathy in Virtual Agents: An Inductive Approach Modeling Parallel and Reactive Empathy in Virtual Agents: An Inductive Approach Scott W. McQuiggan 1, Jennifer L. Robison 1, Robert Phillips 12, and James C. Lester 1 1 Department of Computer Science North

More information

EMOTIONAL INTELLIGENCE

EMOTIONAL INTELLIGENCE EMOTIONAL INTELLIGENCE Ashley Gold, M.A. University of Missouri St. Louis Colarelli Meyer & Associates TOPICS Why does Emotional Intelligence (EI) matter? What is EI? Industrial-Organizational Perspective

More information

An Evaluation of the COR-E Computational Model for Affective Behaviors

An Evaluation of the COR-E Computational Model for Affective Behaviors An Evaluation of the COR-E Computational Model for Affective Behaviors Sabrina Campano, Nicolas Sabouret, Etienne de Sevin, Vincent Corruble Université Pierre et Marie Curie 4, place Jussieu 75005 Paris,

More information

References. Note: Image credits are in the slide notes

References. Note: Image credits are in the slide notes References Reeve, J. (2009). Understanding motivation and (5th ed.). Hoboken, NJ: Wiley. Tomkins, S. S. (1970) Affect as the primary motivational system. In M. B. Arnold (ed.), Feelings and s (pp. 101-110).

More information

Creating Genuine Smiles for Digital and Robotic Characters: An Empirical Study

Creating Genuine Smiles for Digital and Robotic Characters: An Empirical Study Creating Genuine Smiles for Digital and Robotic Characters: An Empirical Study Mei Si Department of Cognitive Science Rensselaer Polytechnic Institute sim@rpi.edu J. Dean McDaniel Department of Computer

More information

Negotiations in the Context of AIDS Prevention: An Agent-Based Model Using Theory of Mind

Negotiations in the Context of AIDS Prevention: An Agent-Based Model Using Theory of Mind Negotiations in the Context of AIDS Prevention: An Agent-Based Model Using Theory of Mind Jennifer Klatt 1, Stacy Marsella 2, and Nicole C. Krämer 1 1 University Duisburg-Essen, Department of Social Psychology:

More information

Look on the Bright Side: A Model of Cognitive Change in Virtual Agents

Look on the Bright Side: A Model of Cognitive Change in Virtual Agents Look on the Bright Side: A Model of Cognitive Change in Virtual Agents Juan Martínez-Miranda, Adrián Bresó, and Juan Miguel García-Gómez ITACA Institute, Biomedical Informatics Group, Universitat Politècnica

More information

Modeling Emotion in Soar

Modeling Emotion in Soar Modeling Emotion in Soar Jonathan Gratch USC Institute for Creative Technologies Joint work with Stacy Marsella USC Information Sciences Institute Outline Human emotion can inform intelligent system design

More information

Affective Systems. Rotterdam, November 11, 2004

Affective Systems. Rotterdam, November 11, 2004 Affective Systems Rotterdam, November 11, 2004 What is an affective system? A fly? A dog? A software? A human? An ant? What is an affective system? We need a definition of affect in order to define affective

More information

THE EMPATHIC COMPANION: A CHARACTER-BASED INTERFACE THAT ADDRESSES USERS AFFECTIVE STATES

THE EMPATHIC COMPANION: A CHARACTER-BASED INTERFACE THAT ADDRESSES USERS AFFECTIVE STATES Applied Artificial Intelligence, 19:267 285 Copyright # 2005 Taylor & Francis Inc. ISSN: 0883-9514 print/1087-6545 online DOI: 10.1080/08839510590910174 THE EMPATHIC COMPANION: A CHARACTER-BASED INTERFACE

More information

It s All in the Game: Towards an Affect Sensitive and Context Aware Game Companion

It s All in the Game: Towards an Affect Sensitive and Context Aware Game Companion It s All in the Game: Towards an Affect Sensitive and Context Aware Game Companion Ginevra Castellano Dept. of Computer Science School of EECS Queen Mary Univ. of London United Kingdom ginevra@dcs.qmul.ac.uk

More information

Towards Interactive Narrative Medicine

Towards Interactive Narrative Medicine Towards Interactive Narrative Medicine Marc CAVAZZA (MD, PhD) 1 and Fred CHARLES (PhD) Teesside University, School of Computing, United Kingdom Abstract. Interactive Storytelling technologies have attracted

More information

Strategies using Facial Expressions and Gaze Behaviors for Animated Agents

Strategies using Facial Expressions and Gaze Behaviors for Animated Agents Strategies using Facial Expressions and Gaze Behaviors for Animated Agents Masahide Yuasa Tokyo Denki University 2-1200 Muzai Gakuendai, Inzai, Chiba, 270-1382, Japan yuasa@sie.dendai.ac.jp Abstract. This

More information

Emotional & social Skills for trainers

Emotional & social Skills for trainers Emotional & social Skills for trainers Hugh Russell www.thinking.ie Emotional Intelligence (EI) describes the ability, capacity, skill or, in the case of the trait EI model, a self-perceived ability, to

More information

Representing Emotion and Mood States for Virtual Agents

Representing Emotion and Mood States for Virtual Agents Representing Emotion and Mood States for Virtual Agents Luis Peña 1, Jose-María Peña 2, and Sascha Ossowski 1 1 Universidad Rey Juan Carlos {luis.pena,sascha.ossowski}@urjc.es 2 Universidad Politecnica

More information

The Importance of the Mind for Understanding How Emotions Are

The Importance of the Mind for Understanding How Emotions Are 11.3 The Importance of the Mind for Understanding How Emotions Are Embodied Naomi I. Eisenberger For centuries, philosophers and psychologists alike have struggled with the question of how emotions seem

More information

GOALS FOR LEADERS SAMPLE SESSION OUTLINE

GOALS FOR LEADERS SAMPLE SESSION OUTLINE THOUGHTS 1 -- THOUGHTS AND YOUR MOOD Welcome new participants Review group rules GOALS FOR LEADERS Have participants and group introduce themselves Review the cognitive behavioral treatment model Introduce

More information

Conversations Without Words: Using Nonverbal Communication to Improve the Patient-Caregiver Relationship

Conversations Without Words: Using Nonverbal Communication to Improve the Patient-Caregiver Relationship Conversations Without Words: Using Nonverbal Communication to Improve the Patient-Caregiver Relationship Judith A. Hall, PhD University Distinguished Professor of Psychology Northeastern University j.hall@neu.edu

More information

Enhancing Support for Special Populations through Understanding Neurodiversity

Enhancing Support for Special Populations through Understanding Neurodiversity Enhancing Support for Special Populations through Understanding Neurodiversity Dr. Ann P. McMahon annpmcmahon@gmail.com Custom K 12 Engineering customk 12engineering.com Select a puzzle from the container

More information

Using an Emotional Intelligent Agent to Improve the Learner s Performance

Using an Emotional Intelligent Agent to Improve the Learner s Performance Using an Emotional Intelligent Agent to Improve the Learner s Performance Soumaya Chaffar, Claude Frasson Département d'informatique et de recherche opérationnelle Université de Montréal C.P. 6128, Succ.

More information

Temporal Context and the Recognition of Emotion from Facial Expression

Temporal Context and the Recognition of Emotion from Facial Expression Temporal Context and the Recognition of Emotion from Facial Expression Rana El Kaliouby 1, Peter Robinson 1, Simeon Keates 2 1 Computer Laboratory University of Cambridge Cambridge CB3 0FD, U.K. {rana.el-kaliouby,

More information

10/9/2015 EMOTIONAL INTELLIGENCE EMOTIONAL INTELLIGENCE

10/9/2015 EMOTIONAL INTELLIGENCE EMOTIONAL INTELLIGENCE EMOTIONAL INTELLIGENCE Presented by: Judith A. Gissy PCC, LICDC, NCAC II, SAP CONCERN is a part of the Corporate Health division of the TriHealth Healthcare System. We Provide a range of services, including

More information

MEASUREMENT, SCALING AND SAMPLING. Variables

MEASUREMENT, SCALING AND SAMPLING. Variables MEASUREMENT, SCALING AND SAMPLING Variables Variables can be explained in different ways: Variable simply denotes a characteristic, item, or the dimensions of the concept that increases or decreases over

More information

Emotion Models and Affective Computing

Emotion Models and Affective Computing Intelligent User Interfaces affective computing Emotion Models and Affective Computing OCC emotion model Emotion recognition from bio-signals Research of the R. Picard s Affective Computing Group Empathic

More information

Emotional Intelligence

Emotional Intelligence Cornell Municipal Clerks Institute 2015 Emotional Intelligence Understanding yourself, caring enough to understand others, managing yourself because you care Session goals Explore the concept of emotional

More information

Cambridge Public Schools SEL Benchmarks K-12

Cambridge Public Schools SEL Benchmarks K-12 Cambridge Public Schools SEL Benchmarks K-12 OVERVIEW SEL Competencies Goal I: Develop selfawareness Goal II: Develop and Goal III: Develop social Goal IV: Demonstrate Goal V: Demonstrate skills to demonstrate

More information

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012 Outline Emotion What are emotions? Why do we have emotions? How do we express emotions? Cultural regulation of emotion Eliciting events Cultural display rules Social Emotions Behavioral component Characteristic

More information

Emotion in Intelligent Virtual Agents: the Flow Model of Emotion

Emotion in Intelligent Virtual Agents: the Flow Model of Emotion Emotion in Intelligent Virtual gents: the Flow Model of Emotion Luís Morgado 1,2 and Graça Gaspar 2 1 Instituto Superior de Engenharia de Lisboa Rua Conselheiro Emídio Navarro, 1949-014 Lisboa, Portugal

More information

Social Context Based Emotion Expression

Social Context Based Emotion Expression Social Context Based Emotion Expression Radosław Niewiadomski (1), Catherine Pelachaud (2) (1) University of Perugia, Italy (2) University Paris VIII, France radek@dipmat.unipg.it Social Context Based

More information

Brain Mechanisms Explain Emotion and Consciousness. Paul Thagard University of Waterloo

Brain Mechanisms Explain Emotion and Consciousness. Paul Thagard University of Waterloo Brain Mechanisms Explain Emotion and Consciousness Paul Thagard University of Waterloo 1 1. Why emotions matter 2. Theories 3. Semantic pointers 4. Emotions 5. Consciousness Outline 2 What is Emotion?

More information

PLAN FOR TODAY. What is Emotional Intelligence/EQ? Why it Matters An Overview of the EQ Model Lots of ideas for improving your EQ

PLAN FOR TODAY. What is Emotional Intelligence/EQ? Why it Matters An Overview of the EQ Model Lots of ideas for improving your EQ PLAN FOR TODAY What is Emotional Intelligence/EQ? Why it Matters An Overview of the EQ Model Lots of ideas for improving your EQ EMOTIONAL INTELLIGENCE IS NOT Being Emotional Always Agreeable Optimistic

More information

BarOn Emotional Quotient Inventory. Resource Report. John Morris. Name: ID: Admin. Date: December 15, 2010 (Online) 17 Minutes 22 Seconds

BarOn Emotional Quotient Inventory. Resource Report. John Morris. Name: ID: Admin. Date: December 15, 2010 (Online) 17 Minutes 22 Seconds BarOn Emotional Quotient Inventory By Reuven Bar-On, Ph.D. Resource Report Name: ID: Admin. Date: Duration: John Morris December 15, 2010 (Online) 17 Minutes 22 Seconds Copyright 2002 Multi-Health Systems

More information

Developing Empirically Based Student Personality Profiles for Affective Feedback Models

Developing Empirically Based Student Personality Profiles for Affective Feedback Models Developing Empirically Based Student Personality Profiles for Affective Feedback Models Jennifer Robison 1, Scott McQuiggan 2 and James Lester 1 1 Department of Computer Science, North Carolina State University

More information

Nothing in biology makes sense except in the light of evolution Theodosius Dobzhansky Descent with modification Darwin

Nothing in biology makes sense except in the light of evolution Theodosius Dobzhansky Descent with modification Darwin Evolutionary Psychology: Emotion, Cognition and Intelligence Bill Meacham, Ph.D. APDG, 11 May 2015 www.bmeacham.com Evolution Nothing in biology makes sense except in the light of evolution Theodosius

More information

Perception Lie Paradox: Mathematically Proved Uncertainty about Humans Perception Similarity

Perception Lie Paradox: Mathematically Proved Uncertainty about Humans Perception Similarity Perception Lie Paradox: Mathematically Proved Uncertainty about Humans Perception Similarity Ahmed M. Mahran Computer and Systems Engineering Department, Faculty of Engineering, Alexandria University,

More information

ADHD and Challenging Behaviour

ADHD and Challenging Behaviour ADHD and Challenging Behaviour 10 th of November 2017 Fintan O Regan 07734 715 378 www.fintanoregan.com Fjmoregan@aol.com Publications Cooper P and O Regan F (2001) EDUCATING children with ADHD: Routledge

More information

Caring for Agents and Agents that Care: Building Empathic Relations with Synthetic Agents

Caring for Agents and Agents that Care: Building Empathic Relations with Synthetic Agents Caring for Agents and Agents that Care: Building Empathic Relations with Synthetic Agents Ana Paiva, João Dias, Daniel Sobral Instituto Superior Técnico and INESC-ID Av. Prof. Cavaco Silva, IST, Taguspark

More information

Investigating the Appraisal Patterns of Regret and Disappointment 1

Investigating the Appraisal Patterns of Regret and Disappointment 1 Motivation and Emotion, Vol. 26, No. 4, December 2002 ( C 2002) Investigating the Appraisal Patterns of Regret and Disappointment 1 Wilco W. van Dijk 2,4 and Marcel Zeelenberg 3 Regret and disappointment

More information

Towards Formal Modeling of Affective Agents in a BDI Architecture

Towards Formal Modeling of Affective Agents in a BDI Architecture Towards Formal Modeling of Affective Agents in a BDI Architecture Bexy Alfonso, Emilio Vivancos, and Vicente Botti Universitat Politècnica de València, Departamento de Sistemas Informáticos y Computación,

More information

RELYING ON TRUST TO FIND RELIABLE INFORMATION. Keywords Reputation; recommendation; trust; information retrieval; open distributed systems.

RELYING ON TRUST TO FIND RELIABLE INFORMATION. Keywords Reputation; recommendation; trust; information retrieval; open distributed systems. Abstract RELYING ON TRUST TO FIND RELIABLE INFORMATION Alfarez Abdul-Rahman and Stephen Hailes Department of Computer Science, University College London Gower Street, London WC1E 6BT, United Kingdom E-mail:

More information

Using progressive adaptability against the complexity of modeling emotionally influenced virtual agents

Using progressive adaptability against the complexity of modeling emotionally influenced virtual agents Using progressive adaptability against the complexity of modeling emotionally influenced virtual agents Ricardo Imbert Facultad de Informática Universidad Politécnica de Madrid rimbert@fi.upm.es Angélica

More information

Affective computing with primary and secondary emotions in a virtual human

Affective computing with primary and secondary emotions in a virtual human Noname manuscript No. (will be inserted by the editor) Affective computing with primary and secondary emotions in a virtual human Christian Becker-Asano Ipke Wachsmuth Received: date / Accepted: date Abstract

More information

This full text version, available on TeesRep, is the post-print (final version prior to publication) of:

This full text version, available on TeesRep, is the post-print (final version prior to publication) of: This full text version, available on TeesRep, is the post-print (final version prior to publication) of: Peinado, F., Cavazza, M. O. and Pizzi, D. (2008) 'Revisiting character-based affective storytelling

More information

Unifying Emotion and Cognition: The Flow Model Approach

Unifying Emotion and Cognition: The Flow Model Approach Unifying Emotion and Cognition: The Flow Model Approach Luís Morgado,2 and Graça Gaspar 2 Instituto Superior de Engenharia de Lisboa Rua Conselheiro Emídio Navarro, 949-04 Lisboa, Portugal lm@isel.ipl.pt

More information

Comparing Three Computational Models of Affect

Comparing Three Computational Models of Affect Comparing Three Computational Models of Affect Tibor Bosse 1, Jonathan Gratch 3, Johan F. Hoorn 2, Matthijs Pontier 1,2 and Ghazanfar F. Siddiqui 1,2,4 1 VU University, Department of Artificial Intelligence

More information

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson Spotting Liars and Deception Detection skills - people reading skills in the risk context Alan Hudson < AH Business Psychology 2016> This presentation has been prepared for the Actuaries Institute 2016

More information

Towards Human Affect Modeling: A Comparative Analysis of Discrete Affect and Valence-Arousal Labeling

Towards Human Affect Modeling: A Comparative Analysis of Discrete Affect and Valence-Arousal Labeling Towards Human Affect Modeling: A Comparative Analysis of Discrete Affect and Valence-Arousal Labeling Sinem Aslan 1, Eda Okur 1, Nese Alyuz 1, Asli Arslan Esme 1, Ryan S. Baker 2 1 Intel Corporation, Hillsboro

More information

The Emotional Machine

The Emotional Machine The Emotional Machine A Machine Learning Approach to Online Prediction of User s Emotion and Intensity Amine Trabelsi, Claude Frasson Département d informatique et de recherche opérationnelle Université

More information

draft Big Five 03/13/ HFM

draft Big Five 03/13/ HFM participant client HFM 03/13/201 This report was generated by the HFMtalentindex Online Assessment system. The data in this report are based on the answers given by the participant on one or more psychological

More information

Emotionally Augmented Storytelling Agent

Emotionally Augmented Storytelling Agent Emotionally Augmented Storytelling Agent The Effects of Dimensional Emotion Modeling for Agent Behavior Control Sangyoon Lee 1(&), Andrew E. Johnson 2, Jason Leigh 2, Luc Renambot 2, Steve Jones 3, and

More information

Learn how to more effectively communicate with others. This will be a fun and informative workshop! Sponsored by

Learn how to more effectively communicate with others. This will be a fun and informative workshop! Sponsored by Assertiveness Training Learn how to more effectively communicate with others. This will be a fun and informative workshop! Sponsored by Lack of Assertiveness Examples Allowing others to coerce you into

More information

Bodily Expression: Selfperception, internal and external effects and their meaning for communication and interaction

Bodily Expression: Selfperception, internal and external effects and their meaning for communication and interaction Bodily Expression: Selfperception, internal and external effects and their meaning for communication and interaction Dear audience, dear students and colleagues, In my lecture I shall introduce the complex

More information

Simulating Affective Communication with Animated Agents

Simulating Affective Communication with Animated Agents Simulating Affective Communication with Animated Agents Helmut Prendinger & Mitsuru Ishizuka Department of Information and Communication Engineering School of Engineering, University of Tokyo 7-3-1 Hongo,

More information

Assertive Communication/Conflict Resolution In Dealing With Different People. Stephanie Bellin Employer Services Trainer

Assertive Communication/Conflict Resolution In Dealing With Different People. Stephanie Bellin Employer Services Trainer Assertive Communication/Conflict Resolution In Dealing With Different People Stephanie Bellin Employer Services Trainer The Passive Communicator Often complain and feel they are being treated unfairly.

More information

How do Robotic Agents Appearances Affect People s Interpretations of the Agents Attitudes?

How do Robotic Agents Appearances Affect People s Interpretations of the Agents Attitudes? How do Robotic Agents Appearances Affect People s Interpretations of the Agents Attitudes? Takanori Komatsu Future University-Hakodate. 116-2 Kamedanakano. Hakodate, 041-8655 JAPAN komatsu@fun.ac.jp Seiji

More information

Animating Idle Gaze in Public Places

Animating Idle Gaze in Public Places Animating Idle Gaze in Public Places Angelo Cafaro 1, Raffaele Gaito 1 and Hannes Högni Vilhjálmsson 2, 1 Università degli Studi di Salerno, Italy angelcaf@inwind.it, raffaele@cosmopoli.it 2 Center for

More information

Dialectical Behaviour Therapy in Secure Services Calverton Hill & Priory Hospital East Midlands Priory Group

Dialectical Behaviour Therapy in Secure Services Calverton Hill & Priory Hospital East Midlands Priory Group Context Our Dialectical Behaviour Therapy (DBT) team is a large multi-site team offering a standard DBT programme to patients who present with complex, severe, and enduring mental illness, personality

More information