Emotionally Augmented Storytelling Agent
|
|
- Jocelyn Tracey Gilbert
- 5 years ago
- Views:
Transcription
1 Emotionally Augmented Storytelling Agent The Effects of Dimensional Emotion Modeling for Agent Behavior Control Sangyoon Lee 1(&), Andrew E. Johnson 2, Jason Leigh 2, Luc Renambot 2, Steve Jones 3, and Barbara Di Eugenio 2 1 Connecticut College, New London, CT, USA james.lee@conncoll.edu 2 College of Engineering, University of Illinois at Chicago, Chicago, IL, USA {ajohnson,spiff,renambot,bdieugen}@uic.edu 3 College of Liberal Arts and Sciences, University of Illinois at Chicago, Chicago, IL, USA sjones@uic.edu Abstract. The study presented in this paper focuses on a dimensional theory to augment agent nonverbal behavior including emotional facial expression and head gestures to evaluate subtle differences in fine-grained conditions in the context of emotional storytelling. The result of a user study in which participants rated perceived naturalness for seven different conditions showed significantly higher preference for the augmented facial expression whereas the head gesture model received mixed ratings: significant preference in high arousal cases (happy) but not significant in low arousal cases (sad). Keywords: Virtual humans Intelligent agents Dimensional theory PAD 1 Introduction Recently, many research efforts toward natural and affective virtual agent capabilities have been undertaken [1]. Several studies have shown that a naturally behaving agent capable of interaction becomes more effective. For example Burgoon and Hoobler pointed out that more than 50 % of social meaning in our communication is carried via nonverbal cues [2]. Although positive effects of emotional agent behavior have been established, it is still challenging to design a computational model for such behaviors. Dimensional theories of emotion have motivated many prior studies to build a computational model to assess an agent s emotional state and provide a framework to map human emotion in one or more dimensions. To this end the Pleasure-Arousal-Dominance (PAD) model [3] provides well-formed computational foundations. The significance of the PAD model is that continuous changes in emotional state can support a smooth transition between discrete emotion eliciting stimuli. In particular, we investigate the capability of a dimensional model to control agent behavior in the context of storytelling. Given our interests, we have limited our exploration to evaluate the agent s ability to augment emotional behavior for stories Springer International Publishing Switzerland 2015 W.-P. Brinkman et al. (Eds.): IVA 2015, LNAI 9238, pp , DOI: / _53
2 484 S. Lee et al. that dominantly elicit two categorical emotions, happy and sad. The presented model is compared to our previous system that shows a fixed intensity of facial expression and head gesture. 2 Approach: System Model The nonverbal behavior processor (NVBP) in the system is composed of three components: the Affect Analyzer, the Emotion Processor, and the Gesture Processor. The Affect Analyzer takes an utterance and executes part-of-speech tagging, affect extraction for words based on WordNet-Affect, and structural analysis to revise word level affect. The gesture predictor uses basic rules adapted from McClave [4] and a data-driven model trained with the SEMAINE video corpus to generate head gesture events. The NVB events are encoded for each word, fed to the speech synthesizer, and the synthesizer then synchronizes NVB events as an agent speaks. The Emotion Processor takes an affect type event, then processes affect to update the agent s mood with the PAD model; finally, it computes the intensity of an emotional facial expression. PAD vector values are adapted from prior work [5], and the layered model is revised based on Gebhard and Kipp [6]. This final step is an augmentation that may increase or decrease the intensity of facial expression. The computational equations are shown below: X E evaluated ¼ e þ c t kk e T þ c a Eactive M updated ¼ T þ E evaluated þ X E active k E intensity ¼ e intensity E evaluatedk þ RðeÞ kk e ð1þ ð2þ ð3þ Equation (1) describes how an initial PAD vector, e, is evaluated. c t and c a are constant factors toward trait and active emotion. The length of e is denoted by e. Trait vector T and currently active emotions E active, and mood M, are applied to skew the initial vector. Equation (2) shows how the mood is updated based on newly added affect. It is a sum of trait, evaluated new emotion, and all active emotions in PAD. Equation (3) presents augmentation of new emotion to be sent to the facial expression synthesizer. The auxiliary effect, R(e), accounts for repetition of the similar emotion. When a gesture type event arrives from the speech synthesizer, the Gesture Processor refers to the arousal value of the current mood to compute the intensity of head gesture. For example, when one feels very depressed (low arousal), one may not show much head movement, whereas one may show intense gestures when the arousal value is high. The augmentation ranges from 25 % to 200 % of the baseline intensity.
3 Emotionally Augmented Storytelling Agent Evaluation To evaluate whether the presented model can increase the perceived naturalness of emotional facial expressions and head gestures, we designed seven conditions (Table 1). The intensity of facial expression and head gesture ranges from 0.0 to 1.0 as a weight value in our system to map it from no expression/gesture to the strongest expression/gesture. We chose a half intensity for control conditions to avoid extreme cases: intensity 0.0 or 1.0. We assume this as a moderate level of behavior for the system that does not have a computational model to reflect an agent s emotional state dynamically on agent s emotional behavior. A 10 % fixed intensity of a head gesture condition is added to compare very subtle head gesture with the presented model. The study included three stories from psychological literature and blogs. Each story is a personal experience in the past [7, 8]. The system detected 14 emotion-eliciting words/phrases, 12 happy and 2 sad, and parsed 32 head gestures, 10 nods and 22 shakes in the first story (happy story). For the second story (sad story) a total of 12 emotion-eliciting words/phrases including 10 sad, 1 fear, and 1 surprise and a total of 44 head gestures, 23 nods and 21 shakes were processed. The results were fed to the system to create videos of an agent. The audio track was excluded to avoid the bias caused by audio cues. Instead, subtitles were embedded in the videos. A total of 24 participants (18 male, mean age with SD 11.15) were recruited in this study. The study was composed of three sessions: one training session and two test sessions. During each session, participants (1) read one of the written stories, (2) selected a primary emotion among the six basic emotions that they might feel if they were telling the story, (3) drew a graph depicting the intensity of the primary emotion, and (4) reviewed a video showing the seven conditions and rated them (Fig. 1). The rating was measured as a number of check marks that a participant gave to each condition when one or more agents seemed more realistic than others. Rating Results. We performed the Wilcoxon signed-rank test to compare ratings between variables: combined cross-variable analysis for model vs. control group in both facial expression and head gesture. The model combined condition in story I for both facial expression (C1, C2, and C3) and head gesture (C1 and C4) was most preferred; its differences were significant (p <.05) except in the comparison with Control IV(C3 and C6) case for head gesture. The model combined condition in story II for facial expression (C1, C2, and C3) was significantly higher than control groups (p <.05), whereas the head gesture case (C1 and C4) was not significant. Facial expression Table 1. Combinational behavior model conditions used in the study Head gesture Model (varied Control III (fixed half Model (varied C1 C2 C3 Control I (fixed half C4 C5 C6 Control II (Control I w/abrupt transition) n/a C7 n/a Control IV (fixed 10 %
4 486 S. Lee et al. Fig. 1. A participant is reviewing virtual agents video on a large display system. Seven identical agents are telling the same story in sync with different conditions. In summary, the presented model that can emotionally augment nonverbal behavior received significantly higher preference than the control conditions for facial expression. However, the head gesture model showed mixed results. 4 Conclusions We designed a PAD space model to compute a virtual agent s emotional state to drive its behavior including facial expressions and head gestures. The presented model uses a shallow parsing of a surface text to extract emotion-eliciting stimuli and generates augmented behavior according to the agent s mood. The result of a user study confirmed that the facial expression model received a significantly higher rating than all control conditions that use fixed intensity of facial expression. However, our head gesture model showed mixed results. We found significant preference for the head gesture model in high arousal cases whereas we obtained meaningful but not significant ratings in low arousal cases. The difficulty of recognizing subtle gestures in low arousal cases may have contributed to the inconsistent rating as some participants noted naturalness of lessened head gesture in a sad/depressed mood. However, more focused future study is required to verify this interpretation. References 1. Lee, J., Marsella, S.C.: Predicting speaker head nods and the effects of affective information. IEEE Trans. Multimedia 12(6), (2010) 2. Burgoon, J.K., Hoobler, G.D.: Nonverbal Signals. In: Knapp, M.L., Daly, J.A. (eds.) Handbook of interpersonal communication, pp SAGE Publications, Inc., Thousand Oaks (2002) 3. Mehrabian, A.: Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in Temperament. Curr. Psychol. 14, (1996) 4. McClave, E.Z.: Linguistic functions of head movements in the context of speech. J. Pragmat. 32(7), (2000) 5. Zhang, S., Wu, Z., Meng, H.M., Cai, L.: Facial expression synthesis based on emotion dimensions for affective talking avatar. In: Nishida, T., Jain, L.C., Faucher, C. (eds.) Modeling Machine Emotions for Realizing Intelligence. SIST, vol. 1, pp Springer, Heidelberg (2010)
5 Emotionally Augmented Storytelling Agent Gebhard, P., Kipp, K.H.: Are computer-generated emotions and moods plausible to humans? In: Gratch, J., Young, M., Aylett, R.S., Ballin, D., Olivier, P. (eds.) IVA LNCS (LNAI), vol. 4133, pp Springer, Heidelberg (2006) 7. DiMarco, L.: My minxy mouse encounter: Disney World. favorite-travel-memories 8. Ossorio, P.G.: Clinical topics: A seminar in Descriptive Psychology. Linguistic Research Institute, Whittier (1976)
Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners
Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners Hatice Gunes and Maja Pantic Department of Computing, Imperial College London 180 Queen
More informationMotion Control for Social Behaviours
Motion Control for Social Behaviours Aryel Beck a.beck@ntu.edu.sg Supervisor: Nadia Magnenat-Thalmann Collaborators: Zhang Zhijun, Rubha Shri Narayanan, Neetha Das 10-03-2015 INTRODUCTION In order for
More informationRepresenting Emotion and Mood States for Virtual Agents
Representing Emotion and Mood States for Virtual Agents Luis Peña 1, Jose-María Peña 2, and Sascha Ossowski 1 1 Universidad Rey Juan Carlos {luis.pena,sascha.ossowski}@urjc.es 2 Universidad Politecnica
More informationEmotion Recognition using a Cauchy Naive Bayes Classifier
Emotion Recognition using a Cauchy Naive Bayes Classifier Abstract Recognizing human facial expression and emotion by computer is an interesting and challenging problem. In this paper we propose a method
More informationThe Effect of Dominance Manipulation on the Perception and Believability of an Emotional Expression
The Effect of Dominance Manipulation on the Perception and Believability of an Emotional Expression Wim F.J. van der Ham 1(B), Joost Broekens 2, and Peter H.M.P. Roelofsma 1 1 AAL-VU, VU Amsterdam, De
More informationFudan University, China
Cyber Psychosocial and Physical (CPP) Computation Based on Social Neuromechanism -Joint research work by Fudan University and University of Novi Sad By Professor Weihui Dai Fudan University, China 1 Agenda
More informationOn Shape And the Computability of Emotions X. Lu, et al.
On Shape And the Computability of Emotions X. Lu, et al. MICC Reading group 10.07.2013 1 On Shape and the Computability of Emotion X. Lu, P. Suryanarayan, R. B. Adams Jr., J. Li, M. G. Newman, J. Z. Wang
More information- - Xiaofen Xing, Bolun Cai, Yinhu Zhao, Shuzhen Li, Zhiwei He, Weiquan Fan South China University of Technology
- - - - -- Xiaofen Xing, Bolun Cai, Yinhu Zhao, Shuzhen Li, Zhiwei He, Weiquan Fan South China University of Technology 1 Outline Ø Introduction Ø Feature Extraction Ø Multi-modal Hierarchical Recall Framework
More informationIntroduction to affect computing and its applications
Introduction to affect computing and its applications Overview What is emotion? What is affective computing + examples? Why is affective computing useful? How do we do affect computing? Some interesting
More informationSocial Communication in young adults with autism spectrum disorders (ASD) Eniola Lahanmi
Social Communication in young adults with autism spectrum disorders (ASD) Eniola Lahanmi We ll cover Autism Spectrum Disorders (ASD) ASD in young adults Social Communication (definition, components, importance,
More informationSociable Robots Peeping into the Human World
Sociable Robots Peeping into the Human World An Infant s Advantages Non-hostile environment Actively benevolent, empathic caregiver Co-exists with mature version of self Baby Scheme Physical form can evoke
More informationUnderstanding Consumer Experience with ACT-R
Understanding Consumer Experience with ACT-R Alessandro Oltramari 1,2 (work done in collaboration with Francesco Patt 2 and Paolo Panizza 2 ) 1 Carnegie Mellon University, CyLab (CS Department) 2 B-Sm@rk
More informationGender Based Emotion Recognition using Speech Signals: A Review
50 Gender Based Emotion Recognition using Speech Signals: A Review Parvinder Kaur 1, Mandeep Kaur 2 1 Department of Electronics and Communication Engineering, Punjabi University, Patiala, India 2 Department
More informationEmotion Affective Color Transfer Using Feature Based Facial Expression Recognition
, pp.131-135 http://dx.doi.org/10.14257/astl.2013.39.24 Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition SeungTaek Ryoo and Jae-Khun Chang School of Computer Engineering
More informationComparing Behavior Towards Humans and Virtual Humans in a Social Dilemma
Comparing Behavior Towards Humans and Virtual Humans in a Social Dilemma Rens Hoegen (B), Giota Stratou, Gale M. Lucas, and Jonathan Gratch Institute for Creative Technologies, University of Southern California,
More informationAnalysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information
Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information C. Busso, Z. Deng, S. Yildirim, M. Bulut, C. M. Lee, A. Kazemzadeh, S. Lee, U. Neumann, S. Narayanan Emotion
More informationRobotics, AI and philosophy
DATAIA Project BAD NUDGE BAD ROBOT: Nudge and Ethics in human-machine spoken interaction Laurence Devillers Professor of Computer Science LIMSI-CNRS/Sorbonne Univ. Team: Affective and social dimensions
More informationPerceptual Enhancement of Emotional Mocap Head Motion: An Experimental Study
2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII) Perceptual Enhancement of Emotional Mocap Head Motion: An Experimental Study Yu Ding Univeristy of Houston
More informationRegression Modeling of Reader s Emotions Induced by Font Based Text Signals
Regression Modeling of Reader s Emotions Induced by Font Based Text Signals Dimitrios Tsonos 1, Georgios Kouroupetroglou 1, and Despina Deligiorgi 2 1 Department of Informatics and Telecommunications,
More informationValence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry.
Proceedings Chapter Valence-arousal evaluation using physiological signals in an emotion recall paradigm CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry Abstract The work presented in this paper aims
More informationAffective Game Engines: Motivation & Requirements
Affective Game Engines: Motivation & Requirements Eva Hudlicka Psychometrix Associates Blacksburg, VA hudlicka@ieee.org psychometrixassociates.com DigiPen Institute of Technology February 20, 2009 1 Outline
More informationMODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION
MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION EMOTION: a response of the whole organism, involving 1. physiological arousal 2. expressive behaviors, and 3. conscious experience A mix of bodily arousal
More informationFacial Expression Biometrics Using Tracker Displacement Features
Facial Expression Biometrics Using Tracker Displacement Features Sergey Tulyakov 1, Thomas Slowe 2,ZhiZhang 1, and Venu Govindaraju 1 1 Center for Unified Biometrics and Sensors University at Buffalo,
More informationFormulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification
Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification Reza Lotfian and Carlos Busso Multimodal Signal Processing (MSP) lab The University of Texas
More informationEmotional Development
Emotional Development How Children Develop Chapter 10 Emotional Intelligence A set of abilities that contribute to competent social functioning: Being able to motivate oneself and persist in the face of
More informationEmotional intelligence in interactive systems
To appear in Design and Emotion, (Eds., D. McDonagh, P. Hekkert, J. van Erp and D. Gyi (pp.262-266) London: Taylor & Francis Emotional intelligence in interactive systems Antonella De Angeli and Graham
More informationDeceptive Communication Behavior during the Interview Process: An Annotated Bibliography. Angela Q. Glass. November 3, 2008
Running head: DECEPTIVE COMMUNICATION BEHAVIOR Deceptive Communication Behavior 1 Deceptive Communication Behavior during the Interview Process: An Annotated Bibliography Angela Q. Glass November 3, 2008
More informationEmotions of Living Creatures
Robot Emotions Emotions of Living Creatures motivation system for complex organisms determine the behavioral reaction to environmental (often social) and internal events of major significance for the needs
More informationEmotionally Responsive Virtual Counselor for Behavior-Change Health Interventions
Emotionally Responsive Virtual Counselor for Behavior-Change Health Interventions Reza Amini, Christine Lisetti, and Ugan Yasavur School of Computing and Information Sciences, Florida International University,
More informationPerson Perception. Forming Impressions of Others. Mar 5, 2012, Banu Cingöz Ulu
Person Perception Forming Impressions of Others Mar 5, 2012, Banu Cingöz Ulu Person Perception person perception: how we come to know about others temporary states, emotions, intentions and desires impression
More informationThe Ordinal Nature of Emotions. Georgios N. Yannakakis, Roddy Cowie and Carlos Busso
The Ordinal Nature of Emotions Georgios N. Yannakakis, Roddy Cowie and Carlos Busso The story It seems that a rank-based FeelTrace yields higher inter-rater agreement Indeed, FeelTrace should actually
More informationPSYC 222 Motivation and Emotions
PSYC 222 Motivation and Emotions Session 6 The Concept of Emotion Lecturer: Dr. Annabella Osei-Tutu, Psychology Department Contact Information: aopare-henaku@ug.edu.gh College of Education School of Continuing
More informationAudiovisual to Sign Language Translator
Technical Disclosure Commons Defensive Publications Series July 17, 2018 Audiovisual to Sign Language Translator Manikandan Gopalakrishnan Follow this and additional works at: https://www.tdcommons.org/dpubs_series
More informationBio-Feedback Based Simulator for Mission Critical Training
Bio-Feedback Based Simulator for Mission Critical Training Igor Balk Polhemus, 40 Hercules drive, Colchester, VT 05446 +1 802 655 31 59 x301 balk@alum.mit.edu Abstract. The paper address needs for training
More informationA Spontaneous Cross-Cultural Emotion Database: Latin-America vs. Japan
KEER2014, LINKÖPING JUNE 11-13 2014 INTERNATIONAL CONFERENCE ON KANSEI ENGINEERING AND EMOTION RESEARCH A Spontaneous Cross-Cultural Emotion Database: Latin-America vs. Japan Maria Alejandra Quiros-Ramirez
More informationManaging emotions in turbulent and troubling times. Professor Peter J. Jordan Griffith Business School
Managing emotions in turbulent and troubling times Professor Peter J. Jordan Griffith Business School Overview Emotions and behaviour Emotional reactions to change Emotional intelligence What emotions
More informationAffective Computing Ana Paiva & João Dias. Lecture 1. Course Presentation
Affective Computing Ana Paiva & João Dias Lecture 1. Course Presentation Motivation. What is Affective Computing? Applications and Problems Perspectives on Emotions History of Affective Sciences Communication
More informationFACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS
International Archives of Photogrammetry and Remote Sensing. Vol. XXXII, Part 5. Hakodate 1998 FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS Ayako KATOH*, Yasuhiro FUKUI**
More informationUsing TouchPad Pressure to Detect Negative Affect
Using TouchPad Pressure to Detect Negative Affect Helena M. Mentis Cornell University hmm26@cornell.edu Abstract Humans naturally use behavioral cues in their interactions with other humans. The Media
More informationCOMBINING CATEGORICAL AND PRIMITIVES-BASED EMOTION RECOGNITION. University of Southern California (USC), Los Angeles, CA, USA
COMBINING CATEGORICAL AND PRIMITIVES-BASED EMOTION RECOGNITION M. Grimm 1, E. Mower 2, K. Kroschel 1, and S. Narayanan 2 1 Institut für Nachrichtentechnik (INT), Universität Karlsruhe (TH), Karlsruhe,
More informationEmotions and Motivation
Emotions and Motivation LP 8A emotions, theories of emotions 1 10.1 What Are Emotions? Emotions Vary in Valence and Arousal Emotions Have a Physiological Component What to Believe? Using Psychological
More informationA Model of Personality and Emotional Traits
A Model of Personality and Emotional Traits Margaret McRorie 1, Ian Sneddon 1, Etienne de Sevin 2, Elisabetta Bevacqua 2, and Catherine Pelachaud 2 1 School of Psychology Queen's University, Belfast BT7
More informationValence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University
Valence and Gender Effects on Emotion Recognition Following TBI Cassie Brown Arizona State University Knox & Douglas (2009) Social Integration and Facial Expression Recognition Participants: Severe TBI
More informationAffective Computing: Emotions in human-computer. interaction. emotions.usc.edu
Affective Computing: Emotions in human-computer interaction gratch@ict.usc.edu; marsella@ict.usc.edu emotions.usc.edu Affective Computing The field of study concerned with understanding, recognizing and
More informationInternal Consistency and Reliability of the Networked Minds Measure of Social Presence
Internal Consistency and Reliability of the Networked Minds Measure of Social Presence Chad Harms Iowa State University Frank Biocca Michigan State University Abstract This study sought to develop and
More informationContrastive Analysis on Emotional Cognition of Skeuomorphic and Flat Icon
Contrastive Analysis on Emotional Cognition of Skeuomorphic and Flat Icon Xiaoming Zhang, Qiang Wang and Yan Shi Abstract In the field of designs of interface and icons, as the skeuomorphism style fades
More informationAn assistive application identifying emotional state and executing a methodical healing process for depressive individuals.
An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. Bandara G.M.M.B.O bhanukab@gmail.com Godawita B.M.D.T tharu9363@gmail.com Gunathilaka
More informationReading personality from blogs An evaluation of the ESCADA system
Reading personality from blogs An evaluation of the ESCADA system Abstract The ESCADA system is a shallow textual understanding system capable of detecting high-level patterns of affective communication.
More informationA Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China
A Vision-based Affective Computing System Jieyu Zhao Ningbo University, China Outline Affective Computing A Dynamic 3D Morphable Model Facial Expression Recognition Probabilistic Graphical Models Some
More informationEMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS?
EMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS? Sefik Emre Eskimez, Kenneth Imade, Na Yang, Melissa Sturge- Apple, Zhiyao Duan, Wendi Heinzelman University of Rochester,
More informationSelected Mehrabian Publications & An Audio CD of Mehrabian Interviews on Nonverbal Communication
Selected Mehrabian Publications & An Audio CD of Mehrabian Interviews on Nonverbal Communication You can order the CD-ROM of Selected Mehrabian Publications described below from the following site: http://www.kaaj.com/psych/selectedpubs.html
More informationChanging the Ratio at SXSW
Changing the Ratio at SXSW Emma Persky Not many conferences publish quite as much raw data about the session proposals submitted to them as SXSW. The SXSW Panel Picker is one of the few sources of such
More informationFacial expression recognition with spatiotemporal local descriptors
Facial expression recognition with spatiotemporal local descriptors Guoying Zhao, Matti Pietikäinen Machine Vision Group, Infotech Oulu and Department of Electrical and Information Engineering, P. O. Box
More informationTowards Formal Modeling of Affective Agents in a BDI Architecture
Towards Formal Modeling of Affective Agents in a BDI Architecture Bexy Alfonso, Emilio Vivancos, and Vicente Botti Universitat Politècnica de València, Departamento de Sistemas Informáticos y Computación,
More informationMoralization Through Moral Shock: Exploring Emotional Antecedents to Moral Conviction. Table of Contents
Supplemental Materials 1 Supplemental Materials for Wisneski and Skitka Moralization Through Moral Shock: Exploring Emotional Antecedents to Moral Conviction Table of Contents 2 Pilot Studies 2 High Awareness
More informationEffect of Sensor Fusion for Recognition of Emotional States Using Voice, Face Image and Thermal Image of Face
Effect of Sensor Fusion for Recognition of Emotional States Using Voice, Face Image and Thermal Image of Face Yasunari Yoshitomi 1, Sung-Ill Kim 2, Takako Kawano 3 and Tetsuro Kitazoe 1 1:Department of
More informationMaking It Personal: End-User Authoring of Health Narratives Delivered by Virtual Agents
Making It Personal: End-User Authoring of Health Narratives Delivered by Virtual Agents Timothy Bickmore, and Lazlo Ring College of Computer and Information Science, Northeastern University 360 Huntington
More informationBringing Your A Game: Strategies to Support Students with Autism Communication Strategies. Ann N. Garfinkle, PhD Benjamin Chu, Doctoral Candidate
Bringing Your A Game: Strategies to Support Students with Autism Communication Strategies Ann N. Garfinkle, PhD Benjamin Chu, Doctoral Candidate Outcomes for this Session Have a basic understanding of
More informationTemporal Context and the Recognition of Emotion from Facial Expression
Temporal Context and the Recognition of Emotion from Facial Expression Rana El Kaliouby 1, Peter Robinson 1, Simeon Keates 2 1 Computer Laboratory University of Cambridge Cambridge CB3 0FD, U.K. {rana.el-kaliouby,
More informationIntroductory Workshop. Research What is Autism? Core Deficits in Behaviour. National Autistic Society Study - UK (Barnard, et. al, 2001).
Schedule 1.! What is Autism and How is it treated? Intervention and Outcomes, New views on Research and Possibilities 2.! How do typical children develop Referencing, Regulation, Play and Peer Relationships?
More informationipads and Autism: Developing Sound Approaches
Liberty University From the SelectedWorks of Randall S. Dunn Winter December 3, 2012 ipads and Autism: Developing Sound Approaches Randall S. Dunn Amanda Szapkiw, Liberty University Available at: https://works.bepress.com/randall_dunn/43/
More informationPUBLIC SPEAKING IAN HILL S WAY
It is easy to take communicating for granted because it is a daily activity. Did you know? The average worker spends 50 percent of his or her time communicating? Business success is 85 percent dependent
More informationMPEG-4 Facial Expression Synthesis based on Appraisal Theory
MPEG-4 Facial Expression Synthesis based on Appraisal Theory L. Malatesta, A. Raouzaiou, K. Karpouzis and S. Kollias Image, Video and Multimedia Systems Laboratory, National Technical University of Athens,
More informationEIQ16 questionnaire. Joe Smith. Emotional Intelligence Report. Report. myskillsprofile.com around the globe
Emotional Intelligence Report EIQ16 questionnaire Joe Smith myskillsprofile.com around the globe Report The EIQ16 questionnaire is copyright MySkillsProfile.com. myskillsprofile.com developed and publish
More informationUseful Roles of Emotions in Animated Pedagogical Agents. Ilusca L. L. Menezes IFT6261 :: Winter 2006
Useful Roles of Emotions in Animated Ilusca L. L. Menezes IFT6261 :: Winter 2006 Objectives To provide opportunities: To understand the importance of the emotions in learning To explore how to model emotions
More informationFlorida Performing Fine Arts Assessment Item Specifications for Benchmarks in Course: Acting 2
Cluster A/B/C/D Item Type Florida Performing Fine Arts Assessment Course Title: Acting 2 Course Number: 0400380 Abbreviated Title: ACTING 2 Number of Credits: 1.0 Course Length: Year Course Level: 2 Graduation
More informationParent initiated Why evaluate (3) Questions about program appropriateness School initiated Conflict resolution Component of a regular peer review proc
Evaluating Educational Services for Students with Autism Spectrum Disorders Erik Mayville, Ph.D., BCBA-D The Institute for Educational Planning, LLC Connecticut Center for Child Development, Inc. & mayville@iepinc.org
More informationGeneral Brain concepts: The brain is an associational organ. The neurons that fire together, wire together. It is also an anticipation machine (173)
The Mindful Brain by Daniel Siegel Summary of why we do it: In mindful learning, it appears that the focus is on engaging with the outside world, not so much in achieving a test score or skill, but in
More informationSpeech Group, Media Laboratory
Speech Group, Media Laboratory The Speech Research Group of the M.I.T. Media Laboratory is concerned with understanding human speech communication and building systems capable of emulating conversational
More informationUSING ARTIFICIAL INTELLIGENCE TO ASSIST PSYCHOLOGICAL TESTING
USING ARTIFICIAL INTELLIGENCE TO ASSIST PSYCHOLOGICAL TESTING Karthi Selvarajah and Debbie Richards Computing Department Macquarie University richards@ics.mq.edu.au ABSTRACT Balancing ecological validity
More informationAffective Dialogue Communication System with Emotional Memories for Humanoid Robots
Affective Dialogue Communication System with Emotional Memories for Humanoid Robots M. S. Ryoo *, Yong-ho Seo, Hye-Won Jung, and H. S. Yang Artificial Intelligence and Media Laboratory Department of Electrical
More informationRecognising Emotions from Keyboard Stroke Pattern
Recognising Emotions from Keyboard Stroke Pattern Preeti Khanna Faculty SBM, SVKM s NMIMS Vile Parle, Mumbai M.Sasikumar Associate Director CDAC, Kharghar Navi Mumbai ABSTRACT In day to day life, emotions
More informationVisualizing the Affective Structure of a Text Document
Visualizing the Affective Structure of a Text Document Hugo Liu, Ted Selker, Henry Lieberman MIT Media Laboratory {hugo, selker, lieber} @ media.mit.edu http://web.media.mit.edu/~hugo Overview Motivation
More informationA framework for the Recognition of Human Emotion using Soft Computing models
A framework for the Recognition of Human Emotion using Soft Computing models Md. Iqbal Quraishi Dept. of Information Technology Kalyani Govt Engg. College J Pal Choudhury Dept. of Information Technology
More informationUC Merced Proceedings of the Annual Meeting of the Cognitive Science Society
UC Merced Proceedings of the Annual Meeting of the Cognitive Science Society Title Emotion-Driven Reinforcement Learning Permalink https://escholarship.org/uc/item/9jk839mw Journal Proceedings of the Annual
More informationEmotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods.
LP 8D emotions & James/Lange 1 Emotions An emotion is a complex psychological state that involves subjective experience, physiological response, and behavioral or expressive responses. These aspects are
More informationArtificial Emotions to Assist Social Coordination in HRI
Artificial Emotions to Assist Social Coordination in HRI Jekaterina Novikova, Leon Watts Department of Computer Science University of Bath Bath, BA2 7AY United Kingdom j.novikova@bath.ac.uk Abstract. Human-Robot
More informationThe 29th Fuzzy System Symposium (Osaka, September 9-, 3) Color Feature Maps (BY, RG) Color Saliency Map Input Image (I) Linear Filtering and Gaussian
The 29th Fuzzy System Symposium (Osaka, September 9-, 3) A Fuzzy Inference Method Based on Saliency Map for Prediction Mao Wang, Yoichiro Maeda 2, Yasutake Takahashi Graduate School of Engineering, University
More informationDeveloping Emotional Intelligence LEIGH HORNE-MEBEL, M.S.W., B.C.D., A.C.S.W. M.G.H. PEDIATRIC EPILEPSY PROGRAM
Developing Emotional Intelligence LEIGH HORNE-MEBEL, M.S.W., B.C.D., A.C.S.W. M.G.H. PEDIATRIC EPILEPSY PROGRAM Founders of Emotional Intelligence John Mayer, Peter Salovey, David Caruso, and Richard Boyatz,
More informationCPSC81 Final Paper: Facial Expression Recognition Using CNNs
CPSC81 Final Paper: Facial Expression Recognition Using CNNs Luis Ceballos Swarthmore College, 500 College Ave., Swarthmore, PA 19081 USA Sarah Wallace Swarthmore College, 500 College Ave., Swarthmore,
More informationSignal Processing in the Workplace. Daniel Gatica-Perez
Signal Processing in the Workplace Daniel Gatica-Perez According to the U.S. Bureau of Labor Statistics, during 2013 employed Americans worked an average of 7.6 hours on the days they worked, and 83 percent
More informationImages of the future are:
A recipe for making deep, meaningful images of the future Exploring the trans-cultural images of the future of young adults By M.A. Akhgar S. Kaboli & Prof. Petri Tapio Finland Futures Research Centre
More informationBAAP: A Behavioral Animation Authoring Platform for Emotion Driven 3D Virtual Characters
BAAP: A Behavioral Animation Authoring Platform for Emotion Driven 3D Virtual Characters Ling Li 1, Gengdai Liu 2, Mingmin Zhang 1,1, Zhigeng Pan 1 and Edwin, Song 3, 1 State Key Lab of CAD&CG, Zhejiang
More informationNot All Moods are Created Equal! Exploring Human Emotional States in Social Media
Not All Moods are Created Equal! Exploring Human Emotional States in Social Media Munmun De Choudhury Scott Counts Michael Gamon Microsoft Research, Redmond {munmund, counts, mgamon}@microsoft.com [Ekman,
More informationEmote to Win: Affective Interactions with a Computer Game Agent
Emote to Win: Affective Interactions with a Computer Game Agent Jonghwa Kim, Nikolaus Bee, Johannes Wagner and Elisabeth André Multimedia Concepts and Application, Faculty for Applied Computer Science
More informationAnimating Idle Gaze in Public Places
Animating Idle Gaze in Public Places Angelo Cafaro 1, Raffaele Gaito 1 and Hannes Högni Vilhjálmsson 2, 1 Università degli Studi di Salerno, Italy angelcaf@inwind.it, raffaele@cosmopoli.it 2 Center for
More informationVisualization of emotional rating scales
Visualization of emotional rating scales Stephanie Greer UC Berkeley CS294-10 smgreer@berkeley.edu ABSTRACT Visual ratings scales have the potential to be a powerful tool for recording emotional information
More informationComputational models of emotion
HUMAINE Plenary Newcastle, May 24-27, 2004 Computational models of emotion Klaus R. Scherer University of Posing the problem: Three types of computational models Appraisal cirteria Integration rules Sequential
More informationClinicians and the Language of Nonverbal Communication: It s Not Just What You Say, But How You Say It
Clinicians and the Language of Nonverbal Communication: It s Not Just What You Say, But How You Say It Introduction Barbara Lewis, MBA Managing Editor DocCom Joan Lowery, MEd President of Lowery Communications
More informationSocial Context Based Emotion Expression
Social Context Based Emotion Expression Radosław Niewiadomski (1), Catherine Pelachaud (2) (1) University of Perugia, Italy (2) University Paris VIII, France radek@dipmat.unipg.it Social Context Based
More informationImproving Social Communication in Children with High Functioning ASD
Improving Social Communication in Children with High Functioning ASD Tonya Agostini Aspect Autism in Education Conference, Sydney 31 st July-1 st August, 2014 My Experiences with Language Learning and
More informationInternal Consistency and Reliability of the Networked Minds Social Presence Measure
Internal Consistency and Reliability of the Networked Minds Social Presence Measure Chad Harms, Frank Biocca Iowa State University, Michigan State University Harms@iastate.edu, Biocca@msu.edu Abstract
More informationUMEME: University of Michigan Emotional McGurk Effect Data Set
THE IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, NOVEMBER-DECEMBER 015 1 UMEME: University of Michigan Emotional McGurk Effect Data Set Emily Mower Provost, Member, IEEE, Yuan Shangguan, Student Member, IEEE,
More informationInteractively Learning Nonverbal Behavior for Inference and Production: A Machine Learning Approach
Interactively Learning Nonverbal Behavior for Inference and Production: A Machine Learning Approach Jin Joo Lee MIT Media Lab Cambridge, MA 02142 USA jinjoo@media.mit.edu Cynthia Breazeal MIT Media Lab
More informationSPEECH EMOTION RECOGNITION: ARE WE THERE YET?
SPEECH EMOTION RECOGNITION: ARE WE THERE YET? CARLOS BUSSO Multimodal Signal Processing (MSP) lab The University of Texas at Dallas Erik Jonsson School of Engineering and Computer Science Why study emotion
More informationA Fuzzy Logic System to Encode Emotion-Related Words and Phrases
A Fuzzy Logic System to Encode Emotion-Related Words and Phrases Author: Abe Kazemzadeh Contact: kazemzad@usc.edu class: EE590 Fuzzy Logic professor: Prof. Mendel Date: 2007-12-6 Abstract: This project
More informationAssessing the validity of appraisalbased models of emotion
Assessing the validity of appraisalbased models of emotion Jonathan Gratch, Stacy Marsella, Ning Wang, Brooke Stankovic Institute for Creative Technologies University of Southern California The projects
More informationFactors for Measuring Dramatic Believability. Brian Magerko, Ph.D. Games for Entertainment and Learning Lab Michigan State University
Factors for Measuring Dramatic Believability Brian Magerko, Ph.D. Games for Entertainment and Learning Lab Michigan State University Outline Introduction Deconstruction Evaluation from Player Perspective
More informationGesture Control in a Virtual Environment. Presenter: Zishuo Cheng (u ) Supervisors: Prof. Tom Gedeon and Mr. Martin Henschke
Gesture Control in a Virtual Environment Presenter: Zishuo Cheng (u4815763) Supervisors: Prof. Tom Gedeon and Mr. Martin Henschke 2 Outline Background Motivation Methodology Result & Discussion Conclusion
More informationFuzzy Model on Human Emotions Recognition
Fuzzy Model on Human Emotions Recognition KAVEH BAKHTIYARI &HAFIZAH HUSAIN Department of Electrical, Electronics and Systems Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan
More information