AFFECTIVE COMPUTING. Affective Computing. Introduction. Guoying Zhao 1 / 69

Size: px
Start display at page:

Download "AFFECTIVE COMPUTING. Affective Computing. Introduction. Guoying Zhao 1 / 69"

Transcription

1 Affective Computing Introduction Guoying Zhao 1 / 69

2 Your Staff Prof. Guoying Zhao - guoying.zhao@oulu.fi - office: TS302 - phone: Thur. 3-4pm Dr. Xiaobai Li (Assistant Lecturer) - xiaobai.li@oulu.fi, TS321 Mr. Henglin Shi (Teaching Assistant) - henglin.shi@oulu.fi, TS317 Ms. Yante Li (Teaching Assistant) - yante.li@oulu.fi, TS334 Mr. Tuomas Varanka (Teaching Assistant) - tuomas.varanka@student.oulu.fi, TS333 Guoying Zhao 2 / 69

3 Objectives Basic understanding of affective computing Psychological theory Recognition of emotions Synthesizing emotions Applications Guoying Zhao 3 / 69 3

4 Prerequisites Computer engineering Basic Data Structures Programming skills using Python (Python 2.7) Guoying Zhao 4 / 69 4

5 Bird s Eye View of the Course Lecture 1: Course Overview. Introduction to Affective Computing. Sep. 5 (10-12, PR119) Lecture 2: Emotion theory and modeling (Dr. Xiaobai Li). Sep. 6 (12-14, PR119) Lecture 3: Emotion recognition from facial expressions. Sep. 7 (8-10, PR119) Lecture 4: Emotion recognition from speech (Mr. Henglin Shi) Sep. 12 (10-12 Room PR104) Lecture 5: Fusion of multiple modalities (Ms. Yante Li) Sep. 19 (10-12, PR104) Lecture 6: Synthesis of emotional behaviors (Dr. Xiaobai Li) Sep. 26 (10-12, PR104) Lecture 7: Crowdsourcing techniques for affective computing. Oct. 3 (10-12, PR104) Lecture 8: Research related to Affective Computing. Oct. 10 (10-12, PR104) Exam: Oct. 25: Thu 16:15-19:15; 2 nd Exam: Dec. 3: Mon 16:15-19:15. Lecture room: PR119, PR104 Guoying Zhao 5 / 69

6 E1.1, Group1: Facial expression recognition. Sep. 13 (12-14, MA336) E1.1, Group2: Facial expression recognition. Sep. 11 (12-14, TS137) E1.2, Group1: Facial expression recognition. Sep. 20 E1.2, Group2: Facial expression recognition. Sep. 18 E2.1, Group1 : Emotional speech recognition. Sep. 27 E2.1, Group2 : Emotional speech recognition. Sep. 25 E2.2, Group1: Emotional speech recognition. Oct. 4 E2.2, Group2: Emotional speech recognition. Oct. 2 E3.1, Group1: Fusion of multi-modalities for emotion recognition. Oct. 9 E3.1, Group2: Fusion of multi-modalities for emotion recognition. Oct. 11 E3.2, Group1 : Fusion of multi-modalities for emotion recognition. Oct. 16 E3.2, Group2 : Fusion of multi-modalities for emotion recognition. Oct. 18 & Exercise room: MA336 (Group 1) & TS137 (Group 2) Guoying Zhao 6 / 69

7 Course studying Studying this course consists of three parts: following the lecture self-study with the distributed materials doing the assigned exercises (mandatory) Guoying Zhao 7 / 69

8 Grading Study materials: lecture slides, pre-collected code, tools and databases The course consists of lectures and exercises. The final grade is based on the final exam while the exercises need to be passed before exam. Course webpage: Guoying Zhao 8 / 69

9 Feedback New course feedback system: The feedback can be given from the first date of the course until three weeks after the last date. Guoying Zhao 9 / 69

10 Affective Computing: Machines with Emotional Intelligence Guoying Zhao 10 / 69

11 Pioneers Roz Picard introduced the term Affective Computing in 1995 Her book with that title published in 1997 Originally defined as computing that relates to, arises from, and deliberately influences emotion Guoying Zhao 11 / 69

12 A community of practice: HUMAINE Association, 2007 Represented by a professional society Now, Association for the Advancement of Affective Computing a professional, world-wide association for researchers in Affective Computing, Emotions and Human-Machine Interaction International Conference ACII (bi-annual International Conference, started in 2005) International Journal (started in 2010) Guoying Zhao 12 / 69

13 Affective Computing An Interdisciplinary field of research: computer science, psychology, and cognitive science. Research and develop systems that recognize, stimulate and simulate human affect including: How affective sensing can inform machine-understanding of people How affect influences human-computer and human-robot interaction and machine usability How to make computers more human-like The ethics of giving machines emotional capabilities Covers but is not limited to the topics involving: Psychology and behavior as they relate to affective computing Behavior generation and user interaction. Guoying Zhao 13 / 69

14 Motivations and Goals Research shows that human intelligence is not independent of emotion. Emotion and cognitive functions are inextricably integrated into the human brain. Automatic assessment of human emotional/affective state. Creating a bridge between highly emotional human and emotionally challenged computer systems/electronic devices - Systems capable of responding emotionally. The central issues in affective computing are representation, detection, and classification of users emotions and synthesis of emotions for response. Norman, D.A. (1981). Twelve issues for cognitive science Picard, R., & Klein, J. (2002). Computers that recognize and respond to user emotion: Theoretical and practical implications. Guoying Taleb, T.; Bottazzi, Zhao D.; Nasser, N.;, "A Novel Middleware Solution 14 / 69to Improve Ubiquitous Healthcare Systems Aided by Affective Information,"

15 Dacher Keltner (professor of psychology at University of California, Berkeley) helped filmmakers understand emotions for the Pixar movie Inside Out.

16 Affective Computing Research Affective computing can be related to other computing disciplines such as Artificial Intelligence (AI), Virtual Reality (VR) and Human Computer interaction (HCI). Questions need to be Answered? What is an affective state (typically feelings, moods, sentiments etc.)? Which human communicative signals convey information about affective state? How various kinds of affective information can be combined to optimize inferences about affective states? How to apply affective information to designing systems? The research areas of affective computing visualized by MIT (2001) Guoying Zhao 16 / 69 M. Pantic, N. Sebe, J. F. Cohn, and T. Huang, Affective multimodal human-computer interaction. In ACM International Conference on Multimedia (MM).

17 Steps towards affective computing research We first need to define what we mean when we use the word emotion. Second, we need an emotion model that gives us the possibility to differentiate between emotional states. In addition, we need a classification scheme that uses specific features from an underlying (input) signal to recognize the user s emotions. The emotion model has to fit together with the classification scheme used by the emotion recognizer. Guoying Zhao 17 / 69 R. Sharma, V. Pavlovic, and T. Huang. Toward multimodal human-computer interface. In Proceedings of the IEEE, 1998.

18 How Emotion/Affection is Modeled? In affective computing, affect is often seen as another kind of information discrete or continuous units or states internal to an individual that can be transmitted in a loss-free manner from people to computational systems and back. Affection description perspectives Discrete Emotion Description Happiness, fear, sadness, surprise, disgust, anger Dimensional Description valence, arousal, dominance Valence: the pleasantness of the stimulus Arousal: the intensity of emotion provoked by the stimulus Dominance: the degree of control exerted by the stimulus Guoying Zhao 18 / 69

19 Mind-Read > Act > Persuade hmm Roz looks busy. Its probably not a good time to bring this up Analysis of nonverbal cues Inference and reasoning about mental states Modify one s actions Persuade others

20 Skills of Emotional Intelligence Recognizing emotions Having emotions Expressing emotions Regulating emotions Utilizing emotions if have emotion Guoying Zhao 20 / 69

21 Robotic Computer Research Areas - Recognizing another s emotions - Expressing emotions - Handling another s emotions and showing emotions Affective and Cognitive Decision Making - Regulating and utilizing emotions - Affect as a self-adapting control system Affect changes the operating characteristics of other three domains (cognition, motivation, behavior) Guoying Zhao 21 / 69

22 Guoying Zhao 22 / 69 Some typical recent research topics (from TAC) Games/Entertainment computing Emotion Recognition in: Speech: Responses to victory and defeat Emotion in natural speech Affective music player Depression detection Boredom detection Face Modeling Recognizing expressions Intensity analysis Modeling emotional influences on Physiology decision-making Inferring response to music via EEG Modeling factors that elicit emotions Detecting stress from skin conductance Applications Text Opinions in twitter; blogs Health: detection and shaping Emotocons Synthesis Emotional speech Emotional facial expressions Games/entertainment: detection and shaping; synthesis/realism Education: detections; shaping Behavioral Science

23 Emotional computer Including facial expression, speech and gesture analysis and facial expression and speech synthesis. Guoying Zhao 23 / 69

24 Computer recognizes emotions Guoying Zhao 24 / 69

25 Affection detection sources: Speech/Vocal expression Facial Expression Affection Detection and Recognition Bio-signals (Psychological sensors, Wearable sensors) Brain Signal, skin temperature, blood pressure, heart rate, respiration rate Gesture Limbic movements Text Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications Guoying Leon, E.; Zhao Clarke, G.; Sepulveda, F.; Callaghan, V., "Optimised attribute selection for emotion 25 / 69 classification using physiological signals

26 Affection Detection and Recognition Affection recognition modalities Unimodal Multimodal provide a more natural style for communication Rafael A. Calvo, Sidney D Mello, Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Guoying Applications. Zhao 26 / 69 Zhihong Zeng; Pantic, M.; Roisman, G.I.; Huang, T.S.; "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions,"

27 Speech in affective computing Affective messages in speech Emotions, moods, feelings Intentional, elicited Lexical (explicit) Affective words, stress, etc. Paralinguistic (implicit) vocal signals beyond the basic verbal message convey meaning beyond the words and grammar used Speech prosody voice quality vocalizations Guoying Zhao 27 / 69

28 Affection Recognition Method Voice / Speech Paralinguistic Features of Speech how is it said? Prosodic features (e.g., pitch-related feature, energy-related features, and speech rate) Spectral features (e.g., MFCC - Mel-frequency cepstral coefficient and cepstral features) Spectral tilt, LFPC (Log Frequency Power Coefficients) F0 (fundamental frequency of speech), Long-term spectrum Speech disfluencies (e.g., filler and silence pauses) Context information (e.g., subject, gender, and turn-level features representing local and global aspects of the dialogue) Nonlinguistic vocalizations (e.g., laughs and cries, decode other affective signals such as stress, depression, boredom, and excitement) Guoying Zhao 28 / 69 Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications, IEEE Transactions on Affective Computing, 1(1): 18-37, 2010

29 Acted Corpora Bored? Interested? Sad? Angry? Happy? confident frustrated friendly anxious encouraging

30 Acted Corpora happy sad angry confident frustrated friendly interested anxious bored encouraging

31 Affection Recognition Method Speech Recognition Architecture Sadness, anger, and fear are the emotions that are more easily recognized through voice than disgust. Speech Signal Pre-processing Feature Extraction Classification Audio recordings collected in call centers and, meetings, Wizard of Oz scenarios interviews and other dialogue systems Classified Result M. Pantic, N. Sebe, J. F. Cohn, and T. Huang. Affective multimodal human-computer interaction. In ACM International Conference on Multimedia (MM), Guoying Rafael A. Calvo, Zhao Sidney D'Mello, "Affect Detection: An Interdisciplinary 31 / 69 Review of Models, Methods, and Their Applications

32 Segmentation Emotion recognition from speech Active speech level, voiced/unvoiced, phonemes (/p/, /ʊ/, /ʃ/), etc. Speech feature extraction Acoustic features Spectral features and other transformations Glottal source parameterization Formant analysis Prosodic features Pitch/F0, Loudness/Intensity, Rhythm/Duration Voice quality Guoying Zhao 32 / 69

33 Emotion recognition from speech Machine learning tools used frequently Feature selection and transformations Sequential floating search (SFS), principal component analysis (PCA), nonlinear manifold modeling, etc. Classifiers Linear discriminant analysis (LDA), k-nearest neighbors (knn), support vector machines (SVM), hidden markov models (HMM), neural networks (NN) Validation and bias Cross-validation, structural risk minimization (SRM), etc. Guoying Zhao 33 / 69

34 Facial Expression Guoying Zhao 34 / 69

35 Facial Expression Recognition Mug Shot Dynamic Information Intensity Action Units Prototypic Emotional Expressions Psychological studies have suggested that the facial motion is fundamental to Guoying Zhao the recognition of facial expressions. 35 / 69

36 Emotion recognition from facial expressions Feature extraction Appearance features Local binary pattern Local binary pattern from three orthogonal planes Gabor, HOG, SIFT(Scale-invariant feature transform), etc. Shape features Distance between facial landmarks, angle, coordinates of facial landmarks, etc. To locate facial landmarks using ASM, AAM, CLM, DRMF(Discriminative Response Map Fitting), etc. Deep learning features CNN, RNN Guoying Zhao 36 / 69

37 Affection Recognition Method Facial Expression Example: Active Appearance Model (AAM) (AAM) based system which uses AAMs to detect and track the face and extract visual features. Guoying Zhao 37 / 69

38 Example: LBP-TOP (a) Non-overlapping blocks(9 x 8) (b) Overlapping blocks (4 x 3, overlap size = 10) (a) Block volumes (b) LBP features (c) Concatenated features for one block volume from three orthogonal planes with the appearance and motion Guoying Zhao 38 / 69 Features in each block volume

39 Guoying Zhao 39 / 69

40 Physiological Signals / Bio-Signals Physiological signals derived from e.g., Central Nervous System (CNS) and Peripheral Nervous System (PNS) of human body. Fear for example increases heartbeat and respiration rates, causes palm sweating, etc. Physiological information used: GSR - Galvanic Skin Resistance RESSP - Respiration BVP - Blood Pressure Skin Temperature Electrocardiography (ECG) Electrodermal activity (EDA) Electromyogram (EMG) Electroencephalogram (EEG) Guoying Zhao 40 / 69

41 Electroencephalogram (EEG), Electrocardiography (ECG), Electrodermal activity (EDA), Electromyogram (EMG) ECG: recording of the electrical activity of the heart EEG: recording of electrical activity along the scalp EDA: to measure the electrical conductance of skin, which varies depending on the amount of sweatinduced moisture on the skin. Guoying Zhao 41 / 69 EMG: for evaluating and recording the electrical activity produced by skeletal muscles.

42 Gesture / Body Motion Pantic et al. s survey shows that gesture and body motion information is an important modality for human affect recognition. Combination of face and gesture is more accurate than facial expression alone. Two categories of Body Motion based affect recognition Stylized The entirety of the movement encodes a particular emotion. Non-stylized More natural - knocking door, lifting hand, walking etc. Example: Applying SOSPDF (shape of signal probability density function) feature description framework in captured 3D human motion data Guoying Zhao 42 / 69

43 Frequently used Modeling Techniques Fuzzy Logic Neural Networks (NN) Hybrid: Fuzzy + NN Tree augmented Naïve Bayes Hidden Markov Models (HMM) K-Nearest Neighbors (KNN) Linear Discriminant Analysis (LDA) Support Vector Machines (SVM) Gaussian Mixture Models (GMM) Discriminant Function Analysis (DFA) Sequential Forward Floating Search (SFFS) Deep learning Guoying Zhao 43 / 69

44 Computer expresses emotions Guoying Zhao 44 / 69

45 Facial expression synthesis Interact with virtual agents: help desks, museum guides, online shopping, and distance learning We would like such virtual characters to behave realistically. This involves understanding the signals we use in non-verbal communication, and enabling the virtual character to display such signals. Such understanding can be achieved through the analysis of data collected from humans interacting with each other or with computers. This data can then be used to teach the virtual characters to express themselves in an appropriate manner. Guoying Zhao 45 / 69

46 Facial expression synthesis Computer Laboratory of University of Cambridge Synthesized affective gestures: Guoying Zhao 46 / 69

47 Emotional speech synthesis The overall goal of the speech synthesis research community is to create natural sounding synthetic speech. One way synthesized speech benefits from emotions is by delivering certain content in the right emotion (e.g. good news are delivered in a happy voice), therefore making the speech and the content more believable. Emotions can make the interaction with the computer more natural because the system reacts in ways that the user expects. Emotional speech synthesis is a step in this direction. Guoying Zhao 47 / 69

48 Effects of mood on decision making (Lerner & Keltner 2000, 2001, 2004) A person s mood at the time of decision-making also influences issue interpretation and his or her choice under risk. Happiness Optimistic about judgments of future events Anger Optimistic judgments of future events, Risk-Seeking choices Fear Pessimistic judgments of future events, Risk-Aversive choices Guoying Zhao 48 / 69 Sadness Pessimistic about the present and make a choice to change the current sad state. Reverse Endowment Effect

49 Applications In the security sector affective behavioural cues play a crucial role in establishing or detracting from credibility In the medical sector, affective behavioural cues are a direct means to identify when specific mental processes are occurring Neurology (in studies on dependence between emotion dysfunction or impairment and brain lesions) Psychiatry (in studies on schizophrenia and mood disorders) Dialog/Automatic call center Environment to reduce user/customer frustration Academic learning Human Computer Interaction (HCI) Zhihong Guoying Zeng; Zhao Pantic, M.; Roisman, G.I.; Huang, T.S.; 49 /, 69 "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions," Pattern Analysis and Machine Intelligence

50 Context Aware Multimodal Affection Analysis Based Smart Learning Environment Guoying Zhao 50 / 69

51 Output Face Analysis Voice Analysis Posture Analysis Multimedia Note Reading Behavior Report Lesson Length Suggestion System Controller Class Efficiency Report Hardware Calibration Manager Physiology Analysis Decision Support System Multimodal Affect Input Application System Architecture Parameter Adjustment

52 Driver Emotion Aware Multiple Agent Controlled Automatic Vehicle

53 Basic Emotions Stress Level Complex Emotions Feature Estimator Alert the Driver Driving Aid Agent Speed, ABS, Traction Control Audio Linguistic / Non-linguistic Feature Detector Facial Expression Feature Detector Navigation Agent Route Selection Bio-signals Actions Steering Movement Interaction with Gas / Break Paddle... Safety Agent Inter agent communication to aid decision making Notify in case of Emergency Music, Climate Control Seat Pressure Feature Detector Affective Multimedia Agent 53

54 Concerned Issues Privacy concerns I do not want the outside world to know what goes through my mind Ethical concerns Robot nurse or caregivers capable of effective feedback Risk of misuse of the technology In the hand of impostors Computers start to make emotionally distorted, harmful decisions (In the Terminator films, computers become self-aware and try to exterminate human life) Complex technology Effectiveness is still questionable, risk of false interpretation Guoying Zhao 54 / 69

55 Future Research Directions So far Context has been overlooked in most Affection Computing researches Collaboration among Affection researchers from different disciplines Fast real-time processing Multimodal detection and recognition to achieve higher accuracy Systems that can model conscious and subconscious user behaviour Guoying Rafael A. Zhao Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their 55 / 69 Applications. IEEE Transactions on Affective Computing, 1(1): 18-37, 2010

56 Databases

57 Facial expression: JAFFE (Easy) Static facial expression images 213 images of 7 facial expressions 10 Japanese females Protocol Leave-one-subject-out Leave-one-sample-out Guoying Zhao 57 / 69

58 Dynamic facial expression Cohn-Kanade (Medium) 372 facial expression sequences Six facial expressions (Anger, Disgust, Fear, Happiness, Sadness and Surprise) Protocol Leave-one-subject-out Guoying Zhao 58 / 69

59 MAHNOB (Hard) Spontaneous facial expression Video Nine emotion keywords Valence and arousal mapping Selected session: 267 samples from 14 subjects Protocol Leave-one-subject-out Valence & Arousal dimension Table. The emotion keywords are mapped into three levels on arousal and valence Arousal classes Emotional keywords Calm sadness, disgust, neutral Medium arousal joy and happiness, amusement Excited/Activated surprise, fear, anger, anxiety Valence classes Emotional keywords Unpleasant fear, anger, disgust, sadness, anxiety Neutral valence surprise, neutral joy and happiness, amusement Guoying Zhao Pleasant 59 / 69

60 AFEW (Hard) Dynamic temporal facial expression data Close to the real-world condition Seven facial expressions Audio & Visual More difficult Protocol 10-fold-cross-validation Guoying Zhao 60 / 69

61 Emotional speech databases Available speech databases MediaTeam emotional speech corpus (easy) Finnish, acted (stereotypical), single modal, basic emotions (4-7 discrete classes), proprietary Hytke (challenging) Finnish, spontaneous (emphasized), multimodal, SAM scale (dimensional annotation: valence, activation) Speech, facial video, heart rate (RR), eye tracking, posture Guoying Zhao 61 / 69

62 Available biosignal databases DEAP ( Electroencephalogram (EEG) and peripheral physiological signals (also some facial videos) Dimensional annotation (self-report and online ratings) Database descriptions me.html ac_special_issue_2011.pdf Guoying Zhao 62 / 69

63 Emotion recognition from multi-modal Feature extraction Video/image: LBP, LBP-TOP, Shape etc. Speech: Acoustic features, Prosodic features etc. Bio-signal: EEG: spectral power (SP) feature, spectral power difference (SPD) features, etc. Fusion Decision-level on classifiers SUM, MAX, PRODUCT, MIN, MEAN, etc. Feature-level Multiple kernel learning (MKL), Canonical correlation analysis (CCA), etc. Guoying Zhao 63 / 69

64 Multi-modal Databases AFEW (challenging) Guoying Zhao 64 / 69 Available databases Spontaneous, multimodal, seven categorized emotions Facial video & Speech 325 video clips with seven emotions (surprise, sadness, neutral, sadness, neutral, happiness, fear, disgust and anger) Hytke (challenging) 24 females Finnish, spontaneous (emphasized), multimodal, SAM scale (dimensional annotation: valence and activation) Speech, facial video, heart rate, eye tracking, posture

65 Available databases Multimodal databases MAHNOB (Challenging) Spontaneous, multimodal, dimensional annotation: valence and arousal Facial video, EEG signal (32 channels), ECG, respiration amplitude and skin temperature The physiological signals are stored using BDF/MAT format, which is readable by EEGLAB, Matlab, EDFB browser, etc. (We provide BDF and mat format for physiological signals) Guoying Zhao 65 / 69

66 Face detection Facial expression tools _face_detection/py_face_detection.html Appearance features: LBP/LBP-TOP Landmark Detection for geometric features Discriminative Response Map Fitting (DRMF): / ASM, AAM Guoying Zhao 66 / 69

67 Praat Speech tools (GPL licenses) Phonetics toolbox/software, for prosody and general speech analysis Voicebox MATLAB toolbox for speech processing, feature extraction etc. Guoying Zhao 67 / 69

68 EEGLAB Biosignal tools (GPL licenses) Graphical EEG signal processing environment/toolbox for MATLAB BioSig MATLAB Toolbox for biosignal processing, feature extraction etc. Guoying Zhao 68 / 69

69 Fusion Guoying Zhao 69 / 69 Fusion and Classification Decision-level on classifiers SUM, MAX, PRODUCT, MIN, MEAN, etc. Feature-level Classifiers: SVM Multiple kernel learning (MKL), Canonical correlation analysis (CCA), etc. Libsvm HMM CRF KNN

AFFECTIVE COMPUTING. Affective Computing. Introduction. Guoying Zhao 1 / 67

AFFECTIVE COMPUTING. Affective Computing. Introduction. Guoying Zhao 1 / 67 Affective Computing Introduction Guoying Zhao 1 / 67 Your Staff Assoc. Prof. Guoying Zhao - email: guoying.zhao@oulu.fi - office: TS302 - phone: 0294 487564 - Wed. 3-4pm Dr. Xiaohua Huang (Assistant Lecturer)

More information

Emotional Design. D. Norman (Emotional Design, 2004) Model with three levels. Visceral (lowest level) Behavioral (middle level) Reflective (top level)

Emotional Design. D. Norman (Emotional Design, 2004) Model with three levels. Visceral (lowest level) Behavioral (middle level) Reflective (top level) Emotional Design D. Norman (Emotional Design, 2004) Model with three levels Visceral (lowest level) Behavioral (middle level) Reflective (top level) Emotional Intelligence (EI) IQ is not the only indicator

More information

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China A Vision-based Affective Computing System Jieyu Zhao Ningbo University, China Outline Affective Computing A Dynamic 3D Morphable Model Facial Expression Recognition Probabilistic Graphical Models Some

More information

Gender Based Emotion Recognition using Speech Signals: A Review

Gender Based Emotion Recognition using Speech Signals: A Review 50 Gender Based Emotion Recognition using Speech Signals: A Review Parvinder Kaur 1, Mandeep Kaur 2 1 Department of Electronics and Communication Engineering, Punjabi University, Patiala, India 2 Department

More information

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information C. Busso, Z. Deng, S. Yildirim, M. Bulut, C. M. Lee, A. Kazemzadeh, S. Lee, U. Neumann, S. Narayanan Emotion

More information

Facial expression recognition with spatiotemporal local descriptors

Facial expression recognition with spatiotemporal local descriptors Facial expression recognition with spatiotemporal local descriptors Guoying Zhao, Matti Pietikäinen Machine Vision Group, Infotech Oulu and Department of Electrical and Information Engineering, P. O. Box

More information

Emotion based E-learning System using Physiological Signals. Dr. Jerritta S, Dr. Arun S School of Engineering, Vels University, Chennai

Emotion based E-learning System using Physiological Signals. Dr. Jerritta S, Dr. Arun S School of Engineering, Vels University, Chennai CHENNAI - INDIA Emotion based E-learning System using Physiological Signals School of Engineering, Vels University, Chennai Outline Introduction Existing Research works on Emotion Recognition Research

More information

Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners

Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners Hatice Gunes and Maja Pantic Department of Computing, Imperial College London 180 Queen

More information

EECS 433 Statistical Pattern Recognition

EECS 433 Statistical Pattern Recognition EECS 433 Statistical Pattern Recognition Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1 / 19 Outline What is Pattern

More information

COMPARISON BETWEEN GMM-SVM SEQUENCE KERNEL AND GMM: APPLICATION TO SPEECH EMOTION RECOGNITION

COMPARISON BETWEEN GMM-SVM SEQUENCE KERNEL AND GMM: APPLICATION TO SPEECH EMOTION RECOGNITION Journal of Engineering Science and Technology Vol. 11, No. 9 (2016) 1221-1233 School of Engineering, Taylor s University COMPARISON BETWEEN GMM-SVM SEQUENCE KERNEL AND GMM: APPLICATION TO SPEECH EMOTION

More information

Affective Game Engines: Motivation & Requirements

Affective Game Engines: Motivation & Requirements Affective Game Engines: Motivation & Requirements Eva Hudlicka Psychometrix Associates Blacksburg, VA hudlicka@ieee.org psychometrixassociates.com DigiPen Institute of Technology February 20, 2009 1 Outline

More information

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry.

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry. Proceedings Chapter Valence-arousal evaluation using physiological signals in an emotion recall paradigm CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry Abstract The work presented in this paper aims

More information

Blue Eyes Technology

Blue Eyes Technology Blue Eyes Technology D.D. Mondal #1, Arti Gupta *2, Tarang Soni *3, Neha Dandekar *4 1 Professor, Dept. of Electronics and Telecommunication, Sinhgad Institute of Technology and Science, Narhe, Maharastra,

More information

Introduction to affect computing and its applications

Introduction to affect computing and its applications Introduction to affect computing and its applications Overview What is emotion? What is affective computing + examples? Why is affective computing useful? How do we do affect computing? Some interesting

More information

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques Yisu Zhao Thesis submitted to the Faculty of Graduate and Postdoctoral Studies In partial fulfillment of the requirements

More information

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals.

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. Bandara G.M.M.B.O bhanukab@gmail.com Godawita B.M.D.T tharu9363@gmail.com Gunathilaka

More information

Audio-visual Classification and Fusion of Spontaneous Affective Data in Likelihood Space

Audio-visual Classification and Fusion of Spontaneous Affective Data in Likelihood Space 2010 International Conference on Pattern Recognition Audio-visual Classification and Fusion of Spontaneous Affective Data in Likelihood Space Mihalis A. Nicolaou, Hatice Gunes and Maja Pantic, Department

More information

Affective Computing Ana Paiva & João Dias. Lecture 1. Course Presentation

Affective Computing Ana Paiva & João Dias. Lecture 1. Course Presentation Affective Computing Ana Paiva & João Dias Lecture 1. Course Presentation Motivation. What is Affective Computing? Applications and Problems Perspectives on Emotions History of Affective Sciences Communication

More information

Emotion Recognition using a Cauchy Naive Bayes Classifier

Emotion Recognition using a Cauchy Naive Bayes Classifier Emotion Recognition using a Cauchy Naive Bayes Classifier Abstract Recognizing human facial expression and emotion by computer is an interesting and challenging problem. In this paper we propose a method

More information

Mental State Recognition by using Brain Waves

Mental State Recognition by using Brain Waves Indian Journal of Science and Technology, Vol 9(33), DOI: 10.17485/ijst/2016/v9i33/99622, September 2016 ISSN (Print) : 0974-6846 ISSN (Online) : 0974-5645 Mental State Recognition by using Brain Waves

More information

Emote to Win: Affective Interactions with a Computer Game Agent

Emote to Win: Affective Interactions with a Computer Game Agent Emote to Win: Affective Interactions with a Computer Game Agent Jonghwa Kim, Nikolaus Bee, Johannes Wagner and Elisabeth André Multimedia Concepts and Application, Faculty for Applied Computer Science

More information

Fudan University, China

Fudan University, China Cyber Psychosocial and Physical (CPP) Computation Based on Social Neuromechanism -Joint research work by Fudan University and University of Novi Sad By Professor Weihui Dai Fudan University, China 1 Agenda

More information

GfK Verein. Detecting Emotions from Voice

GfK Verein. Detecting Emotions from Voice GfK Verein Detecting Emotions from Voice Respondents willingness to complete questionnaires declines But it doesn t necessarily mean that consumers have nothing to say about products or brands: GfK Verein

More information

EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS

EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS 1 KRISHNA MOHAN KUDIRI, 2 ABAS MD SAID AND 3 M YUNUS NAYAN 1 Computer and Information Sciences, Universiti Teknologi PETRONAS, Malaysia 2 Assoc.

More information

Emotion Detection Using Physiological Signals. M.A.Sc. Thesis Proposal Haiyan Xu Supervisor: Prof. K.N. Plataniotis

Emotion Detection Using Physiological Signals. M.A.Sc. Thesis Proposal Haiyan Xu Supervisor: Prof. K.N. Plataniotis Emotion Detection Using Physiological Signals M.A.Sc. Thesis Proposal Haiyan Xu Supervisor: Prof. K.N. Plataniotis May 10 th, 2011 Outline Emotion Detection Overview EEG for Emotion Detection Previous

More information

Study on Aging Effect on Facial Expression Recognition

Study on Aging Effect on Facial Expression Recognition Study on Aging Effect on Facial Expression Recognition Nora Algaraawi, Tim Morris Abstract Automatic facial expression recognition (AFER) is an active research area in computer vision. However, aging causes

More information

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1 1. INTRODUCTION Sign language interpretation is one of the HCI applications where hand gesture plays important role for communication. This chapter discusses sign language interpretation system with present

More information

Research Proposal on Emotion Recognition

Research Proposal on Emotion Recognition Research Proposal on Emotion Recognition Colin Grubb June 3, 2012 Abstract In this paper I will introduce my thesis question: To what extent can emotion recognition be improved by combining audio and visual

More information

Empirical Cognitive Modeling

Empirical Cognitive Modeling Empirical Cognitive Modeling Tanja Schultz Felix Putze Dominic Heger 24.5.2012 Lecture Cognitive Modeling (SS 2012) 1/49 Outline What are empirical cognitive modeling and recognition of human cognitive

More information

Edge Based Grid Super-Imposition for Crowd Emotion Recognition

Edge Based Grid Super-Imposition for Crowd Emotion Recognition Edge Based Grid Super-Imposition for Crowd Emotion Recognition Amol S Patwardhan 1 1Senior Researcher, VIT, University of Mumbai, 400037, India ---------------------------------------------------------------------***---------------------------------------------------------------------

More information

Bio-Feedback Based Simulator for Mission Critical Training

Bio-Feedback Based Simulator for Mission Critical Training Bio-Feedback Based Simulator for Mission Critical Training Igor Balk Polhemus, 40 Hercules drive, Colchester, VT 05446 +1 802 655 31 59 x301 balk@alum.mit.edu Abstract. The paper address needs for training

More information

Practical Approaches to Comforting Users with Relational Agents

Practical Approaches to Comforting Users with Relational Agents Practical Approaches to Comforting Users with Relational Agents Timothy Bickmore, Daniel Schulman Northeastern University College of Computer and Information Science 360 Huntington Avenue, WVH 202, Boston,

More information

Recognising Emotions from Keyboard Stroke Pattern

Recognising Emotions from Keyboard Stroke Pattern Recognising Emotions from Keyboard Stroke Pattern Preeti Khanna Faculty SBM, SVKM s NMIMS Vile Parle, Mumbai M.Sasikumar Associate Director CDAC, Kharghar Navi Mumbai ABSTRACT In day to day life, emotions

More information

Speech as HCI. HCI Lecture 11. Human Communication by Speech. Speech as HCI(cont. 2) Guest lecture: Speech Interfaces

Speech as HCI. HCI Lecture 11. Human Communication by Speech. Speech as HCI(cont. 2) Guest lecture: Speech Interfaces HCI Lecture 11 Guest lecture: Speech Interfaces Hiroshi Shimodaira Institute for Communicating and Collaborative Systems (ICCS) Centre for Speech Technology Research (CSTR) http://www.cstr.ed.ac.uk Thanks

More information

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition , pp.131-135 http://dx.doi.org/10.14257/astl.2013.39.24 Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition SeungTaek Ryoo and Jae-Khun Chang School of Computer Engineering

More information

A framework for the Recognition of Human Emotion using Soft Computing models

A framework for the Recognition of Human Emotion using Soft Computing models A framework for the Recognition of Human Emotion using Soft Computing models Md. Iqbal Quraishi Dept. of Information Technology Kalyani Govt Engg. College J Pal Choudhury Dept. of Information Technology

More information

Sociable Robots Peeping into the Human World

Sociable Robots Peeping into the Human World Sociable Robots Peeping into the Human World An Infant s Advantages Non-hostile environment Actively benevolent, empathic caregiver Co-exists with mature version of self Baby Scheme Physical form can evoke

More information

COMBINING CATEGORICAL AND PRIMITIVES-BASED EMOTION RECOGNITION. University of Southern California (USC), Los Angeles, CA, USA

COMBINING CATEGORICAL AND PRIMITIVES-BASED EMOTION RECOGNITION. University of Southern California (USC), Los Angeles, CA, USA COMBINING CATEGORICAL AND PRIMITIVES-BASED EMOTION RECOGNITION M. Grimm 1, E. Mower 2, K. Kroschel 1, and S. Narayanan 2 1 Institut für Nachrichtentechnik (INT), Universität Karlsruhe (TH), Karlsruhe,

More information

Affect Intensity Estimation using Multiple Modalities

Affect Intensity Estimation using Multiple Modalities Affect Intensity Estimation using Multiple Modalities Amol S. Patwardhan, and Gerald M. Knapp Department of Mechanical and Industrial Engineering Louisiana State University apatwa3@lsu.edu Abstract One

More information

Facial Expression Recognition Using Principal Component Analysis

Facial Expression Recognition Using Principal Component Analysis Facial Expression Recognition Using Principal Component Analysis Ajit P. Gosavi, S. R. Khot Abstract Expression detection is useful as a non-invasive method of lie detection and behaviour prediction. However,

More information

The Ordinal Nature of Emotions. Georgios N. Yannakakis, Roddy Cowie and Carlos Busso

The Ordinal Nature of Emotions. Georgios N. Yannakakis, Roddy Cowie and Carlos Busso The Ordinal Nature of Emotions Georgios N. Yannakakis, Roddy Cowie and Carlos Busso The story It seems that a rank-based FeelTrace yields higher inter-rater agreement Indeed, FeelTrace should actually

More information

A micropower support vector machine based seizure detection architecture for embedded medical devices

A micropower support vector machine based seizure detection architecture for embedded medical devices A micropower support vector machine based seizure detection architecture for embedded medical devices The MIT Faculty has made this article openly available. Please share how this access benefits you.

More information

IEEE TRANSACTIONS ON AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. 19, NO. 5, JULY

IEEE TRANSACTIONS ON AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. 19, NO. 5, JULY IEEE TRANSACTIONS ON AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. 19, NO. 5, JULY 2011 1057 A Framework for Automatic Human Emotion Classification Using Emotion Profiles Emily Mower, Student Member, IEEE,

More information

Audio-based Emotion Recognition for Advanced Automatic Retrieval in Judicial Domain

Audio-based Emotion Recognition for Advanced Automatic Retrieval in Judicial Domain Audio-based Emotion Recognition for Advanced Automatic Retrieval in Judicial Domain F. Archetti 1,2, G. Arosio 1, E. Fersini 1, E. Messina 1 1 DISCO, Università degli Studi di Milano-Bicocca, Viale Sarca,

More information

PHYSIOLOGICAL RESEARCH

PHYSIOLOGICAL RESEARCH DOMAIN STUDIES PHYSIOLOGICAL RESEARCH In order to understand the current landscape of psychophysiological evaluation methods, we conducted a survey of academic literature. We explored several different

More information

Estimating Intent for Human-Robot Interaction

Estimating Intent for Human-Robot Interaction Estimating Intent for Human-Robot Interaction D. Kulić E. A. Croft Department of Mechanical Engineering University of British Columbia 2324 Main Mall Vancouver, BC, V6T 1Z4, Canada Abstract This work proposes

More information

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification Reza Lotfian and Carlos Busso Multimodal Signal Processing (MSP) lab The University of Texas

More information

Effect of Sensor Fusion for Recognition of Emotional States Using Voice, Face Image and Thermal Image of Face

Effect of Sensor Fusion for Recognition of Emotional States Using Voice, Face Image and Thermal Image of Face Effect of Sensor Fusion for Recognition of Emotional States Using Voice, Face Image and Thermal Image of Face Yasunari Yoshitomi 1, Sung-Ill Kim 2, Takako Kawano 3 and Tetsuro Kitazoe 1 1:Department of

More information

On Shape And the Computability of Emotions X. Lu, et al.

On Shape And the Computability of Emotions X. Lu, et al. On Shape And the Computability of Emotions X. Lu, et al. MICC Reading group 10.07.2013 1 On Shape and the Computability of Emotion X. Lu, P. Suryanarayan, R. B. Adams Jr., J. Li, M. G. Newman, J. Z. Wang

More information

Affect Recognition for Interactive Companions

Affect Recognition for Interactive Companions Affect Recognition for Interactive Companions Ginevra Castellano School of Electronic Engineering and Computer Science Queen Mary University of London, UK ginevra@dcs.qmul.ac.uk Ruth Aylett School of Maths

More information

Computational models of emotion

Computational models of emotion HUMAINE Plenary Newcastle, May 24-27, 2004 Computational models of emotion Klaus R. Scherer University of Posing the problem: Three types of computational models Appraisal cirteria Integration rules Sequential

More information

AUDIO-VISUAL EMOTION RECOGNITION USING AN EMOTION SPACE CONCEPT

AUDIO-VISUAL EMOTION RECOGNITION USING AN EMOTION SPACE CONCEPT 16th European Signal Processing Conference (EUSIPCO 28), Lausanne, Switzerland, August 25-29, 28, copyright by EURASIP AUDIO-VISUAL EMOTION RECOGNITION USING AN EMOTION SPACE CONCEPT Ittipan Kanluan, Michael

More information

Human Factors & User Experience LAB

Human Factors & User Experience LAB Human Factors & User Experience LAB User Experience (UX) User Experience is the experience a user has when interacting with your product, service or application Through research and observation of users

More information

Speech Group, Media Laboratory

Speech Group, Media Laboratory Speech Group, Media Laboratory The Speech Research Group of the M.I.T. Media Laboratory is concerned with understanding human speech communication and building systems capable of emulating conversational

More information

SPEECH EMOTION RECOGNITION: ARE WE THERE YET?

SPEECH EMOTION RECOGNITION: ARE WE THERE YET? SPEECH EMOTION RECOGNITION: ARE WE THERE YET? CARLOS BUSSO Multimodal Signal Processing (MSP) lab The University of Texas at Dallas Erik Jonsson School of Engineering and Computer Science Why study emotion

More information

Emotion Recognition Modulating the Behavior of Intelligent Systems

Emotion Recognition Modulating the Behavior of Intelligent Systems 2013 IEEE International Symposium on Multimedia Emotion Recognition Modulating the Behavior of Intelligent Systems Asim Smailagic, Daniel Siewiorek, Alex Rudnicky, Sandeep Nallan Chakravarthula, Anshuman

More information

PERFORMANCE ANALYSIS OF THE TECHNIQUES EMPLOYED ON VARIOUS DATASETS IN IDENTIFYING THE HUMAN FACIAL EMOTION

PERFORMANCE ANALYSIS OF THE TECHNIQUES EMPLOYED ON VARIOUS DATASETS IN IDENTIFYING THE HUMAN FACIAL EMOTION PERFORMANCE ANALYSIS OF THE TECHNIQUES EMPLOYED ON VARIOUS DATASETS IN IDENTIFYING THE HUMAN FACIAL EMOTION Usha Mary Sharma 1, Jayanta Kumar Das 2, Trinayan Dutta 3 1 Assistant Professor, 2,3 Student,

More information

Introduction to Computational Neuroscience

Introduction to Computational Neuroscience Introduction to Computational Neuroscience Lecture 5: Data analysis II Lesson Title 1 Introduction 2 Structure and Function of the NS 3 Windows to the Brain 4 Data analysis 5 Data analysis II 6 Single

More information

Affect in Virtual Agents (and Robots) Professor Beste Filiz Yuksel University of San Francisco CS 686/486

Affect in Virtual Agents (and Robots) Professor Beste Filiz Yuksel University of San Francisco CS 686/486 Affect in Virtual Agents (and Robots) Professor Beste Filiz Yuksel University of San Francisco CS 686/486 Software / Virtual Agents and Robots Affective Agents Computer emotions are of primary interest

More information

User Affective State Assessment for HCI Systems

User Affective State Assessment for HCI Systems Association for Information Systems AIS Electronic Library (AISeL) AMCIS 2004 Proceedings Americas Conference on Information Systems (AMCIS) 12-31-2004 Xiangyang Li University of Michigan-Dearborn Qiang

More information

EMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS?

EMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS? EMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS? Sefik Emre Eskimez, Kenneth Imade, Na Yang, Melissa Sturge- Apple, Zhiyao Duan, Wendi Heinzelman University of Rochester,

More information

SPEECH TO TEXT CONVERTER USING GAUSSIAN MIXTURE MODEL(GMM)

SPEECH TO TEXT CONVERTER USING GAUSSIAN MIXTURE MODEL(GMM) SPEECH TO TEXT CONVERTER USING GAUSSIAN MIXTURE MODEL(GMM) Virendra Chauhan 1, Shobhana Dwivedi 2, Pooja Karale 3, Prof. S.M. Potdar 4 1,2,3B.E. Student 4 Assitant Professor 1,2,3,4Department of Electronics

More information

Temporal Context and the Recognition of Emotion from Facial Expression

Temporal Context and the Recognition of Emotion from Facial Expression Temporal Context and the Recognition of Emotion from Facial Expression Rana El Kaliouby 1, Peter Robinson 1, Simeon Keates 2 1 Computer Laboratory University of Cambridge Cambridge CB3 0FD, U.K. {rana.el-kaliouby,

More information

MSAS: An M-mental health care System for Automatic Stress detection

MSAS: An M-mental health care System for Automatic Stress detection Quarterly of Clinical Psychology Studies Allameh Tabataba i University Vol. 7, No. 28, Fall 2017, Pp 87-94 MSAS: An M-mental health care System for Automatic Stress detection Saeid Pourroostaei Ardakani*

More information

Emotion-Aware Machines

Emotion-Aware Machines Emotion-Aware Machines Saif Mohammad, Senior Research Officer National Research Council Canada 1 Emotion-Aware Machines Saif Mohammad National Research Council Canada 2 What does it mean for a machine

More information

Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results

Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results Seppo J. Laukka 1, Antti Rantanen 1, Guoying Zhao 2, Matti Taini 2, Janne Heikkilä

More information

Active User Affect Recognition and Assistance

Active User Affect Recognition and Assistance Active User Affect Recognition and Assistance Wenhui Liao, Zhiwe Zhu, Markus Guhe*, Mike Schoelles*, Qiang Ji, and Wayne Gray* Email: jiq@rpi.edu Department of Electrical, Computer, and System Eng. *Department

More information

Divide-and-Conquer based Ensemble to Spot Emotions in Speech using MFCC and Random Forest

Divide-and-Conquer based Ensemble to Spot Emotions in Speech using MFCC and Random Forest Published as conference paper in The 2nd International Integrated Conference & Concert on Convergence (2016) Divide-and-Conquer based Ensemble to Spot Emotions in Speech using MFCC and Random Forest Abdul

More information

HUMAN EMOTION DETECTION THROUGH FACIAL EXPRESSIONS

HUMAN EMOTION DETECTION THROUGH FACIAL EXPRESSIONS th June. Vol.88. No. - JATIT & LLS. All rights reserved. ISSN: -8 E-ISSN: 87- HUMAN EMOTION DETECTION THROUGH FACIAL EXPRESSIONS, KRISHNA MOHAN KUDIRI, ABAS MD SAID AND M YUNUS NAYAN Computer and Information

More information

Emotionally Augmented Storytelling Agent

Emotionally Augmented Storytelling Agent Emotionally Augmented Storytelling Agent The Effects of Dimensional Emotion Modeling for Agent Behavior Control Sangyoon Lee 1(&), Andrew E. Johnson 2, Jason Leigh 2, Luc Renambot 2, Steve Jones 3, and

More information

Classification of valence using facial expressions of TV-viewers

Classification of valence using facial expressions of TV-viewers Classification of valence using facial expressions of TV-viewers Master s Thesis Yorick H. Holkamp Classification of valence using facial expressions of TV-viewers THESIS submitted in partial fulfillment

More information

This is the accepted version of this article. To be published as : This is the author version published as:

This is the accepted version of this article. To be published as : This is the author version published as: QUT Digital Repository: http://eprints.qut.edu.au/ This is the author version published as: This is the accepted version of this article. To be published as : This is the author version published as: Chew,

More information

INTER-RATER RELIABILITY OF ACTUAL TAGGED EMOTION CATEGORIES VALIDATION USING COHEN S KAPPA COEFFICIENT

INTER-RATER RELIABILITY OF ACTUAL TAGGED EMOTION CATEGORIES VALIDATION USING COHEN S KAPPA COEFFICIENT INTER-RATER RELIABILITY OF ACTUAL TAGGED EMOTION CATEGORIES VALIDATION USING COHEN S KAPPA COEFFICIENT 1 NOR RASHIDAH MD JUREMI, 2 *MOHD ASYRAF ZULKIFLEY, 3 AINI HUSSAIN, 4 WAN MIMI DIYANA WAN ZAKI Department

More information

Person Perception. Forming Impressions of Others. Mar 5, 2012, Banu Cingöz Ulu

Person Perception. Forming Impressions of Others. Mar 5, 2012, Banu Cingöz Ulu Person Perception Forming Impressions of Others Mar 5, 2012, Banu Cingöz Ulu Person Perception person perception: how we come to know about others temporary states, emotions, intentions and desires impression

More information

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology ISSN: 0975-766X CODEN: IJPTFI Available Online through Research Article www.ijptonline.com FACIAL EMOTION RECOGNITION USING NEURAL NETWORK Kashyap Chiranjiv Devendra, Azad Singh Tomar, Pratigyna.N.Javali,

More information

EMOTIONS are one of the most essential components of

EMOTIONS are one of the most essential components of 1 Hidden Markov Model for Emotion Detection in Speech Cyprien de Lichy, Pratyush Havelia, Raunaq Rewari Abstract This paper seeks to classify speech inputs into emotion labels. Emotions are key to effective

More information

Integrating User Affective State Assessment in Enhancing HCI: Review and Proposition

Integrating User Affective State Assessment in Enhancing HCI: Review and Proposition 192 The Open Cybernetics and Systemics Journal, 2008, 2, 192-205 Open Access Integrating User Affective State Assessment in Enhancing HCI: Review and Proposition Xiangyang Li * Department of Industrial

More information

Observer Annotation of Affective Display and Evaluation of Expressivity: Face vs. Face-and-Body

Observer Annotation of Affective Display and Evaluation of Expressivity: Face vs. Face-and-Body Observer Annotation of Affective Display and Evaluation of Expressivity: Face vs. Face-and-Body Hatice Gunes and Massimo Piccardi Faculty of Information Technology, University of Technology, Sydney (UTS)

More information

PSYC 222 Motivation and Emotions

PSYC 222 Motivation and Emotions PSYC 222 Motivation and Emotions Session 6 The Concept of Emotion Lecturer: Dr. Annabella Osei-Tutu, Psychology Department Contact Information: aopare-henaku@ug.edu.gh College of Education School of Continuing

More information

Facial Expression Biometrics Using Tracker Displacement Features

Facial Expression Biometrics Using Tracker Displacement Features Facial Expression Biometrics Using Tracker Displacement Features Sergey Tulyakov 1, Thomas Slowe 2,ZhiZhang 1, and Venu Govindaraju 1 1 Center for Unified Biometrics and Sensors University at Buffalo,

More information

Facial Emotion Recognition with Facial Analysis

Facial Emotion Recognition with Facial Analysis Facial Emotion Recognition with Facial Analysis İsmail Öztel, Cemil Öz Sakarya University, Faculty of Computer and Information Sciences, Computer Engineering, Sakarya, Türkiye Abstract Computer vision

More information

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION EMOTION: a response of the whole organism, involving 1. physiological arousal 2. expressive behaviors, and 3. conscious experience A mix of bodily arousal

More information

Smart Sensor Based Human Emotion Recognition

Smart Sensor Based Human Emotion Recognition Smart Sensor Based Human Emotion Recognition 1 2 3 4 5 6 7 8 9 1 11 12 13 14 15 16 17 18 19 2 21 22 23 24 25 26 27 28 29 3 31 32 33 34 35 36 37 38 39 4 41 42 Abstract A smart sensor based emotion recognition

More information

Decision tree SVM model with Fisher feature selection for speech emotion recognition

Decision tree SVM model with Fisher feature selection for speech emotion recognition Sun et al. EURASIP Journal on Audio, Speech, and Music Processing (2019) 2019:2 https://doi.org/10.1186/s13636-018-0145-5 RESEARCH Decision tree SVM model with Fisher feature selection for speech emotion

More information

EMOTION RECOGNITION FROM USERS EEG SIGNALS WITH THE HELP OF STIMULUS VIDEOS

EMOTION RECOGNITION FROM USERS EEG SIGNALS WITH THE HELP OF STIMULUS VIDEOS EMOTION RECOGNITION FROM USERS EEG SIGNALS WITH THE HELP OF STIMULUS VIDEOS Yachen Zhu, Shangfei Wang School of Computer Science and Technology University of Science and Technology of China 2327, Hefei,

More information

A Deep Learning Approach for Subject Independent Emotion Recognition from Facial Expressions

A Deep Learning Approach for Subject Independent Emotion Recognition from Facial Expressions A Deep Learning Approach for Subject Independent Emotion Recognition from Facial Expressions VICTOR-EMIL NEAGOE *, ANDREI-PETRU BĂRAR *, NICU SEBE **, PAUL ROBITU * * Faculty of Electronics, Telecommunications

More information

Sound Texture Classification Using Statistics from an Auditory Model

Sound Texture Classification Using Statistics from an Auditory Model Sound Texture Classification Using Statistics from an Auditory Model Gabriele Carotti-Sha Evan Penn Daniel Villamizar Electrical Engineering Email: gcarotti@stanford.edu Mangement Science & Engineering

More information

Audio-Visual Emotion Recognition in Adult Attachment Interview

Audio-Visual Emotion Recognition in Adult Attachment Interview Audio-Visual Emotion Recognition in Adult Attachment Interview Zhihong Zeng, Yuxiao Hu, Glenn I. Roisman, Zhen Wen, Yun Fu and Thomas S. Huang University of Illinois at Urbana-Champaign IBM T.J.Watson

More information

Modeling and Recognizing Emotions from Audio Signals: A Review

Modeling and Recognizing Emotions from Audio Signals: A Review Modeling and Recognizing Emotions from Audio Signals: A Review 1 Ritu Tanwar, 2 Deepti Chaudhary 1 UG Scholar, 2 Assistant Professor, UIET, Kurukshetra University, Kurukshetra, Haryana, India ritu.tanwar2012@gmail.com,

More information

EMOTIONAL INTELLIGENCE QUESTIONNAIRE

EMOTIONAL INTELLIGENCE QUESTIONNAIRE EMOTIONAL INTELLIGENCE QUESTIONNAIRE Personal Report JOHN SMITH 2017 MySkillsProfile. All rights reserved. Introduction The EIQ16 measures aspects of your emotional intelligence by asking you questions

More information

Bayesian Networks for Modeling Emotional State and Personality: Progress Report

Bayesian Networks for Modeling Emotional State and Personality: Progress Report From: AAAI Technical Report FS-98-03. Compilation copyright 1998, AAAI (www.aaai.org). All rights reserved. Bayesian Networks for Modeling Emotional State and Personality: Progress Report Jack Breese Gene

More information

Affective Dialogue Communication System with Emotional Memories for Humanoid Robots

Affective Dialogue Communication System with Emotional Memories for Humanoid Robots Affective Dialogue Communication System with Emotional Memories for Humanoid Robots M. S. Ryoo *, Yong-ho Seo, Hye-Won Jung, and H. S. Yang Artificial Intelligence and Media Laboratory Department of Electrical

More information

EMOTIONS RECOGNITION BY SPEECH AND FACIAL EXPRESSIONS ANALYSIS

EMOTIONS RECOGNITION BY SPEECH AND FACIAL EXPRESSIONS ANALYSIS 17th European Signal Processing Conference (EUSIPCO 2009) Glgow, Scotland, August 24-28, 2009 EMOTIONS RECOGNITION BY SPEECH AND FACIAL EXPRESSIONS ANALYSIS Simina Emerich 1, Eugen Lupu 1, Anca Apatean

More information

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods.

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods. LP 8D emotions & James/Lange 1 Emotions An emotion is a complex psychological state that involves subjective experience, physiological response, and behavioral or expressive responses. These aspects are

More information

CHAPTER 1 MULTIMODAL EMOTION RECOGNITION

CHAPTER 1 MULTIMODAL EMOTION RECOGNITION CHAPTER 1 MULTIMODAL EMOTION RECOGNITION Nicu Sebe 1, Ira Cohen 2, and Thomas S. Huang 3 1 Faculty of Science, University of Amsterdam, The Netherlands E-mail: nicu@science.uva.nl 2 HP Labs, USA E-mail:

More information

Sound Analysis Research at LabROSA

Sound Analysis Research at LabROSA Sound Analysis Research at LabROSA Dan Ellis Laboratory for Recognition and Organization of Speech and Audio Dept. Electrical Eng., Columbia Univ., NY USA dpwe@ee.columbia.edu http://labrosa.ee.columbia.edu/

More information

A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection

A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection Tobias Gehrig and Hazım Kemal Ekenel Facial Image Processing and Analysis Group, Institute for Anthropomatics Karlsruhe

More information

Emotional intelligence in interactive systems

Emotional intelligence in interactive systems To appear in Design and Emotion, (Eds., D. McDonagh, P. Hekkert, J. van Erp and D. Gyi (pp.262-266) London: Taylor & Francis Emotional intelligence in interactive systems Antonella De Angeli and Graham

More information

Emotion classification using linear predictive features on wavelet-decomposed EEG data*

Emotion classification using linear predictive features on wavelet-decomposed EEG data* Emotion classification using linear predictive features on wavelet-decomposed EEG data* Luka Kraljević, Student Member, IEEE, Mladen Russo, Member, IEEE, and Marjan Sikora Abstract Emotions play a significant

More information

User Experience: Beyond Cognition and Emotion. Matthias RAUTERBERG Eindhoven University of Technology TU/e The Netherlands 2014

User Experience: Beyond Cognition and Emotion. Matthias RAUTERBERG Eindhoven University of Technology TU/e The Netherlands 2014 User Experience: Beyond Cognition and Emotion Matthias RAUTERBERG Eindhoven University of Technology TU/e The Netherlands 2014 Interaction Paradigms in Computing Cultural computing Cross cultural-interaction

More information