Social Robots and Human-Robot Interaction Ana Paiva Lecture 5. Social Robots Recognising and Understanding the others

Size: px
Start display at page:

Download "Social Robots and Human-Robot Interaction Ana Paiva Lecture 5. Social Robots Recognising and Understanding the others"

Transcription

1 Social Robots and Human-Robot Interaction Ana Paiva Lecture 5. Social Robots Recognising and Understanding the others

2 Our goal Build Social Intelligence d) e) f)

3 Scenarios we are interested.. Build Social Intelligence d) e) f) Focus on the Interaction

4 The problem Based on the limited perception of a robot, how to build technology to understand the social situation and the user s (and other agents ) affective, social, motivational and informational states, in order to respond in a socially appropriate manner.

5 Machine sensing and automatic understanding of human behaviors

6 Action Analysis User Analysis Interaction Analysis

7 Scientific and Engineering Issues* Which types of messages are communicated by behavioral/ social signals? This question is related to psychological issues pertaining to the nature of behavioral signals and the best way to interpret them. Which human communicative cues convey information about a certain type of behavioral signals? This issue shapes the choice of different modalities to be included into an automatic analyzer of human behavioral signals. How are various kinds of evidence to be combined to optimize inferences about shown behavioral signals? This question is related to issues such as how to distinguish between different types of messages, how best to integrate information across modalities, and what to take into account in order to realize context-aware interpretations. *Pantic, M., Pentland, A., Nijholt, A., & Huang, T. S. (2007). Human computing and machine understanding of human behavior: a survey. In Artifical Intelligence for Human Computing (pp ). Springer Berlin Heidelberg.

8 Which types of messages are communicated by behavioral/social signals? * affective/attitudinal states (e.g. fear, joy, inattention, stress); manipulators (actions used to act on objects in the environment or selfmanipulative actions like scratching and lip biting), emblems (culture-specific interactive signals like wink or thumbs up), illustrators (actions accompanying speech such as finger pointing and raised eyebrows), regulators (conversational mediators such as the exchange of a look, palm pointing, head nods and smiles). *Pantic, M., Pentland, A., Nijholt, A., & Huang, T. S. (2007). Human computing and machine understanding of human behavior: a survey. In Artifical Intelligence for Human Computing (pp ). Springer Berlin Heidelberg.

9 Human Sensing* Sensing human behavioral signals including facial expressions, body gestures, nonlinguistic vocalizations, and vocal intonations, is essential in the human judgment of behavioral cues and involves a number of tasks. Face: face detection and location, head and face tracking, eye-gaze tracking, and facial expression analysis. Body: body detection and tracking, hand tracking, recognition of postures, gestures and activity. Vocal nonlinguistic signals: estimation of auditory features such as pitch, intensity, and speech rate, and recognition of nonlinguistic vocalizations like laughs, cries, sighs, and coughs. *Pantic, M., Pentland, A., Nijholt, A., & Huang, T. S. (2007). Human computing and machine understanding of human behavior: a survey. In Artifical Intelligence for Human Computing (pp ). Springer Berlin Heidelberg.

10 Beyond the four W s Who? (Who the user is?) Where? (Where the user is?) What? (What is the current task of the user? ) How? (How the information is passed on? Which behavioral signals have been displayed?) When? (What is the timing of displayed behavioral signals with respect to changes in the environment? Are there any co-occurrences of the signals?) Why? (What may be the user s reasons to display the observed cues?)

11 Faces Gestures Actions

12 Social Signals Five types of Nonverbal behaviour (Ekman & Friesen 1969) Emblems gestures directly translated to words (e.g. peace sign) different meanings across cultures e.g. giving the finger Illustrators/Iconic gestures gestures that acompanies speech to make it vivid, visual, or empathic e.g illustrating the size of something, throwing a ball also include some facial expressions

13 Social Signals Five types of Nonverbal behaviour Regulators nonverbal behaviours used to coordinate conversation head nodes looking at/orienting the body towards someone Self-adaptor nervous behaviours that release nervous energy touching face, tug hair, bite lips uncounscious (elicit suspicion) Displays of emotion face, voice, body, touch

14 Looking at faces Automatic face analysis: - face detection & recognition (detect who and where the user is); - facial expression recognition (detect how the user is feeling); - Gaze analysis

15 Face detection & recognition One way of identifying users is through the use of tools and algorithms for person identification. These may involve the following steps: - Extract Low-level Features: For each face image low-level features are extracted (for example normalized pixel values, image gradient directions) these vectors to form a large feature vector F(I). - Computing Visual Traits: For each extracted feature vector F(I), the output of n trait is calculated based on classifiers Ci=1...n in order to produce a trait vector C(I) for the face. These classifiers may be focused on attributes such as gender, age, and race, which provide strong cues about a person s identity. - Perform Verification: To decide if face matches one already in the system (calculating if the new user is he same person, can be done by comparing their trait vectors using a final classifier D.

16 Recognizing Facial Expressions A typical approach to expression recognition is to: Goal: to categorize input samples into a number of classes (attitudes, or emotions) Approach: apply standard pattern recognition procedures to train a classifier (Yang et al., 2007) for the detection.

17 Expressions through the Face

18 Facial Expressions What does a facial expression show? The internal physical state of a person An indication of what he/she is going to do next The plans, expectations and memory. The emotional state.

19 Facial Expressions of Markers of Emotional Expression Emotion expressions of emotion tend to last a couple of seconds (smile with enjoyment - 10 seconds) polite smile without emotion (excepcionally brief ¼ second or it can be enforced during long periods of time) facial expressions of emotion involve involuntary muscle actions that people cannot deliberately produce/supress

20 Facial Expression of Emotion Markers of Emotional Expression e.g. Duchenne smile Non Duchenne Duchenne

21 Facial Expression of Emotion Markers of Emotional Expression e.g. Duchenne smile cheek raiser Lip corner puller Non Duchenne Duchenne

22 First test of the universal hypothesis 3000 photos of different people 6 basic emotions Universality of Facial Expressions

23 Universality of Facial Expressions First test of the universal hypothesis 3000 photos of different people 6 basic emotions Anger Fear Disgust Surprise Happyness Sadness

24 Universality of Facial Expressions: Ekman s studies First test of the universal hypothesis photos showed to participants across multiple countries Japan, Brasil, Argentina, Chile, U.S participants were asked to select the emotion term that better matched the emotion being displayed from a list of 6 terms participants achieved accuracy rates of 80-90% In all countries

25 Universality of Facial Critics to this first experiment Expressions Participants had seen U.S. television and movies and might have learned american labels for the expressions

26 Second experiment Universality of Facial Expressions: Ekman s studies Ekman travelled to Papua, new Guinea lived 6 months with people of the Fore tribe did not see any movie or magazine did not speak english minimal exposure to westerners Two tasks were designed to evaluate the universality hypothesis

27 Universality of Facial Expressions Fore participants judging western photos Adults Children U.S students judging Fore expressions Anger Disgust Fear Happiness Sadness Surprise

28 Universality of Facial Expressions Other studies about universality of facial expressions further confirmed these results Ekman, 1984, 1993 Elfenbein & Ambady 2002,2003 Izard 1971, 1994

29 FACS The Facial Action Coding System (FACS) is a human-observer-based system designed to detect subtle changes in facial features. Viewing videotaped facial behaviour in slow motion, trained observers can manually FACS code all possible facial displays, which are referred to as action units (AU).

30 AUs (Action Units ) in FACS

31 Techniques for automatically detecting AUs in faces Most current work on automated facial expression analysis attempt to recognize a small set of prototypic expressions, such as joy and fear (those are trained using pictures and a classifier is built for the purpose). Image analysis techniques Extraction of motion information by computing the difference in image intensity between successive frames in an image sequence Extraction of edges and lines from facial images for detecting furrows and wrinkles Problem: prototypic expressions, occur infrequently, and human emotions and intentions are communicated more often by changes in one or two discrete features.

32 Recognizing Facial Expressions: what s in a face?

33 Recognizing Poses and Poses are static configurations of a person s body (arms, legs) for example, a Stop when interacting with a robot. Other gestures, like for example, go away or come with me are motion gestures and defined through a set of motion patterns. Gestures

34 Recognizing Poses and Gestures As with facial expressions, the recognition of these gestures involves several steps. One of the steps involves pose analysis. Techniques such as neural networks can be used, where a network is trained with data annotated with the gestures we want to identify.

35 Human actions usually involve human-object interactions, where we can see articulated motions along complex temporal structures. Identifying Human Actions Actions are spatio-temporal patterns. Issues in action recognition: - the extraction and representation of suitable spatiotemporal features, and - the modeling and learning of dynamical patterns.

36 Human Actions Analysis Input: - Video-based methods (more used) - Multi- camera motion capture (MoCap) systems: can produce accurate 3D joint positions, (yet special equipment and very expensive). And it is still a challenging problem to develop a marker-free motion capturing system using regular video sensors. - Depth cameras can be used for motion capturing, with some reasonable results (although there is noise occlusion occurs). - Because of the difference in the motion data quality, the action recognition methods designed for MoCap data might not be suitable for depth cameras.

37 Actions

38 Social Signals Engagement & Attention Social Attitudes (e.g. dominance, conflict) Personality Intention and Plans Emotional States (anger, happiness, etc) Confusion, Stress Informational state (Knowledge levels)

39 Data sets for Social Signals recognition To train the classifiers for specific types of attitudes or emotion recognition that we want the robot to do, data has to be captured and annotated for that attitude/emotion. In general, annotation of the data, both for posed and spontaneous data, is usually done separately for each attitude assuming independency between them. For example, the classification emotion classes the annotation can be done in two dimensions: valence and arousal. As such, annotation can be done in: positive activation+positive evaluation; positive activation+negative evaluation; negative activation+negative evaluation; negative activation+positive evaluation; and neutral (close to the centre of the 2D emotional space). Once a corpus of annotated data is created for a certain attitude, then a classifier can be built with the features extracted.

40 Kinect sensor and Software Development Kit (SDK) Kinect is an array of sensors, including a camera and a depth sensor. In addition to the raw depth image, Kinect extracts a 3D virtual skeleton of the body. These capabilities, packed in an affordable and compact device, is ideal to use as a sensor for social robots, in particular for gesture analysis

41 Cases Studied

42 Case Studied 1: icat, the Affec2ve Chess Player

43 Perceiving the User: applica2ondependent user s affec2ve states and expressions User s states related to the game and the social interac2on with the icat Valence of feeling Interest towards the icat Engagement with the icat (Poggi, 2007) Willingness to interact and to maintain the interac2on with the icat

44 The Inter-ACT corpus Mul2modal data 4 cameras; contextual informa2on Affect annota2on experiment Levels of valence of feeling and interest towards the icat Ginevra Castellano, Iolanda Leite, André Pereira, Carlos Mar6nho, Ana Paiva, Peter W. McOwan: Mul6modal Affect Modeling and Recogni6on for Empathic Robot Companions. I. J. Humanoid Robo6cs 10(1) (2013)

45 The Inter-ACT corpus Inter-ACT (INTErac2ng with Robots- Affect Context Task) corpus: 156 thin-slices of the interac2on between 8 children and the icat robot

46 Expression detec2on: a smile detector prototype Behaviour baseline Trained model Extract smile indicators - Lips bounding box ratio - Lips corners distance - Mouth bounding box image norm Smile detector - Based on SVMs (LibSVM, Chang and Lin, 2001) - Kernel type: Radial Basis Function Trained with 512 samples Probabilit y of smile 5 subjects, leave-one-subject-out cross valida2on Accuracy: %

47 icat at work

48 Case Study 2

49 Case studied 2*: Scenario: A robot that plays a Tangram game with a user Goal: Maintain the user engaged. So, the robot needs to recognise engagement. Engagement is the process by which two (or more) participants establish, maintain and end their perceived connection during interactions they jointly undertake *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

50 Case studied 2: Initial Human Engagement Study In order to be able to create a classifier that detects engagement an initial study was conducted. A study of human engagement behavior in which pairs of humans sat across an L-shaped table from each other and prepared canape ś together. Each of four sessions involved an experimenter (confederate) and two study participants and lasted about minutes. *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

51 Case studied 2: Initial Human Engagement Study - In the first half of each session, the experimenter instructed the study participant in how to make several different kinds of canape ś using combinations of the different kinds of crackers, spreads and toppings arrayed on the table. - The experimenter then left the room and was replaced by a second study participant, who was then taught to make canape ś by the first participant.1 - All sessions were videotaped using two cameras. *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

52 Case studied 2: Initial Human Engagement Study, - Analysis of the videotapes (looking at engagement maintenance process) Analysis - During the periods of maintained engagement, the researchers coded where each person was looking at each moment. *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

53 Case studied 2: Initial Human Engagement Study, Analysis *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

54 Case studied 2: Development of the Social - Using the data captured, a system was developed to detect engagement with the robot. Robot *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

55 Case studied 2: Development of the Social Robot *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

56 Case studied 2: Development of the Social Robot *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

57 Case studied 2: Development of the Social Robot *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in humanrobot interaction. In Human-Robot Interaction (HRI), th ACM/ IEEE International Conference on (pp ). IEEE.

58 Case studied 2: Initial Human Engagement Study *Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010, March). Recognizing engagement in human-robot interaction. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

59 Case Study 3

60 Case studied 3*: Scenario: A robot that works in a shopping mall helping people. Goal: understand the users (customers of the shopping mall), be able to remember previous interactions, and understand their actions and moving patterns. *Glas, D. F., Wada, K., Shiomi, M., Kanda, T., Ishiguro, H., & Hagita, N. (2013, March). Personal service: a robot that greets people individually based on observed behavior patterns. In Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction (pp ). IEEE Press.

61 Case studied 3: Step one: data collection To obtain data about customers, a human tracking system was set up and two pan-tiltzoom video cameras in the entrance of a shopping mall. The cameras monitored two sets of sliding doors, and their video feeds were processed by a server running OKAO Vision facerecognition software *Glas, D. F., Wada, K., Shiomi, M., Kanda, T., Ishiguro, H., & Hagita, N. (2013, March). Personal service: a robot that greets people individually based on observed behavior patterns. In Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction (pp ). IEEE Press.

62 Case studied 3: *Glas, D. F., Wada, K., Shiomi, M., Kanda, T., Ishiguro, H., & Hagita, N. (2013, March). Personal service: a robot that greets people individually based on observed behavior patterns. In Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction (pp ). IEEE Press.

63 Case studied 3: Yet, because people interact with the robot in groups (crowds) a study of the type of movement they perform in a shopping mall will give information about the possible actions of the users; Yet, safe navigating among people is often studied from the perspective of constructing a collision free path, and not about the behaviour of the crowd itself. *Glas, D. F., Wada, K., Shiomi, M., Kanda, T., Ishiguro, H., & Hagita, N. (2013, March). Personal service: a robot that greets people individually based on observed behavior patterns. In Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction (pp ). IEEE Press.

64 Case studied 3*: Approach: Simulate the crowd and use that to predict and understand what is actually happening in the real world. *Kidokoro, H., Kanda, T., Brscic, D., & Shiomi, M. (2013, March). Will I bother here?-a robot anticipating its influence on pedestrian walking comfort. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

65 Case studied 3: Data Collection To calibrate the robot influence model, i.e. determine the ratio of persons belonging to each behavior category and the parameters of each behavior, pedestrian data in the same target environment was collected. The robot Robovie-II (120 cm high and 40 cm in diameter, maximum velocity 750 mm/s) roams back and forth on predefined paths in the corridor and hallway and observed people s responses, which were recorded. When a pedestrian stopped in front of the robot, a human operator halted the robot and waited until the pedestrian left. We collected trajectories and videos of pedestrians and the robot. *Kidokoro, H., Kanda, T., Brscic, D., & Shiomi, M. (2013, March). Will I bother here?-a robot anticipating its influence on pedestrian walking comfort. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

66 Case studied 3: Analysis 1. Coding: 1115 trajectories were collected and a human coder classified all trajectories into the four behavior categories by looking at videos and trajectories. Another coder conducted a validation coding for 10% randomly chosen cases. Their classifications were very consistent (Cohen's kappa coefficient was 0.991). 2. Analysis: Features 1. D interact and D observe from trajectories in "stop to interact" and "stop to observe" categories; in average people in these categories stopped at a distance of 0.92 m (S.D. is 0.25) and 1.51 m (S.D. is 1.18) from the robot, respectively. 2. "slow down to look" category, the change of the velocity was analized and found that their velocity around the robot was 76% of the average within 3.0 m from the robot; 3. D notice was considered to be 10 m, as an upper limit on the distance where significant interactions can occur. *Kidokoro, H., Kanda, T., Brscic, D., & Shiomi, M. (2013, March). Will I bother here?-a robot anticipating its influence on pedestrian walking comfort. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

67 Case studied 3: Predicting the comfort walking level in users in a shopping mall with a robot The data was used to predict walking comfort for users from trajectories. The factors that were considered to have significant influence were: Distance: Pedestrian models typically use (the inverse of) distance to compute the influence of other pedestrians. It was considered that it would be more comfortable for a pedestrian if distances to nearby persons are larger. Velocity: People prefer to walk with constant velocity when possible, so the change of velocity forced by congestion would decrease the walking comfort. We therefore considered the amount of change of velocity. Density: People could perceive congestion, i.e. high density of people in a narrow space as less comfortable, so we used the number of nearby persons as a possible factor. Preferred velocity: One can also argue that deviation from the desired velocity could harm comfort. *Kidokoro, H., Kanda, T., Brscic, D., & Shiomi, M. (2013, March). Will I bother here?-a robot anticipating its influence on pedestrian walking comfort. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

68 *Kidokoro, H., Kanda, T., Brscic, D., & Shiomi, M. (2013, March). Will I bother here?-a robot anticipating its influence on pedestrian walking comfort. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE. Case studied 3:

69 Case studied 3: *Kidokoro, H., Kanda, T., Brscic, D., & Shiomi, M. (2013, March). Will I bother here?-a robot anticipating its influence on pedestrian walking comfort. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

70 Case Studies 3: Results *Kidokoro, H., Kanda, T., Brscic, D., & Shiomi, M. (2013, March). Will I bother here?-a robot anticipating its influence on pedestrian walking comfort. In Human-Robot Interaction (HRI), th ACM/IEEE International Conference on (pp ). IEEE.

71 Commercial Tools for Facial and Body Expressions Analysis There are several multi-platform face recognition, identification and facial feature detection tools and are used for HRI. Examples OKAO* (by OMRON) Affdex** (by Affectiva) * **

72 OKAO OKAO Vision software suite from Omron Corporation, detects faces in images and can determine the person s gender and approximate age, or verify his or her identity from a database of faces. Omron used about 10,000 images of human faces some with spontaneous smiles, some with posed smiles, and others sporting different expressions to train the software to evaluate smiles. The smile -software is dedicated to facialexpression detection and analysis. Omron s smile-measurement software picks up the hallmarks of a smile (e.g. narrowed eyes, open mouth, creases around the mouth, and wrinkles turning downward around the eyes) and uses an algorithm to assess the extent of the smile and rate it on a percentage scale. *

73 Example of use of OKAO Functions of OKAO: 1. -Face Detection 2. -Face Recognition 3. -Gender Estimation 4. -Age Estimation 5. -Expression Estimation 6. -Facial Pose Estimation 7. -Gaze Estimation 8. -Blink Estimation 9. -Hand Detection 10.-Human Body Detection *

74 * An advert of OKAO

75 Affdex** by Affectiva Affdex measures emotional responses by capturing a user's facial expressions through an existing webcam, in real time. Tracking gestures and key points on the subject's face, Affdex is able to analyze subtle movements and correlate them with complex emotional and cognitive states. The Affdex system can identify and follow dozens of precise locations on an individual's face. The muscular micro-shifts of every smile, yawn, or moment of confusion are captured and reflected in the data. **

76 Example of use of Affdex by Affectiva **

77 What perception do your systems need?

78 Discussion

Introduction to affect computing and its applications

Introduction to affect computing and its applications Introduction to affect computing and its applications Overview What is emotion? What is affective computing + examples? Why is affective computing useful? How do we do affect computing? Some interesting

More information

Emotion Recognition using a Cauchy Naive Bayes Classifier

Emotion Recognition using a Cauchy Naive Bayes Classifier Emotion Recognition using a Cauchy Naive Bayes Classifier Abstract Recognizing human facial expression and emotion by computer is an interesting and challenging problem. In this paper we propose a method

More information

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information C. Busso, Z. Deng, S. Yildirim, M. Bulut, C. M. Lee, A. Kazemzadeh, S. Lee, U. Neumann, S. Narayanan Emotion

More information

Facial expression recognition with spatiotemporal local descriptors

Facial expression recognition with spatiotemporal local descriptors Facial expression recognition with spatiotemporal local descriptors Guoying Zhao, Matti Pietikäinen Machine Vision Group, Infotech Oulu and Department of Electrical and Information Engineering, P. O. Box

More information

Affective Game Engines: Motivation & Requirements

Affective Game Engines: Motivation & Requirements Affective Game Engines: Motivation & Requirements Eva Hudlicka Psychometrix Associates Blacksburg, VA hudlicka@ieee.org psychometrixassociates.com DigiPen Institute of Technology February 20, 2009 1 Outline

More information

Motion Control for Social Behaviours

Motion Control for Social Behaviours Motion Control for Social Behaviours Aryel Beck a.beck@ntu.edu.sg Supervisor: Nadia Magnenat-Thalmann Collaborators: Zhang Zhijun, Rubha Shri Narayanan, Neetha Das 10-03-2015 INTRODUCTION In order for

More information

This is the accepted version of this article. To be published as : This is the author version published as:

This is the accepted version of this article. To be published as : This is the author version published as: QUT Digital Repository: http://eprints.qut.edu.au/ This is the author version published as: This is the accepted version of this article. To be published as : This is the author version published as: Chew,

More information

Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners

Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners Hatice Gunes and Maja Pantic Department of Computing, Imperial College London 180 Queen

More information

Temporal Context and the Recognition of Emotion from Facial Expression

Temporal Context and the Recognition of Emotion from Facial Expression Temporal Context and the Recognition of Emotion from Facial Expression Rana El Kaliouby 1, Peter Robinson 1, Simeon Keates 2 1 Computer Laboratory University of Cambridge Cambridge CB3 0FD, U.K. {rana.el-kaliouby,

More information

Facial Expression Biometrics Using Tracker Displacement Features

Facial Expression Biometrics Using Tracker Displacement Features Facial Expression Biometrics Using Tracker Displacement Features Sergey Tulyakov 1, Thomas Slowe 2,ZhiZhang 1, and Venu Govindaraju 1 1 Center for Unified Biometrics and Sensors University at Buffalo,

More information

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012 Outline Emotion What are emotions? Why do we have emotions? How do we express emotions? Cultural regulation of emotion Eliciting events Cultural display rules Social Emotions Behavioral component Characteristic

More information

Drive-reducing behaviors (eating, drinking) Drive (hunger, thirst) Need (food, water)

Drive-reducing behaviors (eating, drinking) Drive (hunger, thirst) Need (food, water) Instinct Theory: we are motivated by our inborn automated behaviors that generally lead to survival. But instincts only explain why we do a small fraction of our behaviors. Does this behavior adequately

More information

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals.

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. Bandara G.M.M.B.O bhanukab@gmail.com Godawita B.M.D.T tharu9363@gmail.com Gunathilaka

More information

SCHIZOPHRENIA, AS SEEN BY A

SCHIZOPHRENIA, AS SEEN BY A SCHIZOPHRENIA, AS SEEN BY A DEPTH CAMERA AND OTHER SENSORS Daphna Weinshall 1 School of Computer Science & Engin. Hebrew University of Jerusalem, Israel RESEARCH GOAL Motivation: current psychiatric diagnosis

More information

Discovering Facial Expressions for States of Amused, Persuaded, Informed, Sentimental and Inspired

Discovering Facial Expressions for States of Amused, Persuaded, Informed, Sentimental and Inspired Discovering Facial Expressions for States of Amused, Persuaded, Informed, Sentimental and Inspired Daniel McDuff Microsoft Research, Redmond, WA, USA This work was performed while at Affectiva damcduff@microsoftcom

More information

A Possibility for Expressing Multi-Emotion on Robot Faces

A Possibility for Expressing Multi-Emotion on Robot Faces The 5 th Conference of TRS Conference 26-27 May 2011, Bangkok, Thailand A Possibility for Expressing Multi-Emotion on Robot Faces Trin Veerasiri 1*, Djitt Laowattana 2 Institute of Field robotics, King

More information

General Psych Thinking & Feeling

General Psych Thinking & Feeling General Psych Thinking & Feeling Piaget s Theory Challenged Infants have more than reactive sensing Have some form of discrimination (reasoning) 1-month-old babies given a pacifier; never see it Babies

More information

Sociable Robots Peeping into the Human World

Sociable Robots Peeping into the Human World Sociable Robots Peeping into the Human World An Infant s Advantages Non-hostile environment Actively benevolent, empathic caregiver Co-exists with mature version of self Baby Scheme Physical form can evoke

More information

Edge Based Grid Super-Imposition for Crowd Emotion Recognition

Edge Based Grid Super-Imposition for Crowd Emotion Recognition Edge Based Grid Super-Imposition for Crowd Emotion Recognition Amol S Patwardhan 1 1Senior Researcher, VIT, University of Mumbai, 400037, India ---------------------------------------------------------------------***---------------------------------------------------------------------

More information

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION EMOTION: a response of the whole organism, involving 1. physiological arousal 2. expressive behaviors, and 3. conscious experience A mix of bodily arousal

More information

Facial Behavior as a Soft Biometric

Facial Behavior as a Soft Biometric Facial Behavior as a Soft Biometric Abhay L. Kashyap University of Maryland, Baltimore County 1000 Hilltop Circle, Baltimore, MD 21250 abhay1@umbc.edu Sergey Tulyakov, Venu Govindaraju University at Buffalo

More information

Artificial Emotions to Assist Social Coordination in HRI

Artificial Emotions to Assist Social Coordination in HRI Artificial Emotions to Assist Social Coordination in HRI Jekaterina Novikova, Leon Watts Department of Computer Science University of Bath Bath, BA2 7AY United Kingdom j.novikova@bath.ac.uk Abstract. Human-Robot

More information

Gender Based Emotion Recognition using Speech Signals: A Review

Gender Based Emotion Recognition using Speech Signals: A Review 50 Gender Based Emotion Recognition using Speech Signals: A Review Parvinder Kaur 1, Mandeep Kaur 2 1 Department of Electronics and Communication Engineering, Punjabi University, Patiala, India 2 Department

More information

It s All in the Game: Towards an Affect Sensitive and Context Aware Game Companion

It s All in the Game: Towards an Affect Sensitive and Context Aware Game Companion It s All in the Game: Towards an Affect Sensitive and Context Aware Game Companion Ginevra Castellano Dept. of Computer Science School of EECS Queen Mary Univ. of London United Kingdom ginevra@dcs.qmul.ac.uk

More information

Facial Event Classification with Task Oriented Dynamic Bayesian Network

Facial Event Classification with Task Oriented Dynamic Bayesian Network Facial Event Classification with Task Oriented Dynamic Bayesian Network Haisong Gu Dept. of Computer Science University of Nevada Reno haisonggu@ieee.org Qiang Ji Dept. of ECSE Rensselaer Polytechnic Institute

More information

ANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS

ANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS ANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS Nanxiang Li and Carlos Busso Multimodal Signal Processing (MSP) Laboratory Department of Electrical Engineering, The University

More information

Social Context Based Emotion Expression

Social Context Based Emotion Expression Social Context Based Emotion Expression Radosław Niewiadomski (1), Catherine Pelachaud (2) (1) University of Perugia, Italy (2) University Paris VIII, France radek@dipmat.unipg.it Social Context Based

More information

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology ISSN: 0975-766X CODEN: IJPTFI Available Online through Research Article www.ijptonline.com FACIAL EMOTION RECOGNITION USING NEURAL NETWORK Kashyap Chiranjiv Devendra, Azad Singh Tomar, Pratigyna.N.Javali,

More information

Affect Recognition for Interactive Companions

Affect Recognition for Interactive Companions Affect Recognition for Interactive Companions Ginevra Castellano School of Electronic Engineering and Computer Science Queen Mary University of London, UK ginevra@dcs.qmul.ac.uk Ruth Aylett School of Maths

More information

Affect Recognition for Interactive Companions: Challenges and Design in Real World Scenarios

Affect Recognition for Interactive Companions: Challenges and Design in Real World Scenarios Journal on Multimodal User Interfaces manuscript No. (will be inserted by the editor) Affect Recognition for Interactive Companions: Challenges and Design in Real World Scenarios Ginevra Castellano Iolanda

More information

A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI

A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI J Intell Robot Syst (2016) 82:101 133 DOI 10.1007/s10846-015-0259-2 A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI Derek McColl Alexander Hong Naoaki Hatakeyama

More information

Sign Language in the Intelligent Sensory Environment

Sign Language in the Intelligent Sensory Environment Sign Language in the Intelligent Sensory Environment Ákos Lisztes, László Kővári, Andor Gaudia, Péter Korondi Budapest University of Science and Technology, Department of Automation and Applied Informatics,

More information

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China A Vision-based Affective Computing System Jieyu Zhao Ningbo University, China Outline Affective Computing A Dynamic 3D Morphable Model Facial Expression Recognition Probabilistic Graphical Models Some

More information

Creating Genuine Smiles for Digital and Robotic Characters: An Empirical Study

Creating Genuine Smiles for Digital and Robotic Characters: An Empirical Study Creating Genuine Smiles for Digital and Robotic Characters: An Empirical Study Mei Si Department of Cognitive Science Rensselaer Polytechnic Institute sim@rpi.edu J. Dean McDaniel Department of Computer

More information

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS International Archives of Photogrammetry and Remote Sensing. Vol. XXXII, Part 5. Hakodate 1998 FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS Ayako KATOH*, Yasuhiro FUKUI**

More information

N RISCE 2K18 ISSN International Journal of Advance Research and Innovation

N RISCE 2K18 ISSN International Journal of Advance Research and Innovation The Computer Assistance Hand Gesture Recognition system For Physically Impairment Peoples V.Veeramanikandan(manikandan.veera97@gmail.com) UG student,department of ECE,Gnanamani College of Technology. R.Anandharaj(anandhrak1@gmail.com)

More information

Audiovisual to Sign Language Translator

Audiovisual to Sign Language Translator Technical Disclosure Commons Defensive Publications Series July 17, 2018 Audiovisual to Sign Language Translator Manikandan Gopalakrishnan Follow this and additional works at: https://www.tdcommons.org/dpubs_series

More information

Recognition of sign language gestures using neural networks

Recognition of sign language gestures using neural networks Recognition of sign language gestures using neural s Peter Vamplew Department of Computer Science, University of Tasmania GPO Box 252C, Hobart, Tasmania 7001, Australia vamplew@cs.utas.edu.au ABSTRACT

More information

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson Spotting Liars and Deception Detection skills - people reading skills in the risk context Alan Hudson < AH Business Psychology 2016> This presentation has been prepared for the Actuaries Institute 2016

More information

Recognising Emotions from Keyboard Stroke Pattern

Recognising Emotions from Keyboard Stroke Pattern Recognising Emotions from Keyboard Stroke Pattern Preeti Khanna Faculty SBM, SVKM s NMIMS Vile Parle, Mumbai M.Sasikumar Associate Director CDAC, Kharghar Navi Mumbai ABSTRACT In day to day life, emotions

More information

Face Analysis : Identity vs. Expressions

Face Analysis : Identity vs. Expressions Hugo Mercier, 1,2 Patrice Dalle 1 Face Analysis : Identity vs. Expressions 1 IRIT - Université Paul Sabatier 118 Route de Narbonne, F-31062 Toulouse Cedex 9, France 2 Websourd Bâtiment A 99, route d'espagne

More information

Endowing a Robotic Tutor with Empathic Qualities: Design and Pilot Evaluation.

Endowing a Robotic Tutor with Empathic Qualities: Design and Pilot Evaluation. International Journal of Humanoid Robotics c World Scientific Publishing Company Endowing a Robotic Tutor with Empathic Qualities: Design and Pilot Evaluation Mohammad Obaid 1, Ruth Aylett 2, Wolmet Barendregt

More information

Action Recognition. Computer Vision Jia-Bin Huang, Virginia Tech. Many slides from D. Hoiem

Action Recognition. Computer Vision Jia-Bin Huang, Virginia Tech. Many slides from D. Hoiem Action Recognition Computer Vision Jia-Bin Huang, Virginia Tech Many slides from D. Hoiem This section: advanced topics Convolutional neural networks in vision Action recognition Vision and Language 3D

More information

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques Yisu Zhao Thesis submitted to the Faculty of Graduate and Postdoctoral Studies In partial fulfillment of the requirements

More information

Blue Eyes Technology

Blue Eyes Technology Blue Eyes Technology D.D. Mondal #1, Arti Gupta *2, Tarang Soni *3, Neha Dandekar *4 1 Professor, Dept. of Electronics and Telecommunication, Sinhgad Institute of Technology and Science, Narhe, Maharastra,

More information

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition , pp.131-135 http://dx.doi.org/10.14257/astl.2013.39.24 Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition SeungTaek Ryoo and Jae-Khun Chang School of Computer Engineering

More information

Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images

Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images Ioulia Guizatdinova and Veikko Surakka Research Group for Emotions, Sociality, and Computing Tampere Unit for Computer-Human

More information

Bio-Feedback Based Simulator for Mission Critical Training

Bio-Feedback Based Simulator for Mission Critical Training Bio-Feedback Based Simulator for Mission Critical Training Igor Balk Polhemus, 40 Hercules drive, Colchester, VT 05446 +1 802 655 31 59 x301 balk@alum.mit.edu Abstract. The paper address needs for training

More information

What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression.

What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression. What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression. While our emotions are very different, they all involve

More information

The Vine Assessment System by LifeCubby

The Vine Assessment System by LifeCubby The Vine Assessment System by LifeCubby A Fully Integrated Platform for Observation, Daily Reporting, Communications and Assessment For Early Childhood Professionals and the Families that they Serve Alignment

More information

4. Assisting People Who Are Hard of Hearing

4. Assisting People Who Are Hard of Hearing 4. Assisting People Who Are Hard of Hearing 1) Basic Points Smile and be considerate First of all, smile and greet once you make eye contact with the person you are about to assist. Even if he or she cannot

More information

A tension-moderating mechanism for promoting speech-based human-robot interaction

A tension-moderating mechanism for promoting speech-based human-robot interaction A tension-moderating mechanism for promoting speech-based human-robot interaction Takayuki Kanda *, Kayoko Iwase *, Masahiro Shiomi *, Hiroshi Ishiguro * * Department of Communication Robots Faculty of

More information

Multimodal Coordination of Facial Action, Head Rotation, and Eye Motion during Spontaneous Smiles

Multimodal Coordination of Facial Action, Head Rotation, and Eye Motion during Spontaneous Smiles Multimodal Coordination of Facial Action, Head Rotation, and Eye Motion during Spontaneous Smiles Jeffrey F. Cohn jeffcohn@cs.cmu.edu Lawrence Ian Reed lirst6@pitt.edu suyoshi Moriyama Carnegie Mellon

More information

Person Perception. Forming Impressions of Others. Mar 5, 2012, Banu Cingöz Ulu

Person Perception. Forming Impressions of Others. Mar 5, 2012, Banu Cingöz Ulu Person Perception Forming Impressions of Others Mar 5, 2012, Banu Cingöz Ulu Person Perception person perception: how we come to know about others temporary states, emotions, intentions and desires impression

More information

PHYSIOLOGICAL RESEARCH

PHYSIOLOGICAL RESEARCH DOMAIN STUDIES PHYSIOLOGICAL RESEARCH In order to understand the current landscape of psychophysiological evaluation methods, we conducted a survey of academic literature. We explored several different

More information

Smiling virtual agent in social context

Smiling virtual agent in social context Smiling virtual agent in social context Magalie Ochs 1 Radoslaw Niewiadomski 1 Paul Brunet 2 Catherine Pelachaud 1 1 CNRS-LTCI, TélécomParisTech {ochs, niewiadomski, pelachaud}@telecom-paristech.fr 2 School

More information

Valence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University

Valence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University Valence and Gender Effects on Emotion Recognition Following TBI Cassie Brown Arizona State University Knox & Douglas (2009) Social Integration and Facial Expression Recognition Participants: Severe TBI

More information

Facial Expression Recognition Using Principal Component Analysis

Facial Expression Recognition Using Principal Component Analysis Facial Expression Recognition Using Principal Component Analysis Ajit P. Gosavi, S. R. Khot Abstract Expression detection is useful as a non-invasive method of lie detection and behaviour prediction. However,

More information

Study on Aging Effect on Facial Expression Recognition

Study on Aging Effect on Facial Expression Recognition Study on Aging Effect on Facial Expression Recognition Nora Algaraawi, Tim Morris Abstract Automatic facial expression recognition (AFER) is an active research area in computer vision. However, aging causes

More information

Agitation sensor based on Facial Grimacing for improved sedation management in critical care

Agitation sensor based on Facial Grimacing for improved sedation management in critical care Agitation sensor based on Facial Grimacing for improved sedation management in critical care The 2 nd International Conference on Sensing Technology ICST 2007 C. E. Hann 1, P Becouze 1, J. G. Chase 1,

More information

On Shape And the Computability of Emotions X. Lu, et al.

On Shape And the Computability of Emotions X. Lu, et al. On Shape And the Computability of Emotions X. Lu, et al. MICC Reading group 10.07.2013 1 On Shape and the Computability of Emotion X. Lu, P. Suryanarayan, R. B. Adams Jr., J. Li, M. G. Newman, J. Z. Wang

More information

Using simulated body language and colours to express emotions with the Nao robot

Using simulated body language and colours to express emotions with the Nao robot Using simulated body language and colours to express emotions with the Nao robot Wouter van der Waal S4120922 Bachelor Thesis Artificial Intelligence Radboud University Nijmegen Supervisor: Khiet Truong

More information

D4.10 Demonstrator 4 EoT Application

D4.10 Demonstrator 4 EoT Application Horizon 2020 PROGRAMME ICT-01-2014: Smart Cyber-Physical Systems This project has received funding from the European Union s Horizon 2020 research and innovation programme under Grant Agreement No 643924

More information

BODY LANGUAGE AS A POWERFUL TOOL IN BUSINESS

BODY LANGUAGE AS A POWERFUL TOOL IN BUSINESS CIIM MENTORSHIP SERIES 02 May 2018, Nicosia Cyprus. BODY LANGUAGE AS A POWERFUL TOOL IN BUSINESS Importance of the silent messages during communication Can you understand what happened by observing the

More information

Generalization of a Vision-Based Computational Model of Mind-Reading

Generalization of a Vision-Based Computational Model of Mind-Reading Generalization of a Vision-Based Computational Model of Mind-Reading Rana el Kaliouby and Peter Robinson Computer Laboratory, University of Cambridge, 5 JJ Thomson Avenue, Cambridge UK CB3 FD Abstract.

More information

Estimating Intent for Human-Robot Interaction

Estimating Intent for Human-Robot Interaction Estimating Intent for Human-Robot Interaction D. Kulić E. A. Croft Department of Mechanical Engineering University of British Columbia 2324 Main Mall Vancouver, BC, V6T 1Z4, Canada Abstract This work proposes

More information

How do you design an intelligent agent?

How do you design an intelligent agent? Intelligent Agents How do you design an intelligent agent? Definition: An intelligent agent perceives its environment via sensors and acts rationally upon that environment with its effectors. A discrete

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Intelligent Agents Chapter 2 & 27 What is an Agent? An intelligent agent perceives its environment with sensors and acts upon that environment through actuators 2 Examples of Agents

More information

Emotions of Living Creatures

Emotions of Living Creatures Robot Emotions Emotions of Living Creatures motivation system for complex organisms determine the behavioral reaction to environmental (often social) and internal events of major significance for the needs

More information

Design of Palm Acupuncture Points Indicator

Design of Palm Acupuncture Points Indicator Design of Palm Acupuncture Points Indicator Wen-Yuan Chen, Shih-Yen Huang and Jian-Shie Lin Abstract The acupuncture points are given acupuncture or acupressure so to stimulate the meridians on each corresponding

More information

Lecture 6: Social Cognition (Seeking Accuracy) Part I: Non-verbal communication Part II: Attribution

Lecture 6: Social Cognition (Seeking Accuracy) Part I: Non-verbal communication Part II: Attribution Lecture 6: Social Cognition (Seeking Accuracy) Part I: Non-verbal communication Part II: Attribution Social Perception A fundamental process through which people come to understand their social world.

More information

Nonverbal Communication6

Nonverbal Communication6 Nonverbal 6 CHAPTER TOPICS Characteristics of Nonverbal Influences on Nonverbal Types of Nonverbal Revised by Ron Compton Looking Out/Looking In Fourteenth Edition Characteristics of Nonverbal Nonverbal

More information

Running head: CULTURES 1. Difference in Nonverbal Communication: Cultures Edition ALI OMIDY. University of Kentucky

Running head: CULTURES 1. Difference in Nonverbal Communication: Cultures Edition ALI OMIDY. University of Kentucky Running head: CULTURES 1 Difference in Nonverbal Communication: Cultures Edition ALI OMIDY University of Kentucky CULTURES 2 Abstract The following paper is focused on the similarities and differences

More information

Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning

Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning Renan Contreras, Oleg Starostenko, Vicente Alarcon-Aquino, and Leticia Flores-Pulido CENTIA, Department of Computing, Electronics and

More information

Nonverbal Communication. Guide to Cross-Cultural Communication

Nonverbal Communication. Guide to Cross-Cultural Communication Nonverbal Communication Guide to Cross-Cultural Communication Importance of Nonverbal Communication Nonverbals are a crucial element of managerial communication. The nonverbal message comprises about 93%

More information

Development of an interactive digital signage based on F-formation system

Development of an interactive digital signage based on F-formation system Development of an interactive digital signage based on F-formation system Yu Kobayashi 1, Masahide Yuasa 2, and Daisuke Katagami 1 1 Tokyo Polytechnic University, Japan 2 Shonan Institute of Technology,

More information

Feelings. Subjective experience Phenomenological awareness Cognitive interpretation. Sense of purpose

Feelings. Subjective experience Phenomenological awareness Cognitive interpretation. Sense of purpose Motivation & Emotion Aspects of Feelings Subjective experience Phenomenological awareness Cognitive interpretation What is an? Bodily arousal Bodily preparation for action Physiological activiation Motor

More information

Aspects of emotion. Motivation & Emotion. Aspects of emotion. Review of previous lecture: Perennial questions about emotion

Aspects of emotion. Motivation & Emotion. Aspects of emotion. Review of previous lecture: Perennial questions about emotion Motivation & Emotion Aspects of Dr James Neill Centre for Applied Psychology University of Canberra 2016 Image source 1 Aspects of (Emotion Part 2): Biological, cognitive & social aspects Reading: Reeve

More information

A Dynamic Model for Identification of Emotional Expressions

A Dynamic Model for Identification of Emotional Expressions A Dynamic Model for Identification of Emotional Expressions Rafael A.M. Gonçalves, Diego R. Cueva, Marcos R. Pereira-Barretto, and Fabio G. Cozman Abstract This paper discusses the dynamics of emotion

More information

Fuzzy Model on Human Emotions Recognition

Fuzzy Model on Human Emotions Recognition Fuzzy Model on Human Emotions Recognition KAVEH BAKHTIYARI &HAFIZAH HUSAIN Department of Electrical, Electronics and Systems Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan

More information

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods.

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods. LP 8D emotions & James/Lange 1 Emotions An emotion is a complex psychological state that involves subjective experience, physiological response, and behavioral or expressive responses. These aspects are

More information

Prediction of Negative Symptoms of Schizophrenia from Facial Expressions and Speech Signals

Prediction of Negative Symptoms of Schizophrenia from Facial Expressions and Speech Signals Prediction of Negative Symptoms of Schizophrenia from Facial Expressions and Speech Signals Debsubhra CHAKRABORTY PhD student Institute for Media Innovation/ Interdisciplinary Graduate School Supervisor:

More information

GfK Verein. Detecting Emotions from Voice

GfK Verein. Detecting Emotions from Voice GfK Verein Detecting Emotions from Voice Respondents willingness to complete questionnaires declines But it doesn t necessarily mean that consumers have nothing to say about products or brands: GfK Verein

More information

Understanding Emotions. How does this man feel in each of these photos?

Understanding Emotions. How does this man feel in each of these photos? Understanding Emotions How does this man feel in each of these photos? Emotions Lecture Overview What are Emotions? Facial displays of emotion Culture-based and sex-based differences Definitions Spend

More information

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender Proc. IEEE Int. Conf. on Systems, Man and Cybernetics (SMC 2004), Den Haag, pp. 2203-2208, IEEE omnipress 2004 Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

More information

Emotion-Aware Machines

Emotion-Aware Machines Emotion-Aware Machines Saif Mohammad, Senior Research Officer National Research Council Canada 1 Emotion-Aware Machines Saif Mohammad National Research Council Canada 2 What does it mean for a machine

More information

The Temporal Connection Between Smiles and Blinks

The Temporal Connection Between Smiles and Blinks The Temporal Connection Between Smiles and Blinks Laura C Trutoiu, Jessica K Hodgins, and Jeffrey F Cohn Abstract In this paper, we present evidence for a temporal relationship between eye blinks and smile

More information

Human Computing and Machine Understanding of Human Behavior: A Survey

Human Computing and Machine Understanding of Human Behavior: A Survey Human Computing and Machine Understanding of Human Behavior: A Survey Maja Pantic 1,3, Alex Pentland 2, Anton Nijholt 3, and Thomas S. Hunag 4 1 Computing Dept., Imperial Collge London, London, UK 2 Media

More information

Emotions and Motivation

Emotions and Motivation Emotions and Motivation LP 8A emotions, theories of emotions 1 10.1 What Are Emotions? Emotions Vary in Valence and Arousal Emotions Have a Physiological Component What to Believe? Using Psychological

More information

Culture and Emotion THE EVOLUTION OF HUMAN EMOTION. Outline

Culture and Emotion THE EVOLUTION OF HUMAN EMOTION. Outline Outline Culture and Emotion The Evolution of Human Emotion Universality in Emotion- The Basic Emotions Perspective Cultural Differences in Emotion Conclusion Chapter 8 THE EVOLUTION OF HUMAN EMOTION Emotion:

More information

Elements of Communication

Elements of Communication Elements of Communication Elements of Communication 6 Elements of Communication 1. Verbal messages 2. Nonverbal messages 3. Perception 4. Channel 5. Feedback 6. Context Elements of Communication 1. Verbal

More information

Beyond AI: Bringing Emotional Intelligence to the Digital

Beyond AI: Bringing Emotional Intelligence to the Digital Beyond AI: Bringing Emotional Intelligence to the Digital World Emotions influence every aspect of our lives how we live, work and play to the decisions we make We are surrounded by hyper-connected devices,

More information

Manual Annotation and Automatic Image Processing of Multimodal Emotional Behaviours: Validating the Annotation of TV Interviews

Manual Annotation and Automatic Image Processing of Multimodal Emotional Behaviours: Validating the Annotation of TV Interviews Manual Annotation and Automatic Image Processing of Multimodal Emotional Behaviours: Validating the Annotation of TV Interviews J.-C. Martin 1, G. Caridakis 2, L. Devillers 1, K. Karpouzis 2, S. Abrilian

More information

A MULTIMODAL NONVERBAL HUMAN-ROBOT COMMUNICATION SYSTEM ICCB 2015

A MULTIMODAL NONVERBAL HUMAN-ROBOT COMMUNICATION SYSTEM ICCB 2015 VI International Conference on Computational Bioengineering ICCB 2015 M. Cerrolaza and S.Oller (Eds) A MULTIMODAL NONVERBAL HUMAN-ROBOT COMMUNICATION SYSTEM ICCB 2015 SALAH SALEH *, MANISH SAHU, ZUHAIR

More information

Human Computing and Machine Understanding of Human Behavior: A Survey

Human Computing and Machine Understanding of Human Behavior: A Survey Human Computing and Machine Understanding of Human Behavior: A Survey Maja Pantic 1,3, Alex Pentland 2, Anton Nijholt 3 and Thomas Huang 4 1 Computing Department, Imperial College London, UK 2 Media Lab,

More information

Artificial Intelligence CS 6364

Artificial Intelligence CS 6364 Artificial Intelligence CS 6364 Professor Dan Moldovan Section 2 Intelligent Agents Intelligent Agents An agent is a thing (e.g. program, or system) that can be viewed as perceiving its environment and

More information

PSYC 222 Motivation and Emotions

PSYC 222 Motivation and Emotions PSYC 222 Motivation and Emotions Session 6 The Concept of Emotion Lecturer: Dr. Annabella Osei-Tutu, Psychology Department Contact Information: aopare-henaku@ug.edu.gh College of Education School of Continuing

More information

Emotionally Augmented Storytelling Agent

Emotionally Augmented Storytelling Agent Emotionally Augmented Storytelling Agent The Effects of Dimensional Emotion Modeling for Agent Behavior Control Sangyoon Lee 1(&), Andrew E. Johnson 2, Jason Leigh 2, Luc Renambot 2, Steve Jones 3, and

More information

Communication (Journal)

Communication (Journal) Chapter 2 Communication (Journal) How often have you thought you explained something well only to discover that your friend did not understand? What silly conversational mistakes have caused some serious

More information

Workshop 1 will help you to identify the impact of non verbal communication and developing positive body language habits

Workshop 1 will help you to identify the impact of non verbal communication and developing positive body language habits Privilege Plan Practice Development Workshop 1 Valuing Your Patients Patient Communication Your Patients Valuing Workshop delegate please complete this section: Your name Your role Practice name and location

More information