Anxiety Detection during Human-Robot Interaction *

Similar documents
Estimating Intent for Human-Robot Interaction

Estimation of the Upper Limb Lifting Movement Under Varying Weight and Movement Speed

Artificial Emotions to Assist Social Coordination in HRI

PHYSIOLOGICAL RESEARCH

Emotion based E-learning System using Physiological Signals. Dr. Jerritta S, Dr. Arun S School of Engineering, Vels University, Chennai

ECHORD call1 experiment MAAT

Emote to Win: Affective Interactions with a Computer Game Agent

Emotional memory: from affective relevance to arousal

Affective Priming: Valence and Arousal

Aversive picture processing: Effects of a concurrent task on sustained defensive system engagement

Emotion Detection Using Physiological Signals. M.A.Sc. Thesis Proposal Haiyan Xu Supervisor: Prof. K.N. Plataniotis

EBCC Data Analysis Tool (EBCC DAT) Introduction

THE PHYSIOLOGICAL UNDERPINNINGS OF AFFECTIVE REACTIONS TO PICTURES AND MUSIC. Matthew Schafer The College of William and Mary SREBCS, USC

Affective Dialogue Communication System with Emotional Memories for Humanoid Robots

Affective Game Engines: Motivation & Requirements

Strategies using Facial Expressions and Gaze Behaviors for Animated Agents

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry.

Introduction to affect computing and its applications

Artificial Intelligence Lecture 7

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China

Emotional State Recognition via Physiological Measurement and Processing

A USER INDEPENDENT, BIOSIGNAL BASED, EMOTION RECOGNITION METHOD

Emotions of Living Creatures

BioGraph Infiniti DynaMap Suite 3.0 for ProComp Infiniti March 2, SA7990B Ver 3.0

REACTION TIME MEASUREMENT APPLIED TO MULTIMODAL HUMAN CONTROL MODELING

Affective reactions to briefly presented pictures

Emotion Theory. Dr. Vijay Kumar

Smart Sensor Based Human Emotion Recognition

Quick detection of QRS complexes and R-waves using a wavelet transform and K-means clustering

Psychophysiological correlates of inquiry and advocacy in human interactions Ilkka Leppänen

A Possibility for Expressing Multi-Emotion on Robot Faces

Motion Control for Social Behaviours

Bio-Feedback Based Simulator for Mission Critical Training

Jan Kaiser, Andrzej Beauvale and Jarostaw Bener. Institute of Psychology, Jagiellonian University, 13 Golcbia St., ?

Feelings. Subjective experience Phenomenological awareness Cognitive interpretation. Sense of purpose

Aspects of emotion. Motivation & Emotion. Aspects of emotion. Review of previous lecture: Perennial questions about emotion

PCA Enhanced Kalman Filter for ECG Denoising

On Shape And the Computability of Emotions X. Lu, et al.

Practical Approaches to Comforting Users with Relational Agents

AFFECTIVE COMMUNICATION FOR IMPLICIT HUMAN-MACHINE INTERACTION

PSYC 222 Motivation and Emotions

Issues Surrounding the Normalization and Standardisation of Skin Conductance Responses (SCRs).

Emotion Lecture 26 1

Author. Published. Journal Title. Copyright Statement. Downloaded from. Link to published version. Griffith Research Online

Human Machine Interface Using EOG Signal Analysis

Comparison of Mamdani and Sugeno Fuzzy Interference Systems for the Breast Cancer Risk

Facial Expression Recognition Using Principal Component Analysis

Towards an EEG-based Emotion Recognizer for Humanoid Robots

Gender Based Emotion Recognition using Speech Signals: A Review

Affect in Virtual Agents (and Robots) Professor Beste Filiz Yuksel University of San Francisco CS 686/486

Perceptual and Motor Skills, 2010, 111, 3, Perceptual and Motor Skills 2010 KAZUO MORI HIDEKO MORI

Psychophysiological Methods to Evaluate User s Response in Human Robot Interaction: A Review and Feasibility Study

BioGraph Infiniti DynaMap Suite 3.0 for FlexComp Infiniti March 2, SA7990A Ver 3.0

An Adaptable Fuzzy Emotion Model for Emotion Recognition

On the Possible Pitfalls in the Evaluation of Brain Computer Interface Mice

Heart Rate Calculation by Detection of R Peak

E-MRS: Emotion-based Movie Recommender System. Ai Thanh Ho, Ilusca L. L. Menezes, and Yousra Tagmouti

Embracing Complexity in System of Systems Analysis and Architecting

Optical Illusions 4/5. Optical Illusions 2/5. Optical Illusions 5/5 Optical Illusions 1/5. Reading. Reading. Fang Chen Spring 2004

Comparing affective responses to standardized pictures and videos: A study report

Determining Emotions via Biometric Software

Mental State Recognition by using Brain Waves

Learning and Adaptive Behavior, Part II

Modeling and Implementing an Adaptive Human-Computer Interface Using Passive Biosensors

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Intelligent Agents. CmpE 540 Principles of Artificial Intelligence

Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners

Reading Assignments: Lecture 18: Visual Pre-Processing. Chapters TMB Brain Theory and Artificial Intelligence

Affective Computing for Intelligent Agents. Introduction to Artificial Intelligence CIS 4930, Spring 2005 Guest Speaker: Cindy Bethel

CS148 - Building Intelligent Robots Lecture 5: Autonomus Control Architectures. Instructor: Chad Jenkins (cjenkins)

Towards natural human computer interaction in BCI

Introduction to Psychology. Lecture no: 27 EMOTIONS

Autonomous Mobile Robotics

Empathy for Max. (Preliminary Project Report)

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

FUZZY LOGIC AND FUZZY SYSTEMS: RECENT DEVELOPMENTS AND FUTURE DIWCTIONS

Detecting and Indexing Emotion

Sociable Robots Peeping into the Human World

Compound Effects of Top-down and Bottom-up Influences on Visual Attention During Action Recognition

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION

Neural Network based Heart Arrhythmia Detection and Classification from ECG Signal

Emotions in a repeated Cournot game

Recognising Emotions from Keyboard Stroke Pattern

Strick Lecture 1 March 22, 2006 Page 1

Temporal Context and the Recognition of Emotion from Facial Expression

MSAS: An M-mental health care System for Automatic Stress detection

General Psych Thinking & Feeling

DETECTION OF HEART ABNORMALITIES USING LABVIEW

What We Do To Reduce Work-related Musculoskeletal Disorders and Injuries at Occupational Ergonomics and Biomechanics (OEB) Laboratory

HOW IS IT THAT YOU MOVE?

Keywords: Adaptive Neuro-Fuzzy Interface System (ANFIS), Electrocardiogram (ECG), Fuzzy logic, MIT-BHI database.

A brief comparison between the Subsumption Architecture and Motor Schema Theory in light of Autonomous Exploration by Behavior

Overview. Basic concepts Theories of emotion Universality of emotions Brain basis of emotions Applied research: microexpressions

Various Methods To Detect Respiration Rate From ECG Using LabVIEW

Capturing Driver Response to In-Vehicle Human- Machine Interface Technologies Using Facial Thermography

ANTICIPATORY SKIN CONDUCTANCE RESPONSES: A POSSIBLE EXAMPLE OF DECISION AUGMENTATION THEORY

Founder and Chief Technology Officer, Zyrobotics

Research Proposal on Emotion Recognition

1 Introduction. Christopher G. Courtney 1,3, Michael E. Dawson 1, Anne M. Schell 2, and Thomas D. Parsons 3

Transcription:

Anxiety Detection during Human-Robot Interaction * Dana Kulić and Elizabeth Croft Department of Mechanical Engineering University of British Columbia Vancouver, Canada {dana,ecroft}@mech.ubc.ca Abstract - This paper describes an experiment to determine the feasibility of using physiological signals to determine the human response to robot motions during direct human-robot interaction. A robot manipulator is used to generate common interaction motions, and human subjects are asked to report their response to the motions. The human physiological response is also measured. Motion paths are generated using a classic potential field planner and a safe motion planner, which minimizes the potential collision force along the path. A fuzzy inference engine is developed to estimate the human response based on the physiological measures. Results show that emotional arousal can be detected using physiological signals and the inference engine. Comparison of initial results between the two planners shows that subjects report less anxiety and surprise with the safe planner for high planner speeds. Index Terms human-robot interaction, physiological signal monitoring, affective state estimation, safety. I. INTRODUCTION As robot manipulators move from isolated work cells to unstructured and interactive environments, they will need to become better at acquiring and interpreting information about their environment [1]. In particular, in cases where robothuman interaction is planned, human monitoring can enhance the safety of the interaction by providing additional information to robot planning and control systems [2, 3]. Recently, research has focused at using non-verbal communication, such as eye-gaze [2, 3], facial expressions and physiological signals [4-8] for human-robot and humancomputer interaction. Physiological signals are particularly well suited for human-robot interaction, as they are relatively easy to measure and interpret using on-line signal processing methods [6-8]. By using non-verbal information such as physiological signals, the robot can gauge user approval of its performance without requiring the user to continuously issue explicit feedback [2, 3]. In addition, changes in some nonverbal signals precede a verbal signal from the user. Observation of physiological information can allow the robot control system to anticipate command changes, creating a more responsive and intuitive human-robot interface. Our research is focused on detecting user anxiety during a real-time human-robot interaction, using physiological signals. To examine the feasibility of this approach, an experiment was conducted testing the human subjective and physiological response to various robot motions. The physiological response was then classified using fuzzy methods [5, 7] to determine if physiological response can be used as a mode of communication during human robot interaction. Early results of this work indicate that physiological signals have potential as an additional mode of communication during human-robot interaction. Existing research in this area is overviewed in Section II. The experimental setup and our emotional state estimation algorithm are described in Section III. Results are described in Section IV. Section V concludes the paper and outlines directions for future research. II. RELATED WORK Physiological monitoring systems have previously been used to extract information about the user s reaction, both for human-computer and human-robot interaction. Signals proposed for use in human-computer interfaces include skin conductance, heart rate, pupil dilation and brain and muscle neural activity. Bien et al. [9] advocate that soft computing methods are the most suitable methods for interpreting and classifying these types of signals, because these methods can deal with imprecise and incomplete data. Sarkar proposes using multiple physiological signals to estimate emotional state, and using this estimate to modify robotic actions to make the user more comfortable [8]. Rani et al. [7, 10] use heart-rate analysis and multiple physiological signals to estimate human stress levels. In [7], the stress information is used by an autonomous mobile robot to return to the human if the human is in distress. In this case, the robot is not directly interacting with the human; physiological information is used to allow the robot to assess the human s condition in a rescue situation. Nonaka et al. [11] describe a set of experiments where human response to pick-and-place motions of a virtual humanoid robot is evaluated. In their experiment, a virtual reality display is used to depict the robot. Human response is measured through heart rate measurements and subjective responses. No relationship was found between the heart rate and robot motion, but a correlation was reported between the robot velocity and the subject s rating of fear and surprise. All the above studies use virtual environments such as a video game [7, 10], or a virtual robot [11] to simulate an interaction situation. To the authors knowledge, no studies have been performed to date testing methods suitable for real-time affective state estimation during an actual human-robot interaction. * This work is partially supported by the Natural Sciences and Engineering Research Council of Canada

III. APPROACH In order to determine if it is possible to use human physiological signals to detect human anxiety or fear during human-robot interaction on-line, the experiment was designed to generate various robot motions and to evaluate both the human subjective response and physiological response to the motions. After the physiological data was collected, it was analyzed using a fuzzy inference engine based on an existing psychophysiological research knowledge base [5]. The analysis methods used are algorithmically simple enough to be incorporated into an on-line estimator. A second goal of the study was to determine if a robot motion strategy could be employed to reduce human anxiety. A safe planner [12], which tries to minimize the potential force during a collision along the path, is compared to the standard potential field planner. At higher motion speeds, the safe planner shows potential for reducing human anxiety. A. Experimental Method The experiment was performed using the CRS A460 6 degree of freedom (DoF) manipulator, shown in Figure 1. 10 human subjects were tested. For 8 of the 10 subjects, physiological data as well as subjective response was recorded, for the remaining 2 subjects, only the subjective response was recorded. 1. Trajectory Generation Two different tasks were used for the experiment: a pick-andplace motion (PP), similar to [11] and a (shorter) reach and retract motion (RR). These tasks were chosen to represent typical motions a robot could be asked to perform during human-robot interaction. For the pick-and-place motion, the pick location was specified to the right and away from the subject, and the place location was directly in front and close to the subject. For the reach and retract motion, the reach location was the same as the place location. For both tasks, the robot started and ended in the home upright position. Each of the selected positions is shown (from the subject s point of view) in Figure 1. Two planning strategies were used to plan the path of the robot for each task: a standard potential field (PF) method with obstacle avoidance and goal attraction [13], and a safe path (S) [12]. The safe path planner is similar to the potential field method, with the addition of a danger criterion, comprising of factors that affect the impact force during a collision between the robot and the human, which is minimized along the path. Point to point planning was not used, as this type of planning a. b. c. Fig 1 Robot task positions (a = Robot start/end position, b = pick position, c = place/reach position) would not be suitable for an interactive, human environment. This resulted in four motions, which are detailed in Table I. Figs. 2 5 show frames of video depicting each motion. For each path, a motion trajectory was planned using the trajectory planner described in [14]. Trajectories at three different speeds were planned (slow, medium, fast), resulting in 12 trajectories. The three speeds corresponded to 0.1, 0.5 and 1.0 of the maximum joint velocity of 3.14 rad/s. The accelerations and jerk were also scaled accordingly, as described in [14]. 2. Physiological Sensing The ProComp Infinity system from Thought Technology [15] was used to gather the physiological data. Heart muscle activity, skin conductance and corrugator muscle activity were measured. In earlier studies [5], the respiration rate was also measured, but it was found that data from this sensor is too slow to use in real-time interaction, so respiration rate was not used.the heart muscle activity was measured via electro cardiogram (ECG) measurement using EKG Flex/Pro sensor. The skin conductance was measured using the SCFlex-Pro sensor. Corrugator muscle activity was measured with the Myoscan Pro electromyography (EMG) sensor. The corrugator muscle is located on the forehead, it is used to control brow location; the activity of this muscle has been correlated to negative emotions such as frustration and anger [16, 17]. All sensor data was collected at 256 Hz. This rate is sufficient for capturing physiological signal events. Path PP-PF PP-S RR-PF RR-S TABLE I TEST PATH NAMING AND DESCRIPTIONS Description Pick & Place task planned by potential field planner Pick & place task planned by safe planner Reach & retract task planned by potential field planner Reach & retract task planned by safe planner 3. Experimental Procedure For each experiment, the human subject was connected to the physiological sensors and seated facing the robot. The robot was initially held motionless for a minimum of 30 seconds to collect baseline physiological data for each subject. The robot then executed the 12 trajectories described above. The trajectories were presented to each subject in randomized order. After each trajectory had executed, the subject was asked to rate their response to the motion in the following emotional response categories: anxiety, calm and surprise. The Likert scale (from 1 to 5) was used the characterise the response, with 5 representing extremely or completely and 1 representing not at all. B. Data Analysis The collected data was processed to extract features relevant to emotional state estimation. These features were then processed using fuzzy inference to estimate the emotional response. 1. Data Processing and Feature Extraction The fuzzy inference engine does not use the measured signals directly; the signals are pre-processed to extract relevant features, which are then used for inference.

Fig 2 Path PP-PF (Pick and Place Task Planned with the Potential Field Planner) Fig3 Path PP-S (Pick and Place Task Planned with the Safe Planner) Fig 4 Path RR-PF (Reach and Retract Task Planned with the Potential Field Planner) Fig5 Path RR-S (Reach and Retract Task Planned with the Safe Planner) Because the magnitudes of physiological signals vary widely between individuals, another important function of the preprocessing is to normalize the signal features so that a single inference engine can be used across individuals. Two features were extracted from the ECG data: the heart rate (HeartRate) and the heart rate acceleration (HRAccel). Recent research [7, 10] has reported the use of frequency domain heart rate analysis for use in emotional state estimation. However, since heart rate data is very slow (less than 1 Hz), using frequency windowing methods such as windowed Fourier analysis or wavelets results in unacceptable delays, which are not suitable for real-time interaction. The ECG data was low-pass filtered before applying a peak detection algorithm to detect R-waves of the QRS signal. The peak-topeak time was used to calculate the heart rate. The heart rate was smoothed using a 3 sample averaging filter, and normalized based on the baseline heart rate and heart rate variability, such that the signal ranged between [-1,1]. The heart rate acceleration was calculated by differentiating the smoothed heart rate signal. The heart rate acceleration was normalized to range between [-1,1], based on the resting heart rate of the subject measured during the initialisation phase. Two features were extracted from the skin conductance response (SCR): the level SCR and the rate of change of the SCR (dscr). The skin conductance data was low-pass filtered and smoothed using a 1 second averaging window. The data was then normalized to range between [0,1], using the minimum and maximum values in the preceding 30 seconds. In previous experiments [5], the baseline data had been used to normalize the skin conductance data, however, this normalization failed to account for the fact that occasionally, baseline response of the skin conductance failed to return to baseline prior to the start of new stimuli. Using only data in the preceding 30 seconds to normalize the response ensures that repeated stimuli do not generate an exaggerated response. The rate of change of the skin conductance response is calculated by differentiating the smoothed SCR data and normalizing so that the data ranges from [-1,1].

One feature was extracted from the corrugator muscle EMG data: the level of response (CorrugEMG). The EMG data was low-pass filtered and smoothed using a 1s averaging window. The data was normalized to range between [0,1], based on the resting EMG level measured during the initialisation phase. C. Emotional State Estimation An issue when estimating human emotional response is how to represent the emotional state. Two different representations are in common use in emotion and emotion detection research: one using discrete emotion categories (anger, happiness, fear, etc.), and the other using a two-dimensional representation of valence and arousal. Valence measures the degree to which the emotion is positive (or negative), and arousal measures the strength of the emotion. The valence/arousal representation provides less data, but the amount of information retained appears adequate for the purposes of robotic control, and is easier to convert to a measure of user approval. In this paper, the valence/arousal representation is used. This representation system has been favored for use with physiological signals and in psychophysiological research [4, 16, 17]. 1. Fuzzy Inference Engine The fuzzy rule base used to estimate the emotional state is similar to the rule base reported in [5]. The five extracted features were fuzzified using simple trapezoidal input membership functions. The outputs of the fuzzy engine were the estimated valence and arousal. Table II shows the rule base for the system. This rule base was derived using data from psychophysiological research [16-20]. The first set of rules encapsulates the relationship between the skin conductance response and arousal. Several studies [16, 17, 20] have shown that skin conductance is correlated with arousal. Bradley and Lang [16] report that more than 80% of subjects exhibit this correlation. Because of the variable rate of the decay of the response, it is not adequate to simply use the level of the skin conductance response; instead, both the level and the rate of change must be used. The second set of rules describes the relationship between corrugator muscle EMG and valence. The corrugator muscle is responsible for the lowering and contraction of the brows, i.e. frowning, which is intuitively associated with negative valence. Bradley and Lang also report that EMG levels are highly above the baseline level for negative valence stimuli, slightly above baseline level for neutral valence stimuli, and slightly below baseline level for positive stimuli. In their study, more than 80% of subjects show this correlation. The third set of rules relates heart activity to emotional state. Unlike the SCR and EMG muscle activity, the activity of the heart is governed by many variables, including physical fitness, posture, and activity level as well as emotional state. It is therefore more difficult to obtain significant correlation between heart activity and emotional state. In addition, heart rate activity is also dependent on context. In tests using external stimuli to generate the emotional response (such as picture viewing), heart rate response is initially decellerative, while tests using internal stimulus (recalling emotional imagery) showed an accelerative response [16]. Since our experimental setup consisted of using external stimuli, the external stimuli results were used. Using these results, heart rate deceleration is associated with the orienting response (i.e. increased arousal). Heart rate at the baseline, with no heart rate acceleration or deceleration is associated with low arousal, while high heart rate and heart rate acceleration are associated with high arousal. Due to the added variables affecting heart rate response, heart rate rules were underweighted relative to the SCR and EMG rules. TABLE II FUZZY INFERENCE ENGINE RULE BASE 1. If (SCR is HIGH) and (dscr is NEG) then (Arousal is MED) 2. If (SCR is MED) and (dscr is NEG) then (Arousal is LOW) 3. If (SCR is LOW) and (dscr is NEG) then (Arousal is LOW) 4. If (SCR is ZERO) and (dscr is NEG) then (Arousal is LOW) 5. If (dscr is ZERO) then (Arousal is LOW) 6. If (SCR is ZERO) and (dscr is POS) then (Arousal is MED) 7. If (SCR is LOW) and (dscr is POS) then (Arousal is MED) 8. If (SCR is MED) and (dscr is POS) then (Arousal is HIGH) 9. If (SCR is HIGH) and (dscr is POS) then (Arousal is HIGH) 10. If (CorrugEMG is NEG) then (Valence is POS) 11. If (CorrugEMG is ZERO) then (Valence is ZERO) 12. If (CorrugEMG is LOW) then (Valence is ZERO) 13. If (CorrugEMG is MED) then (Valence is NEG) 14. If (CorrugEMG is HIGH) then (Valence is VNEG) 15. If (HeartRate is VNEG) then (Arousal is HIGH) 16. If (HeartRate is NEG) then (Arousal is MED) 17. If (HRAccel is VNEG) then (Arousal is MED) 18. If (HeartRate is VPOS) then (Arousal is HIGH)(Valence is NEG) 19. If (HRAccel is VPOS) then (Arousal is HIGH)(Valence is NEG) For each trajectory, the average arousal and valence over the duration of the trajectory were calculated. IV. RESULTS A. Subjective Response Figs. 6-8 show the average subjective response and a comparison of the average responses between the potential field and the safe planned paths. Table III shows the correlation analysis between the subjective responses and the trajectory speed for each trajectory type. Correlation among the responses is also shown to validate the use of the valencearousal emotional model. For each set of variables, the upper value is the correlation coefficient, and the lower value is the probability value (the p-value). The p-value is computed from a two sided t-test. As expected, for each trajectory, there is a strong positive correlation between anxiety and speed, and surprise and speed, and a negative correlation between calm and speed. A comparison between the two planners shows that for each motion type, subjects reported on average lower levels of anxiety and surprise, and higher levels of calm, for the safe planned paths. The anxiety response for the fast pick and place trajectory was found to be significantly higher (α = 0.05, student t-test) for the PF planned path, when compared to the safe plan. There was also a smaller correlation between speed and anxiety and surprise for the safe planned paths.

Fig 6 Average Anxiety Response for all Trajectories Tested Fig 7 Average Calm Response for all Trajectories Tested Fig 8 Average Surprise Response for all Trajectories Tested Subjects were not familiar with the safe planner, and were not informed prior to the experiment that some motions would be generated by the safe planner. Some subjects commented that they were more surprised when a safe motion was executed, as the safe motion did not result in a straight line robotlike path. B. Estimated Response Figure 9 shows the average estimated arousal for each trajectory tested. The estimated arousal tends to increase with speed. Table IV shows the correlation analysis between the estimated arousal, the subject-reported response and the trajectory speed for each path. For the pick and place motions, arousal is positively correlated with anxiety and surprise, and negatively correlated with calm, however these relationships were not found to be significant, due to the small sample size. However, arousal is strongly correlated with speed for both the potential field and the safe planned path. For the reach and retract motion, there was no significant correlation between the arousal and the self-reported data. For the safe planned path, arousal was correlated with the speed, while for the potential field planned path this correlation was not found. Fig 9 Average Estimated Arousal for all Trajectories Tested TABLE III CORRELATION BETWEEN REPORTED VARIABLES AND TRAJECTORY VELOCITY Path Calm Surprise Speed PP-PF Anxiety Corr. -0.79 0. 60 0.66 p-value <.0001 0.0005 0.0001 Calm Corr. -0.39-0.47 p-value 0.034 0.0091 Surprise Corr. 0.53 p-value 0.0027 PP-S Anxiety Corr. -0.49 0.07 0.26 p-value 0.0063 0.73 0.17 Calm Corr. -0.29-0.25 p-value 0.12 0.17 Surprise Corr. 0.39 p-value 0.031 RR-PF Anxiety Corr. -0.74 0.32 0.53 p-value <.0001 0.09 0.0029 Calm Corr. -0.13-0.25 p-value 0.48 0.18 Surprise Corr. 0.36 p-value 0.0497 RR-S Anxiety Corr. -0.24 0.42 0.47 p-value 0.12 0.022 0.0086 Calm Corr. 0.00-0.16 p-value 0.98 0.40 Surprise Corr. 0.19 p-value 0.32 For most trajectories, a correlation was found between the estimated arousal and robot speed, but not between arousal and the reported emotional responses. This raises the issue of the relationship between the physiological signals and the subjective experience. It is possible that the physiological response measured is an involuntary response, such as the startle reflex, rather than an emotional state. Another significant factor that reduces the correlation between the reported anxiety and the estimated arousal is the effect of habituation. Most subjects physiological response habituated to the robot motions during the experiment. For several subjects, the first motion generated a strong response, regardless of the trajectory type. Table V shows the correlation coefficients and significance probabilities between the number of trajectories viewed and the reported responses, and also the estimated arousal. As can be seen from this table, while there is no correlation between the subject-reported outputs and the number of trajectories viewed, there is a significant correlation between the physiological response (i.e. the estimated arousal) and the number of trajectories viewed.

TABLE IV CORRELATION BETWEEN ESTIMATED AROUSAL, SELF REPORTED VARIABLES AND SPEED Path Variable Corr w/ Arousal p-value PP-PF Anxiety 0.36 0.086 Calm -0.23 0.27 Surprise 0.40 0.056 Speed 0.60 0.0019 PP-S Anxiety 0.35 0.097 Calm -0.054 0.81 Surprise 0.32 0.14 Speed 0.57 0.0043 RR-PF Anxiety -0.09 0.67 Calm 0.27 0.21 Surprise 0.23 0.28 Speed 0.17 0.42 RR-S Anxiety 0.03 0.88 Calm 0.06 0.79 Surprise 0.02 0.92 Speed 0.53 0.012 The emotional state inference engine also attempted to estimate valence, based on the corrugator muscle EMG signal and heart activity. Corrugator muscle activity has been reported to have strong correlation with negative valence in existing studies, and had shown promise during earlier studies with the inference engine using images as stimuli. However, for most of the trajectories tested, the estimated valence was very small or zero. Corrugator muscle activity was primarily at baseline for all subjects tested during robot motions. TABLE V CORRELATION BETWEEN REPORTED VARIABLES, AROUSAL AND NUMBER OF TRAJECTORIES VIEWED Anxiety Calm Surprise Arousal # Traj. Corr. 0.0315-0.0331 0.0626-0.2346 Viewed p-value 0.7645 0.7530 0.5513 0.0236 V. CONCLUSIONS Two types of robot motions were presented to human subjects during the study: motions planned with a classical potential field planner, and motions planned with the safe planner [12]. Subjects reported less anxiety when safe planned motions were presented. The anxiety was significantly smaller for the safe planned pick and place task during fast motions. Further study is needed, with more test points at higher speeds to confirm that anxiety can be reduced through the use of the safe planner. The results from this study indicate that physiological signals show promise for use during human robot interaction. Fast robot motions tend to reliably elicit a strong arousal response. While in most cases a good correlation is observed between estimated emotional arousal and the robot velocity, only weak or no correlation is found between the estimated emotional response and the subjective responses. Further research is needed to determine the relationship between the physiological and subjective response. A key question is whether the measured arousal indicates a measure of an involuntary reaction the subject may not be aware of (such as the startle reflex), or a measure of a consciously experienced emotional state such as anxiety or surprise. Another issue is the quick habituation of physiological response. Finally, valence estimation using corrugator muscle activity does not appear to be suitable for human robot interaction, based on this preliminary study. Even when high levels of anxiety (a negative valence emotion) were reported, corrugator muscle activity was not reliably present. ACKNOWLEDGEMENT The authors wish to acknowledge David Meger for his help with the robot controller, and during the experiments. REFERENCES [1] A. Pentland, "Perceptual Intelligence," Communications of the ACM, vol. 43, pp. 35-44, 2000. [2] Y. Matsumoto, J. Heinzmann, and A. Zelinsky, "The Essential Components of Human - Friendly Robot Systems," International Conference on Field and Service Robotics, pp. 43-51, 1999. [3] V. J. Traver, A. P. del Pobil, and M. Perez-Francisco, "Making Service Robots Human-Safe," IROS, pp. 696-701, 2000. [4] R. Picard, Affective Computing. Cambridge, Massachussetts: MIT Press, 1997. [5] D. Kulic and E. Croft, "Estimating Intent for Human-Robot Interaction," ICAR, pp. 810-815, 2003. [6] R. Picard et al., "Toward Machine Emotional Intelligence: Analysis of Affective Physiological State," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 23, pp. 1175-1191, 2001. [7] P. Rani, N. Sarkar, C. A. Smith, and L. D. Kirby, "Anxiety detecting robotic system - towards implicit human-robot collaboration," Robotica, vol. 22, pp. 85-95, 2004. [8] N. Sarkar, "Psychophysiological Control Architecture for Human-Robot Coordination - Concepts and Initial Experiments," ICRA, Washington, DC, USA, pp. 3719-3724, 2002. [9] Z. Bien et al., "Soft Computing Based Emotion / Intention Reading for Service Robot," Lecture Notes in Computer Science, vol. 2275, pp. 121-128, 2002. [10] P. Rani, J. Sims, R. Brackin, and N. Sarkar, "Online stress detection using phychophysiological signals for implicit human-robot cooperation," Robotica, vol. 20, pp. 673-685, 2002. [11] S. Nonaka, K. Inoue, T. Arai, and Y. Mae, "Evaluation of Human Sense of Security for Coexisting Robots using Virtual Reality," ICRA, New Orleans, LA, USA, pp. 2770-2775, 2004. [12] D. Kulic and E. Croft, "Safe Planning for Human-Robot Interaction," Journal of Robotic Systems. In Press. 2005. [13] O. Khatib, "Real-Time Obstacle Avoidance for Manipulators and Mobile Robots," Int. Journal of Robotics Research, vol. 5, pp. 90-98, 1986. [14] S. Macfarlane and E. Croft, "Jerk-Bounded Robot Trajectory Planning - Design for Real-Time Applications," IEEE Trans. on Robotics and Automation, vol. 19, pp. 42-52, 2003. [15] www.thoughttechnology.com [16] M. M. Bradley and P. J. Lang, "Measuring Emotion: Behavior, Feeling and Physiology," in Cognitive Neuroscience of Emotion, R. D. Lane and L. Nadel, Eds. New York: Oxford University Press, 2000. [17] P. J. Lang, "The Emotion Probe: Studies of Motivation and Attention," American Psychologicst, vol. 50, pp. 372-385, 1995. [18] K. A. Brownley et al., "Cardiovascular Psychophysiology," in Handbook of Psychophysiology, J. T. Cacioppo and L. G. Tassinary, Eds. Cambridge: Cambridge University Press, 2000. [19] M. E. Dawson et al., "The Electrodermal System," in Handbook of Psychophysiology, J. T. Cacioppo and L. G. Tassinary, Eds. Cambridge: Cambridge University Press, 2000. [20] P. Ekman, R. W. Levenson, and W. V. Friesen, "Autonomic Nervous System Activity Distinguishes Among Emotions," Science, vol. 221, pp. 1208-1210, 1983.