IMPLEMENTATION OF AN AUTOMATED SMART HOME CONTROL FOR DETECTING HUMAN EMOTIONS VIA FACIAL DETECTION

Similar documents
This is the accepted version of this article. To be published as : This is the author version published as:

Development of novel algorithm by combining Wavelet based Enhanced Canny edge Detection and Adaptive Filtering Method for Human Emotion Recognition

Facial Expression Recognition Using Principal Component Analysis

Using Affect Awareness to Modulate Task Experience: A Study Amongst Pre-Elementary School Kids

Facial expression recognition with spatiotemporal local descriptors

Facial Expression Biometrics Using Tracker Displacement Features

D4.10 Demonstrator 4 EoT Application

Sign Language Interpretation Using Pseudo Glove

Gender Based Emotion Recognition using Speech Signals: A Review

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition

Automatic Classification of Perceived Gender from Facial Images

Facial Expression Classification Using Convolutional Neural Network and Support Vector Machine

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

Situation Reaction Detection Using Eye Gaze And Pulse Analysis

Gesture Recognition using Marathi/Hindi Alphabet

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China

HUMAN EMOTION DETECTION THROUGH FACIAL EXPRESSIONS

Emotion based E-learning System using Physiological Signals. Dr. Jerritta S, Dr. Arun S School of Engineering, Vels University, Chennai

Detection and Recognition of Sign Language Protocol using Motion Sensing Device

Facial Emotion Recognition with Facial Analysis

Hand Gesture Recognition and Speech Conversion for Deaf and Dumb using Feature Extraction

CHAPTER 2 MAMMOGRAMS AND COMPUTER AIDED DETECTION

Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender

On Shape And the Computability of Emotions X. Lu, et al.

Recognition of facial expressions using Gabor wavelets and learning vector quantization

Development of a portable device for home monitoring of. snoring. Abstract

DEEP convolutional neural networks have gained much

Emotion Detection Through Facial Feature Recognition

EMOTIONAL LIGHTING SYSTEM ABLE TO EMOTION REASONING USING FUZZY INFERENCE

Phonak Target 4.3. Desktop Fitting Guide. Content. March 2016

NMF-Density: NMF-Based Breast Density Classifier

Intelligent Edge Detector Based on Multiple Edge Maps. M. Qasim, W.L. Woon, Z. Aung. Technical Report DNA # May 2012

The Wellbeing Plus Course

A Review on Gesture Vocalizer

Sleep for Success. Kathy Somers , ext OVC. Stress Management and High Performance Clinic. November 25, 2015

PERFORMANCE ANALYSIS OF THE TECHNIQUES EMPLOYED ON VARIOUS DATASETS IN IDENTIFYING THE HUMAN FACIAL EMOTION

Emotion Recognition using a Cauchy Naive Bayes Classifier

Speech to Text Wireless Converter

Real Time Sign Language Processing System

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Stress and the Mind Body Connection

Extraction of Blood Vessels and Recognition of Bifurcation Points in Retinal Fundus Image

Spotting Liars and Deception Detection skills - people reading skills in the risk context. Alan Hudson

Microphone Input LED Display T-shirt

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS

Fuzzy Model on Human Emotions Recognition

Smart Sensor Based Human Emotion Recognition

Study on Aging Effect on Facial Expression Recognition

AVR Based Gesture Vocalizer Using Speech Synthesizer IC

Obstructive Sleep Apnea Syndrome. Common sleep disorder causes high blood pressure and heart attacks

Recognising Emotions from Keyboard Stroke Pattern

ITU-T. FG AVA TR Version 1.0 (10/2013) Part 3: Using audiovisual media A taxonomy of participation

Utilizing Posterior Probability for Race-composite Age Estimation

IJESRT. Scientific Journal Impact Factor: (ISRA), Impact Factor: 1.852

Mental State Recognition by using Brain Waves

Feature Integration Theory

Data mining for Obstructive Sleep Apnea Detection. 18 October 2017 Konstantinos Nikolaidis

CSE 118/218 Final Presentation. Team 2 Dreams and Aspirations

MRI Image Processing Operations for Brain Tumor Detection

31 Days to Better Sleep

Turn on your moments

Blue Eyes Technology

Accessible Computing Research for Users who are Deaf and Hard of Hearing (DHH)

HEALTH 3--DEPRESSION, SLEEP, AND HEALTH GOALS FOR LEADERS. To educate participants regarding the sleep wake cycle.

A Survey on Hand Gesture Recognition for Indian Sign Language

Sociable Robots Peeping into the Human World

Desktop Fitting Guide for Phonak Brio 3

THE BETTER SLEEP BLUEPRINT

Neuromorphic convolutional recurrent neural network for road safety or safety near the road

Bayesian Face Recognition Using Gabor Features

Session 5. Bedtime Relaxation Techniques and Lifestyle Practices for Improving Sleep

Alaska Sleep Education Center

THE COMPLETE CARDIO TRAINING SOLUTION

EMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS?

The Open Access Institutional Repository at Robert Gordon University

Sleep hygiene. Turnberg Building Department of Respiratory Medicine University Teaching Trust

Design of Palm Acupuncture Points Indicator

Image Processing and Data Mining Techniques in the Detection of Diabetic Retinopathy in Fundus Images

Audiovisual to Sign Language Translator

Human Abilities: Vision, Memory and Cognition. Oct 14, 2016

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE) Vol 5, Issue 3, March 2018 Gesture Glove

Recognizing Emotions from Facial Expressions Using Neural Network

Error Detection based on neural signals

Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images

HUMAN EMOTION RECOGNITION USING THERMAL IMAGE PROCESSING AND EIGENFACES

SiS9255. Projected Capacitive. Touch-Screen Micro Processor. Data sheet. Rev. 1.2 August 18, 2015

Development of an Electronic Glove with Voice Output for Finger Posture Recognition

Recognition of HIV-1 subtypes and antiretroviral drug resistance using weightless neural networks

ABSTRACT I. INTRODUCTION

Drowsiness Detection in Real Time Driving Conditions

Identification of Neuroimaging Biomarkers

Intelligent Medicine Case for Dosing Monitoring and Support

myphonak app User Guide

A framework for the Recognition of Human Emotion using Soft Computing models

Intelligent Frozen Shoulder Self-Home Rehabilitation Monitoring System

Affective Dialogue Communication System with Emotional Memories for Humanoid Robots

Emotional Design. D. Norman (Emotional Design, 2004) Model with three levels. Visceral (lowest level) Behavioral (middle level) Reflective (top level)

IDENTIFICATION OF REAL TIME HAND GESTURE USING SCALE INVARIANT FEATURE TRANSFORM

PAPER REVIEW: HAND GESTURE RECOGNITION METHODS

Primary Level Classification of Brain Tumor using PCA and PNN

Transcription:

IMPLEMENTATION OF AN AUTOMATED SMART HOME CONTROL FOR DETECTING HUMAN EMOTIONS VIA FACIAL DETECTION Lim Teck Boon 1, Mohd Heikal Husin 2, Zarul Fitri Zaaba 3 and Mohd Azam Osman 4 1 Universiti Sains Malaysia, Malaysia, ltboon.ucom10@student.usm.my 2 Universiti Sains Malaysia, Malaysia, heikal@usm.my 3 Universiti Sains Malaysia, Malaysia, zarulfitri@usm.my 4 Universiti Sains Malaysia, Malaysia, azam@usm.my ABSTRACT. With our daily busy schedules, the home is one of the main locations where we could unwind and relax. With the implementation of an effective smart home system, the relaxation time could begin as soon as the homeowners return home. The Eye 2H is an automated smart home control system that detects the homeowners emotion via facial detection. The system is capable of boosting the mood and emotion of the residents by controlling the electronic lighting and equipment in their home. By adjusting, the lighting conditions automatically could create a welcoming effect for the homeowners. Besides that, the system is also able to reduce energy usage via sensors, which would detect human presence in specific rooms. As such, the implementation of the system require a number of processes along with initial testing to ensure a smooth system operation. Based on our testing and evaluation, the proposed system is able to detect both facial and emotion in an effective manner. INTRODUCTION Keywords: Eye2H, human emotion detection, smart home control, facial detection In today s world, people are often working harder and longer to achieve a more comfortable living. Nevertheless, the pressure and tension due to the increased workload and other challenges have lead to higher levels of health and mental issues. A survey conducted by Towers Watson UK has found that 57% of employees who experienced high levels of stress have lower engagement, are less productive and often absent to work (Towers Watson, 2014). Another study previously published by the American Psychological Association also concluded that employees with longer working hours are more frequently linked to family conflicts and stress-related health problems (American Psychological Association, 2008). With this in mind, we believe that majority of the people are suffering from stress-related health problems and conflicts. As such, it is essential to ensure that the home is conducive for them to distress from the related stresses of their day. When it comes to distressing at home, most homeowners usually enjoy watching television or even listening to music (Klein, 2013). Music has been found to have a positive effect on stressed individuals (Thoma et al., 2013) but light-related activities such as watching television has a deeper impact. A report published by the European Commission found that exposure to light at night could be linked with an in- 39

creased risk of breast cancer and also cause sleep, gastrointestinal, mood and cardiovascular diseases (Wighton, 2013). The sudden change of light produced by a television may cause a disturbance to our brain, which then disrupts our sleeping time. With this in mind, we have proposed a smart home system called The Eye 2H which may be an appropriate solution to this recurring stress issue (Boon et al., 2014). The aim of this paper is to highlight the system implementation for our proposed system in Boon et al. (2014), which could provide an alternative approach for people to relieve stress in their own homes. The first section of the paper provides a brief introduction of our proposed system. This section is then followed by the system implementation for the Core Program (software) while the third section examines the implementation for the Intelligent Control Board (hardware). Next, we present some of the initial testing and evaluation that was conducted with the system. In the final section, the conclusion and related future work for the proposed system is presented. EYE 2H: THE PROPOSED SMART HOME SYSTEM The Eye 2H is an intelligent system that utilizes a combination of both and facial recognition techniques to detect and analyze human emotion of the homeowners. This information is then utilized by an intelligent control box to control any connected electronic and lighting equipment in their house. There are two main modules in system: 1) The Eye 2H Core Program and, 2) the Intelligent Control Box. Figure 1 depicts the proposed system structure. Figure 3. The Eye 2H Proposed System Structure (Boon et al., 2014) The Core Program utilizes a processing box and a camera that monitors a target area that detects as well as recognizes if he or she is a resident of the home. This recognition system would then analyze the individual s face and compare the to an existing of the 40

current residents. Once the system verifies the resident s identification, the system will then continue to monitor the resident to detect the emotion of the owner and respond accordingly to their emotion. Once a particular emotion is detected, the system sends a signal to the intelligent control box. The box consists of three intelligent control features: 1) Input Process, 2) Signal Output, and 3) Configure Board that receives the required respond input from the Core Program. The respond inputs would then be carried out by the control box, which is connected to the electronic equipment such as lighting and other devices. Our proposed system is designed to provide a better living environment for people such as switching on the lights for the residents when they return home or off the lights when they are away. The system would also detect the emotion of the residents and response to those emotions by controlling the brightness of light and other electronic devices if needed (Boon et al., 2014). The implementation of the system from the software and hardware perspectives are highlighted in the next section. THE EYE 2H SYSTEM - CORE PROGRAM (SOFTWARE) The software implementation for the Core Program in the proposed system consists of three modules, which are Camera Monitoring, Emotion Detection & Analysis and Response modules. Besides that, there are two training phases that was implemented for both the face and emotion recognition modelling as part of the system. Camera Monitoring Module The camera would capture s continuously, which provides the basis for the facial recognition process. The facial detection feature was achieved by using the Haar method, a feature-based cascade classifier, where a trained Haar classifier is utilized for the facial detection. In order to increase the accuracy of face detection, eye detection is also performed on the captured s. If both the face and eyes of the residents were detected from the captured, the system will proceed to the pre-processing phase. The identified is then scaled to grey-scale as a greyscale has a higher processing speed compared to a colored. The system then proceeds to align the facial to ensure that the face and eyes are at the correct angle. Once this is completed, the facial region is cropped to a dimension of 24 x 24 from the processed. The intensity of the light distribution for the cropped facial would then be equalized to balance the intensity range. Noise from the would be reduced by using the Bilateral filtering which retains the sharp edges in the. Finally, a mask layer would be implemented to remove the unrequired region within the facial. Emotion Detection & Analysis Module There are two sub-modules in this module, which are: 1) Face Recognition and, 2) Emotion Detection. The Face Recognition sub-module provides the capability of recognizing the residents via the facial. This capability utilizes the Eigen Face algorithm via a number of collected facial s to train the algorithm. The processed from the Camera Monitoring module would be converted into an matrix where the Eigenvalue and Eigenvectors of the matrix is calculated. Based on the value and vector, both the Eigen Space and Face would be generated which then leads to the formation of an Eigen Face model. The Eigen Face model from the training session would then be used as a comparison against the cropped facial. If the similarity of both s exceed a certain threshold, the system will perform a facial recognition or vice versa for an unknown facial. Figure 2 shows the process flow of the facial recognition phase. 41

Figure 2. Process Flow of the Facial Recognition Phase The Emotion Detection sub-module analyzes the detected emotions from the captured facial s. This analysis is achieved by using a combination of the Principal Component Analysis (PCA) approach for feature extraction and selection; as well as the SVM approach for classification. The combination of approaches requires the use of a trained emotion classifier to categorize the captured s into one of the recognized emotions. The raw dataset required for the emotion classifier is taken from the Extended Cohn-Canade Dataset (CK+) (Lucey et al., 2010). There were 7 emotions within the classifier, where we selected 4 emotions for our system: neutral, happy, sad and sleepy. The emotions was selected as they are considered the most basic emotions that a user may show at any given time (Lucey et al., 2010). Therefore, there are four sets of data representing each of the emotion acquired in the SVM classifier training phase. Each classifier requires a positive and a negative. For this research, we utilized the concept of one versus others in selecting the negative. For example, for a happy face, the positive dataset is based on the happy facial while the negative dataset consists of the neutral, sad and sleepy facial s. Table 1 shows the positive and negative dataset for the SVM classifier. Table 4. Positive and Negative Dataset for the SVM Classifier Emotion Classifier Positive Dataset Negative Dataset Happy Happy facial Neutral + Sad + Sleepy facial Neutral Neutral facial Happy + Sad + Sleepy facial Sad Sad facial Happy + Neutral + Sleepy facial Sleepy Sleepy facial Happy + Neutral + Sad facial For the feature extraction and selection of the facial s, PCA was utilized. The extracted features is selected and forms a PCA subspace. Then the from the dataset would be integrated into the PCA subspace to form a Projected PCA Image. This projected would be utilized in training along with the SVM classifiers. The SVM parameters used for the training phase are as follow: Type: C-Support Vector Classification; Kernel Type: Linear Kernel; C-value: 100. Figure 3 below shows the overall flow of the Emotion Recognition submodule with the PCA and SVM approach. 42

Figure 3. Overall flow of the Emotion Recognition Sub-Module (PCA & SVM approach) THE EYE 2H SYSTEM INTELLIGENT CONTROL BOARD (HARDWARE) The system implementation for the Intelligent Control Board consists of four modules: Input Processing, Signaling, DE2i_150 Board and, the NiosII Program for the Control Board. The Input Processing module receives and processes the signal generated by the Core Program. If the signal received is 1, it would indicate that the Control Board would need to process and response accordingly. The Signaling module would then handle the response to the signal. If the signal is 0, the board would signal to the connected lights or home appliances to turn off while, a signal with 1 would instead turn on the devices instead. The DE2i_150 Board would require some initial configuration before any communication could occur. By utilizing Quartus II, the control board logic could be designed with the Qsys- Tool with GUI. Components such as Clock Source, Embedded Processor: Nios II Processor, Serial Interface Protocol; JTAG UART, On-Chip Memory (RAM / ROM), System ID Peripheral and Microcontroller Peripheral: Parallel I/O (PIO) are utilized to design the hardware logic for this research. As this is an initial research into a new smart home system, an LED light is connected to the board to indicate a signal from the Core Program. When some initialization and assignment of base addresses occurs, the hardware logic would be compiled into a.sof file. This file would be used to configure the FPGA of the Control Board through a USB-Blaster connected to the host computer. Before the hardware logic could be executed, a software program is required. Therefore, a NiosII Program was developed to operate the configured DE2i_150 Board. The NiosII Program was designed to operate the configured Control Board that acts as an intermediary with the Core Program. The code below shows an example of a signal being sent to the Control Board via the PIO with a base address which is connected to the LED. If the signal received from the Core Program is 0, the program will switch off the LED and vice versa if the signal received is a 1. IOWR_ALTERA_AVALON_PIO_DATA(0x810000, count & 0x01); INITIAL SYSTEM TESTING AND EVALUATION We conducted two different system testing for the Camera Monitoring Module and the Emotion Detection & Analysis Module. We utilized 10 different user faces for both tests in order to evaluate the system implementation effectively. For the Camera Monitoring Module, we conducted three test scenarios: 1) functionality of the camera, 2) the level of facial detection and 3) the pre-processing quality. Based on the tests, we found that the module is working as proposed (shown in Table 2 below). 43

Table 2. Camera Monitoring Module Test Scenarios Test 1: Functional Camera Test 2: Facial Detection Test 3: Image Preprocessing Able to activate the camera and capture as input Able to detect the user s face Able to process the and create the final cropped as input For the Emotion Detection & Analysis Module, we conducted two test scenarios: 1) face recognition for known and unknown residents and, 2) four different sets of emotion detection. Table 3 shows some of the test scenarios conducted for this module. The arrows in the s shows the status for each of the scenarios. Table 3. Emotion Detection & Analysis Module Test Scenarios Test 1: Face Recognition (Unknown Resident) Test 2: Happy Facial Detection The performance of the system is excellent when a face is detected and vice versa due to the heavy processing of every single that is captured by the camera. As the system tries to analyze and detect possible faces within a captured, the system seems to slow down. However, the accuracy of the facial detection in our proposed system is up to 90% as majority of the captured faces can be detected. This high detection rate is due to the Eigen Face Algorithm that is very sensitive to lighting conditions. In terms of emotion detection, our system manages to detect with an accuracy of more than 50% for each of the four emotions. Table 4 depicts the summary of results that we have collected for the emotion detection testing. 44

Table 4. Emotion Detection & Analysis Module Result Summary Emotion Neutral Happy Sad Sleepy Recognition Rate 7 out of 10 Neutral faces 8 out of 10 Happy faces 6 out of 10 Sad faces 7 out of 10 Sleepy faces CONCLUSION AND FUTURE WORK The Eye 2H system is designed to provide a harmonious and happy (2H) living for people. By utilizing a combination of processing, facial recognition and facial expression recognition techniques, a more intelligent smart home system is developed. As a result, we are able to propose an intelligent system can control lighting and electronic devices based on facial and emotion detection. However, the system can be enhanced in the future by increasing the emotion parameters to include stress, anger, dislike and many more. Besides that, the integration of voice recognition and gesture recognition into the system can increase the functionality and reliability for homeowners. REFERENCES American Psychological Association. (February 2008). Toward reducing work stress. Monitor on Psychology, 39(2), 9. Retrieved from http://www.apa.org/monitor/feb08/ceo.aspx Boon, L. T., Husin, M. H., Zaaba, Z. F., Osman, M. A. (2014). Eye 2H: A proposed automated smart home control shystem for detecting human emotions through facial detection. Proceedings of 2014 5 th International Conference on Information & Communication Technology for the Muslim World (ICT4M), 1-4 Klein, S. (2013). Common stress relief tricks: What works and what doesn t. Huffington Post. Retrieved from http://www.huffingtonpost.com/2013/05/02/stress-relief-survey_n_3195466.html Lucey, P., Cohn, J. F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I. (2010). The Extended Cohn- Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression. Proceedings of 2010 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 94-101 Thoma, M. V., La Marca, R., Bronnimann, R., Finkel, L., Ehlert, U., Nater U. (2013) The Effect of Music on the Human Stress Response. PLoS One. 8(8). US National Library of Medicine Towers Watson. (3 September 2014). Workplace stress leads to less productive employees. Retrieved from http://www.towerswatson.com/en/press/2014/09/workplace-stress-leads-to-lessproductive-employees Wighton, K. (October 2013). Desperate for a good night s sleep? Putting a red bulb in your bedside lamp might do the trick, but from weight gain to insomnia, research suggests artificial light may wreak havoc on your health. The Daily Mail. Retrieved from http://www.dailymail.co.uk/health/article-2470513/lights-effect-sleep-suggested-newresearch.html 45