An assistive application identifying emotional state and executing a methodical healing process for depressive individuals.

Similar documents
Gender Based Emotion Recognition using Speech Signals: A Review

Emotion Recognition using a Cauchy Naive Bayes Classifier

Facial expression recognition with spatiotemporal local descriptors

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Affective Game Engines: Motivation & Requirements

Supplemental Digital Appendix 1 Questions From the Pre- and Post-Evaluation for a Facial Expression Workshop a,b

A Smart Texting System For Android Mobile Users

Edge Based Grid Super-Imposition for Crowd Emotion Recognition

Healthy Coping. Learning You Have Diabetes. Stress. Type of Stress

EECS 433 Statistical Pattern Recognition

Neuro-Inspired Statistical. Rensselaer Polytechnic Institute National Science Foundation

Beyond AI: Bringing Emotional Intelligence to the Digital

Identifying Signs of Depression on Twitter Eugene Tang, Class of 2016 Dobin Prize Submission

Classification and attractiveness evaluation of facial emotions for purposes of plastic surgery using machine-learning methods and R

Skin color detection for face localization in humanmachine

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION

Research Proposal on Emotion Recognition

Facial Expression and Consumer Attitudes toward Cultural Goods

1/12/2012. How can you tell if someone is experiencing an emotion? Emotion. Dr.

Emotion-Aware Machines

Not All Moods are Created Equal! Exploring Human Emotional States in Social Media

Contrastive Analysis on Emotional Cognition of Skeuomorphic and Flat Icon

Inventions on expressing emotions In Graphical User Interface

Fuzzy Model on Human Emotions Recognition

Introduction to affect computing and its applications

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition

Fostering positive team behaviors in human-machine teams through emotion processing: Adapting to the operator's state

This is the accepted version of this article. To be published as : This is the author version published as:

WEB MINING IN CLASSIFYING YOUTH EMOTIONS

ERI User s Guide. 2. Obtaining the ERI for research purposes

GfK Verein. Detecting Emotions from Voice

Emotion based E-learning System using Physiological Signals. Dr. Jerritta S, Dr. Arun S School of Engineering, Vels University, Chennai

Drive-reducing behaviors (eating, drinking) Drive (hunger, thirst) Need (food, water)

Recognising Emotions from Keyboard Stroke Pattern

The SpiritualityApps.net Travel Guide

Fudan University, China

Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning

Blue Eyes Technology

Mental State Recognition by using Brain Waves

A Human-Markov Chain Monte Carlo Method For Investigating Facial Expression Categorization

Motivation represents the reasons for people's actions, desires, and needs. Typically, this unit is described as a goal

Open Research Online The Open University s repository of research publications and other research outputs

Brain Training and Self-Quantification

Audiovisual to Sign Language Translator

EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS

THE ANALYTICS EDGE. Intelligence, Happiness, and Health x The Analytics Edge

IDENTIFYING STRESS BASED ON COMMUNICATIONS IN SOCIAL NETWORKS

Signals from Text: Sentiment, Intent, Emotion, Deception

PSYC 222 Motivation and Emotions

IEEE TRANSACTIONS ON AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. 19, NO. 5, JULY

Towards Unobtrusive Emotion Recognition for Affective Social Communication

Talking Heads for the Web: what for? Koray Balci Fabio Pianesi Massimo Zancanaro

Outline. Emotion. Emotions According to Darwin. Emotions: Information Processing 10/8/2012

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS

From Sentiment to Emotion Analysis in Social Networks

Lecture 6: Social Cognition (Seeking Accuracy) Part I: Non-verbal communication Part II: Attribution

5 Minute Strategies to Support Healthy Treatment and Recovery

Effect of Sensor Fusion for Recognition of Emotional States Using Voice, Face Image and Thermal Image of Face

Introduction to Machine Learning. Katherine Heller Deep Learning Summer School 2018

ANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS

EMOTIONS are one of the most essential components of

Accessible Computing Research for Users who are Deaf and Hard of Hearing (DHH)

Temporal Context and the Recognition of Emotion from Facial Expression

SCHIZOPHRENIA, AS SEEN BY A

Implementation of Perception Classification based on BDI Model using Bayesian Classifier

D4.10 Demonstrator 4 EoT Application

Prediction of Psychological Disorder using ANN

The Vine Assessment System by LifeCubby

did you feel sad or depressed? did you feel sad or depressed for most of the day, nearly every day?

Glove for Gesture Recognition using Flex Sensor

The Make Time Programme

FILTWAM. A Framework for Online Affective Computing in Serious Games. Kiavash Bahreini Wim Westera Rob Nadolski

This is a repository copy of Measuring the effect of public health campaigns on Twitter: the case of World Autism Awareness Day.

January - April Adult, Community and Family Learning

An Online ADR System Using a Tool for Animated Agents

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification

Outline. Teager Energy and Modulation Features for Speech Applications. Dept. of ECE Technical Univ. of Crete

PSYCHOLOGIST-PATIENT SERVICES

Analyzing Personality through Social Media Profile Picture Choice

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

ASC-Inclusion. The Case, Initiatives and Impact. Interactive Emotion Games. Björn Schuller

Introduction to Sentiment Analysis

Classification of Local Language Disaster Related Tweets in Micro Blogs

Please take time to read this document carefully. It forms part of the agreement between you and your counsellor and Insight Counselling.

Emotions. These aspects are generally stronger in emotional responses than with moods. The duration of emotions tend to be shorter than moods.

Apps to help those with special needs.

This is the author s version of a work that was submitted/accepted for publication in the following source:

Guidance on Benign Behavioral Interventions: Exempt Category Three 45 CFR (d)(3)(i)

RESTore TM. Clinician Manual for Single User. Insomnia and Sleep Disorders. A step by step manual to help you guide your clients through the program

How to Create Better Performing Bayesian Networks: A Heuristic Approach for Variable Selection

Source and Description Category of Practice Level of CI User How to Use Additional Information. Intermediate- Advanced. Beginner- Advanced

FORENSIC HYPNOSIS WITH THE DEAF AND HEARING IMPAIRED

The Ordinal Nature of Emotions. Georgios N. Yannakakis, Roddy Cowie and Carlos Busso

Large Scale Analysis of Health Communications on the Social Web. Michael J. Paul Johns Hopkins University

Person Perception. Forming Impressions of Others. Mar 5, 2012, Banu Cingöz Ulu

Self Help Journals Anger Management

Discovering Facial Expressions for States of Amused, Persuaded, Informed, Sentimental and Inspired

CSE 258 Lecture 1.5. Web Mining and Recommender Systems. Supervised learning Regression

Transcription:

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. Bandara G.M.M.B.O bhanukab@gmail.com Godawita B.M.D.T tharu9363@gmail.com Gunathilaka M.D.M.L maheshlak9@gmail.com Dr. Dharshana Kasthurirathna dharshana.k@sliit.lk Abstract In this paper we represent an assistive application identifying emotional state by utilizing various advanced technologies and providing methodical treatments for depressive individuals. is a very common mental disorder which can even lead to suicide [1]. It is acknowledged as a major public health problem by many national governments and international agencies. Fortunately, it is treatable. But most depressed people are unware that they are suffering from depression [2]. Considering all these we have implemented a mobile application to help people overcome depression. Our application uses many advanced techniques such as facial analysis, voice analysis and social media behavior to recognize users emotions and symptoms accurately. This helps to identify users depression, and the treatments are given based on that. User s progress is monitored frequently and activities are assigned according to that. Keywords diagnose, linguistic, emotion, sentiment I. INTRODUCTION is a common and serious medical illness, which can even lead to suicide [1]. According to the World Health Organization (WHO) depression will be the second most important medical disease worldwide by the year 2020 [3]. Though depression is treatable, the problem is that the most people suffering from this are never diagnosed, let alone treated [2]. And it has been a challenge to our healthcare system to connect people with the screening or treatments Maduranga P.S.W.B shashithm@gmail.com because of the obsessive fear and shyness that most people have to consult a psychiatrist. Though there are a few depression aid mobile applications available in the app market, none of them are able to diagnose the user s disorder accurately due to their very poor techniques and methodologies. These applications do not attempt to study user s emotions or track user s status at all. They just offer some basic questionnaires to identify the symptoms of the user and provide some instructions to keep their minds busy. There are many different s of depression. So, it is very important to identify each patient s depression in deciding the right treatment. But those existing applications are just offering same set of instructions to any user regardless of what their condition is. Therefore, these applications do not satisfy or fulfill the expectations of the user in need at all. As there s no control or proper guidance, the user s condition will get worst and worst which even leads to suicide. Considering all these issues, we came up with this mobile solution that effectively support people to overcome depression. It uses many advanced techniques to identify the user s emotions, symptoms and depression accurately. Those advanced techniques are following: Identify the emotions of the user by analyzing his/her facial expressions and diagnose depression by face analysis techniques. Identify the emotions of the user by analyzing his/her vocal expressions and diagnose depression by voice analysis techniques Analyze linguistic features on Twitter statuses of the users and diagnose his/her depression This application also prompts the users to answer a series of standard, recognized questions to identify the symptoms, medical and family history etc. Using all these techniques we will be able to diagnose the depression of the user

accurately. According to the depression of each user, this application provides different treatments separately. We are not just giving instructions. We provide simple interesting yet effective recognized activities to help people to combat depression and alleviate their negative moods. We monitor the users progress time to time and those activities are assigned according to that. This application provides guided meditations, positive thinking relaxation music and many other tools to improve mood and behavior of the user. This application has a notification system which reminds the users frequently, about the activities that they should involve to lift their moods. It also focusses in connecting users with their families, friends which is very effective for depressed individuals. This application provides a suicide safety plan to the users who are suffering with suicidal thoughts. II. METHODOLOGY This section discusses about the procedure and technologies that we have used in order to develop the application. The -Aid application is developed as a mobile based application in order to help the user to get the maximum use and high performance based experience in international standards and software development standards A. Facial emotion analysis The human face is the primary tool used for inferring human emotional expressions and intentions, thus it is widely used in healthcare, driver safety, human-computer interaction (HCI), surveillance, and many more new applications. Facial expressions are one of the key means of expressing emotions of a person. Activation of certain specific facial muscles are the foundation for expression of facial emotions. As facial expressions being such important, first step is to identify the user s depression by examining the face of the person and extracting the facial emotions. The user is asked to look at the mobile phones front camera which allows the application to scan through the facial actions from the live facial video and present a real-time result regarding the emotions of the person. Here in this research we have mainly focused in identifying basic emotions which in detail are, disgust, fear, joy, sadness and surprise. Fig. 1. Six basic human expressions [4] As stated above the facial emotion extraction process consists of steps like capturing, processing captured image sequences and provide the emotional s. Initially the system will be prompted with an interface where the front camera of the device is activated and a sequence of users facial images are captured by the system. Then they are undergone the initial process and basic facial features are detected. Then the feature extraction is to realize depression of the user based on a 0-10 scale. Fig. 2. Feature extraction process The above said processes will be developed considering the Facial Action Coding System (FACS) principles and concepts. Based on the result the user will be classified with a depressive. The application is developed using Android, python, and an emotion data library for emotion detection and for live prediction. Furthermore, a data set is being used or trained with the process. To give a much more accuracy a deep learning neural network is implemented to the program, which is like giving eyes to the system. Facial Action Coding System (FACS) is a system to define Facial movements of human by their appearance. Deconstructing each facial expression into Action Units (AU) and temporal segments occurred in that expression gives us the ability to code the most near possible emotion underlying with that expression. AUs are the basic elements for the construction of an expression, they represent minimal facial actions, that are not further separated into more simple actions. The muscle actions and Action Unit (AU) do not agree: in fact, an AU may correspond to the action of one or more of the same muscles and a muscle can be associated with several AUs. Au is, in short, a basic change in appearance of the face, caused by activation of one or more facial muscles. The AUs are divided into groups according to position and/or the type of action involved. First AUs are in the UPPER FACE, and affect the eyebrows, forehead, and eyelids. Then the LOWER FACE AUs are presented in a file groups: Up/Down, Horizontal, Oblique, Orbital, and Miscellaneous. [4]

[12] [15] [12] [15] AU 1 Inner Brow Raiser AU 4 Brow lowerer AU 12 Lip corner puller Table 1. Action Units (AU) [13] [16] [13] [16] 100% 100% 87.5% Disgust 37.5% [17] 100% Disgust 87,5% 85,5% 75% Embarrassment 12,5% 37,5% [17] Happy [12] Happy [13] Happy [14] Happy [15] Happy [16] Joy 100% Happy 50% [17] B. Vocal emotion extraction Emotion mentioned as important role in the human life. There are basically six classes that emotions are categorized. They are happiness, anger, fear, disgust, surprise and sadness. Speech-to-text and language modeling techniques are used to analyze emotions from speech. Below the figure 1 simply shows the whole process of the speech emotion recognition used to analyze the depression of the person. The first stage is an emotional speech database is used to get the voice samples which consists of annotated utterances of voice actors. Next, feature extraction is performed by using open source feature extractor. Then, feature selection method is used for decreasing the number of features and selecting only the most relevant ones. Finally, the emotion recognition is performed by a classification algorithm. Fig. 3. Voice analysis process For analyzing emotion from speech we used speech emotional database to train our emotional recognition algorithms. Database used in here is EMO_DB database (Berlin emotional speech database).[5] It is one of the most exploited database for speech emotion recognition. It has recordings of 535 audio files and used 10 actors to record the voice clips. For extract the features we used open smile tool. It is the most accuracy tool for feature extraction and signal processing. For classification algorithms we used support vector machine and linear classification algorithms to classify the different emotional states. It establishes a hyper plane as the decision surface maximizing the margin of separation between negative and positive samples. Finally, through all the emotion speech recognition process we successfully able to diagnose the depression from speech. C. Assess depression through social media behavior As another technique, we use social media to assess depression of the user. Nowadays social media has concurred a greater share of humans life. People tend to express their feelings and emotions freely through social media. In this paper we mainly focus on a most popular social media Twitter. In here, we gather the previously published twitter messages as known as tweets, of the user and analyze sentiments of those tweets that represent the user s state of mind. We gather the tweets using Tweet Rest API. Then the extracted data from the API were trained using a training data set in order to do the semantic analysis. For sentiment analysis, we present a supervised sentiment classification model based on the Naïve Bayes algorithm. Naïve Bayesian Classifier is a probabilistic learning method. It can solve diagnostic and predictive problems. Naïve Bayes classifier are among the most successful known algorithms for learning to classify text documents [6]. It takes less time to train than other classifiers. In here we are going to show how we achieved the accuracy of this model. Naive Bayes theorem provides a way of calculating the posterior probability, P(c x), from P(c), P(x), and P(x c). Naive Bayes classifier assume that the effect of the value of a predictor (x) on a given class (c) is independent of the values of other predictors. This assumption is called class conditional independence [7]. Above, P(c x) is the posterior probability of class (c, target) given predictor (x, attributes). P(c) is the prior probability of class. P(x c) is the likelihood which is the probability of predictor given class. P(x) is the prior probability of predictor.

D. Activity suggestion based on the data extracted The activities are one of the major factors which may help to reduce depression, and at the same time that activities may not help to reduce the depression of some other persons. The objective of this component is to identify the activities that help to reduce the depression of a person. The depression of relevant person, can be identified through the application by using Face recognition, Voice recognition and Social media to get current depression and according to the depression, the module will show some list to user and user can add different activities to the list. The application will provide an option to the user to select most favorable activities to and user must have to select more than 3 activities in the list. The application will afford some rate for each selected and it will analyses the depression daily. End of the day the application asks from user What activities? and according to the answers which provided by the user, the application will compare user s previous depression and current depression. If the current depression lower than the previous depression, the application will increase the rate value of that by itself otherwise it will reduce the ratings of that. The above process will be used to select activities for one month and identify the most suitable activities for that person according value of they gain from doing activities. The following tables will show user to understand how to do this process when they select three activities. Note. is calculated by getting the average sadness value of Face recognition, Voice recognition and Social media. Table 2. Activity rate calculation (first day) First Day Activity 1 Activity 2 Activity 3 Start rate 0 0 0 New rate CDL Level PDL Previous Level Activity 1 and Activity 3 0 Note. Start rate value 0 is constant for any. Note. 0.5 Constant value for increase rate of activities. Second Day Table 3. Activity rate calculation (second day) Activity 1 Activity 2 Activity 3 Start rate 0.5 0 0.5 New rate CDL Level PDL Previous Level 0.5-0.2 = 0.3 Activity 1 0 0.5 Note. 0.2 Constant value for decrease rate of activities. Table 4. Activity rate calculation (third day) Third Day Activity 1 Activity 2 Activity 3 Start rate 0.3 0 0.5 New rate 0.3 CDL Level PDL Previous Level Activity 2 and Activity 3 0.5 + 0.5 = 1.0 After one week through the application, user can identify what activities are more suitable to reduce their depression. III. CONCLUSION AND FUTURE WORK This research work demonstrated the implementation of a mobile application to assess and under control the mental depression of a person. This approach overcomes the present issues where the depressive individuals being afraid to attend to a psychiatrist as they try to hide from others, and to keep their issues to themselves thinking about their privacy. In the future work we have planned to increase the effectiveness of feature extraction process to enhance the system to be able to assess people from any country in the world with several appearances, and support more languages.

ACKNOWLEDGMENT Application on assessing depression through facial, vocal and social media behavior analysis Research was carried out as our 4 th year research in Sri Lanka Institute of Information Technology. We are extremely grateful to our supervisor Dr. Dharshana Kasthurirathna and our external supervisor Ms. Reka Attidiye, lecturers of Sri Lanka Institute of Information Technology who shared their great knowledge, constant encouragement and support making the pathway for the success of this research. REFERENCES [1] World Health Organization,, Media Centre, February, 2017.[Online].Available:http://www.who.int/mediacentre/factsheets/fs3 69/en/ [Accessed: March. 13, 2017].) [2] R. Letzter, If you have depression, you likely aren't getting the treatment you need, Business Insider, 16, November, 2016. [Online]. Available: http://www.businessinsider.com/depression-treatment-accessmental-health-2016-11 [Accessed: March. 13, 2017]. [3] M.D. Roos, in Sri Lanka, taking pills or talking?, Sunday Island e-paper, para.2, July 26, 2010. [Online], Available: http://www.island.lk/index.php?page_cat=article-details&page=articledetails&code_title=2956 [Accessed: March. 13, 2017]. [4] A new tool to support diagnosis of neurological disorders by means of facial expression. [Online], Available: http://ieeexplore. ieee. org/ document/5966766/? reload=true [Accessed: March. 13, 2017]. [5] Pooja Yadav, Gaurav Aggarwal, Speech Emotion Classification using Machine Learning.[Online]. Available: http://research.ijcaonline.org/ volume118/number13/pxc3903564.pdf [Accessed: March. 13, 2017]. [6] Naive-Bayes Classification Algorithm,. [Online]. Available: http://software.ucv.ro/~cmihaescu/ro/teaching/air/docs/lab4- NaiveBayes.pdf [Accessed: March. 13, 2017]. [7] Sentimental Analysis on Twitter Data using Naive Bayes,. [Online]. Available:https://www.ijarcce.com/upload/2016/december-16 /IJARCCE% 2073.pdf [Accessed: March. 13, 2017].