An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. Bandara G.M.M.B.O bhanukab@gmail.com Godawita B.M.D.T tharu9363@gmail.com Gunathilaka M.D.M.L maheshlak9@gmail.com Dr. Dharshana Kasthurirathna dharshana.k@sliit.lk Abstract In this paper we represent an assistive application identifying emotional state by utilizing various advanced technologies and providing methodical treatments for depressive individuals. is a very common mental disorder which can even lead to suicide [1]. It is acknowledged as a major public health problem by many national governments and international agencies. Fortunately, it is treatable. But most depressed people are unware that they are suffering from depression [2]. Considering all these we have implemented a mobile application to help people overcome depression. Our application uses many advanced techniques such as facial analysis, voice analysis and social media behavior to recognize users emotions and symptoms accurately. This helps to identify users depression, and the treatments are given based on that. User s progress is monitored frequently and activities are assigned according to that. Keywords diagnose, linguistic, emotion, sentiment I. INTRODUCTION is a common and serious medical illness, which can even lead to suicide [1]. According to the World Health Organization (WHO) depression will be the second most important medical disease worldwide by the year 2020 [3]. Though depression is treatable, the problem is that the most people suffering from this are never diagnosed, let alone treated [2]. And it has been a challenge to our healthcare system to connect people with the screening or treatments Maduranga P.S.W.B shashithm@gmail.com because of the obsessive fear and shyness that most people have to consult a psychiatrist. Though there are a few depression aid mobile applications available in the app market, none of them are able to diagnose the user s disorder accurately due to their very poor techniques and methodologies. These applications do not attempt to study user s emotions or track user s status at all. They just offer some basic questionnaires to identify the symptoms of the user and provide some instructions to keep their minds busy. There are many different s of depression. So, it is very important to identify each patient s depression in deciding the right treatment. But those existing applications are just offering same set of instructions to any user regardless of what their condition is. Therefore, these applications do not satisfy or fulfill the expectations of the user in need at all. As there s no control or proper guidance, the user s condition will get worst and worst which even leads to suicide. Considering all these issues, we came up with this mobile solution that effectively support people to overcome depression. It uses many advanced techniques to identify the user s emotions, symptoms and depression accurately. Those advanced techniques are following: Identify the emotions of the user by analyzing his/her facial expressions and diagnose depression by face analysis techniques. Identify the emotions of the user by analyzing his/her vocal expressions and diagnose depression by voice analysis techniques Analyze linguistic features on Twitter statuses of the users and diagnose his/her depression This application also prompts the users to answer a series of standard, recognized questions to identify the symptoms, medical and family history etc. Using all these techniques we will be able to diagnose the depression of the user
accurately. According to the depression of each user, this application provides different treatments separately. We are not just giving instructions. We provide simple interesting yet effective recognized activities to help people to combat depression and alleviate their negative moods. We monitor the users progress time to time and those activities are assigned according to that. This application provides guided meditations, positive thinking relaxation music and many other tools to improve mood and behavior of the user. This application has a notification system which reminds the users frequently, about the activities that they should involve to lift their moods. It also focusses in connecting users with their families, friends which is very effective for depressed individuals. This application provides a suicide safety plan to the users who are suffering with suicidal thoughts. II. METHODOLOGY This section discusses about the procedure and technologies that we have used in order to develop the application. The -Aid application is developed as a mobile based application in order to help the user to get the maximum use and high performance based experience in international standards and software development standards A. Facial emotion analysis The human face is the primary tool used for inferring human emotional expressions and intentions, thus it is widely used in healthcare, driver safety, human-computer interaction (HCI), surveillance, and many more new applications. Facial expressions are one of the key means of expressing emotions of a person. Activation of certain specific facial muscles are the foundation for expression of facial emotions. As facial expressions being such important, first step is to identify the user s depression by examining the face of the person and extracting the facial emotions. The user is asked to look at the mobile phones front camera which allows the application to scan through the facial actions from the live facial video and present a real-time result regarding the emotions of the person. Here in this research we have mainly focused in identifying basic emotions which in detail are, disgust, fear, joy, sadness and surprise. Fig. 1. Six basic human expressions [4] As stated above the facial emotion extraction process consists of steps like capturing, processing captured image sequences and provide the emotional s. Initially the system will be prompted with an interface where the front camera of the device is activated and a sequence of users facial images are captured by the system. Then they are undergone the initial process and basic facial features are detected. Then the feature extraction is to realize depression of the user based on a 0-10 scale. Fig. 2. Feature extraction process The above said processes will be developed considering the Facial Action Coding System (FACS) principles and concepts. Based on the result the user will be classified with a depressive. The application is developed using Android, python, and an emotion data library for emotion detection and for live prediction. Furthermore, a data set is being used or trained with the process. To give a much more accuracy a deep learning neural network is implemented to the program, which is like giving eyes to the system. Facial Action Coding System (FACS) is a system to define Facial movements of human by their appearance. Deconstructing each facial expression into Action Units (AU) and temporal segments occurred in that expression gives us the ability to code the most near possible emotion underlying with that expression. AUs are the basic elements for the construction of an expression, they represent minimal facial actions, that are not further separated into more simple actions. The muscle actions and Action Unit (AU) do not agree: in fact, an AU may correspond to the action of one or more of the same muscles and a muscle can be associated with several AUs. Au is, in short, a basic change in appearance of the face, caused by activation of one or more facial muscles. The AUs are divided into groups according to position and/or the type of action involved. First AUs are in the UPPER FACE, and affect the eyebrows, forehead, and eyelids. Then the LOWER FACE AUs are presented in a file groups: Up/Down, Horizontal, Oblique, Orbital, and Miscellaneous. [4]
[12] [15] [12] [15] AU 1 Inner Brow Raiser AU 4 Brow lowerer AU 12 Lip corner puller Table 1. Action Units (AU) [13] [16] [13] [16] 100% 100% 87.5% Disgust 37.5% [17] 100% Disgust 87,5% 85,5% 75% Embarrassment 12,5% 37,5% [17] Happy [12] Happy [13] Happy [14] Happy [15] Happy [16] Joy 100% Happy 50% [17] B. Vocal emotion extraction Emotion mentioned as important role in the human life. There are basically six classes that emotions are categorized. They are happiness, anger, fear, disgust, surprise and sadness. Speech-to-text and language modeling techniques are used to analyze emotions from speech. Below the figure 1 simply shows the whole process of the speech emotion recognition used to analyze the depression of the person. The first stage is an emotional speech database is used to get the voice samples which consists of annotated utterances of voice actors. Next, feature extraction is performed by using open source feature extractor. Then, feature selection method is used for decreasing the number of features and selecting only the most relevant ones. Finally, the emotion recognition is performed by a classification algorithm. Fig. 3. Voice analysis process For analyzing emotion from speech we used speech emotional database to train our emotional recognition algorithms. Database used in here is EMO_DB database (Berlin emotional speech database).[5] It is one of the most exploited database for speech emotion recognition. It has recordings of 535 audio files and used 10 actors to record the voice clips. For extract the features we used open smile tool. It is the most accuracy tool for feature extraction and signal processing. For classification algorithms we used support vector machine and linear classification algorithms to classify the different emotional states. It establishes a hyper plane as the decision surface maximizing the margin of separation between negative and positive samples. Finally, through all the emotion speech recognition process we successfully able to diagnose the depression from speech. C. Assess depression through social media behavior As another technique, we use social media to assess depression of the user. Nowadays social media has concurred a greater share of humans life. People tend to express their feelings and emotions freely through social media. In this paper we mainly focus on a most popular social media Twitter. In here, we gather the previously published twitter messages as known as tweets, of the user and analyze sentiments of those tweets that represent the user s state of mind. We gather the tweets using Tweet Rest API. Then the extracted data from the API were trained using a training data set in order to do the semantic analysis. For sentiment analysis, we present a supervised sentiment classification model based on the Naïve Bayes algorithm. Naïve Bayesian Classifier is a probabilistic learning method. It can solve diagnostic and predictive problems. Naïve Bayes classifier are among the most successful known algorithms for learning to classify text documents [6]. It takes less time to train than other classifiers. In here we are going to show how we achieved the accuracy of this model. Naive Bayes theorem provides a way of calculating the posterior probability, P(c x), from P(c), P(x), and P(x c). Naive Bayes classifier assume that the effect of the value of a predictor (x) on a given class (c) is independent of the values of other predictors. This assumption is called class conditional independence [7]. Above, P(c x) is the posterior probability of class (c, target) given predictor (x, attributes). P(c) is the prior probability of class. P(x c) is the likelihood which is the probability of predictor given class. P(x) is the prior probability of predictor.
D. Activity suggestion based on the data extracted The activities are one of the major factors which may help to reduce depression, and at the same time that activities may not help to reduce the depression of some other persons. The objective of this component is to identify the activities that help to reduce the depression of a person. The depression of relevant person, can be identified through the application by using Face recognition, Voice recognition and Social media to get current depression and according to the depression, the module will show some list to user and user can add different activities to the list. The application will provide an option to the user to select most favorable activities to and user must have to select more than 3 activities in the list. The application will afford some rate for each selected and it will analyses the depression daily. End of the day the application asks from user What activities? and according to the answers which provided by the user, the application will compare user s previous depression and current depression. If the current depression lower than the previous depression, the application will increase the rate value of that by itself otherwise it will reduce the ratings of that. The above process will be used to select activities for one month and identify the most suitable activities for that person according value of they gain from doing activities. The following tables will show user to understand how to do this process when they select three activities. Note. is calculated by getting the average sadness value of Face recognition, Voice recognition and Social media. Table 2. Activity rate calculation (first day) First Day Activity 1 Activity 2 Activity 3 Start rate 0 0 0 New rate CDL Level PDL Previous Level Activity 1 and Activity 3 0 Note. Start rate value 0 is constant for any. Note. 0.5 Constant value for increase rate of activities. Second Day Table 3. Activity rate calculation (second day) Activity 1 Activity 2 Activity 3 Start rate 0.5 0 0.5 New rate CDL Level PDL Previous Level 0.5-0.2 = 0.3 Activity 1 0 0.5 Note. 0.2 Constant value for decrease rate of activities. Table 4. Activity rate calculation (third day) Third Day Activity 1 Activity 2 Activity 3 Start rate 0.3 0 0.5 New rate 0.3 CDL Level PDL Previous Level Activity 2 and Activity 3 0.5 + 0.5 = 1.0 After one week through the application, user can identify what activities are more suitable to reduce their depression. III. CONCLUSION AND FUTURE WORK This research work demonstrated the implementation of a mobile application to assess and under control the mental depression of a person. This approach overcomes the present issues where the depressive individuals being afraid to attend to a psychiatrist as they try to hide from others, and to keep their issues to themselves thinking about their privacy. In the future work we have planned to increase the effectiveness of feature extraction process to enhance the system to be able to assess people from any country in the world with several appearances, and support more languages.
ACKNOWLEDGMENT Application on assessing depression through facial, vocal and social media behavior analysis Research was carried out as our 4 th year research in Sri Lanka Institute of Information Technology. We are extremely grateful to our supervisor Dr. Dharshana Kasthurirathna and our external supervisor Ms. Reka Attidiye, lecturers of Sri Lanka Institute of Information Technology who shared their great knowledge, constant encouragement and support making the pathway for the success of this research. REFERENCES [1] World Health Organization,, Media Centre, February, 2017.[Online].Available:http://www.who.int/mediacentre/factsheets/fs3 69/en/ [Accessed: March. 13, 2017].) [2] R. Letzter, If you have depression, you likely aren't getting the treatment you need, Business Insider, 16, November, 2016. [Online]. Available: http://www.businessinsider.com/depression-treatment-accessmental-health-2016-11 [Accessed: March. 13, 2017]. [3] M.D. Roos, in Sri Lanka, taking pills or talking?, Sunday Island e-paper, para.2, July 26, 2010. [Online], Available: http://www.island.lk/index.php?page_cat=article-details&page=articledetails&code_title=2956 [Accessed: March. 13, 2017]. [4] A new tool to support diagnosis of neurological disorders by means of facial expression. [Online], Available: http://ieeexplore. ieee. org/ document/5966766/? reload=true [Accessed: March. 13, 2017]. [5] Pooja Yadav, Gaurav Aggarwal, Speech Emotion Classification using Machine Learning.[Online]. Available: http://research.ijcaonline.org/ volume118/number13/pxc3903564.pdf [Accessed: March. 13, 2017]. [6] Naive-Bayes Classification Algorithm,. [Online]. Available: http://software.ucv.ro/~cmihaescu/ro/teaching/air/docs/lab4- NaiveBayes.pdf [Accessed: March. 13, 2017]. [7] Sentimental Analysis on Twitter Data using Naive Bayes,. [Online]. Available:https://www.ijarcce.com/upload/2016/december-16 /IJARCCE% 2073.pdf [Accessed: March. 13, 2017].