EEG-based Valence Level Recognition for Real-Time Applications

Size: px
Start display at page:

Download "EEG-based Valence Level Recognition for Real-Time Applications"

Transcription

1 EEG-based Valence Level Recognition for Real-Time Applications Yisi Liu School of Electrical & Electronic Engineering Nanyang Technological University Singapore Olga Sourina School of Electrical & Electronic Engineering Nanyang Technological University Singapore Abstract Emotions are important in human-computer interaction. Emotions could be classified based on 3- dimensional Valence-Arousal-Dominance model which allows defining any number of emotions even without discrete emotion labels. In this paper, we proposed a real-time EEGbased subject-dependent valence level recognition algorithm, where the thresholds were used to identify different levels of the valence dimension of the human emotion. The algorithm was tested by using the EEG data labeled with valence levels. The algorithm could identify valence levels continuously. The algorithm was tested with the experiment data and with the benchmark affective EEG database DEAP where up to 9 levels of valence dimension with high/low dominance were recognized. Then, the algorithm was applied to recognize 16 emotions defined by high/low arousal, high/low dominance and 4 levels of valence. At least 14 electrodes should be used to get the better accuracy. The proposed algorithm could be implemented in different real-time applications such as emotional avatar and E-learning systems. Keywords-EEG; emotion recognition; Valence-Arousal- Dominance model; valence levels recognition I. INTRODUCTION Recognition of the user emotions from Electroencephalographic (EEG) signals is attracting more and more attention since new wireless portable devices became easily available and could be used in humancomputer interfaces. The integration of emotion detection in human computer interfaces could be applied in many fields such as entertainment or education. Traditionally, detection of emotions could be done by analyzing biosignals such as EEG, skin temperature, heart rate, etc. [1]. Now, more research is needed on recognition of emotions from EEG. The development of brain-computer interface (BCI) gives a new way to enhance the interaction between computer and human. The mental states such as concentration levels and emotions could be detected from the EEG signal in real time and serve as a feedback to trigger certain commands in different applications, e.g. to modify the difficulty levels of video games or adjust teaching methods in e-learning systems. There are subject-dependent and subject-independent emotion recognition algorithms. Subject-dependent algorithms have much better accuracy than subjectindependent ones but the subject-dependent algorithms need the system training session implemented in real-time applications. Generally, the available algorithms consist from two parts: feature extraction and classification. A classifier should be trained with features extracted from EEG data labeled with emotions. Thus, in the case of subjectdependent algorithm implementation, the user/player needs to train the classifier by recording the EEG data and labeling the data with emotions. For example, to get 8 emotions in Valence-Arousal-Dominance model, we need to have 8 sessions to collect the EEG data labeled with 8 emotions such as happy (positive valence/high arousal/high dominance), surprised (positive/high arousal/low dominance), satisfied (positive/low arousal/high dominance), protected (positive/low arousal/low dominance), angry (negative/high arousal/high dominance), fear (negative/high arousal/low dominance), unconcerned (negative/low arousal/high dominance), and sad (negative/low arousal/low dominance) [2]. In many applications, more than two classes of valences could be needed. It is useful to recognize, for example, 2 levels of positive feeling and 2 levels of negative feeling or even more levels of the valence dimension. In such case, the training sessions could become very heavy and difficult to implement. Therefore, in this paper, we proposed to use a fractal dimension (FD) feature for continues valence level recognition with thresholds in a real-time subject-dependent EEG-based emotion recognition algorithm following the Valence-Arousal-Dominance emotion model. Fractal dimension which reflects complexity of signals could be used to analyze the chaos of EEG signals. For example, FD could be used in EEG-based detection of the concentration level in real time [3]. The proposed valence level recognition algorithm was tested on the proposed and implemented EEG database and on the benchmark affective EEG database DEAP [4] where up to 9 levels of valence were available. Both databases follow the Valence-Arousal-Dominance emotion model. After that, the proposed algorithm was applied for recognition of 16 emotions defined by high/low arousal, high/low dominance and 4 levels of valence dimension as follows. First, 4 classes (combinations of high/low arousal and high/low dominance levels) were recognized by using fractal dimension and statistical features proposed in [2] and the Support Vector Machine (SVM) classifier. Second, 4 valence levels were recognized with the proposed valence level recognition algorithm using thresholds. In [5], discrete emotions labels were given for emotions defined by high/low

2 arousal, high/low dominance and 8 levels of valence in Valence-Arousal-Dominance emotion model. We used just 4 levels of the valence scale, and we have got 16 names of the emotions defined as follows. High arousal, high dominance, different valence (2 positive ones, from the lowest positive to highest positive) corresponds to activated/elated and joyful/happy emotions. High arousal, high dominance, different valence (2 negative ones, from the lowest negative to highest negative) correspond to contempt/hostile and angry/frustrated. High arousal, low dominance, different valence (2 positive ones, from the lowest positive to highest positive) corresponds to anxious/surprised and fascinated/loved. High arousal, low dominance, different valence (2 negative one, from the lowest negative to highest negative) corresponds to sinful/displeased and embarrassed/fearful. Low arousal, high dominance, different valence (2 positive ones, from the lowest positive to highest positive) corresponds to nonchalant/leisurely and relaxed/secure. Low arousal, high dominance, different valence (2 negative ones, from the lowest negative to highest negative) corresponds to mildly annoyed/disdainful and selfish/dissatisfied. Low arousal, low dominance, different valence (2 positive ones, from the lowest positive to highest positive) corresponds to solemn/quiet and humble/protected. Low arousal, low dominance, different valence (2 negative ones, from the lowest negative to highest negative) corresponds to fatigued/sad and bored/depressed. In some applications, it is important just to recognize from EEG is the subject feeling more positive or less positive than to know the name of the current emotion label. In Section II, the related works include review on emotion classification models and on EEG-based emotion recognition algorithms. Fractal dimension algorithm used as a feature extraction method and benchmark affective EEG database DEAP are introduced as well. In Section III, the designed and implemented experiment is described. The proposed real-time valence level recognition algorithm and its application in recognition of 16 emotions are given in Section IV. Finally, Section V concludes the paper. II. BACKGROUND A. Emotion Classification Models Generally, emotions could be classified in two ways. One way is to define emotional states using discrete categories and identify basic emotions that could be used to form other emotions. Different sets of basic emotions were proposed. For example, eight basic emotional states were proposed by Plutchik: anger, fear, sadness, disgust, surprise, anticipation, acceptance and joy [6]. In this model, other emotions could be a combination of the basic ones, for example, a disappointed emotion is composed of sadness and surprise. Another way to represent emotions is the dimensional approach. The most widely used classification is the bipolar model with valence and arousal dimensions proposed by Russell [7]. In this model, valence dimension ranges from negative to positive, and arousal dimension ranges from not aroused to excited one. The 2-dimensional model could locate the discrete emotion labels in its space [8], and it could define a lot of emotions which could be even without discrete emotion labels. However, if the emotions defined by the 2-dimensional model have the same arousal and valence level values, for example, happy and surprised are both high aroused and positive emotions, the 2-dimensional model cannot differentiate them. In order to get a comprehensive description of emotions, Mehrabian and Russell proposed 3-dimensional Pleasure- Arousal-Dominance (PAD) model in [9]. In this model, pleasure-displeasure dimension equals to the valence dimension mentioned above, evaluating the pleasure level of the emotion. Arousal-non-arousal is equivalent to the arousal dimension, referring to the alertness of an emotion. "Dominance-submissiveness" is a newly extended dimension, which is also named as control dimension of emotion [9] [10]. It ranges from a feeling of being in control during the emotional experience to the feeling of being controlled by the emotion [8]. It makes the dimensional model more complete. By adding the dominance level, emotions such as happy and surprised could be differed since happy emotion is with high dominance and surprised emotion is with low dominance. With the help of the third emotional dimension, more emotions labels could be located in the 3D space, for example, activated, elated, joyful, happy are emotion labels with high arousal high dominance and different intensity of positive valence (all of these four emotions are positive, but their pleasure level ranges from the least positive to the most positive); contempt, hostile, angry, frustrated are emotion labels with high arousal high dominance, and different negative valence (all of these four emotions are negative ones, but their displeasure level ranges from the least negative to most negative) [5]. In our work, we use the 3-dimensional Valence-Arousal- Dominance emotion classification model. B. Peer Work The EEG-based emotion recognition algorithms could be either subject-dependent or subject-independent. The advantage of subject-dependent recognition is that higher accuracy could be achieved since the classification is catered to each individual, but the disadvantage is that every time a new classifier is needed for a new subject. In [11], a subject-dependent algorithm was proposed. Power Spectral Density and Common Spatial Patterns approaches were used to extract features, and Support Vector Machine (SVM) was selected as the classifier. The best accuracy obtained was 76% for two levels valence dimension recognition, and 67% for two levels arousal dimension recognition with 32 electrodes. In [12], two levels of valence were recognized, and the accuracy obtained was 73% with 2 electrodes. Despite the works using dimensional emotion model, there are also subjectdependent algorithms that use emotion labels. For example, in [13], four emotions - happy, angry, sad, and pleasant were recognized with 24 electrodes by using power differences at the symmetric electrodes pairs and SVM as a

3 classifier. In [14], 6 emotions - pleasant, satisfied, happy, sad, frustrated, and fear were recognized with 3 electrodes. In work [15], high and low dominance were recognized by using beta/alpha ratio as features and SVM as the classifier with the best accuracy of 87.5% with 2 electrodes. In [16], a subject-independent algorithm was proposed and implemented. Power spectral features were extracted and SVM was used to classify the data. Finally, an accuracy of 57% was obtained for 3 levels valence dimension recognition, and 52.4% for 3 levels arousal dimension recognition with 32 electrodes. In [1] and [17] subjectindependent algorithms were described. In [1], 3 emotional states were detected by using the power values of 6 EEG frequency bands from 34 electrodes as features, and the maximum accuracy of 56% was achieved. In [17], the accuracy 66.7% was obtained by using the statistical features from 3 electrodes and SVM to recognize three emotions. In [18], both subject-dependent and subject-independent algorithms were proposed and 4 electrodes were used. An accuracy of 69.51% was achieved to differentiate two levels of valence dimension when the algorithm was subjectindependent. Accuracies ranging from 70% to 100% were achieved when the algorithm was subject-dependent. Subject-dependent algorithms have generally higher accuracy than subject-independent ones. The number of the emotions that is possible to recognize and the number of the electrodes are very important for algorithms comparison as well. For example, although the accuracy in [13] is higher than in [12], 24 electrodes were needed in [13] whereas 3 channels were used in [12]. In this paper, our main objective is to propose an algorithm allowing recognizing more than two levels of valence with short training session and performing with higher accuracy in real-time applications. C. Fractal Dimension based Feature Extraction In our work, we used Higuchi algorithm [19] for fractal dimension values calculation. The algorithm gave better accuracy than other fractal dimension algorithms as it was shown in [20]. Here, the implemented algorithms were evaluated using Brownian and Weierstrass functions where theoretical FD values are known. 1) Higuchi Algorithm Let X () 1, X ( 2, ), X ( N) be a finite set of time series samples. Then, the newly constructed time series is m N m X : X ( m), X ( m+ t),, X m+ t t. (1) t where m= 1,2,..., t is the initial time and t is the interval time [19]. L t are calculated by t sets of () m N m t N 1 Xm ( it) Xm ( ( i 1) t + + ) i= 1 N m t t Lm () t =. t (2) L() t denotes the average value of L () relationship exists FD () t L t m t, and one. (3) Then, the fractal dimension FD could be obtained by logarithmic plotting between different t (ranging from 1 to t max ) and its associated L() t [19]. ln L( t) FD =. (4) ln t D. Benchmark Affective EEG Database Recently, the DEAP database based on Valence-Arousal- Dominance emotion model was published in [4]. It has a relatively large amount of subjects (32 subjects) who participated in the data collection. The stimuli to elicit emotions used in the experiment were one-minute long music videos, which are considered as combined stimuli (visual and audio). 40 music videos were used. In the DEAP database, 32 EEG channels of the Biosemi ActiveTwo device [21] were used in the data recording. There are different datasets available in DEAP database, for example, the EEG dataset and the videos dataset with the recorded subject s facial expressions. Here, we used the dataset of the preprocessed EEG data [22]. The sampling rate of the original recorded data is 512 Hz, and the set of preprocessed data is down sampled to 128 Hz. As suggested by the developers of DEAP, this dataset is well-suited for testing new algorithms. Thus, in our work, we used the dataset to validate the proposed algorithm. More details about the DEAP database could be found in [4] and [22]. III. EXPERIMENTAL Besides the affective EEG database DEAP, we also designed and carried out an Experiment 1 based on Valence- Arousal-Dominance emotion model to collect EEG data with labeled emotions. A. Stimuli Sound clips selected from the International Affective Digitized Sounds (IADS) [23] database which also follows the Valence-Arousal-Dominance emotion model were used to induce emotions. The choice of sound clips was based on

4 their Valence, Arousal and Dominance level rating in the IADS database. The experiment consists of 16 sessions and the details of stimuli used in each session are given in Table I. TABLE I. EXPERIMENT STIMULI SELECTION Session No. Valence Rating Arousal Rating Dominance Rating Session Session Session Session Session Session Session Session Session Session Session Session Session Session Session Session B. Subjects There are a total of 12 (3 female and 9 male) subjects participating in the experiment. All of them are university students whose age ranged from 18 to 27 years old and they are without auditory deficit or any history of mental illness. C. Procedure After a participant was invited to a project room, the experiment protocol and the usage of a self-assessment questionnaire were explained to him/her. The subjects needed to complete the questionnaire after the exposure to the audio stimuli. The Self-Assessment Manikin (SAM) technique [24] was employed which used the 3D model with valence, arousal and dominance dimensions and nine levels indicating the intensity level in each dimensions. In the questionnaire, the subjects were also asked to describe their feelings in any words including the emotions like relaxed, happy, or any other emotions they feel. The experiments were done with one subject at each time. The participants had to avoid making any movements when the sound clips were played. The construction of each experimental session is as follows. The experimental design complies with the standard EEG-based affective experiment protocol [11]-[13]. 1. A beep tone to indicate the beginning of sound clip (1 second). 2. A silent period for the participant to calm down (15 seconds). 3. The sound stimulus (30 seconds). 4. Silent period (2 seconds). 5. A beep tone to indicate the ending of the sound clip (1 second). In summary, each session lasted 49 seconds plus the selfassessment time. D. EEG Recording In the Experiment 1, we used Emotiv [25] device with 14 electrodes locating at AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4 standardized by the American Electroencephalographic Society [26] (plus CMS/DRL as references) for the experiment. The technical parameters of the device are given as follows: bandwidth Hz, digital notch filters at 50Hz and 60Hz; A/D converter with 16 bits resolution and sampling rate of 128Hz. The data are transferred via wireless receiver. Recently, the Emotiv device became more be used in the research area [27] [28]. The reliability and validity of the EEG data collected by Emotiv device was done in [29]. EEG data recorded from standard EEG device and Emotiv were compared, and the results showed that the Emotiv device could be used as the standard EEG device in real-time applications where fewer electrodes were needed. IV. EMOTION RECOGNITION ALGORITHM According to [30], there are individual differences when recognizing emotions from EEG for each subject. We proposed a novel subject-dependent algorithm of human emotion recognition with different valence levels recognition. The algorithm could recognize up to 9 levels of valence states with controlled dominance level (high or low), arousal level (high or low) or combination of dominance/arousal levels. For emotions recognition, the overall algorithm consists of two parts. First, four classes with combination of high/low dominance and high/low arousal levels are recognized according to the algorithm proposed in [2]. Fractal dimension and statistical features are extracted with sliding window and Support Vector Machine (SVM) is used as a classifier. Then, the targeted number of valence levels is recognized with the algorithm proposed and described below. For example, for 16 emotions recognition, first, 4 classes (combination of high/low dominance and high/low arousal levels) are recognized then 4 levels of valence dimension are recognized. A. Analysis of Self-Assessment Questionnaire Although the chosen sound clips targeted at the special emotional states, we found out from the self-report questionnaire records that some emotions were not confirmed by the subjects. Our analysis was based on the questionnaire which gave us the recorded participants feelings. Since we have EEG data labeled with 9 levels of dominance dimension and 9 levels of arousal dimension, we considered dominance or arousal level rating 5 as high dominance or arousal level, and dominance or arousal level rating<5 as low dominance or arousal level. After the

5 analysis of the questionnaires, we have got EEG data labeled with different valence levels. B. Valence Level Recognition In works [31] and [32], it was found for valence level recognition that left hemisphere was more active during positive emotion, and right hemisphere was more active during negative emotion. In our previous work [14], it was also found that individual difference exists for different subjects as follows. Most of the subjects had more activities in the left hemisphere during positive emotion, but few others had the right hemisphere more active during positive emotion. As a result, in our proposed valence level recognition algorithm, we use the difference of FD values computed from right and left hemisphere channels. For the database established from Experiment 1, since we have 7 channels from the left and 7 channels from the right hemisphere, in total, we could obtain 49 different combinations to compute the difference of two FD values ( Δ FD ), for example, channel pairs (AF3-AF4), (AF3-F8), (AF3-F4), (AF3-FC6), (AF3-T8), (AF3-P8) and (AF3-O2). The feature for valence level recognition is computed as follows. First, the data are filtered by the bandpass filter which is 2-42 Hz. The sliding window size is 512, and each time shifting by 1 new sample to compute one new FD value described in (4). Then, Δ FD is calculated as (5). Δ FD =(FD left ) m -(FD right ) n (5) where (FD left ) m denotes the FD computed from the left hemisphere and (FD right ) n denotes the FD computed from the left hemisphere, m=1,2,,7 denotes each channel from the left hemisphere, n=1,2,,7 denotes each channel from the right hemisphere,. The obtained ΔFD is averaged every 128 samples to get the mean ( Δ FD ), and it is used as features for valence level recognition. In order to get continuous recognition of valence levels, thresholds are used here to identify the valence level. In the training session, the averaged differences of FD values ( Δ FD ) computed from EEG data labeled with two valence levels but same dominance levels (high or low) or same arousal levels (high or low) are used as input to the threshold selection algorithm. As a result, the lateralization pattern for that particular subject is figured out and thresholds are set for the application session. Here, the EEG data labeled with two valence levels are composed by the one with the most positive and most negative labels. In the application session, the FD values with unknown valence level are compared with the thresholds within the lateralization pattern of that subject and identified correspondingly. C. Threshold Selection Algorithm In the training session, just two valence states (negative and positive) are needed to be elicited with each of the four arousal-dominance combinations, namely negative and positive for High Arousal/High Dominance (HA/HD), High Arousal/Low Dominance (HA/LD), Low Arousal/High Dominance (LA/HD), Low Arousal/Low Dominance (LA/LD). For example, five clips from IADS audio stimuli database [23] could be chosen to evoke the positive and negative states for each arousal-dominance combinations. Then, the EEG data should be recorded when the user was listening to the sound stimuli. To set the thresholds for different valence levels recognition, EEG data labeled with positive and negative states are considered as an input. Since we found that the lateralization pattern for different subjects is not the same [14], we need to figure out this pattern for the current user first. The output is the optimal thresholds for valence level detection. The thresholds selection algorithm for valence recognition is the same for HA/HD, HA/LD, LA/HD, LA/LD, thus only the one with HA/HD is described as follows. Firstly, Δ FD in (5) is computed from all the possible channel pairs from right and left hemispheres. The sliding window 512 and shift by 1 new sample is used each time to get one new Δ FD. Secondly, Δ FD is computed by averaging every 128 samples of Δ FD. The lateralization pattern is figured out by comparing the mean Δ FD values for data labeled with negative and for positive valence levels. If the mean ΔFD for negative valence level is larger than the mean ΔFD for positive valence level, the lateralization pattern label is assigned to 0. Otherwise, the lateralization pattern label is assigned to 1. After finding out the lateralization pattern, the maximum MaxΔ FD and MinΔ FD are assigned according to the pattern label (0 or 1), which means if the label is 0, the MaxΔ FD equals to the maximum value of ΔFD computed from the data labeled with negative valence level, and MinΔ FD equals to the minimum value of ΔFD computed from the data labeled with positive valence level; if the label is 1, the MaxΔ FD equals to maximum value of Δ FD from the data labeled with positive valence level, and MinΔ FD equals to the minimum value of ΔFD computed from the data labeled with negative valence level. Finally, the thresholds for the different valence levels are set by dividing the in between area of MaxΔ FD and MinΔFD based on the targeted number of valence levels needed to be recognized. For example, if 4 valence levels are targeted to be recognized, the three thresholds are set up as follows T1 = MaxΔFD ( MaxΔFD MinΔFD)/4, T2 = T1 ( MaxΔFD MinΔFD)/4, T3 = T2 ( MaxΔFD MinΔFD)/4. The final channel pair used in the application phase is decided as the one with the largest margin between MaxΔFD and MinΔ FD (which is ( MaxΔFD MinΔ FD )). The thresholds obtained from the proposed algorithm are with descending or ascending orders based on the obtained labels from the second step, e.g. T 1 >T 2 >T n. These

6 thresholds are used in application phase such that switching to a different valence level takes place when the FD values computed in the application session satisfy one constrain given by the thresholds, for example, if one Δ FD is larger than T 2 and smaller than T 1, then that sample is considered to belong to valence level 2. D. Results First, the data collected from Experiment 1 were used to test the valence level recognition algorithm with controlled dominance level. 90% of the data with the most negative rating and the most positive rating available from one subject were used as the training data to set the thresholds for that subject, and then, the testing data were composed by the other 10% of the data with the most negative and positive rating which were not used in the training sessions and by all the data with other valence level ratings. The best accuracy obtained from the channel pairs are shown in the Table II. Here, X in the table means data of different valence levels with low dominance are not available for Subject 8. TABLE II. Subject ID S1 S2 S3 S4 S5 S6 CLASSIFICATION ACCURACY OF DATA FROM EXPERIMENT 1 (%) High Dominance Low Dominance Subject ID High Dominance Low Dominance 5 levels 2 levels 5 levels 5 levels S levels 4 levels 2 levels X S X 6 levels 3 levels 4 levels 3 levels S levels 5 levels 3 levels 7 levels S levels 5 levels 7 levels 2 levels S levels 6 levels 4 levels 4 levels S We also validated our algorithm on the data of the DEAP database where combined audio and visual stimuli were used for emotion induction. Here, the same way is used to test the proposed algorithm. In DEAP database, since its EEG device has 14 channels from the left hemisphere and 14 channels from the right hemisphere, in total, we could obtain 196 different combinations to compute Δ FD. The best accuracy among these 196 combinations is shown in Table III. X in the table means that data of different valence levels with low dominance are not available for Subject 17, 27, 32. As it could be seen from Table II and III, the results of testing the data from Experiment 1 are compatible with results from the benchmark database DEAP. The accuracy decreases when the number of valence levels recognized increases as illustrated in Fig. 1 for Experiment 1 and DEAP dataset. The accuracy of valence level recognition for Experiment 1 is generally lower than for DEAP database as the 2 electrodes used in the valence recognition algorithm could be chosen from only 49 possible electrode pairs in Experiment 1 but 196 possible electrode pairs in DEAP database. The proposed algorithm could be adapted to the targeted number of valence levels that needs to be recognized based on the demands of applications. TABLE III. CLASSIFICATION ACCURACY OF DATA FROM DEAP DATABASE(%) High Low High Low Subject ID Subject ID Dominance Dominance Dominance Dominance S1 S2 S3 S4 S5 S6 S7 S8 S9 S10 S11 S12 S13 S14 S15 S16 7 levels 4 levels 5 levels X S X 8 levels 4 levels 5 levels 3 levels S levels 6 levels 7 levels 6 levels S levels 7 levels 7 levels 7 levels S levels 6 levels 7 levels 5 levels S levels 6 levels 8 levels 3 levels S levels 5 levels 7 levels 3 levels S levels 5 levels 8 levels 4 levels S levels 4 levels 9 levels 2 levels S levels 5 levels 7 levels 5 levels S levels 5 levels 9 levels X S X 5 levels 5 levels 9 levels 6 levels S levels 9 levels 7 levels 6 levels S levels 7 levels 6 levels 3 levels S levels 4 levels 6 levels 7 levels S levels 6 levels 7 levels X S X Besides controlling dominance level, the valence level recognition algorithm was tested with controlled arousal (high or low) and with controlled dominance/arousal levels. The results were similar to the results given in the Tables II and III.

7 targeted to be recognized, the accuracy could be up to 100% (Table II, Subject 8). Additionally, the proposed valence recognition algorithm could be combined with the arousaldominance level recognition algorithm [2]. As a result, up to 16 emotions could be recognized. TABLE IV. 16 EMOTIONS RECOGNITION ACCURACY OF DATA FROM DEAP DATABASE WITH CONTROLLED AROUSAL-DOMINANCE(%) Figure 1. Mean accuracy of valence levels with high/low dominance in Experiment 1 and DEAP databases. Then, the proposed algorithm was tested to recognize 16 emotions as follows. First, we classified data into 4 controlled Arousal-Dominance combinations which include high arousal/high dominance, high arousal/low dominance, low arousal/high dominance, and low arousal/low dominance. Next, we recognized 4 valence levels in each class using the threshold algorithm. The classification of different Arousal-Dominance combination was done following the algorithm proposed in our previous work [2]. In [2], it was found that 3 statistical features plus 1 FD features per channel could achieve high classification accuracy to recognize emotion. The Support Vector Machine classifier with polynomial kernel implemented by LIBSVM [33] was used. The corresponding parameter setting includes gamma equals to 1, coef equals to 1 and order d equals to 5. The kernel and parameters were chosen based on the results shown in [18] as it gave the best accuracy. The 5-fold cross validation was adopted to get the classification accuracy to avoid over fitting problem. 21 subjects data from DEAP database were used to test the algorithm performance, and the resulting accuracy is shown in Table IV. In this table, arousal-dominance recognition accuracy shows the mean accuracy obtained from the 5-fold cross validation. Valence recognition accuracy presents the mean accuracy across all the 4 levels valence recognitions from each case of the arousaldominance combinations. When comparing the proposed algorithm with other emotion recognition algorithms such as [11] and [12], which could recognize two levels valence with an accuracy of 76% and 73% respectively, our algorithm could recognize up to nine valence levels, and if only two levels of valence are Subject ID Arousal- 4 levels Dominance Valence Recognition Recognition Subject ID Arousal- 4 levels Dominance Valence Recognition Recognition S S S S S S S S S S S S S S S S S S S S Avg. S Accuracy V. CONCULSION Classification of valence dimension levels ranging from most negative to most positive is very important in applications such as games, e-learning systems, etc. For example, the difficulty levels of games could be adjusted according to the users current emotional state. In this work, we designed and carried out one experiment on emotion induction with audio stimuli using Valence- Arousal-Dominance emotion model. The EEG database labeled with different valence levels, and high/low dominance and high/low arousal levels was established. Based on the data analysis results, we proposed a novel subject-dependent valence level recognition algorithm using the difference of FD values computed from the left and right hemispheres as features and setting thresholds to identify different valence levels. Up to 9 different levels of valence could be recognized. The proposed algorithm was validated with the benchmark EEG database DEAP as well. This algorithm could also be adjusted to the number of valence levels needed by the application requirements. The algorithm was applied in 16 emotions recognition with high/low arousal, high/low dominance and 4 levels of valence. All electrodes were used to get the best accuracy. The real-time EEG-based emotion recognition algorithm could be used in adaptive games, E-learning systems, etc. In Fig. 2, the user with Emotiv device is interacting with the emotion-enabled application Hapek [34]. The user s emotions recognized in real time from EEG are visualized on the user s avatar [14]. Videos of the implemented real-time EEG-enabled applications such as emotional avatar and emotional music players are presented in [35].

8 Figure 2. Real-time emotion-enabled application. ACKNOWLEDGMENT This research is supported by the Singapore National Research Foundation under its Interactive & Digital Media (IDM) Public Sector R&D Funding Initiative and administered by the IDM Programme Office. REFERENCES [1] G. Chanel, C. Rebetez, M. Bétrancourt, and T. Pun, "Emotion assessment from physiological signals for adaptation of game difficulty," IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans, vol. 41, 2011, pp [2] Y. Liu, O. Sourina, and M. K. Nguyen, "Real-time EEG-based Emotion Recognition Algorithm for Adaptive Games," IEEE Transactions on Computational Intelligence and AI in Games. Under review., [3] Q. Wang, O. Sourina, and M. K. Nguyen, "Fractal dimension based algorithm for neurofeedback games," in CGI 2010, Singapore, 2010, p. SP25. [4] S. Koelstra, et al., "DEAP: A Database for Emotion Analysis ;Using Physiological Signals," Affective Computing, IEEE Transactions on, vol. 3, 2012, pp [5] J. A. Russell and A. Mehrabian, "Evidence for a three-factor theory of emotions," Journal of Research in Personality, vol. 11, 1977, pp [6] R. Plutchik, Emotions and life : perspectives from psychology, biology, and evolution, 1st ed. Washington, DC: American Psychological Association, [7] J. A. Russell, "Affective space is bipolar," Journal of Personality and Social Psychology, vol. 37, 1979, pp [8] I. B. Mauss and M. D. Robinson, "Measures of emotion: A review," Cognition and Emotion, vol. 23, 2009, pp [9] A. Mehrabian, "Framework for a comprehensive description and measurement of emotional states," Genetic, social, and general psychology monographs, vol. 121, 1995, pp [10] A. Mehrabian, "Pleasure-Arousal-Dominance: A general framework for describing and measuring individual differences in temperament," Current Psychology, vol. 14, 1996, pp [11] S. Koelstra, et al., "Single trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos," vol LNAI, ed, 2010, pp [12] Q. Zhang and M. Lee, "Analysis of positive and negative emotions in natural scene using brain activity and GIST," Neurocomputing, vol. 72, 2009, pp [13] Y. P. Lin, C. H. Wang, T. L. Wu, S. K. Jeng, and J. H. Chen, "EEGbased emotion recognition in music listening: A comparison of schemes for multiclass support vector machine," in ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, Taipei, 2009, pp [14] Y. Liu, O. Sourina, and M. K. Nguyen, "Real-Time EEG-Based Human Emotion Recognition and Visualization," in Cyberworlds (CW), 2010 International Conference on, 2010, pp [15] Y. Liu and O. Sourina, "EEG-based Dominance Level Recognition for Emotion-enabled Interaction," in The IEEE International Conference on Multimedia & Expo (ICME), 2012, in press. [16] M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, "A multimodal database for affect recognition and implicit tagging," IEEE Transactions on Affective Computing, vol. 3, 2012, pp [17] K. Takahashi, "Remarks on emotion recognition from multi-modal bio-potential signals," in Industrial Technology, IEEE ICIT ' IEEE International Conference on, 2004, pp [18] P. C. Petrantonakis and L. J. Hadjileontiadis, "Adaptive Emotional Information Retrieval From EEG Signals in the Time-Frequency Domain," Signal Processing, IEEE Transactions on, vol. 60, 2012, pp [19] T. Higuchi, "Approach to an irregular time series on the basis of the fractal theory," Physica D: Nonlinear Phenomena, vol. 31, 1988, pp [20] Q. Wang, O. Sourina, and M. Nguyen, "Fractal dimension based neurofeedback in serious games," The Visual Computer, vol. 27, 2011, pp [21] Biosemi. Available: [22] S. Koelstra, et al. (2012, DEAP dataset. Available: [23] M. M. Bradley and P. J. Lang, "The International Affective Digitized Sounds (2nd Edition; IADS-2): Affective ratings of sounds and instruction manual," University of Florida, Gainesville, [24] M. M. Bradley, "Measuring emotion: The self-assessment manikin and the semantic differential," Journal of Behavior Therapy and Experimental Psychiatry, vol. 25, 1994, pp [25] Emotiv. Available: [26] "American electroencephalographic society guidelines for standard electrode position nomenclature," Journal of Clinical Neurophysiology, vol. 8, 1991, pp [27] S. O'Regan, S. Faul, and W. Marnane, "Automatic detection of EEG artefacts arising from head movements," in Engineering in Medicine and Biology Society (EMBC), 2010 Annual International Conference of the IEEE, 2010, pp [28] G. N. Ranky and S. Adamovich, "Analysis of a commercial EEG device for the control of a robot arm," in Bioengineering Conference, Proceedings of the 2010 IEEE 36th Annual Northeast, 2010, pp [29] K. Stytsenko, E. Jablonskis, and C. Prahm, "Evaluation of consumer EEG device Emotiv EPOC," in Poster session presented at MEi:CogSci Conference 2011, 2011, Ljubljana. [30] S. Hamann and T. Canli, "Individual differences in emotion processing," Current Opinion in Neurobiology, vol. 14, 2004, pp [31] N. A. Jones and N. A. Fox, "Electroencephalogram asymmetry during emotionally evocative films and its relation to positive and negative affectivity," Brain and Cognition, vol. 20, 1992, pp [32] T. Canli, J. E. Desmond, Z. Zhao, G. Glover, and J. D. E. Gabrieli, "Hemispheric asymmetry for emotional stimuli detected with fmri," NeuroReport, vol. 9, 1998, pp [33] C.-C. Chang and C.-J. Lin. LIBSVM : a library for support vector machines, 2001 [Online]. Available: [34] Haptek. Available: [35] IDM-Project. (2008, Emotion-based personalized digital media experience in Co-Spaces. Available:

Real-time EEG-based Emotion Recognition and its Applications

Real-time EEG-based Emotion Recognition and its Applications Real-time EEG-based Emotion Recognition and its Applications Yisi Liu, Olga Sourina, and Minh Khoa Nguyen Nanyang Technological University Singapore {LIUY0053,EOSourina,RaymondKhoa}@ntu.edu.sg Abstract.

More information

DISCRETE WAVELET PACKET TRANSFORM FOR ELECTROENCEPHALOGRAM- BASED EMOTION RECOGNITION IN THE VALENCE-AROUSAL SPACE

DISCRETE WAVELET PACKET TRANSFORM FOR ELECTROENCEPHALOGRAM- BASED EMOTION RECOGNITION IN THE VALENCE-AROUSAL SPACE DISCRETE WAVELET PACKET TRANSFORM FOR ELECTROENCEPHALOGRAM- BASED EMOTION RECOGNITION IN THE VALENCE-AROUSAL SPACE Farzana Kabir Ahmad*and Oyenuga Wasiu Olakunle Computational Intelligence Research Cluster,

More information

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry.

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry. Proceedings Chapter Valence-arousal evaluation using physiological signals in an emotion recall paradigm CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry Abstract The work presented in this paper aims

More information

Detecting emotion from EEG signals using the emotive epoc device

Detecting emotion from EEG signals using the emotive epoc device See discussions, stats, and author profiles for this publication at: http://www.researchgate.net/publication/262288467 Detecting emotion from EEG signals using the emotive epoc device CONFERENCE PAPER

More information

EEG-Based Emotion Recognition via Fast and Robust Feature Smoothing

EEG-Based Emotion Recognition via Fast and Robust Feature Smoothing EEG-Based Emotion Recognition via Fast and Robust Feature Smoothing Cheng Tang 1, Di Wang 2, Ah-Hwee Tan 1,2, and Chunyan Miao 1,2 1 School of Computer Science and Engineering 2 Joint NTU-UBC Research

More information

INTER-RATER RELIABILITY OF ACTUAL TAGGED EMOTION CATEGORIES VALIDATION USING COHEN S KAPPA COEFFICIENT

INTER-RATER RELIABILITY OF ACTUAL TAGGED EMOTION CATEGORIES VALIDATION USING COHEN S KAPPA COEFFICIENT INTER-RATER RELIABILITY OF ACTUAL TAGGED EMOTION CATEGORIES VALIDATION USING COHEN S KAPPA COEFFICIENT 1 NOR RASHIDAH MD JUREMI, 2 *MOHD ASYRAF ZULKIFLEY, 3 AINI HUSSAIN, 4 WAN MIMI DIYANA WAN ZAKI Department

More information

An Emotional BCI during Listening to Quran

An Emotional BCI during Listening to Quran An Emotional BCI during Listening to Quran By ( Mashail Laffai Alsolamy ) A thesis submitted for the requirements of the degree of Master of Science in Computer Science Supervised By Dr. Anas M. Ali Fattouh

More information

Neurofeedback Games to Improve Cognitive Abilities

Neurofeedback Games to Improve Cognitive Abilities 2014 International Conference on Cyberworlds Neurofeedback Games to Improve Cognitive Abilities Yisi Liu Fraunhofer IDM@NTU Nanyang Technological University Singapore LIUYS@ntu.edu.sg Olga Sourina Fraunhofer

More information

Mental State Recognition by using Brain Waves

Mental State Recognition by using Brain Waves Indian Journal of Science and Technology, Vol 9(33), DOI: 10.17485/ijst/2016/v9i33/99622, September 2016 ISSN (Print) : 0974-6846 ISSN (Online) : 0974-5645 Mental State Recognition by using Brain Waves

More information

Emotion Detection Using Physiological Signals. M.A.Sc. Thesis Proposal Haiyan Xu Supervisor: Prof. K.N. Plataniotis

Emotion Detection Using Physiological Signals. M.A.Sc. Thesis Proposal Haiyan Xu Supervisor: Prof. K.N. Plataniotis Emotion Detection Using Physiological Signals M.A.Sc. Thesis Proposal Haiyan Xu Supervisor: Prof. K.N. Plataniotis May 10 th, 2011 Outline Emotion Detection Overview EEG for Emotion Detection Previous

More information

EEG Features in Mental Tasks Recognition and Neurofeedback

EEG Features in Mental Tasks Recognition and Neurofeedback EEG Features in Mental Tasks Recognition and Neurofeedback Ph.D. Candidate: Wang Qiang Supervisor: Asst. Prof. Olga Sourina Co-Supervisor: Assoc. Prof. Vladimir V. Kulish Division of Information Engineering

More information

Emotion classification using linear predictive features on wavelet-decomposed EEG data*

Emotion classification using linear predictive features on wavelet-decomposed EEG data* Emotion classification using linear predictive features on wavelet-decomposed EEG data* Luka Kraljević, Student Member, IEEE, Mladen Russo, Member, IEEE, and Marjan Sikora Abstract Emotions play a significant

More information

A Review on Analysis of EEG Signal

A Review on Analysis of EEG Signal A Review on Analysis of EEG Signal Shayela Nausheen Aziz 1, Naveen Kumar Dewangan 2 1 Dept of Electronics Engineering, BIT, Durg, C.G., India 2 Dept of Electronics Engineering, BIT, Durg, C.G., India Email

More information

Real-Time Electroencephalography-Based Emotion Recognition System

Real-Time Electroencephalography-Based Emotion Recognition System International Review on Computers and Software (I.RE.CO.S.), Vol. 11, N. 5 ISSN 1828-03 May 2016 Real-Time Electroencephalography-Based Emotion Recognition System Riyanarto Sarno, Muhammad Nadzeri Munawar,

More information

Towards an EEG-based Emotion Recognizer for Humanoid Robots

Towards an EEG-based Emotion Recognizer for Humanoid Robots The 18th IEEE International Symposium on Robot and Human Interactive Communication Toyama, Japan, Sept. 27-Oct. 2, 2009 WeC3.4 Towards an EEG-based Emotion Recognizer for Humanoid Robots Kristina Schaaff

More information

Emotion Classification along Valence Axis Using ERP Signals

Emotion Classification along Valence Axis Using ERP Signals Emotion Classification along Valence Axis Using ERP Signals [1] Mandeep Singh, [2] Mooninder Singh, [3] Ankita Sandel [1, 2, 3]Department of Electrical & Instrumentation Engineering, Thapar University,

More information

A Brain Computer Interface System For Auto Piloting Wheelchair

A Brain Computer Interface System For Auto Piloting Wheelchair A Brain Computer Interface System For Auto Piloting Wheelchair Reshmi G, N. Kumaravel & M. Sasikala Centre for Medical Electronics, Dept. of Electronics and Communication Engineering, College of Engineering,

More information

MUSIC is important for people because it pleasure

MUSIC is important for people because it pleasure , March 18-20, 2015, Hong Kong Music Recommender System Using a Forehead-mounted Electrical Potential Monitoring Device to Classify Mental States Sungri Chong, Ryosuke Yamanishi, Yasuhiro Tsubo, Haruo

More information

On Shape And the Computability of Emotions X. Lu, et al.

On Shape And the Computability of Emotions X. Lu, et al. On Shape And the Computability of Emotions X. Lu, et al. MICC Reading group 10.07.2013 1 On Shape and the Computability of Emotion X. Lu, P. Suryanarayan, R. B. Adams Jr., J. Li, M. G. Newman, J. Z. Wang

More information

Comparing affective responses to standardized pictures and videos: A study report

Comparing affective responses to standardized pictures and videos: A study report Comparing affective responses to standardized pictures and videos: A study report Marko Horvat 1, Davor Kukolja 2 and Dragutin Ivanec 3 1 Polytechnic of Zagreb, Department of Computer Science and Information

More information

An Algorithm to Detect Emotion States and Stress Levels Using EEG Signals

An Algorithm to Detect Emotion States and Stress Levels Using EEG Signals An Algorithm to Detect Emotion States and Stress Levels Using EEG Signals Thejaswini S 1, Dr. K M Ravi Kumar 2, Abijith Vijayendra 1, Rupali Shyam 1, Preethi D Anchan 1, Ekanth Gowda 1 1 Department of

More information

Affective Game Engines: Motivation & Requirements

Affective Game Engines: Motivation & Requirements Affective Game Engines: Motivation & Requirements Eva Hudlicka Psychometrix Associates Blacksburg, VA hudlicka@ieee.org psychometrixassociates.com DigiPen Institute of Technology February 20, 2009 1 Outline

More information

Emotion Detection from EEG signals with Continuous Wavelet Analyzing

Emotion Detection from EEG signals with Continuous Wavelet Analyzing American Journal of Computing Research Repository, 2014, Vol. 2, No. 4, 66-70 Available online at http://pubs.sciepub.com/ajcrr/2/4/3 Science and Education Publishing DOI:10.12691/ajcrr-2-4-3 Emotion Detection

More information

Emotionally Augmented Storytelling Agent

Emotionally Augmented Storytelling Agent Emotionally Augmented Storytelling Agent The Effects of Dimensional Emotion Modeling for Agent Behavior Control Sangyoon Lee 1(&), Andrew E. Johnson 2, Jason Leigh 2, Luc Renambot 2, Steve Jones 3, and

More information

Facial expression recognition with spatiotemporal local descriptors

Facial expression recognition with spatiotemporal local descriptors Facial expression recognition with spatiotemporal local descriptors Guoying Zhao, Matti Pietikäinen Machine Vision Group, Infotech Oulu and Department of Electrical and Information Engineering, P. O. Box

More information

Machine Intelligence Based Detection and Classification of Human Physiology and Emotions

Machine Intelligence Based Detection and Classification of Human Physiology and Emotions Volume No.-7, Issue No.-1, January, 2019 Machine Intelligence Based Detection and Classification of Human Physiology and Emotions Dhanya.M Dept. of Computer Science NSS College, ottapalam(taluk), Palakkad,

More information

Recognising Emotions from Keyboard Stroke Pattern

Recognising Emotions from Keyboard Stroke Pattern Recognising Emotions from Keyboard Stroke Pattern Preeti Khanna Faculty SBM, SVKM s NMIMS Vile Parle, Mumbai M.Sasikumar Associate Director CDAC, Kharghar Navi Mumbai ABSTRACT In day to day life, emotions

More information

Feature Extraction for Emotion Recognition and Modelling using Neurophysiological Data

Feature Extraction for Emotion Recognition and Modelling using Neurophysiological Data Feature Extraction for Emotion Recognition and Modelling using Neurophysiological Data Anas Samara School of Computing and Mathematics Ulster University Belfast BT37 0QB United Kingdom Email: samara-a@email.ulster.ac.uk

More information

Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners

Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners Hatice Gunes and Maja Pantic Department of Computing, Imperial College London 180 Queen

More information

Emotion based E-learning System using Physiological Signals. Dr. Jerritta S, Dr. Arun S School of Engineering, Vels University, Chennai

Emotion based E-learning System using Physiological Signals. Dr. Jerritta S, Dr. Arun S School of Engineering, Vels University, Chennai CHENNAI - INDIA Emotion based E-learning System using Physiological Signals School of Engineering, Vels University, Chennai Outline Introduction Existing Research works on Emotion Recognition Research

More information

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China A Vision-based Affective Computing System Jieyu Zhao Ningbo University, China Outline Affective Computing A Dynamic 3D Morphable Model Facial Expression Recognition Probabilistic Graphical Models Some

More information

Introduction to affect computing and its applications

Introduction to affect computing and its applications Introduction to affect computing and its applications Overview What is emotion? What is affective computing + examples? Why is affective computing useful? How do we do affect computing? Some interesting

More information

SSRG International Journal of Medical Science ( SSRG IJMS ) Volume 4 Issue 12 December 2017

SSRG International Journal of Medical Science ( SSRG IJMS ) Volume 4 Issue 12 December 2017 Classification and Determination of Human Emotional States using EEG SougataBhattacharjee 1, A. I. Siddiki 2, Dr. Praveen Kumar Yadav 3, Saikat Maity 4 1 Computer Science and Engineering Dept., Dr. B.

More information

Emotion Recognition using a Cauchy Naive Bayes Classifier

Emotion Recognition using a Cauchy Naive Bayes Classifier Emotion Recognition using a Cauchy Naive Bayes Classifier Abstract Recognizing human facial expression and emotion by computer is an interesting and challenging problem. In this paper we propose a method

More information

Fuzzy Model on Human Emotions Recognition

Fuzzy Model on Human Emotions Recognition Fuzzy Model on Human Emotions Recognition KAVEH BAKHTIYARI &HAFIZAH HUSAIN Department of Electrical, Electronics and Systems Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan

More information

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information C. Busso, Z. Deng, S. Yildirim, M. Bulut, C. M. Lee, A. Kazemzadeh, S. Lee, U. Neumann, S. Narayanan Emotion

More information

Development of novel algorithm by combining Wavelet based Enhanced Canny edge Detection and Adaptive Filtering Method for Human Emotion Recognition

Development of novel algorithm by combining Wavelet based Enhanced Canny edge Detection and Adaptive Filtering Method for Human Emotion Recognition International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 12, Issue 9 (September 2016), PP.67-72 Development of novel algorithm by combining

More information

One Class SVM and Canonical Correlation Analysis increase performance in a c-vep based Brain-Computer Interface (BCI)

One Class SVM and Canonical Correlation Analysis increase performance in a c-vep based Brain-Computer Interface (BCI) One Class SVM and Canonical Correlation Analysis increase performance in a c-vep based Brain-Computer Interface (BCI) Martin Spüler 1, Wolfgang Rosenstiel 1 and Martin Bogdan 2,1 1-Wilhelm-Schickard-Institute

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,900 116,000 120M Open access books available International authors and editors Downloads Our

More information

Implementation of Spectral Maxima Sound processing for cochlear. implants by using Bark scale Frequency band partition

Implementation of Spectral Maxima Sound processing for cochlear. implants by using Bark scale Frequency band partition Implementation of Spectral Maxima Sound processing for cochlear implants by using Bark scale Frequency band partition Han xianhua 1 Nie Kaibao 1 1 Department of Information Science and Engineering, Shandong

More information

Emotion Classification along Valence Axis Using Averaged ERP Signals

Emotion Classification along Valence Axis Using Averaged ERP Signals Emotion Classification along Valence Axis Using Averaged ERP Signals [1] Mandeep Singh, [2] Mooninder Singh, [3] Ankita Sandel [1, 2, 3]Department of Electrical & Instrumentation Engineering, Thapar University,

More information

Detection and Plotting Real Time Brain Waves

Detection and Plotting Real Time Brain Waves Detection and Plotting Real Time Brain Waves Prof. M. M. PAL ME (ESC(CS) Department Of Computer Engineering Suresh Deshmukh College Of Engineering, Wardha Abstract - The human brain, either is in the state

More information

A Pilot Study on Emotion Recognition System Using Electroencephalography (EEG) Signals

A Pilot Study on Emotion Recognition System Using Electroencephalography (EEG) Signals A Pilot Study on Emotion Recognition System Using Electroencephalography (EEG) Signals 1 B.K.N.Jyothirmai, 2 A.Narendra Babu & 3 B.Chaitanya Lakireddy Balireddy College of Engineering E-mail : 1 buragaddajyothi@gmail.com,

More information

Sociable Robots Peeping into the Human World

Sociable Robots Peeping into the Human World Sociable Robots Peeping into the Human World An Infant s Advantages Non-hostile environment Actively benevolent, empathic caregiver Co-exists with mature version of self Baby Scheme Physical form can evoke

More information

Construction of the EEG Emotion Judgment System Using Concept Base of EEG Features

Construction of the EEG Emotion Judgment System Using Concept Base of EEG Features Int'l Conf. Artificial Intelligence ICAI'15 485 Construction of the EEG Emotion Judgment System Using Concept Base of EEG Features Mayo Morimoto 1, Misako Imono 2, Seiji Tsuchiya 2, and Hirokazu Watabe

More information

Music-induced Emotions and Musical Regulation and Emotion Improvement Based on EEG Technology

Music-induced Emotions and Musical Regulation and Emotion Improvement Based on EEG Technology Music-induced Emotions and Musical Regulation and Emotion Improvement Based on EEG Technology Xiaoling Wu 1*, Guodong Sun 2 ABSTRACT Musical stimulation can induce emotions as well as adjust and improve

More information

A micropower support vector machine based seizure detection architecture for embedded medical devices

A micropower support vector machine based seizure detection architecture for embedded medical devices A micropower support vector machine based seizure detection architecture for embedded medical devices The MIT Faculty has made this article openly available. Please share how this access benefits you.

More information

SIGNIFICANT PREPROCESSING METHOD IN EEG-BASED EMOTIONS CLASSIFICATION

SIGNIFICANT PREPROCESSING METHOD IN EEG-BASED EMOTIONS CLASSIFICATION SIGNIFICANT PREPROCESSING METHOD IN EEG-BASED EMOTIONS CLASSIFICATION 1 MUHAMMAD NADZERI MUNAWAR, 2 RIYANARTO SARNO, 3 DIMAS ANTON ASFANI, 4 TOMOHIKO IGASAKI, 5 BRILIAN T. NUGRAHA 1 2 5 Department of Informatics,

More information

AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups

AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, VOL.?, NO.?,?? 1 AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups Juan Abdon Miranda-Correa, Student Member, IEEE, Mojtaba

More information

EMOTION RECOGNITION FROM USERS EEG SIGNALS WITH THE HELP OF STIMULUS VIDEOS

EMOTION RECOGNITION FROM USERS EEG SIGNALS WITH THE HELP OF STIMULUS VIDEOS EMOTION RECOGNITION FROM USERS EEG SIGNALS WITH THE HELP OF STIMULUS VIDEOS Yachen Zhu, Shangfei Wang School of Computer Science and Technology University of Science and Technology of China 2327, Hefei,

More information

Temporal Context and the Recognition of Emotion from Facial Expression

Temporal Context and the Recognition of Emotion from Facial Expression Temporal Context and the Recognition of Emotion from Facial Expression Rana El Kaliouby 1, Peter Robinson 1, Simeon Keates 2 1 Computer Laboratory University of Cambridge Cambridge CB3 0FD, U.K. {rana.el-kaliouby,

More information

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition

Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition , pp.131-135 http://dx.doi.org/10.14257/astl.2013.39.24 Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition SeungTaek Ryoo and Jae-Khun Chang School of Computer Engineering

More information

Novel single trial movement classification based on temporal dynamics of EEG

Novel single trial movement classification based on temporal dynamics of EEG Novel single trial movement classification based on temporal dynamics of EEG Conference or Workshop Item Accepted Version Wairagkar, M., Daly, I., Hayashi, Y. and Nasuto, S. (2014) Novel single trial movement

More information

Emote to Win: Affective Interactions with a Computer Game Agent

Emote to Win: Affective Interactions with a Computer Game Agent Emote to Win: Affective Interactions with a Computer Game Agent Jonghwa Kim, Nikolaus Bee, Johannes Wagner and Elisabeth André Multimedia Concepts and Application, Faculty for Applied Computer Science

More information

A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses Rashima Mahajan, Dipali Bansal, Shweta Singh

A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses Rashima Mahajan, Dipali Bansal, Shweta Singh A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses Rashima Mahajan, Dipali Bansal, Shweta Singh Abstract Real time non-invasive Brain Computer Interfaces have a significant

More information

Development of 2-Channel Eeg Device And Analysis Of Brain Wave For Depressed Persons

Development of 2-Channel Eeg Device And Analysis Of Brain Wave For Depressed Persons Development of 2-Channel Eeg Device And Analysis Of Brain Wave For Depressed Persons P.Amsaleka*, Dr.S.Mythili ** * PG Scholar, Applied Electronics, Department of Electronics and Communication, PSNA College

More information

Empirical Cognitive Modeling

Empirical Cognitive Modeling Empirical Cognitive Modeling Tanja Schultz Felix Putze Dominic Heger 24.5.2012 Lecture Cognitive Modeling (SS 2012) 1/49 Outline What are empirical cognitive modeling and recognition of human cognitive

More information

Gender Based Emotion Recognition using Speech Signals: A Review

Gender Based Emotion Recognition using Speech Signals: A Review 50 Gender Based Emotion Recognition using Speech Signals: A Review Parvinder Kaur 1, Mandeep Kaur 2 1 Department of Electronics and Communication Engineering, Punjabi University, Patiala, India 2 Department

More information

The Ordinal Nature of Emotions. Georgios N. Yannakakis, Roddy Cowie and Carlos Busso

The Ordinal Nature of Emotions. Georgios N. Yannakakis, Roddy Cowie and Carlos Busso The Ordinal Nature of Emotions Georgios N. Yannakakis, Roddy Cowie and Carlos Busso The story It seems that a rank-based FeelTrace yields higher inter-rater agreement Indeed, FeelTrace should actually

More information

Validation of a Low-Cost EEG Device for Mood Induction Studies

Validation of a Low-Cost EEG Device for Mood Induction Studies Validation of a Low-Cost EEG Device for Mood Induction Studies Alejandro Rodríguez a,1 Beatriz Rey a,b a,b and Mariano Alcañiz a Instituto Interuniversitario de Investigación en Bioingeniería y Tecnología

More information

Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results

Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results Affective pictures and emotion analysis of facial expressions with local binary pattern operator: Preliminary results Seppo J. Laukka 1, Antti Rantanen 1, Guoying Zhao 2, Matti Taini 2, Janne Heikkilä

More information

VALIDATION OF AN AUTOMATED SEIZURE DETECTION SYSTEM ON HEALTHY BABIES Histogram-based Energy Normalization for Montage Mismatch Compensation

VALIDATION OF AN AUTOMATED SEIZURE DETECTION SYSTEM ON HEALTHY BABIES Histogram-based Energy Normalization for Montage Mismatch Compensation VALIDATION OF AN AUTOMATED SEIZURE DETECTION SYSTEM ON HEALTHY BABIES Histogram-based Energy Normalization for Montage Mismatch Compensation A. Temko 1, I. Korotchikova 2, W. Marnane 1, G. Lightbody 1

More information

Blue Eyes Technology

Blue Eyes Technology Blue Eyes Technology D.D. Mondal #1, Arti Gupta *2, Tarang Soni *3, Neha Dandekar *4 1 Professor, Dept. of Electronics and Telecommunication, Sinhgad Institute of Technology and Science, Narhe, Maharastra,

More information

Asymmetric spatial pattern for EEG-based emotion detection

Asymmetric spatial pattern for EEG-based emotion detection WCCI 2012 IEEE World Congress on Computational Intelligence June, 10-15, 2012 - Brisbane, Australia IJCNN Asymmetric spatial pattern for EEG-based emotion detection Dong Huang, Cuntai Guan, Kai Keng Ang,

More information

Classification of Emotional Signals from the DEAP Dataset

Classification of Emotional Signals from the DEAP Dataset Classification of Emotional Signals from the DEAP Dataset Giuseppe Placidi 1, Paolo Di Giamberardino 2, Andrea Petracca 1, Matteo Spezialetti 1 and Daniela Iacoviello 2 1 A 2 VI_Lab, c/o Department of

More information

EMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS?

EMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS? EMOTION CLASSIFICATION: HOW DOES AN AUTOMATED SYSTEM COMPARE TO NAÏVE HUMAN CODERS? Sefik Emre Eskimez, Kenneth Imade, Na Yang, Melissa Sturge- Apple, Zhiyao Duan, Wendi Heinzelman University of Rochester,

More information

Identification of Neuroimaging Biomarkers

Identification of Neuroimaging Biomarkers Identification of Neuroimaging Biomarkers Dan Goodwin, Tom Bleymaier, Shipra Bhal Advisor: Dr. Amit Etkin M.D./PhD, Stanford Psychiatry Department Abstract We present a supervised learning approach to

More information

REAL-TIME SMILE SONIFICATION USING SURFACE EMG SIGNAL AND THE EVALUATION OF ITS USABILITY.

REAL-TIME SMILE SONIFICATION USING SURFACE EMG SIGNAL AND THE EVALUATION OF ITS USABILITY. REAL-TIME SMILE SONIFICATION USING SURFACE EMG SIGNAL AND THE EVALUATION OF ITS USABILITY Yuki Nakayama 1 Yuji Takano 2 Masaki Matsubara 3 Kenji Suzuki 4 Hiroko Terasawa 3,5 1 Graduate School of Library,

More information

Classification of EEG signals in an Object Recognition task

Classification of EEG signals in an Object Recognition task Classification of EEG signals in an Object Recognition task Iacob D. Rus, Paul Marc, Mihaela Dinsoreanu, Rodica Potolea Technical University of Cluj-Napoca Cluj-Napoca, Romania 1 rus_iacob23@yahoo.com,

More information

Research Article Fusion of Facial Expressions and EEG for Multimodal Emotion Recognition

Research Article Fusion of Facial Expressions and EEG for Multimodal Emotion Recognition Hindawi Computational Intelligence and Neuroscience Volume 2017, Article ID 2107451, 8 pages https://doi.org/10.1155/2017/2107451 Research Article Fusion of Facial Expressions and EEG for Multimodal Emotion

More information

Classification of Video Game Player Experience Using Consumer-Grade Electroencephalography

Classification of Video Game Player Experience Using Consumer-Grade Electroencephalography > REPLACE THIS LINE WITH YOUR PAPER IDENTIFICATION NUMBER (DOUBLE-CLICK HERE TO EDIT) < 1 Classification of Video Game Player Experience Using Consumer-Grade Electroencephalography Thomas D. Parsons, Timothy

More information

Design of Intelligent Emotion Feedback to Assist Users Regulate Emotions: Framework and Principles

Design of Intelligent Emotion Feedback to Assist Users Regulate Emotions: Framework and Principles 2015 International Conference on Affective Computing and Intelligent Interaction (ACII) Design of Intelligent Emotion to Assist Users Regulate Emotions: Framework and Principles Donghai Wang, James G.

More information

Proceedings Chapter. Reference. Boredom, Engagement and Anxiety as Indicators for Adaptation to Difficulty in Games. CHANEL, Guillaume, et al.

Proceedings Chapter. Reference. Boredom, Engagement and Anxiety as Indicators for Adaptation to Difficulty in Games. CHANEL, Guillaume, et al. Proceedings Chapter Boredom, Engagement and Anxiety as Indicators for Adaptation to Difficulty in Games CHANEL, Guillaume, et al. Abstract This paper proposes an approach based on emotion recognition to

More information

This is the accepted version of this article. To be published as : This is the author version published as:

This is the accepted version of this article. To be published as : This is the author version published as: QUT Digital Repository: http://eprints.qut.edu.au/ This is the author version published as: This is the accepted version of this article. To be published as : This is the author version published as: Chew,

More information

QUANTIFICATION OF EMOTIONAL FEATURES OF PHOTOPLETHYSOMOGRAPHIC WAVEFORMS USING BOX-COUNTING METHOD OF FRACTAL DIMENSION

QUANTIFICATION OF EMOTIONAL FEATURES OF PHOTOPLETHYSOMOGRAPHIC WAVEFORMS USING BOX-COUNTING METHOD OF FRACTAL DIMENSION QUANTIFICATION OF EMOTIONAL FEATURES OF PHOTOPLETHYSOMOGRAPHIC WAVEFORMS USING BOX-COUNTING METHOD OF FRACTAL DIMENSION Andrews Samraj*, Nasir G. Noma*, Shohel Sayeed* and Nikos E. Mastorakis** *Faculty

More information

An Overview of BMIs. Luca Rossini. Workshop on Brain Machine Interfaces for Space Applications

An Overview of BMIs. Luca Rossini. Workshop on Brain Machine Interfaces for Space Applications An Overview of BMIs Luca Rossini Workshop on Brain Machine Interfaces for Space Applications European Space Research and Technology Centre, European Space Agency Noordvijk, 30 th November 2009 Definition

More information

Combining complexity measures of EEG data: multiplying measures reveal previously hidden information

Combining complexity measures of EEG data: multiplying measures reveal previously hidden information Combining complexity measures of EEG data: multiplying measures reveal previously hidden information Thomas F Burns Abstract PrePrints Many studies have noted significant differences among human EEG results

More information

Implementation of Image Processing and Classification Techniques on EEG Images for Emotion Recognition System

Implementation of Image Processing and Classification Techniques on EEG Images for Emotion Recognition System Implementation of Image Processing and Classification Techniques on EEG Images for Emotion Recognition System Priyanka Abhang, Bharti Gawali Department of Computer Science and Information Technology, Dr.

More information

USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES

USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES Varinthira Duangudom and David V Anderson School of Electrical and Computer Engineering, Georgia Institute of Technology Atlanta, GA 30332

More information

Thought Technology Ltd.

Thought Technology Ltd. Thought Technology Ltd. 8205 Montreal/ Toronto Blvd. Suite 223, Montreal West, QC H4X 1N1 Canada Tel: (800) 361-3651 ۰ (514) 489-8251 Fax: (514) 489-8255 E-mail: mail@thoughttechnology.com Webpage: http://www.thoughttechnology.com

More information

Selection of Emotionally Salient Audio-Visual Features for Modeling Human Evaluations of Synthetic Character Emotion Displays

Selection of Emotionally Salient Audio-Visual Features for Modeling Human Evaluations of Synthetic Character Emotion Displays Selection of Emotionally Salient Audio-Visual Features for Modeling Human Evaluations of Synthetic Character Emotion Displays Emily Mower #1, Maja J Matarić 2,Shrikanth Narayanan # 3 # Department of Electrical

More information

This is a repository copy of Facial Expression Classification Using EEG and Gyroscope Signals.

This is a repository copy of Facial Expression Classification Using EEG and Gyroscope Signals. This is a repository copy of Facial Expression Classification Using EEG and Gyroscope Signals. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/116449/ Version: Accepted Version

More information

Neuro Approach to Examine Behavioral Competence: An Application of Visible Mind-Wave Measurement on Work Performance

Neuro Approach to Examine Behavioral Competence: An Application of Visible Mind-Wave Measurement on Work Performance Journal of Asian Vocational Education and Training Vol. 8, pp. 20-24, 2015 ISSN 2005-0550 Neuro Approach to Examine Behavioral Competence: An Application of Visible Mind-Wave Measurement on Work Performance

More information

Smart Sensor Based Human Emotion Recognition

Smart Sensor Based Human Emotion Recognition Smart Sensor Based Human Emotion Recognition 1 2 3 4 5 6 7 8 9 1 11 12 13 14 15 16 17 18 19 2 21 22 23 24 25 26 27 28 29 3 31 32 33 34 35 36 37 38 39 4 41 42 Abstract A smart sensor based emotion recognition

More information

EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS

EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS 1 KRISHNA MOHAN KUDIRI, 2 ABAS MD SAID AND 3 M YUNUS NAYAN 1 Computer and Information Sciences, Universiti Teknologi PETRONAS, Malaysia 2 Assoc.

More information

Not All Moods are Created Equal! Exploring Human Emotional States in Social Media

Not All Moods are Created Equal! Exploring Human Emotional States in Social Media Not All Moods are Created Equal! Exploring Human Emotional States in Social Media Munmun De Choudhury Scott Counts Michael Gamon Microsoft Research, Redmond {munmund, counts, mgamon}@microsoft.com [Ekman,

More information

Simultaneous Real-Time Detection of Motor Imagery and Error-Related Potentials for Improved BCI Accuracy

Simultaneous Real-Time Detection of Motor Imagery and Error-Related Potentials for Improved BCI Accuracy Simultaneous Real-Time Detection of Motor Imagery and Error-Related Potentials for Improved BCI Accuracy P. W. Ferrez 1,2 and J. del R. Millán 1,2 1 IDIAP Research Institute, Martigny, Switzerland 2 Ecole

More information

HCS 7367 Speech Perception

HCS 7367 Speech Perception Long-term spectrum of speech HCS 7367 Speech Perception Connected speech Absolute threshold Males Dr. Peter Assmann Fall 212 Females Long-term spectrum of speech Vowels Males Females 2) Absolute threshold

More information

Emotion recognition based on EEG features in movie clips with channel selection

Emotion recognition based on EEG features in movie clips with channel selection Brain Informatics (2017) 4:241 252 DOI 10.1007/s40708-017-0069-3 Emotion recognition based on EEG features in movie clips with channel selection Mehmet Siraç Özerdem. Hasan Polat Received: 12 April 2017

More information

Distributed Multisensory Signals Acquisition and Analysis in Dyadic Interactions

Distributed Multisensory Signals Acquisition and Analysis in Dyadic Interactions Distributed Multisensory Signals Acquisition and Analysis in Dyadic Interactions Ashish Tawari atawari@ucsd.edu Cuong Tran cutran@cs.ucsd.edu Anup Doshi anup.doshi@gmail.com Thorsten Zander Max Planck

More information

AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT

AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT Y. Soeta, S. Uetani, and Y. Ando Graduate School of Science and Technology, Kobe

More information

Classification of Human Emotions from Electroencephalogram (EEG) Signal using Deep Neural Network

Classification of Human Emotions from Electroencephalogram (EEG) Signal using Deep Neural Network Classification of Human Emotions from Electroencephalogram (EEG) Signal using Deep Neural Network Abeer Al-Nafjan College of Computer and Information Sciences Imam Muhammad bin Saud University Manar Hosny

More information

Analyzing Brainwaves While Listening To Quranic Recitation Compared With Listening To Music Based on EEG Signals

Analyzing Brainwaves While Listening To Quranic Recitation Compared With Listening To Music Based on EEG Signals International Journal on Perceptive and Cognitive Computing (IJPCC) Vol 3, Issue 1 (217) Analyzing Brainwaves While Listening To Quranic Recitation Compared With Listening To Music Based on EEG Signals

More information

Patients EEG Data Analysis via Spectrogram Image with a Convolution Neural Network

Patients EEG Data Analysis via Spectrogram Image with a Convolution Neural Network Patients EEG Data Analysis via Spectrogram Image with a Convolution Neural Network Longhao Yuan and Jianting Cao ( ) Graduate School of Engineering, Saitama Institute of Technology, Fusaiji 1690, Fukaya-shi,

More information

EEG based analysis and classification of human emotions is a new and challenging field that has gained momentum in the

EEG based analysis and classification of human emotions is a new and challenging field that has gained momentum in the Available Online through ISSN: 0975-766X CODEN: IJPTFI Research Article www.ijptonline.com EEG ANALYSIS FOR EMOTION RECOGNITION USING MULTI-WAVELET TRANSFORMS Swati Vaid,Chamandeep Kaur, Preeti UIET, PU,

More information

Hybrid EEG-HEG based Neurofeedback Device

Hybrid EEG-HEG based Neurofeedback Device APSIPA ASC 2011 Xi an Hybrid EEG-HEG based Neurofeedback Device Supassorn Rodrak *, Supatcha Namtong, and Yodchanan Wongsawat ** Department of Biomedical Engineering, Faculty of Engineering, Mahidol University,

More information

PHYSIOLOGICAL RESEARCH

PHYSIOLOGICAL RESEARCH DOMAIN STUDIES PHYSIOLOGICAL RESEARCH In order to understand the current landscape of psychophysiological evaluation methods, we conducted a survey of academic literature. We explored several different

More information

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1 1. INTRODUCTION Sign language interpretation is one of the HCI applications where hand gesture plays important role for communication. This chapter discusses sign language interpretation system with present

More information

Frequency Tracking: LMS and RLS Applied to Speech Formant Estimation

Frequency Tracking: LMS and RLS Applied to Speech Formant Estimation Aldebaro Klautau - http://speech.ucsd.edu/aldebaro - 2/3/. Page. Frequency Tracking: LMS and RLS Applied to Speech Formant Estimation ) Introduction Several speech processing algorithms assume the signal

More information

Understanding Affective Experiences: Towards a Practical Framework in the VALIT-Project

Understanding Affective Experiences: Towards a Practical Framework in the VALIT-Project Understanding Affective Experiences: Towards a Practical Framework in the VALIT-Project Author: Mika Boedeker In the b2b-sector the role of affective experiences has not been as salient as in the b2c-sector.

More information