Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning
|
|
- Colin Walker
- 5 years ago
- Views:
Transcription
1 Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning Renan Contreras, Oleg Starostenko, Vicente Alarcon-Aquino, and Leticia Flores-Pulido CENTIA, Department of Computing, Electronics and Mechatronics Universidad de las Américas Puebla, Cholula, 72820, México {renan.contrerasgz,oleg.starostenko, Abstract. In this paper we present a fuzzy reasoning system that can measure and recognize the intensity of basic or non-prototypical facial expressions. The system inputs are the encoded facial deformations described either in terms of Ekman s Action Units (AUs) or Facial Animation Parameters (FAPs) of MPEG-4 standard. The proposed fuzzy system uses a knowledge base implemented on knowledge acquisition and ontology editor Protégé. It allows the modeling of facial features obtained from geometric parameters coded by AUs - FAPs and also the definition of rules required for classification of measured expressions. This paper also presents the designed framework for fuzzyfication of input variables for fuzzy classifier based on statistical analysis of emotions expressed in video records of standard Cohn-Kanade s and Pantic s MMI face databases. The proposed system has been tested in order to evaluate its capability for detection, classifying, and interpretation of facial expressions. Keywords: Emotion recognition, facial features, knowledge-based framework, rules-based fuzzy classifier. 1 Introduction The fuzzy systems and their combination with neural networks have been successfully used for pattern recognition and for image indexing and interpretation [1, 2]. In the area of facial expression recognition the application of a fuzzy reasoning remains marginal despite that some researchers have successfully used classifying systems which emulate the way as humans identify prototypical expression [3, 4, 5]. Usually the facial expression recognizing systems are based on two parts: a module for generation of feature vector corresponding to emotion expression in analyzed image (described by pixel position, colors, shapes, regions, etc.) and a classification module that recognize an emotion and describe its intensity. There are a lot of techniques that have been used for facial features extraction. Some of them are based on Gabor Wavelets [5], Active Appearance and Geometric Models [6], Principal Components Analysis and Hierarchical Radial Basis Function Network [7], Optical Flow and Deformable Templates [8], Discrete Cosine Transform and Neural Networks [9], Multilevel Hidden Markov Models [10], Dynamic Bayesian J.A. Carrasco-Ochoa et al. (Eds.): MCPR 2010, LNCS 6256, pp , Springer-Verlag Berlin Heidelberg 2010
2 12 R. Contreras et al. networks [11], and others. Even though these approaches may extract and interpret the facial actions, there are not reports about how they may link standard facial actions with particular formal models or rules for automatic facial expression recognition. In this paper we present the development of a fuzzy reasoning system, which is able not only to recognize facial expressions using standard Ekman s AUs (Action Units), FAPs (Facial Animation Parameter) and FDPs (Facial Definition Parameter) of MPEG-4 standard, but also to measure their intensity [12]. This proposal permits to create novel models for precise facial feature detection as well as recognition and interpretation of basic and non-prototypical emotions. The proposed fuzzy system for facial expression recognition consists of two principal modules. The first one is a knowledge-based framework for modeling facial deformations by FAP and AU action units [13] developed according to well-known standards [14, 15]. The second module is used for recognizing facial expressions by fuzzy classifier in combination with Gaussian functions providing measurement of emotion intensity. To reduce relative subjectivity and lack of psychological meaning of emotional intensity levels the statistical analysis of facial actions in Cohn-Kanade s and Pantic s image databases has been implemented [16, 17]. This information has been incorporated into the knowledge-based framework enabling to model each facial deformation. The proposed approach has not been widely used in emotion classifiers, but we believe that this technique permits to develop knowledge-base frameworks for facial expression recognition because the analysis of semantics of facial actions may be implemented by using the rule-based descriptors and fuzzy reasoning classifier. 2 Knowledge-Based Framework The knowledge-based framework allows measuring facial deformations in terms of distances between fiducial points modeled by FAPs and AUs and represented by rulebased descriptors used then in the process of fuzzyfication and interpretation of emotion intensity. The fiducial points represented by FDPs of MPEG-4 standard provides the automatic normalization of measured facial deformations making it invariant to scale of input images. The framework also allows modeling facial deformations defining a set of rules for indexing and quantification of expressions. The proposed approach is able to detect and measure any type of facial expression however it has been tested using six basic expressions (happiness, sadness, disgust, surprise, anger, and fear). Fig. 1 shows the structure of knowledge-based framework that supports design of a fuzzy reasoning system. We exploit relationships between the measured facial deformations and their mathematical description, by the corresponding AUs and FAPs and rules required for identification of expressions. This knowledge-based framework has been implemented using ontology editor Protégé that provides extensible, flexible, and plug-and-play environment that allows fast prototyping and application development [18]. The proposed knowledge-based framework consists of two abstract super-classes: the Face_Model and the Emotion_Model. The Face_Model class defines different approaches for representation of face features.
3 Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning 13 Facial motion Definition FAPs Definition Emotion Definition FACE MODEL: Distance difference Wavelets, Optical flow, Principal Components FAPs MPEG-4 AUs FACS EMOTION MODEL: Prototypical Non-prototypical Statistical Analysis AUs Definition Emotion Rules Fig. 1. Structure of emotion knowledge database based on FAPs and AUs For the framework four specific classes based on AUs, FAPs, FDPs, and FDPs have been created. The Emotion_Model class permits to create the rule-based models for emotion indexing using classes of the Face_Model. The instances of the particular Face_Model class contain the basic facial actions (AUs, FAPs) that include action number, its name, description, direction of motion, involved facial muscles, part of a face where action is occurred, etc. Particularly, we propose emotion indexing based on measuring standard spatial variations of FDP positions implemented by our Distance_Model. The framework may be extended with new non-standard classes and models. The advantage of the proposed framework is that the classes and instances with attributes represent knowledge about emotions, and parameters of any model may be automatically converted to parameters of each other. 3 The Proposed Facial Model In the proposed fuzzy reasoning system the facial model based on analysis of nineteen FDPs has been adopted. It describes all necessary facial actions defining either basic or non-prototypical emotions [13]. Fig. 2 (left) shows the selected FDPs with corresponding number of associated FAPs. Some FDPs are reference points which are remained static during facial deformation. Used FDPs define the Distance_Class that represent distances between fiducial reference points and particular FDPs (see Fig. 2). The facial action parameter FAP represents facial changes of emotional expression with respect to the neutral expression. The difference called Dd quantifies facial changes in terms of units defined by MPEG-4 standard. Table 1 shows the fifteen instances of the DistanceClass chosen for our model, geometric definitions of these distances, the measurement units, the relations with FAPs, and the actions, which they describe. Some reports of Plutchik [12], Pantic, [5], and Esau [4] suggest a geometrical model of face which includes not only distances but also angles between the lines connecting the standard FDPs. However this approach does not contribute significant precision and makes the processing too complex [13, 17].
4 14 R. Contreras et al. Fig. 2. FDPs used for recognizing facial expressions and definition of Distance_Class instances Table 1. Description of instances in the DistanceClass Dd FDPs Difference Units FDP Action Description D1 d{3.11,4.1} ENS 31 raise l i eyebrow D2 d{3.8, 4.2} ENS 32 raise r i eyebrow D3 d{3.7, 4.3} ENS 33 raise l m eyebrow D4 d{3.12, 4.4} ENS 34 raise r m eyebrow D5 d{3.7, 4.5} ENS 35 raise l o eyebrow D6 d{3.12, 4.6} ENS 36 raise r o eyebrow D7 d{4.1, 4.2} ES squeeze l/r eyebrow D8 d{3.3, 3.1} IRISD close t/b l eyelid D9 d{3.4, 3.2} IRISD close t/b r eyelid D10 d{8.3, 8.4} MW stretch l/r cornerlip D11 d{3.11, 8.3} ENS 59 raise l cornerlip o D12 d{3.8, 8.4} ENS 60 raise r cornerlip o D13 d{9.15, 8.1} MNS lower t midlip D14 d{9.15, 8.2} MNS raise b midlip D15 d{8.1, 8.2} MNS raise b/t midlip 4 Fuzzyfication of Distance Instances A fundamental process of fuzzy reasoning is fuzzyficacion of input variables and definition of the corresponding membership functions used for indexing facial deformations. The input variables are FAPs representing variation of distances between fiducial points that compose standard database of indexed facial expressions, particularly from Kanade s and Pantic s databases [16, 17]. Each database consists of approximately 500 records with expression of different emotions by 100 subjects in frontal position. Accompanying meta-data include annotation of FAC action units and emotion specified expressions. Recorded videos show a series of 23 facial muscle motions that are described by combination of action units (e.g. AU1+AU2 means inner and outer brows raised). Each record begins from a neutral or nearly neutral emotion (neutral face) finishing on a required target emotion. Table 2 shows the results of quantification of the distance differences (see Fig. 2) between fiducial points describing maximum and minimum values, mean, and standard deviation for
5 Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning 15 Table 2. AUs parameters determined for Kanade s database FACs Distance Maximum Minimun Mean Deviation AU1 D D AU2 D D D D D AU4 D D D D AU5 D D AU7 D D AU10 D D AU12 D D AU15 D D AU16 D AU20 D D AU25 D AU27 D D each one associated with the particular AU. Recall that the difference in distances is measured between a neutral face and one with any action expressing an emotion. The similar results have been obtained after analysis of emotion representation by AUs using either Kanade s or Pantic s database. From Table 2 we already have quantitative definition of action units, which may be used for continuous interpretation of emotion. For measuring intensity of facial expression the Gaussian function has been used applying the following equation (1): f ( ( x c ) x, σ,c ) = e σ (1) The used parameters are determined by mentioned statistical analysis, where c defines the position of peak and σ controls the width of the bell shaped Gaussian curve. The fuzzyfication process may be explained analyzing a behavior of any action unit, for example, AU12. According to the results of statistical analysis made for AU12 (see Table 2) the range of its distance variable, for example, D10 is between 0 and MWs (Mouth Width). We have defined for each variable as for all ones
6 16 R. Contreras et al. a) b) Fig. 3. Membership function plots and intensity partitions for variable a) D10 and b) D11 three levels of emotion intensity (low+, medium+, and high+) dividing in the corresponding proportion the range of distance variation. These intervals may be computed using the data from Table 2 such as center and width of medium section, the mean, and deviation of D10. Having already defined the middle section, then we compute the Gaussian functions for low and high sections. Additionally, a saturation level is included taking into account the maximum possible value of a facial deformation. Fig. 3 a) and b) show the final process of fuzzyfication for variables D10 and D11. The membership functions are obtained for each partition using Gaussian functions providing measurement of intensity of action unit in continuous manner. 4 Fuzzy Inference System The proposed model for fuzzyfication of facial features has been tested on designed fuzzy inference system. The designed system (see Fig.4) consists of two modules: the first module measures value of AUs composing analyzed emotion; the second one recognizes and interprets the intensity of facial expressions. A set of rules defined for fuzzy logic that recognizes and measures intensity of AUs and corresponding emotion is shown in Tables 3 and 4.
7 Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning 17 Fig. 4. Block diagram of fuzzy classifier of prototypical emotions Table 3. Rules and Distance variables for recognizing AUs Code Description Distance Diff. Recognition Rules AU1 Inner Brow Raiser D1, D2 Both increase in same proportion AU2 Outer Brow Raiser D5, D6 Both increase in same proportion AU4 Brow Lowerer D3, D4, D7 D3&D4 increase, D7 decrease AU5 Upper Lid Raiser D8, D9 Both increase in same proportion AU7 Lid Tightener D8, D9 Both decrease in same proportion AU10 Upper Lip Raiser D13 D13 decrease AU12 Lip Corner Puller D10,D11,D12 D10 increase D11 & D12 decrease AU15 Lip Corner Depressor D11, D12 Both increase in same proportion AU16 Lower Lip Depressor D14 D14 increase AU20 Lip stretcher D10, D11, D12 D10, D11&D12 increase AU25 Lips part D15 D15 increase AU27 Mouth Stretch D10, D15 D10 decrease, D15 increase Table 4. Rules and AUs for recognizing facial expressions Emotion AUs Used Recognition Rules Sadness AU1, AU4, AU15 Increasing 3 actions increase expression intensity Happiness AU12, AU7 Presence of AU12 & AU7 but not AU7 (blinking). Increasing values increase expression intensity Fear AU1, AU2, AU4, Presence of the 6 actions but not AU7 (blinking). AU5, AU20,AU27 Increasing values increase expression intensity Surprise AU1, AU2 AU5, Presence of the 4th action but not AU5 (blinking). AU27 Increasing values increase expression intensity Anger AU4, AU7 Presence of AU4 & AU7 but not AU7 (blinking). Increasing values increase expression intensity
8 18 R. Contreras et al. In Fig. 5 the user interface of designed fuzzy inference system is shown. In the right upper corner the reasoning process is visualized with intensity of analyzed action unit AU12. The intensity of the input values is tested by classifier applying three discrimination levels described by Gaussian functions: 1-st row in Fig. 5 presents low intensity for all input distances, 2-nd row presents medium and 3-rd - high intensity. The shaded areas correspond to magnitude of the membership functions that describe the contribution of each distance difference to particular emotion. In some cases the displacement of symmetrical points on a face is different. Thus, it is also measured and shown in 4-th column. The intensity of output variables for the particular action unit presented in 5-th column is modeled by three grades described by the triangular functions instead of Gaussian. This approach is easy to implement and provides fast facial expression recognition without additional errors during interpretation. The proposed model of reasoning is flexible enough to allow its extension incorporating new features for recognition of non-prototypical emotions. Fig. 5. Measurement of AU12 Lip Corner Puller representing happiness of high intensity 4 Obtained Results and Discussion The test of system performance and efficiency of fuzzyfication model has been done using Kanade s and Pantic s databases. Tables 5 and 6 show the confusion matrices obtained for five basic prototypical emotions in case of medium and high intensity. Finally, with regard to correct evaluation of the expression reported by the system, Table 7 shows the comparison between the intensity of expression Surprise given by the classifier and reported by evaluation committee of ten persons participated in
9 Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning 19 Table 5. Confusion Matrix with expression of medium intensity Emotion Sadness Surprise Happiness Anger Fear Sadness 81% 9.50% % Surprise 0.30% 96% % Happiness % 96% 1.90% 1.90% Anger % 0.10% 92% 3.40% Fear 6% 5.80% % Table 6. Confusion Matrix for expressions of high intensity Emotion Sadness Surprise Happiness Anger Fear Sadness 84% 8% 0 0 8% Surprise 0.20% 96.40% % Happiness % 1.20% 1.20% Anger % % 1.60% Fear 4.70% 5.70% % Table 7. Usability test results for Surprise emotion Output Evaluation Status Output Evaluat. Status Low OK Medium OK Medium OK Medium- OK Low FAIL Low OK Medium OK Medium OK Medium OK Medium FAIL High OK High OK High OK High OK Medium OK Medium OK High OK High OK High OK High OK Correct assessment : 90% usability tests. The obtained results indicate a correct assessment of the intensity about 90% for Surprise emotion. For other expressions such as joy, sadness, anger, and fear the percentage of corresponding correct recognition is about 80, 85, 77, and 75% respectively. Comparing the results of facial expression recognition with other well-known systems the proposed approach gives average value about 79% against 72% of Esau [4]. The high degree of recognition mainly depends on the number of AUs or FAPs used for description of emotion. The recognition of non-prototypical emotions lies in the range about of 40-60%. This low level of recognition is because of complexity in selection of AUs for representation of non-prototypical emotion and due to subjectivity of its perception by each person. The proposed framework opens new possibility for design of systems for emotion detection and intensity recognition.
10 20 R. Contreras et al. 5 Conclusions In this paper we presented a model for fuzzyfication of facial features used for recognition of basic or non-prototypical emotions. For quantification of emotions and their intensities a statistical analysis of Kanade s and Pantic s face database has been made. Two-stage fuzzy inference using Gaussian and triangular functions is applied providing measurement of facial expression intensity. In the preliminary experiments the basic emotion recognition achieves up to 70-90% that depends on complexity in selection of AUs for representation of particular emotion and subjectivity of its perception by each person. The designed knowledge-base framework is general enough to create the diverse instances of emotions, as well as it provides quite exact quantitative description of measured facial actions. This permits simple and formal definition of relationship between emotions, facial actions, and their descriptors. The proposed framework also allows the postulation of rules for prototypical or nonprototypical facial expression recognition using any type of classifiers. Acknowledgments. This research is sponsored by Mexican National Council of Science and Technology, CONACyT, Projects: # and # References 1. Young-Joong, K., Myo-Taeg, L.: Near-Optimal Fuzzy Systems Using Polar Clustering: Application. In: Khosla, R., Howlett, R.J., Jain, L.C. (eds.) KES LNCS (LNAI), vol. 3684, pp Springer, Heidelberg (2005) 2. Yamakawa, T.: Stabilization of an inverted pendulum by a high-speed fuzzy logic controller hardware system, J. Fuzzy Sets and Sys. 32(2), (1989) 3. Mufti, M., Khanam, A.: Fuzzy Rule Based Facial Expression Recognition, Computational Intelligence for Modeling, Control and Automation (2006) 4. Esau, N., Wetzel, E. L.: Real-Time Facial Expression Recognition Using a Fuzzy Emotion Model. In: IEEE Fuzzy Systems Conf., pp. 1 6 (2007) 5. Pantic, M.: An Expert System for Multiple Emotional Classification of Facial Expressions. In: 11th IEEE Int. Conf. on Tools with Artif. Intel., p. 113 (1999) 6. Akamatsu, L.S.: Coding facial expressions with Gabor wavelets. McGraw Hill, N.Y. (1998) 7. Kyoung, S.C., Yong-Guk, K., Yang-Bok, L.: Real-Time Expression Recog. Using Active Appearance Model. In: Int. Conf. Comp. Intelligence and Security, China, pp. 1 8 (2006) 8. Lin, D.T.: Facial Expression Classification Using PCA and Hierarchical Radial Basis Function Network. J. Inf. Science and Eng. 22, (2006) 9. Black, M.J.: Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image Motion. J. of Comp. Vision 25(1), (1998) 10. Kharat, G.U., Dudul, S.V.: Neural Network Classifier for Human Emotion Recognition. In: 1-st Int. Conf. on Emerging Trends in Eng. and Techn., Iran, pp. 1 6 (2008) 11. Cohen, I., Garg, A., Huang, T.S.: Emotion Recognition from Facial Expressions using Multilevel HMM. Neural Information Processing Systems, London (2000) 12. Plutchik, R.: The nature of emotions. J. American Scientist 89, 344 (2001) 13. Contreras, R., Starostenko, O.: A Knowledge-base Framework for Analysis of Facial Expressions. In: 10th Conf. on Pat. Recog. and Inf. Proces., Belarus, pp (2009)
11 Facial Feature Model for Emotion Recognition Using Fuzzy Reasoning Ekman, P., Friesen, W.: Facial Action Coding System (FACS). Consulting Psychologists Press, Palo Alto (1978) 15. ISO/IEC :2001(E), International Standard, Information technology - Coding of audio-visual objects - Part 2, 2nd Ed. (2001) 16. Kanade T., Cohn J.: Comprehensive database for facial expression analysis. In: 4-th IEEE Conf. on Autom. Face and Gesture Recog. France, pp (2000) 17. Pantic, M., Valstar, M.F., Rademaker, R.: Web-based Database for Facial Expression Analysis. In: IEEE Conf. Multmedia and Expo., Netherlands, pp. 1 6 (2005) 18. Ontology editor Protégé (2009),
Facial expression recognition with spatiotemporal local descriptors
Facial expression recognition with spatiotemporal local descriptors Guoying Zhao, Matti Pietikäinen Machine Vision Group, Infotech Oulu and Department of Electrical and Information Engineering, P. O. Box
More informationStatistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender
Proc. IEEE Int. Conf. on Systems, Man and Cybernetics (SMC 2004), Den Haag, pp. 2203-2208, IEEE omnipress 2004 Statistical and Neural Methods for Vision-based Analysis of Facial Expressions and Gender
More informationFace Analysis : Identity vs. Expressions
Hugo Mercier, 1,2 Patrice Dalle 1 Face Analysis : Identity vs. Expressions 1 IRIT - Université Paul Sabatier 118 Route de Narbonne, F-31062 Toulouse Cedex 9, France 2 Websourd Bâtiment A 99, route d'espagne
More informationEmotion Recognition using a Cauchy Naive Bayes Classifier
Emotion Recognition using a Cauchy Naive Bayes Classifier Abstract Recognizing human facial expression and emotion by computer is an interesting and challenging problem. In this paper we propose a method
More informationR Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology
ISSN: 0975-766X CODEN: IJPTFI Available Online through Research Article www.ijptonline.com FACIAL EMOTION RECOGNITION USING NEURAL NETWORK Kashyap Chiranjiv Devendra, Azad Singh Tomar, Pratigyna.N.Javali,
More informationFacial Expression Recognition Using Principal Component Analysis
Facial Expression Recognition Using Principal Component Analysis Ajit P. Gosavi, S. R. Khot Abstract Expression detection is useful as a non-invasive method of lie detection and behaviour prediction. However,
More informationFacial Event Classification with Task Oriented Dynamic Bayesian Network
Facial Event Classification with Task Oriented Dynamic Bayesian Network Haisong Gu Dept. of Computer Science University of Nevada Reno haisonggu@ieee.org Qiang Ji Dept. of ECSE Rensselaer Polytechnic Institute
More informationFacial Expression Biometrics Using Tracker Displacement Features
Facial Expression Biometrics Using Tracker Displacement Features Sergey Tulyakov 1, Thomas Slowe 2,ZhiZhang 1, and Venu Govindaraju 1 1 Center for Unified Biometrics and Sensors University at Buffalo,
More informationFacial Behavior as a Soft Biometric
Facial Behavior as a Soft Biometric Abhay L. Kashyap University of Maryland, Baltimore County 1000 Hilltop Circle, Baltimore, MD 21250 abhay1@umbc.edu Sergey Tulyakov, Venu Govindaraju University at Buffalo
More informationNeuro-Inspired Statistical. Rensselaer Polytechnic Institute National Science Foundation
Neuro-Inspired Statistical Pi Prior Model lfor Robust Visual Inference Qiang Ji Rensselaer Polytechnic Institute National Science Foundation 1 Status of Computer Vision CV has been an active area for over
More informationStudy on Aging Effect on Facial Expression Recognition
Study on Aging Effect on Facial Expression Recognition Nora Algaraawi, Tim Morris Abstract Automatic facial expression recognition (AFER) is an active research area in computer vision. However, aging causes
More informationGeneralization of a Vision-Based Computational Model of Mind-Reading
Generalization of a Vision-Based Computational Model of Mind-Reading Rana el Kaliouby and Peter Robinson Computer Laboratory, University of Cambridge, 5 JJ Thomson Avenue, Cambridge UK CB3 FD Abstract.
More informationThis is the accepted version of this article. To be published as : This is the author version published as:
QUT Digital Repository: http://eprints.qut.edu.au/ This is the author version published as: This is the accepted version of this article. To be published as : This is the author version published as: Chew,
More informationA Review: Facial Expression Detection with its Techniques and Application
Vol.9, No.6 (2016), pp.149-158 http://dx.doi.org/10.14257/ijsip.2016.9.6.13 A Review: Facial Expression Detection with its Techniques and Application Neha Bhardwaj 1 and Manish Dixit 2 1 Department of
More informationSocial Context Based Emotion Expression
Social Context Based Emotion Expression Radosław Niewiadomski (1), Catherine Pelachaud (2) (1) University of Perugia, Italy (2) University Paris VIII, France radek@dipmat.unipg.it Social Context Based
More informationHUMAN EMOTION DETECTION THROUGH FACIAL EXPRESSIONS
th June. Vol.88. No. - JATIT & LLS. All rights reserved. ISSN: -8 E-ISSN: 87- HUMAN EMOTION DETECTION THROUGH FACIAL EXPRESSIONS, KRISHNA MOHAN KUDIRI, ABAS MD SAID AND M YUNUS NAYAN Computer and Information
More informationA framework for the Recognition of Human Emotion using Soft Computing models
A framework for the Recognition of Human Emotion using Soft Computing models Md. Iqbal Quraishi Dept. of Information Technology Kalyani Govt Engg. College J Pal Choudhury Dept. of Information Technology
More informationA Human-Markov Chain Monte Carlo Method For Investigating Facial Expression Categorization
A Human-Markov Chain Monte Carlo Method For Investigating Facial Expression Categorization Daniel McDuff (djmcduff@mit.edu) MIT Media Laboratory Cambridge, MA 02139 USA Abstract This paper demonstrates
More informationAn assistive application identifying emotional state and executing a methodical healing process for depressive individuals.
An assistive application identifying emotional state and executing a methodical healing process for depressive individuals. Bandara G.M.M.B.O bhanukab@gmail.com Godawita B.M.D.T tharu9363@gmail.com Gunathilaka
More informationAutomatic Facial Action Unit Recognition by Modeling Their Semantic And Dynamic Relationships
Chapter 10 Automatic Facial Action Unit Recognition by Modeling Their Semantic And Dynamic Relationships Yan Tong, Wenhui Liao, and Qiang Ji Abstract A system that could automatically analyze the facial
More informationEmotion Affective Color Transfer Using Feature Based Facial Expression Recognition
, pp.131-135 http://dx.doi.org/10.14257/astl.2013.39.24 Emotion Affective Color Transfer Using Feature Based Facial Expression Recognition SeungTaek Ryoo and Jae-Khun Chang School of Computer Engineering
More informationAdvanced FACS Methodological Issues
7/9/8 Advanced FACS Methodological Issues Erika Rosenberg, University of California, Davis Daniel Messinger, University of Miami Jeffrey Cohn, University of Pittsburgh The th European Conference on Facial
More informationGender Based Emotion Recognition using Speech Signals: A Review
50 Gender Based Emotion Recognition using Speech Signals: A Review Parvinder Kaur 1, Mandeep Kaur 2 1 Department of Electronics and Communication Engineering, Punjabi University, Patiala, India 2 Department
More informationDevelopment of novel algorithm by combining Wavelet based Enhanced Canny edge Detection and Adaptive Filtering Method for Human Emotion Recognition
International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 12, Issue 9 (September 2016), PP.67-72 Development of novel algorithm by combining
More informationRecognition of Facial Expressions for Images using Neural Network
Recognition of Facial Expressions for Images using Neural Network Shubhangi Giripunje Research Scholar, Dept.of Electronics Engg., GHRCE, Nagpur, India Preeti Bajaj Senior IEEE Member, Professor, Dept.of
More informationAffective Dialogue Communication System with Emotional Memories for Humanoid Robots
Affective Dialogue Communication System with Emotional Memories for Humanoid Robots M. S. Ryoo *, Yong-ho Seo, Hye-Won Jung, and H. S. Yang Artificial Intelligence and Media Laboratory Department of Electrical
More informationFacial Expression and Consumer Attitudes toward Cultural Goods
Facial Expression and Consumer Attitudes toward Cultural Goods Chih-Hsiang Ko, Chia-Yin Yu Department of Industrial and Commercial Design, National Taiwan University of Science and Technology, 43 Keelung
More informationQuantification of facial expressions using high-dimensional shape transformations
Journal of Neuroscience Methods xxx (2004) xxx xxx Quantification of facial expressions using high-dimensional shape transformations Ragini Verma a,, Christos Davatzikos a,1, James Loughead b,2, Tim Indersmitten
More informationAutomatic Facial Expression Recognition Using Boosted Discriminatory Classifiers
Automatic Facial Expression Recognition Using Boosted Discriminatory Classifiers Stephen Moore and Richard Bowden Centre for Vision Speech and Signal Processing University of Surrey, Guildford, GU2 7JW,
More informationA Deep Learning Approach for Subject Independent Emotion Recognition from Facial Expressions
A Deep Learning Approach for Subject Independent Emotion Recognition from Facial Expressions VICTOR-EMIL NEAGOE *, ANDREI-PETRU BĂRAR *, NICU SEBE **, PAUL ROBITU * * Faculty of Electronics, Telecommunications
More informationDimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners
Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners Hatice Gunes and Maja Pantic Department of Computing, Imperial College London 180 Queen
More informationAvailable online at ScienceDirect. Procedia Computer Science 45 (2015 )
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 45 (2015 ) 282 289 International Conference on Advanced Computing Technologies and Applications (ICACTA- 2015) A Human
More informationAnalysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information
Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information C. Busso, Z. Deng, S. Yildirim, M. Bulut, C. M. Lee, A. Kazemzadeh, S. Lee, U. Neumann, S. Narayanan Emotion
More informationActive User Affect Recognition and Assistance
Active User Affect Recognition and Assistance Wenhui Liao, Zhiwe Zhu, Markus Guhe*, Mike Schoelles*, Qiang Ji, and Wayne Gray* Email: jiq@rpi.edu Department of Electrical, Computer, and System Eng. *Department
More informationDetection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images
Detection of Facial Landmarks from Neutral, Happy, and Disgust Facial Images Ioulia Guizatdinova and Veikko Surakka Research Group for Emotions, Sociality, and Computing Tampere Unit for Computer-Human
More informationAccuracy of three commercial automatic emotion recognition systems across different individuals and their facial expressions
Accuracy of three commercial automatic emotion recognition systems across different individuals and their facial expressions Dupré, D., Andelic, N., Morrison, G., & McKeown, G. (Accepted/In press). Accuracy
More informationAutomatic Emotion Recognition Using Facial Expression: A Review
Automatic Emotion Recognition Using Facial Expression: A Review Monika Dubey 1, Prof. Lokesh Singh 2 1Department of Computer Science & Engineering, Technocrats Institute of Technology, Bhopal, India 2Asst.Professor,
More informationRecognizing Emotions from Facial Expressions Using Neural Network
Recognizing Emotions from Facial Expressions Using Neural Network Isidoros Perikos, Epaminondas Ziakopoulos, Ioannis Hatzilygeroudis To cite this version: Isidoros Perikos, Epaminondas Ziakopoulos, Ioannis
More informationValence and Gender Effects on Emotion Recognition Following TBI. Cassie Brown Arizona State University
Valence and Gender Effects on Emotion Recognition Following TBI Cassie Brown Arizona State University Knox & Douglas (2009) Social Integration and Facial Expression Recognition Participants: Severe TBI
More informationA Possibility for Expressing Multi-Emotion on Robot Faces
The 5 th Conference of TRS Conference 26-27 May 2011, Bangkok, Thailand A Possibility for Expressing Multi-Emotion on Robot Faces Trin Veerasiri 1*, Djitt Laowattana 2 Institute of Field robotics, King
More informationComparing a novel model based on the transferable belief model with humans during the recognition of partially occluded facial expressions
Journal of Vision (2009) 9(2):22, 1 19 http://journalofvision.org/9/2/22/ 1 Comparing a novel model based on the transferable belief model with humans during the recognition of partially occluded facial
More informationA Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China
A Vision-based Affective Computing System Jieyu Zhao Ningbo University, China Outline Affective Computing A Dynamic 3D Morphable Model Facial Expression Recognition Probabilistic Graphical Models Some
More informationEnhanced Facial Expressions Recognition using Modular Equable 2DPCA and Equable 2DPC
Enhanced Facial Expressions Recognition using Modular Equable 2DPCA and Equable 2DPC Sushma Choudhar 1, Sachin Puntambekar 2 1 Research Scholar-Digital Communication Medicaps Institute of Technology &
More informationIEEE TRANSACTIONS ON AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. 19, NO. 5, JULY
IEEE TRANSACTIONS ON AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. 19, NO. 5, JULY 2011 1057 A Framework for Automatic Human Emotion Classification Using Emotion Profiles Emily Mower, Student Member, IEEE,
More informationBio-Feedback Based Simulator for Mission Critical Training
Bio-Feedback Based Simulator for Mission Critical Training Igor Balk Polhemus, 40 Hercules drive, Colchester, VT 05446 +1 802 655 31 59 x301 balk@alum.mit.edu Abstract. The paper address needs for training
More informationLocal Image Structures and Optic Flow Estimation
Local Image Structures and Optic Flow Estimation Sinan KALKAN 1, Dirk Calow 2, Florentin Wörgötter 1, Markus Lappe 2 and Norbert Krüger 3 1 Computational Neuroscience, Uni. of Stirling, Scotland; {sinan,worgott}@cn.stir.ac.uk
More informationComparison of Mamdani and Sugeno Fuzzy Interference Systems for the Breast Cancer Risk
Comparison of Mamdani and Sugeno Fuzzy Interference Systems for the Breast Cancer Risk Alshalaa A. Shleeg, Issmail M. Ellabib Abstract Breast cancer is a major health burden worldwide being a major cause
More informationPERFORMANCE ANALYSIS OF THE TECHNIQUES EMPLOYED ON VARIOUS DATASETS IN IDENTIFYING THE HUMAN FACIAL EMOTION
PERFORMANCE ANALYSIS OF THE TECHNIQUES EMPLOYED ON VARIOUS DATASETS IN IDENTIFYING THE HUMAN FACIAL EMOTION Usha Mary Sharma 1, Jayanta Kumar Das 2, Trinayan Dutta 3 1 Assistant Professor, 2,3 Student,
More informationBlue Eyes Technology
Blue Eyes Technology D.D. Mondal #1, Arti Gupta *2, Tarang Soni *3, Neha Dandekar *4 1 Professor, Dept. of Electronics and Telecommunication, Sinhgad Institute of Technology and Science, Narhe, Maharastra,
More informationFACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS
International Archives of Photogrammetry and Remote Sensing. Vol. XXXII, Part 5. Hakodate 1998 FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS Ayako KATOH*, Yasuhiro FUKUI**
More informationA Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection
A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection Tobias Gehrig and Hazım Kemal Ekenel Facial Image Processing and Analysis Group, Institute for Anthropomatics Karlsruhe
More informationRecognition of facial expressions using Gabor wavelets and learning vector quantization
Engineering Applications of Artificial Intelligence 21 (2008) 1056 1064 www.elsevier.com/locate/engappai Recognition of facial expressions using Gabor wavelets and learning vector quantization Shishir
More informationFacial Expression Classification Using Convolutional Neural Network and Support Vector Machine
Facial Expression Classification Using Convolutional Neural Network and Support Vector Machine Valfredo Pilla Jr, André Zanellato, Cristian Bortolini, Humberto R. Gamba and Gustavo Benvenutti Borba Graduate
More informationUser Affective State Assessment for HCI Systems
Association for Information Systems AIS Electronic Library (AISeL) AMCIS 2004 Proceedings Americas Conference on Information Systems (AMCIS) 12-31-2004 Xiangyang Li University of Michigan-Dearborn Qiang
More informationValence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry.
Proceedings Chapter Valence-arousal evaluation using physiological signals in an emotion recall paradigm CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry Abstract The work presented in this paper aims
More informationAutomated Real Time Emotion Recognition using Facial Expression Analysis
Automated Real Time Emotion Recognition using Facial Expression Analysis by Ashleigh Fratesi A thesis submitted to the Faculty of Graduate and Postdoctoral Affairs in partial fulfillment of the requirements
More informationA Study of Facial Expression Reorganization and Local Binary Patterns
A Study of Facial Expression Reorganization and Local Binary Patterns Poonam Verma #1, Deepshikha Rathore *2 #1 MTech Scholar,Sanghvi Innovative Academy Indore *2 Asst.Professor,Sanghvi Innovative Academy
More informationANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS
ANALYSIS OF FACIAL FEATURES OF DRIVERS UNDER COGNITIVE AND VISUAL DISTRACTIONS Nanxiang Li and Carlos Busso Multimodal Signal Processing (MSP) Laboratory Department of Electrical Engineering, The University
More informationReal-time Automatic Deceit Detection from Involuntary Facial Expressions
Real-time Automatic Deceit Detection from Involuntary Facial Expressions Zhi Zhang, Vartika Singh, Thomas E. Slowe, Sergey Tulyakov, and Venugopal Govindaraju Center for Unified Biometrics and Sensors
More informationKatsunari Shibata and Tomohiko Kawano
Learning of Action Generation from Raw Camera Images in a Real-World-Like Environment by Simple Coupling of Reinforcement Learning and a Neural Network Katsunari Shibata and Tomohiko Kawano Oita University,
More informationEmotion Analysis Using Emotion Recognition Module Evolved by Genetic Programming
THE HARRIS SCIENCE REVIEW OF DOSHISHA UNIVERSITY, VOL. 57, NO. 2 July 2016 Emotion Analysis Using Emotion Recognition Module Evolved by Genetic Programming Rahadian YUSUF*, Ivan TANEV*, Katsunori SHIMOHARA*
More informationFacial Emotion Recognition with Facial Analysis
Facial Emotion Recognition with Facial Analysis İsmail Öztel, Cemil Öz Sakarya University, Faculty of Computer and Information Sciences, Computer Engineering, Sakarya, Türkiye Abstract Computer vision
More informationComparison of Deliberate and Spontaneous Facial Movement in Smiles and Eyebrow Raises
J Nonverbal Behav (2009) 33:35 45 DOI 10.1007/s10919-008-0058-6 ORIGINAL PAPER Comparison of Deliberate and Spontaneous Facial Movement in Smiles and Eyebrow Raises Karen L. Schmidt Æ Sharika Bhattacharya
More informationAffective Game Engines: Motivation & Requirements
Affective Game Engines: Motivation & Requirements Eva Hudlicka Psychometrix Associates Blacksburg, VA hudlicka@ieee.org psychometrixassociates.com DigiPen Institute of Technology February 20, 2009 1 Outline
More informationValidating the Visual Saliency Model
Validating the Visual Saliency Model Ali Alsam and Puneet Sharma Department of Informatics & e-learning (AITeL), Sør-Trøndelag University College (HiST), Trondheim, Norway er.puneetsharma@gmail.com Abstract.
More informationTask oriented facial behavior recognition with selective sensing
Computer Vision and Image Understanding 100 (2005) 385 415 www.elsevier.com/locate/cviu Task oriented facial behavior recognition with selective sensing Haisong Gu a, Yongmian Zhang a, Qiang Ji b, * a
More informationRule Based System for Recognizing Emotions Using Multimodal Approach
Rule Based System for Recognizing Emotions Using Multimodal Approach Preeti Khanna Information System SBM, SVKM s NMIMS Mumbai, India Sasikumar, M. Director, R & D Center for Development of Advance Computing,
More informationClassifying Facial Pain Expressions
Classifying Facial Pain Expressions Individual Classifiers vs. Global Classifiers Michael Siebers 1, Miriam Kunz 2, Stefan Lautenbacher 2, and Ute Schmid 1 1 Faculty Information Systems and Applied Computer
More informationContinuous Facial Expression Representation for Multimodal Emotion Detection
Continuous Facial Expression Representation for Multimodal Emotion Detection Catherine Soladié, Hanan Salam, Nicolas Stoiber & Renaud Seguier Manuscript Received: 12,Mar.,2013 Revised: 25,Mar.,2013 Accepted:
More informationComparison of selected off-the-shelf solutions for emotion recognition based on facial expressions
Comparison of selected off-the-shelf solutions for emotion recognition based on facial expressions Grzegorz Brodny, Agata Kołakowska, Agnieszka Landowska, Mariusz Szwoch, Wioleta Szwoch, Michał R. Wróbel
More informationThe Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression
The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression Patrick Lucey 1,2, Jeffrey F. Cohn 1,2, Takeo Kanade 1, Jason Saragih 1, Zara Ambadar 2 Robotics
More informationEmotion Detection Through Facial Feature Recognition
Emotion Detection Through Facial Feature Recognition James Pao jpao@stanford.edu Abstract Humans share a universal and fundamental set of emotions which are exhibited through consistent facial expressions.
More informationUC Merced Proceedings of the Annual Meeting of the Cognitive Science Society
UC Merced Proceedings of the Annual Meeting of the Cognitive Science Society Title A Six-Unit Network is All You Need to Discover Permalink https://escholarshiporg/uc/item/3j28p6z3 Journal Proceedings
More informationEXTRACTION OF RETINAL BLOOD VESSELS USING IMAGE PROCESSING TECHNIQUES
EXTRACTION OF RETINAL BLOOD VESSELS USING IMAGE PROCESSING TECHNIQUES T.HARI BABU 1, Y.RATNA KUMAR 2 1 (PG Scholar, Dept. of Electronics and Communication Engineering, College of Engineering(A), Andhra
More informationObject recognition and hierarchical computation
Object recognition and hierarchical computation Challenges in object recognition. Fukushima s Neocognitron View-based representations of objects Poggio s HMAX Forward and Feedback in visual hierarchy Hierarchical
More informationA Unified Probabilistic Framework For Measuring The Intensity of Spontaneous Facial Action Units
A Unified Probabilistic Framework For Measuring The Intensity of Spontaneous Facial Action Units Yongqiang Li 1, S. Mohammad Mavadati 2, Mohammad H. Mahoor and Qiang Ji Abstract Automatic facial expression
More informationWho Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification. and. Evidence for a Face Superiority Effect. Nila K Leigh
1 Who Needs Cheeks? Eyes and Mouths are Enough for Emotion Identification and Evidence for a Face Superiority Effect Nila K Leigh 131 Ave B (Apt. 1B) New York, NY 10009 Stuyvesant High School 345 Chambers
More informationAutomated Embryo Stage Classification in Time-Lapse Microscopy Video of Early Human Embryo Development
Automated Embryo Stage Classification in Time-Lapse Microscopy Video of Early Human Embryo Development Yu Wang, Farshid Moussavi, and Peter Lorenzen Auxogyn, Inc. 1490 O Brien Drive, Suite A, Menlo Park,
More informationD3.9 Mental effort assessment tool
D3.9 Mental effort assessment tool Project SWELL Project leader Wessel Kraaij (TNO) Work package 3 Deliverable number D3.9 Authors John Schavemaker, Saskia Koldijk Reviewers Joris Janssen, Wessel Kraaij
More informationGet The FACS Fast: Automated FACS face analysis benefits from the addition of velocity
Get The FACS Fast: Automated FACS face analysis benefits from the addition of velocity Timothy R. Brick University of Virginia Charlottesville, VA 22904 tbrick@virginia.edu Michael D. Hunter University
More informationMPEG-4 Facial Expression Synthesis based on Appraisal Theory
MPEG-4 Facial Expression Synthesis based on Appraisal Theory L. Malatesta, A. Raouzaiou, K. Karpouzis and S. Kollias Image, Video and Multimedia Systems Laboratory, National Technical University of Athens,
More informationComparison of Lip Image Feature Extraction Methods for Improvement of Isolated Word Recognition Rate
, pp.57-61 http://dx.doi.org/10.14257/astl.2015.107.14 Comparison of Lip Image Feature Extraction Methods for Improvement of Isolated Word Recognition Rate Yong-Ki Kim 1, Jong Gwan Lim 2, Mi-Hye Kim *3
More informationAffective Modeling and Recognition of Learning Emotion: Application to E-learning
JOURNAL OF SOFTWARE, VOL. 4, NO. 8, OCTOBER 2009 859 Affective Modeling and Recognition of Learning Emotion: Application to E-learning Yanwen Wu, Tingting Wang, Xiaonian Chu Department of Information &
More informationFACIAL IMAGE BASED EXPRESSION CLASSIFICATION SYSTEM USING COMMITTEE NEURAL NETWORKS. A Thesis. Presented to
FACIAL IMAGE BASED EXPRESSION CLASSIFICATION SYSTEM USING COMMITTEE NEURAL NETWORKS A Thesis Presented to The Graduate Faculty of The University of Akron In Partial Fulfillment of the Requirements for
More informationTo What Extent Can the Recognition of Unfamiliar Faces be Accounted for by the Direct Output of Simple Cells?
To What Extent Can the Recognition of Unfamiliar Faces be Accounted for by the Direct Output of Simple Cells? Peter Kalocsai, Irving Biederman, and Eric E. Cooper University of Southern California Hedco
More informationLung Cancer Diagnosis from CT Images Using Fuzzy Inference System
Lung Cancer Diagnosis from CT Images Using Fuzzy Inference System T.Manikandan 1, Dr. N. Bharathi 2 1 Associate Professor, Rajalakshmi Engineering College, Chennai-602 105 2 Professor, Velammal Engineering
More informationCOMPARATIVE STUDY ON FEATURE EXTRACTION METHOD FOR BREAST CANCER CLASSIFICATION
COMPARATIVE STUDY ON FEATURE EXTRACTION METHOD FOR BREAST CANCER CLASSIFICATION 1 R.NITHYA, 2 B.SANTHI 1 Asstt Prof., School of Computing, SASTRA University, Thanjavur, Tamilnadu, India-613402 2 Prof.,
More informationUsing Computational Models to Understand ASD Facial Expression Recognition Patterns
Using Computational Models to Understand ASD Facial Expression Recognition Patterns Irene Feng Dartmouth College Computer Science Technical Report TR2017-819 May 30, 2017 Irene Feng 2 Literature Review
More informationA NOVEL FACIAL EXPRESSION RECOGNITION METHOD US- ING BI-DIMENSIONAL EMD BASED EDGE DETECTION. Zijing Qin
A NOVEL FACIAL EXPRESSION RECOGNITION METHOD US- ING BI-DIMENSIONAL EMD BASED EDGE DETECTION By Zijing Qin A Thesis Submitted to the Faculty of the Graduate School of Western Carolina University in Partial
More informationRachael E. Jack, Caroline Blais, Christoph Scheepers, Philippe G. Schyns, and Roberto Caldara
Current Biology, Volume 19 Supplemental Data Cultural Confusions Show that Facial Expressions Are Not Universal Rachael E. Jack, Caroline Blais, Christoph Scheepers, Philippe G. Schyns, and Roberto Caldara
More informationGeneral Psych Thinking & Feeling
General Psych Thinking & Feeling Piaget s Theory Challenged Infants have more than reactive sensing Have some form of discrimination (reasoning) 1-month-old babies given a pacifier; never see it Babies
More informationEECS 433 Statistical Pattern Recognition
EECS 433 Statistical Pattern Recognition Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1 / 19 Outline What is Pattern
More informationAutomatic Coding of Facial Expressions Displayed During Posed and Genuine Pain
Automatic Coding of Facial Expressions Displayed During Posed and Genuine Pain Gwen C. Littlewort Machine Perception Lab, Institute for Neural Computation University of California, San Diego La Jolla,
More informationEmotion Recognition from Facial Action Points by Principal Component Analysis
Emotion Recognition from Facial Action Points by Principal Component Analysis Anisha Halder, Garima Singh Arindam Jati, Amit Konar ETCE Department, Jadavpur University, Kolkata-32. India halder.anisha@gmail.com,
More informationGender Discrimination Through Fingerprint- A Review
Gender Discrimination Through Fingerprint- A Review Navkamal kaur 1, Beant kaur 2 1 M.tech Student, Department of Electronics and Communication Engineering, Punjabi University, Patiala 2Assistant Professor,
More informationAction Recognition based on Hierarchical Self-Organizing Maps
Action Recognition based on Hierarchical Self-Organizing Maps Miriam Buonamente 1, Haris Dindo 1, and Magnus Johnsson 2 1 RoboticsLab, DICGIM, University of Palermo, Viale delle Scienze, Ed. 6, 90128 Palermo,
More informationCompeting Frameworks in Perception
Competing Frameworks in Perception Lesson II: Perception module 08 Perception.08. 1 Views on perception Perception as a cascade of information processing stages From sensation to percept Template vs. feature
More informationCompeting Frameworks in Perception
Competing Frameworks in Perception Lesson II: Perception module 08 Perception.08. 1 Views on perception Perception as a cascade of information processing stages From sensation to percept Template vs. feature
More informationACTIVE APPEARANCE MODELS FOR AFFECT RECOGNITION USING FACIAL EXPRESSIONS. Matthew Stephen Ratliff
ACTIVE APPEARANCE MODELS FOR AFFECT RECOGNITION USING FACIAL EXPRESSIONS Matthew Stephen Ratliff A Thesis Submitted to the University of North Carolina Wilmington in Partial Fulfillment of the Requirements
More informationAutomated facial expression measurement: Recent applications to basic research in human behavior, learning, and education
1 Automated facial expression measurement: Recent applications to basic research in human behavior, learning, and education Marian Stewart Bartlett and Jacob Whitehill, Institute for Neural Computation,
More information