An approach for Brazilian Sign Language (BSL) recognition based on facial expression and k-nn classifier

Similar documents
A Novel Capsule Neural Network Based Model For Drowsiness Detection Using Electroencephalography Signals

LIBRE-LIBRAS: A Tool with a Free-Hands Approach to Assist LIBRAS Translation and Learning

Gesture Recognition using Marathi/Hindi Alphabet

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Facial expression recognition with spatiotemporal local descriptors

7.1 Grading Diabetic Retinopathy

Recognition of sign language gestures using neural networks

Real-time SVM Classification for Drowsiness Detection Using Eye Aspect Ratio

A Review on Feature Extraction for Indian and American Sign Language

Edge Based Grid Super-Imposition for Crowd Emotion Recognition

Automated Assessment of Diabetic Retinal Image Quality Based on Blood Vessel Detection

Research Proposal on Emotion Recognition

DeepASL: Enabling Ubiquitous and Non-Intrusive Word and Sentence-Level Sign Language Translation

IDENTIFICATION OF REAL TIME HAND GESTURE USING SCALE INVARIANT FEATURE TRANSFORM

Using Fuzzy Classifiers for Perceptual State Recognition

Development of an Electronic Glove with Voice Output for Finger Posture Recognition

Emotion Recognition using a Cauchy Naive Bayes Classifier

doi: / _59(

Sign Language Recognition using Webcams

Can You See Me Now? Communication Over Cell Phones for Deaf People. Eve Riskin Richard Ladner Computer Engineering University of Washington

TURKISH SIGN LANGUAGE RECOGNITION USING HIDDEN MARKOV MODEL

R Jagdeesh Kanan* et al. International Journal of Pharmacy & Technology

VIDEO SURVEILLANCE AND BIOMEDICAL IMAGING Research Activities and Technology Transfer at PAVIS

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

Unsupervised MRI Brain Tumor Detection Techniques with Morphological Operations

Valence-arousal evaluation using physiological signals in an emotion recall paradigm. CHANEL, Guillaume, ANSARI ASL, Karim, PUN, Thierry.

Temporal Context and the Recognition of Emotion from Facial Expression

TWO HANDED SIGN LANGUAGE RECOGNITION SYSTEM USING IMAGE PROCESSING

Development of novel algorithm by combining Wavelet based Enhanced Canny edge Detection and Adaptive Filtering Method for Human Emotion Recognition

An investigation of an Artificial Neural Network method for personal identification using kinematic parameters from specific body parts

FACIAL EXPRESSION RECOGNITION FROM IMAGE SEQUENCES USING SELF-ORGANIZING MAPS

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China

Mental State Recognition by using Brain Waves

Implementation of image processing approach to translation of ASL finger-spelling to digital text

Diabetic Retinopathy Classification using SVM Classifier

Sign Language in the Intelligent Sensory Environment

Gesture Control in a Virtual Environment. Presenter: Zishuo Cheng (u ) Supervisors: Prof. Tom Gedeon and Mr. Martin Henschke

Experiment Terminology

General Psych Thinking & Feeling

Supporting Arabic Sign Language Recognition with Facial Expressions

Real Time Hand Gesture Recognition System

Expression Recognition. Mohammad Amanzadeh

Accessible Computing Research for Users who are Deaf and Hard of Hearing (DHH)

CSE 118/218 Final Presentation. Team 2 Dreams and Aspirations

Facial Expression Recognition Using Principal Component Analysis

Motion Control for Social Behaviours

Available online at ScienceDirect. Procedia Technology 24 (2016 )

Segmentation of vertebrae in lateral lumbar spinal X-ray images

PERFORMANCE ANALYSIS OF THE TECHNIQUES EMPLOYED ON VARIOUS DATASETS IN IDENTIFYING THE HUMAN FACIAL EMOTION

Design of Palm Acupuncture Points Indicator

International Journal of Advance Engineering and Research Development. Gesture Glove for American Sign Language Representation

An Evaluation of RGB-D Skeleton Tracking for Use in Large Vocabulary Complex Gesture Recognition

Noise-Robust Speech Recognition Technologies in Mobile Environments

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification

easy read Your rights under THE accessible InformatioN STandard

Facial Expression Classification Using Convolutional Neural Network and Support Vector Machine

enterface 13 Kinect-Sign João Manuel Ferreira Gameiro Project Proposal for enterface 13

Neural Network based Heart Arrhythmia Detection and Classification from ECG Signal

Automatic Classification of Perceived Gender from Facial Images

What is Emotion? Emotion is a 4 part process consisting of: physiological arousal cognitive interpretation, subjective feelings behavioral expression.

arxiv: v1 [cs.lg] 4 Feb 2019

Developmental Social Cognition Cognitive Development

INDIAN SIGN LANGUAGE RECOGNITION USING NEURAL NETWORKS AND KNN CLASSIFIERS

Emotions and Motivation

LSA64: An Argentinian Sign Language Dataset

Effect of Sensor Fusion for Recognition of Emotional States Using Voice, Face Image and Thermal Image of Face

3. MANUAL ALPHABET RECOGNITION STSTM

AUTOMATIC ACNE QUANTIFICATION AND LOCALISATION FOR MEDICAL TREATMENT

A Survey on Hand Gesture Recognition for Indian Sign Language

A Cooperative Multiagent Architecture for Turkish Sign Tutors

Agitation sensor based on Facial Grimacing for improved sedation management in critical care

N RISCE 2K18 ISSN International Journal of Advance Research and Innovation

Building an Application for Learning the Finger Alphabet of Swiss German Sign Language through Use of the Kinect

Assessment of Reliability of Hamilton-Tompkins Algorithm to ECG Parameter Detection

EXTRACTION OF RETINAL BLOOD VESSELS USING IMAGE PROCESSING TECHNIQUES

Online Speaker Adaptation of an Acoustic Model using Face Recognition

Biometric Authentication through Advanced Voice Recognition. Conference on Fraud in CyberSpace Washington, DC April 17, 1997

IMPLEMENTATION OF AN AUTOMATED SMART HOME CONTROL FOR DETECTING HUMAN EMOTIONS VIA FACIAL DETECTION

Audiovisual to Sign Language Translator

Evaluating the accuracy of markerless body tracking for digital ergonomic assessment using consumer electronics

A Hierarchical Artificial Neural Network Model for Giemsa-Stained Human Chromosome Classification

Contour-based Hand Pose Recognition for Sign Language Recognition

Glove for Gesture Recognition using Flex Sensor

Frequency Tracking: LMS and RLS Applied to Speech Formant Estimation

ALTHOUGH there is agreement that facial expressions

Recognizing Non-manual Signals in Filipino Sign Language

Construction of the EEG Emotion Judgment System Using Concept Base of EEG Features

CETR Adhesion Testing on Glass/Al Surface for Corning Inc.

Two Themes. MobileASL: Making Cell Phones Accessible to the Deaf Community. Our goal: Challenges: Current Technology for Deaf People (text) ASL

Multi-Modality American Sign Language Recognition

Detection of Lung Cancer Using Backpropagation Neural Networks and Genetic Algorithm

Serious Game Development for Vocal Training Session in Speech Therapy

INAHTA Workshop: How to Set up an HTA Agency

Not all NLP is Created Equal:

Measurement System Analysis Page 1 of 15

Gender Based Emotion Recognition using Speech Signals: A Review

Advanced Audio Interface for Phonetic Speech. Recognition in a High Noise Environment

RESTORATIVE justice is an international social movement

Analysis of Recognition System of Japanese Sign Language using 3D Image Sensor

RESTORATIVE justice is an international social movement

Transcription:

An approach for Brazilian Sign Language (BSL) recognition based on facial expression and k-nn classifier Tamires Martins Rezende 1 Cristiano Leite de Castro 1 Sílvia Grasiella Moreira Almeida 2 1 The Electrical Engineering Graduate Program - UFMG 2 Department of Industrial Automation - IFMG Ouro Preto.

Summary 1 Brazilian Sign Language 2 Database 3 Methodology and Results 4 Conclusions 5 Acknowledgment

Brazilian Sign Language (BSL) or Libras The second official language of Brazil. 5 main parameters: 1 point of articulation, 2 hand configuration, 3 movement, 4 palm orientation and 5 non-manual expressions. This paper did an exploratory study of the peculiarities involved in non-manual sign language expression recognition.

Brazilian Sign Language (BSL) or Libras The BSL recognition using computational methods is a challenge for a variety of reasons: There is currently no standardized database containing signs in a format that allows computer classification systems validation; One sign is composed of simultaneous elements; The language does not contain a consistent identifier for the start and end of a sign; Different people complete any given gesture differently.

The Brazilian Sign Language Database The first step was to choose the signs that contained changes in facial expression during execution. Figure: Signs: (a) to calm down, (b) to accuse, (c) to annihilate, (d) to love, (e) to gain weight, (f) happiness, (g) slim, (h) lucky, (i) surprise, and (j) angry

Record Protocol After first step to record, the signs were recorded. Figure: Scenario created for record the signs With the 10 signs each recorded 10 times with the same speaker, the balanced database had a total of 100 samples.

Record Protocol RGB-D sensor Kinect Software nuicaptureanalyse Figure: Facial model used with labeled points

Methodology Figure: Methodology for the automatic face recognition system

Step 1: Detection of the region of interest Input: Signals recorded by Kinect. Output: Video with the face images. Figure: Full frame Figure: Selected face

Step 2: Summarization Input: Face images. Output: 5 most relevant frames. Figure: The five most relevant frames extracted from a recording of the sign to love

Step 3: Feature vector Input: 5 most relevant frames. Output: Descriptor of a sign. Descriptor of a single frame: Sign: D = [ x 1 y 1 x 2 y 2... x 121 y 121 ] 1x242 Vector = [ ] D 1 D 2 D 3 D 4 D 5 1x1210 Vector = [ ] x 1,1 y 1,1... x i,j y i,j... x 5,121 y 5,121 i = number of the frame and j = point of the face 1x1210

Step 4: Classification Input: Descriptor of a sign. Output: Average accuracy of the 10 iterations.

Experiments and Results EX1: Raw data. EX2: Z-Score Normalization. EX3: Centroid normalization. Figure: Box plot of the percent accuracies for the three conducted

Conclusions The BSL recognition is a challenge problem. The lack of a database well structured with signs in BSL. This paper did an exploratory study of the peculiarities involved in non-manual expressions. The methodology adopted had a considerable performance, achieving a maximum average accuracy of 84%.

Acknowledgment We would like to thanks PPGEE - UFMG for the incentive and guidance. The present work was completed with the financial support of CAPES - Brazil. Thank you!