Characterization of 3D Gestural Data on Sign Language by Extraction of Joint Kinematics

Similar documents
Sign Language to English (Slate8)

Real Time Sign Language Processing System

easy read Your rights under THE accessible InformatioN STandard

The Sign2 Project Digital Translation of American Sign- Language to Audio and Text

Implementation of image processing approach to translation of ASL finger-spelling to digital text

Sign Language Interpretation Using Pseudo Glove

easy read Your rights under THE accessible InformatioN STandard

Sign Language in the Intelligent Sensory Environment

TWO HANDED SIGN LANGUAGE RECOGNITION SYSTEM USING IMAGE PROCESSING

Development of an Electronic Glove with Voice Output for Finger Posture Recognition

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

Quality Assessment of Human Hand Posture Recognition System Er. ManjinderKaur M.Tech Scholar GIMET Amritsar, Department of CSE

Communication. Jess Walsh

Android based Monitoring Human Knee Joint Movement Using Wearable Computing

International Journal of Advance Engineering and Research Development. Gesture Glove for American Sign Language Representation

Professor Brenda Schick

Framework for Comparative Research on Relational Information Displays

Accuracy and validity of Kinetisense joint measures for cardinal movements, compared to current experimental and clinical gold standards.

Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio

Sign Language Fun in the Early Childhood Classroom

CHAPTER 2: Muscular skeletal system - Biomechanics. Exam style questions - pages QUESTIONS AND ANSWERS. Answers

An Innovative Measurement in Musculoskeletal Rehabilitation Using 3D Motion Analysis

Lecture 2 BME 599: Modeling & Simulation of Movement

Recognition of sign language gestures using neural networks

Detection and Recognition of Sign Language Protocol using Motion Sensing Device

Available online at ScienceDirect. Procedia Technology 24 (2016 )

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE) Vol 5, Issue 3, March 2018 Gesture Glove

Using Deep Convolutional Networks for Gesture Recognition in American Sign Language

PRESCOTT UNIFIED SCHOOL DISTRICT District Instructional Guide

Recognition of Hand Gestures by ASL

LIBRE-LIBRAS: A Tool with a Free-Hands Approach to Assist LIBRAS Translation and Learning

Skin color detection for face localization in humanmachine

Question 2. The Deaf community has its own culture.

Gesture Recognition using Marathi/Hindi Alphabet

CHAPTER FOUR: Identity and Communication in the Deaf Community

Finger spelling recognition using distinctive features of hand shape

PEMP-AML2506. Day 01A. Session Speaker Dr. M. D. Deshpande. 01A M.S. Ramaiah School of Advanced Studies - Bangalore 1

Prentice Hall Connected Mathematics 2, Grade Correlated to: Michigan Grade Level Content Expectations (Grade 6)

I. Language and Communication Needs

Using $1 UNISTROKE Recognizer Algorithm in Gesture Recognition of Hijaiyah Malaysian Hand-Code

Communication services for deaf and hard of hearing people

Smart Speaking Gloves for Speechless

ADA Business BRIEF: Communicating with People Who Are Deaf or Hard of Hearing in Hospital Settings

SPEECH TO TEXT CONVERTER USING GAUSSIAN MIXTURE MODEL(GMM)

enterface 13 Kinect-Sign João Manuel Ferreira Gameiro Project Proposal for enterface 13

Accessible Computing Research for Users who are Deaf and Hard of Hearing (DHH)

Interpreter Preparation (IPP) IPP 101 ASL/Non-IPP Majors. 4 Hours. Prerequisites: None. 4 hours weekly (3-1)

Accessibility Standard for Customer Service:

BIOMECHANICAL OBSERVATION: VISUALLY ACCESSIBLE VARIABLES

Kinematic Analysis of Lumbar Spine Undergoing Extension and Dynamic Neural Foramina Cross Section Measurement

Contour-based Hand Pose Recognition for Sign Language Recognition

Sign Synthesis and Sign Phonology Angus B. Grieve-Smith Linguistics Department University of New Mexico

A Wearable Hand Gloves Gesture Detection based on Flex Sensors for disabled People

Linguistic Competence and Implicit/Explicit Bias Crosswalk. Linguistically Competent Practice Implicit & Explicit Bias

Hand Sign Communication System for Hearing Imparied

Richard C. Gershon, PhD.

Intelligent Sensor Systems for Healthcare: A Case Study of Pressure Ulcer TITOLO. Prevention and Treatment TESI. Rui (April) Dai

DeepASL: Enabling Ubiquitous and Non-Intrusive Word and Sentence-Level Sign Language Translation

An Ingenious accelerator card System for the Visually Lessen

Embedded Based Hand Talk Assisting System for Dumb Peoples on Android Platform

Overview. Introduction

PAPER REVIEW: HAND GESTURE RECOGNITION METHODS

Modeling the Use of Space for Pointing in American Sign Language Animation

ODP Deaf Services Overview Lesson 2 (PD) (music playing) Course Number

AVR Based Gesture Vocalizer Using Speech Synthesizer IC

International Journal of Research in Science and Technology. (IJRST) 2018, Vol. No. 8, Issue No. IV, Oct-Dec e-issn: , p-issn: X

Introduction to Computational Neuroscience

Teaching students in VET who have a hearing loss: Glossary of Terms

Introduction. Spoken language phonology. Spoken language phonology. Sign language phonology. Spoken language phonology

A Communication tool, Mobile Application Arabic & American Sign Languages (ARSL) Sign Language (ASL) as part of Teaching and Learning

The Leap Motion controller: A view on sign language

HEALTH SCIENCES AND ATHLETICS Institutional (ILO), Program (PLO), and Course (SLO) Alignment

Mammogram Analysis: Tumor Classification

CHARACTERISTICS OF STUDENTS WHO ARE: DEAF OR HARD OF HEARING

Motor Control in Biomechanics In Honor of Prof. T. Kiryu s retirement from rich academic career at Niigata University

2018 Limited English Proficiency Plan for the City of West Palm Beach

Interpreter Preparation (IPP) IPP 101 ASL/Non-IPP Majors. 4 Hours. Prerequisites: None. 4 hours weekly (3-1)

SignLanguage By Viggo Mortensen

NCFE Level 2 Certificate in The Principles of Dementia Care

An assistive application identifying emotional state and executing a methodical healing process for depressive individuals.

A Kinematic Assessment of Knee Prosthesis from Fluoroscopy Images

Wireless Emergency Communications Project

WHAT DOES 1 PETABYTE EQUAL? On average, one synapse can hold about 4.7 bits of information. This means that the human brain has a capacity of one

College of Health Sciences. Communication Sciences and Disorders

Increasing Motor Learning During Hand Rehabilitation Exercises Through the Use of Adaptive Games: A Pilot Study

Communication Skills Assessment

Facial expression recognition with spatiotemporal local descriptors


Automatic Motion Analysis of Bones from MR Sequences

Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information:

In this chapter, you will learn about the requirements of Title II of the ADA for effective communication. Questions answered include:

Hearing Impaired K 12

PUBLIC SPEAKING IAN HILL S WAY

icommunicator, Leading Speech-to-Text-To-Sign Language Software System, Announces Version 5.0

Communication Interface for Mute and Hearing Impaired People

HUMAN FACTORS ENGINEERING: ANTHROPOMETRY AND BIOMECHANICS

Excellent Network Courses. Department of Neurology Affiliated hospital of Jiangsu University

A Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China

Multimedia courses generator for hearing impaired

Mammogram Analysis: Tumor Classification

Transcription:

Human Journals Research Article October 2017 Vol.:7, Issue:4 All rights are reserved by Newman Lau Characterization of 3D Gestural Data on Sign Language by Extraction of Joint Kinematics Keywords: hand gesture, sign language, joint kinematics, range of motion Newman Lau School of Design, The Hong Kong Polytechnic University, Hung Hom, Hong Kong Submission: 28 September 2017 Accepted: 5 October 2017 Published: 30 October 2017 www.ijsrm.humanjournals.com ABSTRACT Sign language recognition system is complex and particularly hard to recognize the sign and gesture from those people who have aphasia and apraxia; for example, joint contracture. Therefore, the simplified version of gesture is useful for them to communicate better and for recognition purpose. The feature subset selection has been a focus in research commonly applied in many multimedia and medical applications such as gestural recognition, physiological signal analysis, etc. The characteristics of these data are usually high dimensional as they consist of a series of observations from many variables at a time. In this research, a computational process is proposed to reduce the feature set as originally adopted from a hand gesture capture system. The subset selection is first based on biomechanical characteristics. The objective is to provide a faster, more cost-effective process for the flexibility of capture and recognition based on the selected target group. At the same time, it enhances the knowledge and better understanding of the underlying process that generates the gestural data.

INTRODUCTION Nowadays, there are more and more different applications employing hand gesture recognition techniques. The hand gesture has provided significant methods of communication in human daily interaction and has been extensively explored in the human-computer interaction (HCI) studies [1]. Gestures include the movement of the fingers, hands or other parts of the body. They are culture specific and can convey many different meanings even in different societies or cultural places. For example, American Sign Language (ASL) fingerspelling recognition system was designed and constructed in order to translate the ASL alphabet into the corresponding printed and spoken English letters [2]. The structure of sign language is represented as a combination of basic components of gestures and the sign language can then be recognized by using those components [3]. Sign Language for Special Needs For those people who suffer from aphasia and apraxia, sign language is one of the ways for them to communicate with others. Aphasia is an impairment of language, affecting the production or comprehension of speech and the ability to read or write. It is always due to injury to the brain, particularly in older individuals. Apraxia is neurological condition characterized by loss of the ability to perform activities that a person is physically able and willing to do. It is caused by brain damage related to conditions such as head injury, stroke, brain tumor, and Alzheimer's disease. That damage can affect the brain's ability to correctly signal instructions to the body. Those patients who suffer from aphasia and apraxia would find it difficult to say some words [4] or to make hand and finger gestures. The hand gesture recognition system may not read their vague pose and hand gesture. Therefore, feature subset selection based on formal sign language is one of the research approaches to seek for solution in between. On one hand, these people can still communicate based on formal set of sign language defined. On the other hand, the hand gesture can be simplified for performance and robust for recognition system. Recognition Systems More and more hand gesture recognition system makes use of three dimensional technology and coordinate system to capture hand shape and calculate the range of motion for each finger. There 127

are two popular types of approaches, appearance-based solutions and device-based solutions to model the hand gesture for interpretation. However, most related works are about the hand gesture recognition for general users. There is only little consideration in terms of flexibility in coordination of various joints of a hand through different key positions and target for aphasia and apraxic users. Therefore, simplifying strategies can be based on temporal pattern of movements [5]. The aim of this study is to apply the method, which allows people who suffer from aphasia and apraxia could use computers to recognize their gesture easily. MATERIALS AND METHODS Joint Kinematics The concept of joint kinematics from the biomechanical literature is used to abstract the movement. Fingers segments are linked to each other at the joints. The joint structure determines the types of joint motions allowed at the joint. In human movement, it is the study of the positions, angles, velocities, and accelerations of body segments and joints during motion. Range of Motion Range of motion (ROM) is a quantity that defines the joint movement by measuring the angle from the starting position of an axis to its position at the end of its full range of the movement. It is possible to calculate the range of motion per joint by using the sensor values acquired by sensory device attached to a human hand [6]. With the knowledge on the hand posture of all the hand signs concerned, the overall range of motion needed per finger joint can be defined. RESULTS AND DISCUSSION By accumulating all these 1s, each pair of sign can then be compared and revealing the number of bits that can be distinguished between each other. The process resulted in a matrix of 24 x 24, which are the number of alphabets. If the value is 0, that means the two signs cannot be distinguished from each other. Higher the value, higher the probability that the sign can be distinguished. A threshold T, between 1 and 8, was set as the minimum number of features that has to be different between two signs. If the number of features between two signs meets or above this threshold, the two signs are said to fulfil the criteria for recognition. The number of 128

sign pairs that fulfil the criteria with respect to one alphabet can be expressed as a percentage as the whole sign language set. These percentages are illustrated in Table 1. Table 1. Percentages of sign pairs that fulfil the threshold criteria for feature set S t ''. Threshold (T) 2 3 4 5 6 A 96% 78% 70% 43% 26% B 96% 83% 61% 39% 13% C 100% 91% 61% 26% 13% D 91% 74% 26% 4% 0% E 100% 87% 70% 48% 26% F 96% 83% 65% 52% 13% G 87% 61% 39% 13% 0% H 83% 65% 39% 17% 0% I 96% 87% 65% 35% 17% K 100% 100% 70% 30% 17% L 87% 61% 39% 13% 0% M 96% 83% 65% 52% 13% N 100% 78% 26% 9% 0% O 96% 74% 39% 22% 0% P 83% 61% 39% 22% 9% Q 83% 52% 35% 17% 4% R 83% 52% 35% 17% 4% S 100% 78% 48% 22% 9% T 87% 78% 48% 30% 13% U 83% 61% 39% 22% 9% V 83% 61% 39% 22% 9% W 91% 78% 57% 26% 13% X 70% 52% 35% 22% 9% Y 91% 78% 39% 13% 0% Mean 91% 73% 48% 26% 9% There are signs of alphabet that can relatively easy to get distinguished from others. They have percentages of higher values at the same threshold value. Using the same computational process, results can be calculated for the feature set of S t as illustrated in Table 2. 129

Table 2. Percentages of sign pairs that fulfil the threshold criteria for feature set S t '. Threshold (T) 4 5 6 7 8 9 10 11 12 A 96% 91% 83% 74% 57% 52% 48% 30% 26% B 91% 87% 78% 70% 61% 43% 30% 22% 9% C 100% 100% 100% 96% 74% 48% 26% 9% 9% D 91% 87% 65% 43% 26% 17% 9% 0% 0% E 100% 100% 91% 83% 78% 57% 43% 22% 4% F 96% 91% 87% 70% 61% 57% 39% 22% 13% G 91% 78% 65% 61% 35% 13% 9% 0% 0% H 91% 83% 78% 61% 48% 43% 13% 4% 4% I 100% 96% 87% 74% 52% 30% 17% 9% 4% K 100% 96% 87% 70% 39% 26% 13% 4% 0% L 96% 91% 65% 57% 48% 30% 22% 9% 4% M 96% 87% 78% 74% 70% 43% 30% 22% 9% N 96% 87% 65% 43% 35% 17% 13% 9% 0% O 96% 91% 74% 74% 61% 43% 22% 0% 0% P 87% 70% 52% 43% 39% 30% 9% 4% 0% Q 87% 70% 52% 39% 26% 13% 9% 0% 0% R 96% 87% 70% 52% 39% 26% 13% 9% 0% S 100% 96% 83% 78% 74% 61% 39% 17% 9% T 96% 87% 78% 70% 57% 43% 26% 13% 9% U 87% 70% 48% 39% 39% 26% 13% 0% 0% V 91% 91% 78% 57% 43% 35% 22% 13% 0% W 91% 87% 83% 70% 57% 48% 35% 17% 17% X 96% 87% 65% 48% 43% 30% 13% 9% 0% Y 100% 96% 78% 70% 57% 43% 35% 17% 4% Mean 95% 88% 75% 63% 51% 37% 23% 11% 5% To compare the performance of these two table of results, the T is normalized based on the maximum number of features in the given set. The graph that compares the results based on the two feature sets is illustrated in Figure 1. 130

Figure 1. Comparison between feature sets. Based on the same percentage of features used, it can be seen that higher percentage of sign pairs meeting the criteria for the set S t compared to S t '. It can be illustrated by drawing vertical lines inside Figure 1 starting from 30%. For every line on the right hand side of 30%, the percentage as resulted from feature 8 is larger than those of 18 features. This means the features within the S t is more effective for recognition than those of S t. The results reach our initial object to provide a feature subset for faster and more cost-effective process for the capture and recognition. Since the process includes a discretization process that defines a threshold value flavour for particular needs, the process is customized for people who suffer from aphasia and apraxia in particular. CONCLUSION In this research, an approach is proposed for feature subset selection preparing for hand gestures recognition. The process is first based on joint kinematics to eliminate those features that do not have a considerable amount of magnitude of movement in order to distinguish between the hand gestures of various alphabets in the set of sign language. This parameter was set based on the criteria for people with special needs, aphasia and apraxia in particular. The discretized values are then compared using bitwise comparison among the whole alphabet set of signs. This research can be further enhanced by capturing various subject groups with symptoms of aphasia 131

and apraxia. The computational process and evaluation method can then be validated with stronger evidence. REFERENCES 1. Howe LW, Wong F, Chekima, A. Comparison of hand segmentation methodologies for hand gesture recognition. International Symposium on Information Technology. 2008. 2. Jerome M, Pierre K, Richard F. American sign language fingerspelling recognition system. IEEE 29th Bioengineering Conference. 2003. 3. Hirohiko S, Masaru T, Masaru O. Description and recognition methods for sign language based on gesture components. 2 nd International Conference on Intelligent User Interfaces. 1997. 4. Apraxia. 8/20/2017. Available from http://medical-dictionary.thefreedictionary.com/apraxic 5. Daniel BD, Renata CB, Madeo TR, Helton HB, Sarajane MP. Hand movement recognition for Brazilian Sign Language: A study using distance-based neural networks. Paper presented at the International Joint Conference on Neural Networks. 2009. 6. Reese NB. Joint Range of Motion and Muscle Length Testing. Saunders Publication; 2001. 132