Communication Interface for Mute and Hearing Impaired People

Similar documents
AVR Based Gesture Vocalizer Using Speech Synthesizer IC

International Journal of Advance Engineering and Research Development. Gesture Glove for American Sign Language Representation

Sign Language Interpretation Using Pseudo Glove

A Review on Gesture Vocalizer

Embedded Based Hand Talk Assisting System for Dumb Peoples on Android Platform

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE) Vol 5, Issue 3, March 2018 Gesture Glove

Recognition of sign language gestures using neural networks

Development of an Electronic Glove with Voice Output for Finger Posture Recognition

A Design of Prototypic Hand Talk Assistive Technology for the Physically Challenged

Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio

Smart Speaking Gloves for Speechless

Glove for Gesture Recognition using Flex Sensor

Gesture Recognition using Marathi/Hindi Alphabet

A Review on Hand Gesture Recognition for Deaf and Dumb People Using GSM Module

Recognition of Voice and Text Using Hand Gesture

Hand Gesture Recognition System for Deaf and Dumb Persons

Hand-Gesture Recognition System For Dumb And Paraplegics

ISSN: [Jain * et al., 7(4): April, 2018] Impact Factor: 5.164

Real-time Communication System for the Deaf and Dumb

Hand Gestures Recognition System for Deaf, Dumb and Blind People

International Journal of Modern Trends in Engineering and Research e-issn No.: , Date: 2-4 July, 2015

Haptic Based Sign Language Interpreter

Speech to Text Wireless Converter

GLOVES BASED HAND GESTURE RECOGNITION USING INDIAN SIGN LANGUAGE

IDENTIFICATION OF REAL TIME HAND GESTURE USING SCALE INVARIANT FEATURE TRANSFORM

Real Time Sign Language Processing System

[Ashwini*, 5(4): April, 2016] ISSN: (I2OR), Publication Impact Factor: 3.785

Finger spelling recognition using distinctive features of hand shape

International Journal of Advances in Engineering & Technology, Sept., IJAET ISSN: Tambaram, Chennai

Detection and Recognition of Sign Language Protocol using Motion Sensing Device

Labview Based Hand Gesture Recognition for Deaf and Dumb People

N RISCE 2K18 ISSN International Journal of Advance Research and Innovation

TWO HANDED SIGN LANGUAGE RECOGNITION SYSTEM USING IMAGE PROCESSING

Microcontroller and Sensors Based Gesture Vocalizer

Hand Gesture Recognition In Real Time Using IR Sensor

A Survey on Hand Gesture Recognition for Indian Sign Language

AVR based Robotic Arm for Speech Impaired People

Meri Awaaz Smart Glove Learning Assistant for Mute Students and teachers

Translation of Unintelligible Speech of User into Synthetic Speech Using Augmentative and Alternative Communication

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

[Chafle, 2(5): May, 2015] ISSN:

Sign Language Coach. Pupul Mayank Department of Telecommunication Engineering BMS College of Engg, Bangalore, Karnataka, India

A Wearable Hand Gloves Gesture Detection based on Flex Sensors for disabled People

Quality Assessment of Human Hand Posture Recognition System Er. ManjinderKaur M.Tech Scholar GIMET Amritsar, Department of CSE

Sign Language to English (Slate8)

ISSN: [ ] [Vol-3, Issue-3 March- 2017]

HAND GESTURE RECOGNITION FOR HUMAN COMPUTER INTERACTION

Gesture Vocalizer using IoT

A Real-Time Large Vocabulary Recognition System For Chinese Sign Language

Hand Gesture Recognition and Speech Conversion for Deaf and Dumb using Feature Extraction

The Sign2 Project Digital Translation of American Sign- Language to Audio and Text

Sentence Formation in NLP Engine on the Basis of Indian Sign Language using Hand Gestures

Available online at ScienceDirect. Procedia Technology 24 (2016 )

Implementation of image processing approach to translation of ASL finger-spelling to digital text

Flex Sensor Based Hand Glove for Deaf and Mute People

Microphone Input LED Display T-shirt

An Approach to Hand Gesture Recognition for Devanagari Sign Language using Image Processing Tool Box

Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information:

Accessible Computing Research for Users who are Deaf and Hard of Hearing (DHH)

Android based Monitoring Human Knee Joint Movement Using Wearable Computing

Gesture Recognition by using Flex Sensor for Patient Monitoring

TURKISH SIGN LANGUAGE RECOGNITION USING HIDDEN MARKOV MODEL

3. MANUAL ALPHABET RECOGNITION STSTM

Assistant Professor, PG and Research Department of Computer Applications, Sacred Heart College (Autonomous), Tirupattur, Vellore, Tamil Nadu, India

Using $1 UNISTROKE Recognizer Algorithm in Gesture Recognition of Hijaiyah Malaysian Hand-Code

A Smart Texting System For Android Mobile Users

Video-Based Recognition of Fingerspelling in Real-Time. Kirsti Grobel and Hermann Hienz

Recognition of Hand Gestures by ASL

A Review on Feature Extraction for Indian and American Sign Language

ABSTRACT I. INTRODUCTION

COMPUTER PLAY IN EDUCATIONAL THERAPY FOR CHILDREN WITH STUTTERING PROBLEM: HARDWARE SETUP AND INTERVENTION

Smart Glove Based Gesture Vocalizer for Deaf and Dumb

Sign Language in the Intelligent Sensory Environment

Audiovisual to Sign Language Translator

Learning Objectives. AT Goals. Assistive Technology for Sensory Impairments. Review Course for Assistive Technology Practitioners & Suppliers

Detection of Finger Motion using Flex Sensor for Assisting Speech Impaired

Survey on Sign Language and Gesture Recognition System

easy read Your rights under THE accessible InformatioN STandard

easy read Your rights under THE accessible InformatioN STandard

An Ingenious accelerator card System for the Visually Lessen

Analysis of Recognition System of Japanese Sign Language using 3D Image Sensor

Figure 1: Block Diagram of Proposed System

Available online at ScienceDirect. Procedia Computer Science 92 (2016 )

Recognition of Tamil Sign Language Alphabet using Image Processing to aid Deaf-Dumb People

Characterization of 3D Gestural Data on Sign Language by Extraction of Joint Kinematics

Development of a portable device for home monitoring of. snoring. Abstract

Computer Applications: An International Journal (CAIJ), Vol.3, No.1, February Mohammad Taye, Mohammad Abu Shanab, Moyad Rayyan and Husam Younis

Feasibility Evaluation of a Novel Ultrasonic Method for Prosthetic Control ECE-492/3 Senior Design Project Fall 2011

PORTABLE CAMERA-BASED ASSISTIVE TEXT AND PRODUCT LABEL READING FROM HAND-HELD OBJECTS FOR BLINDPERSONS

Sign Language Recognition using Webcams

BLOOD GLUCOSE LEVEL MONITORING BY NONINVASIVE METHOD USING NEAR INFRA RED SENSOR

Estimation of Haemoglobin Using Non-Invasive Technique

Design and Implementation study of Remote Home Rehabilitation Training Operating System based on Internet

International Journal of Modern Trends in Engineering and Research e-issn No.: , Date: 2-4 July, 2015

PRITHIBA.S. MONISHA.J CSC DEPARTMENT RMD ENGINEERING COLLEGE CHENNAI

SPEECH TO TEXT CONVERTER USING GAUSSIAN MIXTURE MODEL(GMM)

Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information

Non-Invasive Method of Blood Pressure Measurement Validated in a Mathematical Model

Japanese sign-language recognition based on gesture primitives using acceleration sensors and datagloves

AT89C51 Microcontroller Based Control Model for Hybrid Assistive Limb (Knee )

Transcription:

Communication Interface for Mute and Hearing Impaired People *GarimaRao,*LakshNarang,*Abhishek Solanki,*Kapil Singh, Mrs.*Karamjit Kaur, Mr.*Neeraj Gupta. *Amity University Haryana Abstract - Sign language is an important tool used by the mute and hearing impaired people to communicate. To ensure seamless interaction between hearing impaired /mute people and society without translator understanding of sign language by one and all is must. To overcome this limitation, desire to develop human machine interface that recognize the sign language would have a significant impact on deaf and mute people s social life. In the proposed work, state of art interface has been demonstrated which recognizes the sign language through flex sensors and to convert it into text and voice. The system employs PIC16f887,APR 9600andflexes sensors. The simulation part has been done using Proteus and hardware is developed accordingly. The system has been tested for various signs and optimum results have been obtained. Keywords: embedded system, flex sensors, Proteus, Voice Module I. INTRODUCTION To ensure seamless interaction between hearing impaired /mute people and society without translator understanding of sign language by one and all is must. In order to lower this barrier designing human machine interface has drawn a great attention towards its field of research. Enormous work has been reported so far introducing various aids to convert sign language text and voice message. Some of prominent solutions include B-Spline approximation, Real Time Continuous Gesture recognition (RTC), Motion Tracking Network (MTN), etc. B-spline[1] is a Vision-based recognition system for Indian sign language alphabets and numerals. Algorithm approximates the boundary extracted from the region of interest, to a B-Spline curve by taking the maximum curvature points (MCPs) as the control points. A very Large vocabulary sign language interpreter is presented with real-time continuous gesture recognition of sign language using a data glove. End-point detection in a stream of gesture input is first solved and then statistical analysis is done according to 4 parameters in a gesture: posture, position, orientation, and motion. The recognition percentage is very less and is below %[2]. Technology for hand gesture recognition which is based on thinning of segmented image. System works suitably for static letters of the American Sign Language. The drawback of this system was that it does not give good results under thepoor background light conditions [3].SLARTI for data acquisition it uses a robotic glove and a system based on magnetic fields. Implementation cost is very high.university of Central Florida gesture recognition system uses a webcam and computer vision techniques to collect the data and a neural network to classify shapes [4]. For recognition, this system requires wearing a specially colored glove to facilitate the imaging process.aslr [5] uses a webcam and computer vision techniques to collect field data and for classification of signs. Many of these developments have been made but there are several sections to be explored so as to make system more simple user friendly and economic. Keeping in view the practical implementation and considering the need of challenges in designing machine interface to decode sign language, an effort has been put in to design the prominent finding for the literature regarding human machine interface are stated specifying the scope of improvement in these findings. Section1:Describe the simulation of prospered work. The result obtainedthrough simulation was satisfactory enough for initiation hardware design. Section 2:Describes the detailed description of hardware design followed by result obtained and possible improvement in prospered work II. Section 1 DESIGN DEVELOPMENT AND FABRICATION Prior to hardware design the network was simulated through Proteus ISIS 7.6.8741. This simulation software provides professional tools for simulating embedded system. Figure 1 shows the flex sensor is modelled through a variable resistor and is given as the input to the Pic micro controller and corresponding output is shown on lcd. 667

Speaker LCD Apr 9600 (OUTPUT) Figure 1: simulation of the desired hardware design using the Proteus. For experimental setup, design flow and block diagram is shown in figure 2 and figure 3 respectively. Acquisition Glove Sensor (Angle Variation with folding fingers) Resistance Varies (Given to the microcontroller) Power Supply +5 v Pic Microcontroller 16f887 (PROCESSING) 4Flex sensor Conditioning Processing PIC Microcontroller Analog to Digital Converter LCD Voice Section 2 Figure3: Block diagram Major Component used in hardware design are flex sensors, Pic microcontroller 16f887, Apr 9600,lcd (16*2). Figure 4 shows the complete hardware setup with the no symbol detected on the lcd display. Figure 2: Design flow 668

The output voltage is converted into digital through pic 16f887 which has 10 inbuilt analog to digital convertor channels.figure 6 shows the variation in resistance due to the bending of the flex sensor using the multimeter. Figure 4: Hardware Setup Figure 4: Hardware Setup The input is provided from the flex sensor which converts the resistance change in bend to electrical resistance the more the bend more the resistance value. A bridge circuit is implemented by using a 10 k ohm sensor in series with flex sensor. Figure 5 shows voltage divider rule being applied on the flex sensors and the output voltage is given to ADC channels of Pic micro controller. The output voltage is given by Figure 6: Snapshot of Change in Resistance with Bending of flex sensor Table 1 shows that due to bending of the flex sensor (in degree) the change in the resistance value of the flex sensor with the corresponding output voltage. DEGREE RESISTANCE (kilo ohms ) 0 10.536 2.5 10 12.832 2.27 20 14.124 2.08 30 15.572 2.01 VOLTAGE (ANALOG) Vout=Vin(R1/R1+R2) 45 18.000 1.78 60 22.686 1.56 75 26.296 1.38 30.460 1.25 Figure 5: Working of flex Table1:Degree resistance voltage with bending of flex sensor 669

III. RESULT AND CONCLUSION The table 2 shows the different position of the 4 flex sensors and their corresponding outputs on the lcd in text and voice through APR 9600. Flex 1 Flex 2 degree degree 0 Flex 3 degree Flex 4 degree 0 Text Voice Why Yes Yes 0 0 0 Hello 0 0 Who Hello 0 0 0 No 0 0 Bye Table 2: Position of sensors and corresponding output onlcd and APR 9600. To check the reliability and accuracy of system testing has been done by providing 25 samples in random order a few has been shown below. Who No Why Bye Figure 7: Samples taken in random order 670

The proposed system aims to lower the communication gap between the hearing-impaired or mute community and the normal world. This project was meant to be a prototype to check the feasibility of recognizing sign language using sensor gloves. With this project the hearing-impaired and mute people can use the gloves to perform sign language and it will be converted in to speech so that normal people can easily understand. The main feature of this project is that the gesture recognizer is a standalone system, which is applicable in daily life.data gloves can only capture the bending of fingers and not the shape or motion of other parts of the body example arm, elbows, face etc. So only postures are taken and moving gestures are ignored. The problem of recognizing moving gestures can be resolved using 3 axis accelerometer sensor at wrist for full capture of the wrist movement changes, while 2axis accelerometer can be used at elbow and shoulder. IV. FUTURE WORK The completion of this prototype suggests that sensor gloves can be used for partial sign language recognition. More sensors can be employed to recognize full sign language. A handy and portable hardware device with building translating system and group of body sensors along with the pair of data gloves can be manufactured so that a hearingimpaired and mute person can communicate to any normal person anywhere. V. REFERENCES 1. Geetha M Manjusha UC A Vision Based Recognition of Indian Sign Language Alphabets and Numerals Using B-Spline Approximation IJCSE Vol. 4 No. 03 March 2012 2. Rung-Huei Liang, Ming Ouhyoung A Real-time Continuous Gesture Recognition System for Sign Language ieee international conference on automatic face and gesture recognition, pp 558-567 Japan 1998 3. RajshreeRokhadeDharampalDoye Hand Gesture Recognition by Thinning Method ieee computer society, 2009 4. J. W. Davis, J. William, and M. Shah, Gesture Recognition, Technical Report, CS-TR-93-11, Department of Computer Science, University of Central Florida, Orlando, USA, 1993 5. Philippe Drew, David Rybach, Thomas Deselaers, MortezaZahedi, Herman Ney, Speech recognition techniques for sign language recognition system, In Interspeech, pages 2513-2516, Antwerp, Belgium, August 2007 6. AnujaGolliwar, HarshadaPatil, RohitaWatpade, Sneha Moon, SonalPatil,V. D. Bondre 2 Sign Language Translator Using Hand Gloves ISSN (Online): pp 2347-2820, Volume -2, Issue-1, January, 2014 7. AjinkyaRaut, Vineeta Singh, Vikrant Rajput and RuchikaMahale, Hand Sign Interpreter (IJES), Volume 1, Issue 2, Pages 19-25, 2012. 8. J Kramer and L Leifer. The Talking Glove: A Speaking Aid for Nonvocal Deaf and Deaf-Blind Individuals, Proc. of the RESNA 12th annual Conf. (1993) pp. 471-472. 9. Shoib Ahmed Magic Gloves Hand Gesture Recognition and Voice Conversion System for Differentially Able Dumb People V:http://www.theglobalsummit.org/wpcontent/uploads/2012/08/Shoai b-ahmed.pdf 10. S. Sidney Fels and Geo rey E. Hinton, Glove-TalkII: A neural network interface which maps gestures to parallel formant speech synthesizer controls, Transactions on Neural Networks, 9 (1), 205-212. 1998. 671