Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio

Similar documents
Embedded Based Hand Talk Assisting System for Dumb Peoples on Android Platform

GLOVES BASED HAND GESTURE RECOGNITION USING INDIAN SIGN LANGUAGE

International Journal of Advance Engineering and Research Development. Gesture Glove for American Sign Language Representation

Glove for Gesture Recognition using Flex Sensor

Sign Language Interpretation Using Pseudo Glove

Haptic Based Sign Language Interpreter

Smart Speaking Gloves for Speechless

AVR Based Gesture Vocalizer Using Speech Synthesizer IC

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE) Vol 5, Issue 3, March 2018 Gesture Glove

Real-time Communication System for the Deaf and Dumb

A Review on Hand Gesture Recognition for Deaf and Dumb People Using GSM Module

Meri Awaaz Smart Glove Learning Assistant for Mute Students and teachers

Development of an Electronic Glove with Voice Output for Finger Posture Recognition

Hand Gesture Recognition In Real Time Using IR Sensor

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

Recognition of Voice and Text Using Hand Gesture

A Review on Gesture Vocalizer

Communication Interface for Mute and Hearing Impaired People

Hand-Gesture Recognition System For Dumb And Paraplegics

A Wearable Hand Gloves Gesture Detection based on Flex Sensors for disabled People

Survey on Sign Language and Gesture Recognition System

Detection of Finger Motion using Flex Sensor for Assisting Speech Impaired

Speech to Text Wireless Converter

Gesture Recognition using Marathi/Hindi Alphabet

Sign Language Coach. Pupul Mayank Department of Telecommunication Engineering BMS College of Engg, Bangalore, Karnataka, India

Microcontroller and Sensors Based Gesture Vocalizer

ISSN: [Jain * et al., 7(4): April, 2018] Impact Factor: 5.164

A Design of Prototypic Hand Talk Assistive Technology for the Physically Challenged

Hand Gesture Recognition System for Deaf and Dumb Persons

International Journal of Modern Trends in Engineering and Research e-issn No.: , Date: 2-4 July, 2015

Detection and Recognition of Sign Language Protocol using Motion Sensing Device

Flex Sensor Based Hand Glove for Deaf and Mute People

[Chafle, 2(5): May, 2015] ISSN:

[Ashwini*, 5(4): April, 2016] ISSN: (I2OR), Publication Impact Factor: 3.785

TALKING GLOVES. Mitali Madhusmita Atul Agrawal. Department of Electronics and Communication Engineering

Labview Based Hand Gesture Recognition for Deaf and Dumb People

ANDROID BASED AID FOR THE DEAF

Gesture Vocalizer using IoT

Available online at ScienceDirect. Procedia Technology 24 (2016 )

Sentence Formation in NLP Engine on the Basis of Indian Sign Language using Hand Gestures

International Journal of Advances in Engineering & Technology, Sept., IJAET ISSN: Tambaram, Chennai

AVR based Robotic Arm for Speech Impaired People

Sign Language to English (Slate8)

Computer Applications: An International Journal (CAIJ), Vol.3, No.1, February Mohammad Taye, Mohammad Abu Shanab, Moyad Rayyan and Husam Younis

ADVANCES in NATURAL and APPLIED SCIENCES

Hand Gestures Recognition System for Deaf, Dumb and Blind People

DESIGN OF SMART HEARING AID

ISSN: [ ] [Vol-3, Issue-3 March- 2017]

SPEECH TO TEXT CONVERTER USING GAUSSIAN MIXTURE MODEL(GMM)

Smart Glove Based Gesture Vocalizer for Deaf and Dumb

DIGITAL TEXT AND SPEECH SYNTHESIZER USING SMART GLOVE FOR DEAF AND DUMB

PC BASED AUDIOMETER GENERATING AUDIOGRAM TO ASSESS ACOUSTIC THRESHOLD

Real Time Sign Language Processing System

The Sign2 Project Digital Translation of American Sign- Language to Audio and Text

Intelligent Frozen Shoulder Self-Home Rehabilitation Monitoring System

Figure 1: Block Diagram of Proposed System

ECG Acquisition System and its Analysis using MATLAB

Assistant Professor, PG and Research Department of Computer Applications, Sacred Heart College (Autonomous), Tirupattur, Vellore, Tamil Nadu, India

BLOOD GLUCOSE LEVEL MONITORING BY NONINVASIVE METHOD USING NEAR INFRA RED SENSOR

Translation of Unintelligible Speech of User into Synthetic Speech Using Augmentative and Alternative Communication

N RISCE 2K18 ISSN International Journal of Advance Research and Innovation

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

An Approach to Hand Gesture Recognition for Devanagari Sign Language using Image Processing Tool Box

A Simple Portable ECG Monitor with IOT

Voluntary Product Accessibility Template (VPAT)

Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information:

TWO HANDED SIGN LANGUAGE RECOGNITION SYSTEM USING IMAGE PROCESSING

Gesture Recognition by using Flex Sensor for Patient Monitoring

MRI Image Processing Operations for Brain Tumor Detection

Suppliers' Information Note. Next Generation Text Service also called Text Relay. Service Description

Communication-Translator for Deaf Dumb and Normal People. (CommuLator)

Microphone Input LED Display T-shirt

Hand Gesture Recognition and Speech Conversion for Deaf and Dumb using Feature Extraction

An Ingenious accelerator card System for the Visually Lessen

Prosody Rule for Time Structure of Finger Braille

Finger spelling recognition using distinctive features of hand shape

Recognition of sign language gestures using neural networks

Android based Monitoring Human Knee Joint Movement Using Wearable Computing

Portable Healthcare System with Low-power Wireless ECG and Heart Sounds Measurement

Quality Assessment of Human Hand Posture Recognition System Er. ManjinderKaur M.Tech Scholar GIMET Amritsar, Department of CSE

ITU-T. FG AVA TR Version 1.0 (10/2013) Part 3: Using audiovisual media A taxonomy of participation

IDENTIFICATION OF REAL TIME HAND GESTURE USING SCALE INVARIANT FEATURE TRANSFORM

Fujitsu LifeBook T Series TabletPC Voluntary Product Accessibility Template

Hand Sign to Bangla Speech: A Deep Learning in Vision based system for Recognizing Hand Sign Digits and Generating Bangla Speech

Heart Attack Detection and Heart Rate Monitoring Using IoT

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

A Smart Texting System For Android Mobile Users

Study about Software used in Sign Language Recognition

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Avaya G450 Branch Gateway, Release 7.1 Voluntary Product Accessibility Template (VPAT)

Learning Objectives. AT Goals. Assistive Technology for Sensory Impairments. Review Course for Assistive Technology Practitioners & Suppliers

Hand Sign Communication System for Hearing Imparied

Apple emac. Standards Subpart Software applications and operating systems. Subpart B -- Technical Standards

RASPBERRY PI BASED ECG DATA ACQUISITION SYSTEM

Implementation of image processing approach to translation of ASL finger-spelling to digital text

Communications Accessibility with Avaya IP Office

Thrive Hearing Control Application

Avaya one-x Communicator for Mac OS X R2.0 Voluntary Product Accessibility Template (VPAT)

A Novel Based Adaptive Communication Technique For Specially Abled People

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Transcription:

Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio Anshula Kumari 1, Rutuja Benke 1, Yasheseve Bhat 1, Amina Qazi 2 1Project Student, Department of Electronics and Telecommunication, Bharati Vidyapeeth s College of Engineering, Navi Mumbai, Maharashtra, India. 2Professor and Project Faculty member, Department of Electronics and Telecommunication, Bharati Vidyapeeth s College of Engineering, Navi Mumbai, Maharashtra, India. ---------------------------------------------------------------------***---------------------------------------------------------------------- Abstract - Smart Gloves used for hand gesture recognition and translation which acts as a tool for communication by the deaf and dumb people in real life. The gloves make use of Sign language which is translated in text and audio. It uses hand gesture and orientation of the fingers which has flex sensors embedded to it to measure the movement and convey the message. This data is then sent to the microcontroller. The input signal needs to stay constant for the defined period and if the gesture remains unchanged then it is valid and the translated text and audio is observed on the smart phone application. The transmission of signals is done with the help of Bluetooth. Key Words: Bluetooth Module, Flex Sensor, PIC 16F877A, Sign language, Arduino text to speech convertor. 1. INTRODUCTION Hand Gesture Recognition (HGR) is one of the emerging semantic Human Computer Interaction (HCI) methods [6]. Sign language helps in expressing and enables a physically challenged person to freely communicate with normal person which not only help them in workspace but also in day to day life. In this paper we propose a wireless hand glove which make use of orientation of the fingers to depict a particular sentence. The language being used is American English. There are only about 250 certified sign language interpreters in India, translating for a deaf population of between 1.8 million and 7 million [7]. Thus, the purpose of the project is to provide mode of communication to the mass and remove the dependence of the disabled on the interpreter. The application on which the output is displayed and heard is an open source software which can be easily available to any person. Chart -1: Deaf and dumb work survey From the chart. We can see the decreasing ratio of literates and employed deaf and dumb population which is a result of the disability due to lack of communication between the deaf and dumb and normal person. This glove will be helpful to remove the differences in communication. 2. SIGN LANGUAGE Sign languages are the languages which make use of the visual-manual modality to convey meaning. Sign languages are fully natural languages consisting of their own grammar and lexicon [6]. This proves that sign languages are not universal at all, although there are some similarities among different sign languages. They have become easy means of communication for mute people. It is used by people suffering from hearing problems, those who are unable to physically speak, those who have trouble with spoken language due to a disability. Each country generally has its own one or more native sign languages. According to ethnologic lists of 2013 edition, there are 137 sign languages in total. Out of which, 2019, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 2504

some sign languages have obtained some sort of legal recognition, while other languages have no status at all. In this paper we are going to use gestures for English language. 3. RELATED WORK Many attempts are made to convert sign language into different languages. In [1] sign language is converted to a Taiwanese language. Here a wireless hand gesture recognition glove is used for realtime translation of sign language into Taiwanese sign language. The flex and inertial sensors are embedded into the glove. The input is given by finger flexion postures acquired by flex sensors, the orientation of palm acquired by G-sensor, and the trajectory of motion acquired by gyroscope are used as the input signals of the proposed system. The paper delivers a accuracy of 94%. For Chinese language [2] uses a virtual keyboard constructed to provide a visualized interface to retrieve sign icons from a sign database. A proposed language model (LM), based on a predictive sentence template (PST) tree, integrates a statistical variable -gram LM and linguistic constraints. The PST tree collected from the deaf schools is used in the model for correspondence between signed and written Chinese. These approaches improved the efficiency of text generation and the accuracy of word prediction and, thus, improved the input rate. Different sign gestures are converted into Arabic language [3]. Flex sensors are used to recognize the degree of bent. The input is given to the microcontroller-based Arduino UNO board at analog pin which programmed. Then the digital output is connected to the display which gives output as Arabic alphabet. The project gives output only for alphabets. However, our model is based on a system that coverts sign language in English in both text and audio format which depicts sentences and the model uses PIC microcontroller which gives high power efficiency. 4. HARDWARE COMPONENTS AND THEORETICAL BACKGROUND 4.1 FLEX SENSOR A flex sensor is a sensor which calculates the amount of degree of bend. It is a non-inverting operational amplifier. It works at 5V input voltage. The degree of bend is calculated with respect to the resistance. Greater the degree of bend lower will be the output voltage. 4.2 PIC MICROCONTROLLER Fig -1: Flex sensor w.r.t degree of bend. It is 40-bit microcontroller IC. It has one 16 bit and two 8-bit timer. Five input/output ports along with serial and parallel ports are present in it. Under software control it can also self-reprogram itself. 4.3 BLUETOOTH MODULE A Bluetooth module (HC-05) is used to transfer data from microcontroller to mobile application. Operating voltage of this bluetooth module is 4V to 6V and operating frequency is 30mA. It is used for serial communication. Also, it supports different baudrates-9600, 19200, 38400, 57600, 115200, 230400, 460800 2019, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 2505

5. WORKING Fig -2: Block Diagram The model proposed in this paper is an interpreter that translates. The sign language makes use of various hand gestures that have their generic meaning in their respective language like the sentence hello is stated by keeping all the fingers bend apart from the little finger which is kept straight. The word hello in binary would be denoted as 00001. The first step to recognize hand gesture is to detect the orientation of the finger from the 31 fundamental postures as shown in fig 3. are the sample examples. These hand gestures are sensed by the flex sensor. The flex sensor varies its resistance with respect to the degree of bend. The flex sensor output voltage varies from 0-5V. It is a non-inverting operational amplifier working on the principles of voltage divider rule. The sensors are connected via three pins they are the ground, live and output pins. When a finger bends the output from the flex sensor are taken which is concatenated with fixed resistors. Then we measure the output voltage between the fixed resistor (R b) and flex sensor (R) as given by (1).. (1) If for flex 1 we get a value 207 then 207*5=1035mV which will be the output of the flex sensor. All output signals generated by flex sensors are in analog form and it is digitized by using PIC microcontroller, as it has 10- bit inbuilt ADC. All digital values are then stored in the buffer. Microcontroller compares static data and digital values to determine the gesture. According to the finger movement, it generates further output. The values sensed by flex sensors are in number form and to send those numbers over Bluetooth module we need to convert it into ASCII characters so that it can be readable at the receiver end. For each flex sensor, there is a fixed threshold value that is based on the bend of the finger. Threshold values are obtained with the help of ADC values. ADC values are calculated using free Android software named as Bluetooth terminal HC-05. Calculated threshold values of each flex sensor are as shown below in the table. Table -1: Threshold values of flex sensors Flex sensor Threshold value H1 150 H2 150 H3 165 H4 160 H5 235 By considering the above table, if threshold value of flex sensor H1 is greater than 150 then it will give 1 as output and if it is less than 150 it will give 0 as output. According to gestures different binary combinations and their respective messages are stored in program memory. This stored message is then transferred to a free Android 2019, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 2506

application named as Arduino Bluetooth text to speech converter via Bluetooth connected in the UART interface. This communication is made at a baud rate of 9600 bits/sec. When an application receives the message, it will convert it into speech. 6. PROGRAMMING IN C Fig -3: Sample gestures used for communication along with binary code. All the signs corresponding to the sentences were decoded using C programming language the code was then dumped into the PIC microcontroller. We have used an open source software for the display of the output in both text and audio form. 7. RESULT Fig 4-: Obtained sample result for sentence IT HELPS TO COMMUNICATE WITH OTHERS. 2019, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 2507

8. CONCLUSION In this paper a real-time and portable system for English language. To provide a mode of communication, an independent smart glove was designed for mute people at a low cost. We were able to achieve a high degree of accuracy for gesture recognition. The model can be coded for 31 sentences. The whole system is made up of easily available electronic components and free android apps which helped in reducing the overall cost. Such cheap but effective product will help the mute people to put forward their words through gestures and reduce the dependency they have on the interpretators. 9. REFERENCE [1] A Real-time Portable Sign Language Translation System. Authors: Lih-Jen kau, Wan-Lin Su, Pei-Ju Yu, Sin-Jhan Wei. Published in: 2015 IEEE 58th International Midwest Symposium on Circuits and Systems (MWSCAS). [2] Text Generation from Taiwanese Sign Language Using a PST-Based Language Model for Augmentative Communication. Authors: Chung-Hsien Wu, Senior Member Published in: IEEE Transaction on Neural System and Rehabilitation Engineering, VOL. 12, NO. 4, Dec 2004 [3] Design of a Real-Time Interpreter for Arabic Sign Language. Authors: Tariq Jamil. Published in: IEEE Southeast Conference 2018. [4] Design of Smart gloves International Journal of Engineering Research & Technology (IJERT)vol.3 Issue 11, November- 2014 [5] Laura DiPietro, Angelo M. Sabatini and Paulo Dario "A survey of Glove-Based systems and their Applications", IEEE Transactions on systems, Man and Cybernetics-part C:applications &review,vol.38,no.04,pp. 461-482,July 2008 [6] With a deaf community of millions, hearing India is only just beginning to sign. GlobalPost January 04, 2017 11:00 AM EST By Nimisha Jaiswal [7] Signing Made Easy (A Complete Program for Learning Sign Language. Includes Sentence Drills and Exercises for Increased Comprehension and Signing Skill) 2019, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 2508