A Design of Prototypic Hand Talk Assistive Technology for the Physically Challenged

Similar documents
AVR Based Gesture Vocalizer Using Speech Synthesizer IC

Embedded Based Hand Talk Assisting System for Dumb Peoples on Android Platform

A Review on Gesture Vocalizer

Sign Language Interpretation Using Pseudo Glove

International Journal of Advance Engineering and Research Development. Gesture Glove for American Sign Language Representation

Smart Speaking Gloves for Speechless

Communication Interface for Mute and Hearing Impaired People

Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio

International Journal of Advances in Engineering & Technology, Sept., IJAET ISSN: Tambaram, Chennai

International Journal of Modern Trends in Engineering and Research e-issn No.: , Date: 2-4 July, 2015

Recognition of Voice and Text Using Hand Gesture

Development of an Electronic Glove with Voice Output for Finger Posture Recognition

Speech to Text Wireless Converter

Haptic Based Sign Language Interpreter

A Review on Hand Gesture Recognition for Deaf and Dumb People Using GSM Module

ISSN: [Jain * et al., 7(4): April, 2018] Impact Factor: 5.164

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE) Vol 5, Issue 3, March 2018 Gesture Glove

Translation of Unintelligible Speech of User into Synthetic Speech Using Augmentative and Alternative Communication

Hand Gesture Recognition System for Deaf and Dumb Persons

Glove for Gesture Recognition using Flex Sensor

GLOVES BASED HAND GESTURE RECOGNITION USING INDIAN SIGN LANGUAGE

Real-time Communication System for the Deaf and Dumb

Hand-Gesture Recognition System For Dumb And Paraplegics

Hand Gesture Recognition In Real Time Using IR Sensor

Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information:

Microcontroller and Sensors Based Gesture Vocalizer

Meri Awaaz Smart Glove Learning Assistant for Mute Students and teachers

Smart Glove Based Gesture Vocalizer for Deaf and Dumb

Sign Language Coach. Pupul Mayank Department of Telecommunication Engineering BMS College of Engg, Bangalore, Karnataka, India

[Ashwini*, 5(4): April, 2016] ISSN: (I2OR), Publication Impact Factor: 3.785

Hand Gestures Recognition System for Deaf, Dumb and Blind People

Detection and Recognition of Sign Language Protocol using Motion Sensing Device

Recognition of sign language gestures using neural networks

Android based Monitoring Human Knee Joint Movement Using Wearable Computing

Communications Accessibility with Avaya IP Office

Team Members: MaryPat Beaufait, Benjamin Dennis, Taylor Milligan, Anirudh Nath, Wes Smith

Gesture Recognition using Marathi/Hindi Alphabet

Computer Applications: An International Journal (CAIJ), Vol.3, No.1, February Mohammad Taye, Mohammad Abu Shanab, Moyad Rayyan and Husam Younis

Voluntary Product Accessibility Template (VPAT)

Voluntary Product Accessibility Template (VPAT)

Gesture Vocalizer using IoT

Gesture Recognition by using Flex Sensor for Patient Monitoring

Avaya B159 Conference Telephone Voluntary Product Accessibility Template (VPAT)

[Chafle, 2(5): May, 2015] ISSN:

Microphone Input LED Display T-shirt

Konftel 300Mx. Voluntary Product Accessibility Template (VPAT)

Hand Gesture Recognition and Speech Conversion for Deaf and Dumb using Feature Extraction

Summary Table Voluntary Product Accessibility Template. Supporting Features. Supports. Supports. Supports. Supports

COMPUTER PLAY IN EDUCATIONAL THERAPY FOR CHILDREN WITH STUTTERING PROBLEM: HARDWARE SETUP AND INTERVENTION

AVR based Robotic Arm for Speech Impaired People

Avaya IP Office R9.1 Avaya one-x Portal Call Assistant Voluntary Product Accessibility Template (VPAT)

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Heart Attack Detection and Heart Rate Monitoring Using IoT

SUMMARY TABLE VOLUNTARY PRODUCT ACCESSIBILITY TEMPLATE

Measuring Heart Rate and Blood Oxygen Levels for Portable and Wearable Devices

Summary Table Voluntary Product Accessibility Template. Supports. Please refer to. Supports. Please refer to

Heart Rate Monitoring System Using Lab View

TALKING GLOVES. Mitali Madhusmita Atul Agrawal. Department of Electronics and Communication Engineering

Research Proposal on Emotion Recognition

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Avaya IP Office 10.1 Telecommunication Functions

Avaya B189 Conference Telephone Voluntary Product Accessibility Template (VPAT)

Avaya one-x Communicator for Mac OS X R2.0 Voluntary Product Accessibility Template (VPAT)

Available online at ScienceDirect. Procedia Technology 24 (2016 )

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

VPAT Summary. VPAT Details. Section Telecommunications Products - Detail. Date: October 8, 2014 Name of Product: BladeCenter HS23

Networx Enterprise Proposal for Internet Protocol (IP)-Based Services. Supporting Features. Remarks and explanations. Criteria

The following information relates to NEC products offered under our GSA Schedule GS-35F- 0245J and other Federal Contracts.

Glossary of Inclusion Terminology

Avaya Model 9611G H.323 Deskphone

Networx Enterprise Proposal for Internet Protocol (IP)-Based Services. Supporting Features. Remarks and explanations. Criteria

In this chapter, you will learn about the requirements of Title II of the ADA for effective communication. Questions answered include:

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

ISSN: [ ] [Vol-3, Issue-3 March- 2017]

A Wearable Hand Gloves Gesture Detection based on Flex Sensors for disabled People

Detection of Finger Motion using Flex Sensor for Assisting Speech Impaired

Apple emac. Standards Subpart Software applications and operating systems. Subpart B -- Technical Standards

IDENTIFICATION OF REAL TIME HAND GESTURE USING SCALE INVARIANT FEATURE TRANSFORM

Avaya G450 Branch Gateway, R6.2 Voluntary Product Accessibility Template (VPAT)

A Novel Approach for Communication among Blind, Deaf and Dumb People

Avaya G450 Branch Gateway, Release 7.1 Voluntary Product Accessibility Template (VPAT)

Voluntary Product Accessibility Template (VPAT)

Avaya 2500 Series Analog Telephones Voluntary Product Accessibility Template (VPAT)

As of: 01/10/2006 the HP Designjet 4500 Stacker addresses the Section 508 standards as described in the chart below.

Summary Table: Voluntary Product Accessibility Template

Avaya B179 Conference Telephone Voluntary Product Accessibility Template (VPAT)

Learning Objectives. AT Goals. Assistive Technology for Sensory Impairments. Review Course for Assistive Technology Practitioners & Suppliers

Summary Table: Voluntary Product Accessibility Template

Figure 1: Block Diagram of Proposed System

SUMMARY TABLE VOLUNTARY PRODUCT ACCESSIBILITY TEMPLATE

Design and Implementation of Speech Processing in Cochlear Implant

ANDROID BASED AID FOR THE DEAF

Accessible Computing Research for Users who are Deaf and Hard of Hearing (DHH)

Fujitsu LifeBook T Series TabletPC Voluntary Product Accessibility Template

Supporting Features. Criteria. Remarks and Explanations

An Approach to Hand Gesture Recognition for Devanagari Sign Language using Image Processing Tool Box

PC BASED AUDIOMETER GENERATING AUDIOGRAM TO ASSESS ACOUSTIC THRESHOLD

PORTABLE CAMERA-BASED ASSISTIVE TEXT AND PRODUCT LABEL READING FROM HAND-HELD OBJECTS FOR BLINDPERSONS

Transcription:

A Design of Prototypic Hand Talk Assistive Technology for the Physically Challenged S. Siva Srujana 1, S. Jahnavi 2, K. Jhansi 3 1 Pragati Engineering College, Surampalem, Peddapuram, A.P, India 2 Pragati Engineering College, Surampalem, Peddapuram, A.P, India 3 Pragati Engineering College, Surampalem, Peddapuram, A.P, India Abstract: In our day to day life most of the task we carry out involves speaking and hearing. The deaf and dumb people have difficulty in communicating with others who cannot understand sign language and misinterpreters. In this paper, we designed a simple embedded system based device for solving this problem.we have use flex sensor for getting the data from the deaf and dumb people using sign language and microcontroller AT89c51 for controlling all operations and APR 9600 voice chip for voice storage.lcd display and speaker are used as output device to convey the message to deaf and dumb people. Keil and proteus software tools are used for compiling software coding and simulating the design. Keywords: ADC, AVR, FET, LCD, LED, PWM, MEM, VR, RF 1. Introduction Glove-based systems represent one of the most important efforts aimed at acquiring hand movement data. Generally dumb people use sign language for communication but they find difficulty in communicating with others who do not understand sign language. It is based on the need of developing an electronic device that can translate sign language into speech in order to make the communication take place between the mute communities with the general public possible, a Wireless data gloves is used which normal cloth is driving gloves fitted with flex sensors along the length of each finger and the thumb. Mute people can use the gloves to perform hand gesture and it will be converted into speech so that normal people can understand their expression. 2. Existing System In our day to day life most of the task we carry out involves speaking and hearing. The deaf and dumb people have difficulty in communicating with others who cannot understand sign language and misinterpreters. In this paper, we designed a simple embedded system based device for solving this problem.we have use MEM sensor [1] for getting the data from the deaf and dumb people using sign language and microcontroller AT89c51 for controlling all operations and APR 9600 voice chip for voice storage.lcd display and speaker are used as output device to convey the message to deaf and dumb people. Kiel and proteus software tools are used for compiling software coding and simulating the design. The second step, that of recognizing the sign or gesture once it has been captured is much more challenging, especially in a continuous stream. In fact currently, this is the focus of the research. The objective of this paper is to design a simple embedded system based communicating device for deaf and dumb people. Here two major problems are taken into consideration. First one is deaf and dumb people communicating with normal person and second one is communication between deaf and dumb people. To solve this problem we have use two modes of operation in this system. We are measuring the actions performed by the deaf and dumb people using flex sensor [1] attached to gloves in a hand of the user. Once the glove is placed in the hands, whenever an action for sign language is performed, the bending values are obtained and the corresponding action is identified by the microcontroller AT89c51. It activates the voice chipapr9600 and the corresponding voice is spelled in speaker and displayed in the LCD. Virtual reality is a computer interface that includes simulation and interactions through different sensory channels in real time, which may be visual, acoustic, tactile, and olfactory stimulus. The high prices that characterize virtual reality devices, has led the search for alternative, less sophisticated as the simulation by conventional computing devices such as keyboard, mouse, and monitor. This is known as desktop virtual reality and into the main computer programs can be mentioned VRML. 3. Proposed System Our proposal will help the deaf and dumb people who are unable to communicate, or having difficulties in communication. A setup data glove is equipped with five flex sensors, each of the flex sensors [1] is meant to be fixed on each of the finger of the hand glove for the monitoring and sensing of static movements of the fingers of the hand. Whatever the person wants to communicate is activated by two ways either by hand gesture or by keypad in the device. A basic language for the dumb named as the AMERICAN SIGN LANGUAGE as shown in the fig.1 helps these special people in these circumstances. This input is text is processed Paper ID: ART20161858 1259

using a microcontroller Further, the frequently spoken words can be stored in memory of APR9600 voice chip and can be impairment or deaf people cannot talk like normal people so they have to depend on some sort of visual communication in most of the time. Sign Language is the primary means of communication in the deaf and dumb. The second step, that of recognizing the sign or gesture once it has been captured is much more challenging, especially in a continuous stream. In fact currently, this is the focus of the research. The objective of this paper is to design a simple embedded system based communicating device for deaf and dumb people. Figure 1: American Sign Language Easily retrieve using hotkeys. The output from the LCD can be read by the dumb people and Speaker can be heard by the deaf people. This device helps in communication if attached to both the person involved in the communication who may be deaf, dumb, and Normal person. The flex sensor [1] senses the sign language performed by the deaf people and produces the output. The output of the flex sensor is given to the Microcontroller through ADC. In the Microcontroller we already programmed the particular word for each output of the sensor. This word is recorded in the voice chip and heard from the speaker. If the Controller accepts the input from the Keypad then, the output will be displayed in the LCD. Here two major problems are taken into consideration. First one is deaf and dumb people communicating with normal person and second one is communication between deaf and dumb people. To solve this problem we have use two modes of operation in this system. We are measuring the actions performed by the deaf and dumb people using MEM sensor attached to gloves in a hand of the user. CONCEPT Glove-based systems [5] represent one of the most important efforts aimed at acquiring hand movement data. Generally dumb people use sign language for communication but they find difficulty in communicating with others who do not understand sign language. The practical implementation of this project can be achieved through the usage of special software named KEIL. Also, many components such as a step down transformer, DC shunt motor, an 8051 microcontroller, A/D converter (MCP 3208), LCD display, relay units of 12V capacity,a motor driver (ULN 2003), voice modules(apr9600) and most importantly sensors named MEMs Figure 2: Hand Talk Assistance glove In the area of technology applied to sign language interpretation will have some significant works in two ways, first is to translate the language either spoken or sign language text and the second which means a signer for be translated into spoken or text. 4. Hand Assistive System For Physically Challenged Dumb people are usually deprived of normal communication with other people in the society. It has been observed that they find it really difficult at times to interact with normal people with their gestures, as only a very few of those are recognized by most people. Since people with hearing The below block diagram is the representation of connection of sensors to the required loads for the physically challenged[2][3]. In this connection, the 230v ac supply is drawn from the transformer which is step-down in nature. This reducing in voltage helps in obtaining the input voltage to a required level i.e. 12v. Now this input voltage which is of 12v ac is converted into 12v DC with the help of a modern electronic rectifier. Now the converted DC voltage is converted to continuous dc supply voltage. Further this voltage is supplied through a capacitor unit to make it ripple free and pure dc supply. Now, this converted supply is regulated using a voltage regulator 7805 and 7812. Hence, this regulated supply is utilized in lightening up the LEDs and activating the entire apparatus. It is based on the need of developing an electronic device that can translate sign language into speech in order to make the communication take place between the mute communities with the general public possible, a Wireless data gloves is used which normal cloth is driving gloves fitted with flex sensors along the length of each finger and the thumb. Mute people can use the gloves to perform hand gesture and it will be converted into speech so that normal people can understand their expression. The basic idea behind this project is for interfacing the usage of sensors named MEMs in providing a better communication for the physically challenged[2][3]. Paper ID: ART20161858 1260

The practical implementation of this project can be achieved through the usage of special software named KEIL. Also, many components such as a step down transformer, DC shunt motor, an 8051 microcontroller, A/D converter (MCP 3208), LCD display, relay units of 12V capacity,a motor driver (ULN 2003), voice modules(apr9600) and most importantly sensors named MEMs 600ma and a voltage of 12v. In the final stage, the voice playback module comes into play. In this unit, there are 8 modes into which the voice can be recorded and played accordingly to the requirements of the user. After the recording of the voices is done, this module can be changed to PLAY mode and the voices are fed to the sensor through programming into the microcontroller as shown in fig 3 The below block diagram is the representation of connection of sensors to the required loads for the physically challenged[2][3]. In this connection, the 230v ac supply is drawn from the transformer which is step-down in nature. This reducing in voltage helps in obtaining the input voltage to a required level i.e. 12v. Now this input voltage which is of 12v ac is converted into 12v DC with the help of a modern electronic rectifier. Now the converted DC voltage is converted to continuous dc supply voltage. Further this voltage is supplied through a capacitor unit to make it ripple free and pure dc supply. Now, this converted supply is regulated using a voltage regulator 7805 and 7812. Hence, this regulated supply is utilized in lightening up the LEDs and activating the entire apparatus. Hence, according to the movement of the user s hand the MEM sensors detect the hand motion and display the needs accordingly by voice playback or display through the LCD present in the MOTHERBOARD. The A/D converter consists of 8 pins which are named as ANALOG INPUT, ANALOG OUTPUT and DIGITAL INPUT, DIGITAL. Initially, the MEMs are moved in order to detect the motion of the user s hands. Now, with the sensing technology inherently present in these sensors, the movement of the hands is detected in terms of the coordinates with respect to X, Y, Z Figure 3: Block Diagram Axes. MEMs intake an initial voltage of 5V.The A/D converter is named as MCP3208. The supply to the sensors is transferred by the A/D unit. The intake from the A/D unit is sent to the AC and DC loads through the microcontroller unit and the relay unit. The microcontroller unit is employed to select various modes employed in the apparatus. In this unit, capacitors like disc and cylindrical capacitors are employed. Crystal oscillator is also employed to eradicate any chance of impurities and disturbances. LCD is an important component in this unit as it is used in displaying the requirements of the handicapped users. Further, the RELAY UNIT comes into act. This unit trips the circuit and is used to controlling the ON and OFF of the AC and DC loads. The AC load rating is a 15w bulb and that of the DC load is DC shunt motor of rating 10rpm, Figure 4: Schematic Diagram The schematic diagram in fig.4 represents the nature of connections between the components in the hand- talk assistive technology for the physically challenged. First of all, the nature of movement in the hand of the challenged user is detected by the coordinate sensing by the MEM sensor[1]. This sensor, after sensing these movements sends an analog signal to the A/D converter in the next stage. In the further stage, the A/D converter which is rated for 5V converts the analog signal to a digital signal and reads the movements in its own language. The ANALOG TO DIGITAL CONVERTER is named as MCP3206. Further, this digital signal is carried to the MOTHERBOARD where the microcontroller decides the mode of load. Another mode of communication i.e. LCD display is connected to the other end of the microcontroller. On the motherboard, there are various components which interface the modes of communication for the special as shown in fig 5 Initially, the supply is given from a step down transformer of range 230v/12v which reduces the 230v ac supply to 12v ac input. After this, a rectifier is used to convert the ac voltage to dc voltage. Here, a capacitor of 1000uf is employed to filter the impurities in the DC input voltage. Paper ID: ART20161858 1261

requirement of the physically challenged and gets played when the mode is selected to a voice playback as shown in fig.6. The mode is changed to LCD display when the mode is changed to the display mode by the action of the microcontroller. This prototype can also be implemented using a glove fitted with sensors which may be MEMs or flex sensors as shown in fig.7. Figure 5: Overview of the LCD Display System Further, this voltage is divided and regulated by two voltage regulators of 12v and 5 v respectively. These help in regulating the supply to a limit. Further the employment of various disc and cylindrical capacitors is useful in controlling the glow in LED, amount of input through the diodes and LCD display. Along with the implementing of the two basic modes there is also a provision to employ both AC and DC loads in our project. A DC shunt motor of rating 10rpm and 600ma is employed along with an AC consuming load like a bulb of 15 watts wattage. The operation of these loads is controlled solely by a relay unit which trips whenever the input to this unit 1 from the microcontroller and operates whenever the input is 0 from the same microcontroller. This relay unit plays a very important role in the connection of the AC and DC loads with the apparatus. The T89C5115 is a high performance Flash version of the 80C51 single chip 8-bit microcontrollers. It contains a 16- KB Flash memory block for program and data.the 16-KB Flash memory can be programmed either in parallel mode or in serial mode with the ISP capability or with software. The programming voltage is internally generated from the standard VCC pin. Figure 6: Overview of the Voice Playback Module The AC and DC loads are interfaced to the system through a motor driver named ULN 2003 and this driver performs a better function by interfacing the loads and regulating the amount of power passing through the loads. In the final stage, a voice playback module comes into play. This module has connections from the relay as well as the motherboard unit. Now, a total of 8 pins are present in this module and each pin has the capability of storing a unique Figure 7: Implementation using a hand glove. The first mode of operation is voice mode. In this mode the input is obtained from flex sensor which senses the sign language performed by the deaf and dumb people or normal person. Then the resistance value of the flex sensor is changed based on the degree of bending of fingers. The output from the flex sensor[1] is multiplexed and is an analog signal.this analog signal is given to the ADC to convert into digital value. This digital value is given to the microcontroller. In the microcontroller we have already programmed a particular word or message indicating for each voltage value coming from flex sensor. It compares the incoming voltage value with already stored value. If it is matched with a particular word that is stored in the voice chip.then the output from the voice chip is given to the audio amplifier and the word is heard through the speaker, which helps the dumb people to hear the information. Vice versa the dumb people can also respond using his gesture. Thus two way communications can be accomplished. The second mode of operation is Text mode. In this mode of operation the controller accepts the input of the deaf, dumb & normal person using keypad. If the particular key in the keypad is pressed, the pressed key is detected by the controller by scanning the each row and column respectively. For each key press a particular word stored in the microcontroller, is displayed in the LCD display which helps the deaf person to see the message. Vice versa the deaf people can also respond using his key pad. The detailed hardware setup is shown in the fig.8 Paper ID: ART20161858 1262

Speech Syntetizer, IEEE Transactions on Neural Networks, Vol. 3, No 6, November 1992 [6] Raj M.S., Saravanan T., Srinivasan V., "A modified direct torque control of induction motor using space vector modulation technique", Middle - East Journal of Scientific Research, ISSN : 1990-9233, 20(11) (2014) pp.1572-1574 Author Profile 5. Conclusion Figure 8: Detailed Hardware Setup This paper describes the design and working of a system which is useful for deaf and dumb people to communicate with one another and with the normal people. The deaf and dumb people use their standard sign language which is not easily understandable by common people. This system converts the sign language into voice which is easily understandable by the people. The sign language is translated into text form, to facilitate the deaf people to convey their messages as well to the others. This text is display on LCD and also through speaker by which the other person can understand.in this way our project is very well useful for deaf and dumb people[2][3] and for other robotics applications. S Siva Srujana Pursuing her B.Tech in Pragati Engineering College, Surampalem near peddapuram Under JNTU college of Engineering Kakinada. Her Area of research includes Applications in Robotics S Jahnavi Pursuing her B.Tech in Pragati Engineering College, Surampalem near peddapuram Under JNTU college of Engineering Kakinada. Her Area of research includes Applications in Robotics K Jhansi Rani Pursuing her B. Tech in Pragati Engineering College, Surampalem near peddapuram Under JNTU college of Engineering Kakinada. Her Area of research includes Applications in Robotics This project aims to lower the communication gap between the deaf or mute community and the normal world. This project was meant to be a prototype to check the feasibility of recognizing sign language using sensor gloves. References [1] Jaime Leybon Ibarra, Maria del Rocio Ramirez Barba y Veronica Taboada Picazo. SENSor Foto-Electrico Aplicado al movimiento de los dedos de las Manos, Computación y Sistemas Volumen. 10 No 1, 2006, pp 57-68, ISSN 1405-5546 [2] Bharatwaj R.S., Vijaya K., Rajaram P., "A descriptive study of knowledge, attitude and practice with regard to voluntary blood donation among medical undergraduate students in Diagnostic Research, ISSN : 0973-709X, 6(S4) (2012) pp.602-604. [3] Fernando López, Javier Tejedor, Daniel Bolaños, José Colás, Interprete de lenguaje de signos en español multidispositivo, Conferencia IADIS Ibero-Americana WWW/Internet 2006 [4] Anbuselvi S., Rebecca J., "A comparative study on the biodegradation of coir waste by three different species of Marine cyanobacteria", Journal of Applied Sciences Research, ISSN : 1815-932x, 5(12) (2009) pp.2369-2374. [5] Sidney-Fels S. y E.-Hinton Geofrey. Glove-Talk: A neural Network Interface Between a Data-Glove and a Paper ID: ART20161858 1263