RASPBERRY PI BASED GESTURE BASED ACCESS AND VOICE COMMAND FOR BLIND AND DUMB

Similar documents
Meri Awaaz Smart Glove Learning Assistant for Mute Students and teachers

AVR Based Gesture Vocalizer Using Speech Synthesizer IC

Assistant Professor, PG and Research Department of Computer Applications, Sacred Heart College (Autonomous), Tirupattur, Vellore, Tamil Nadu, India

Hand Gestures Recognition System for Deaf, Dumb and Blind People

Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio

A Review on Gesture Vocalizer

Glove for Gesture Recognition using Flex Sensor

Embedded Based Hand Talk Assisting System for Dumb Peoples on Android Platform

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information:

Sign Language Interpretation Using Pseudo Glove

Computer Applications: An International Journal (CAIJ), Vol.3, No.1, February Mohammad Taye, Mohammad Abu Shanab, Moyad Rayyan and Husam Younis

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Detection and Recognition of Sign Language Protocol using Motion Sensing Device

Real-time Communication System for the Deaf and Dumb

Haptic Based Sign Language Interpreter

Hand-Gesture Recognition System For Dumb And Paraplegics

Speech to Text Wireless Converter

Gesture Recognition using Marathi/Hindi Alphabet

An Ingenious accelerator card System for the Visually Lessen

Avaya IP Office R9.1 Avaya one-x Portal Call Assistant Voluntary Product Accessibility Template (VPAT)

ANDROID BASED AID FOR THE DEAF

Summary Table Voluntary Product Accessibility Template. Supports. Please refer to. Supports. Please refer to

Voluntary Product Accessibility Template (VPAT)

Hand Gesture Recognition and Speech Conversion for Deaf and Dumb using Feature Extraction

Avaya one-x Communicator for Mac OS X R2.0 Voluntary Product Accessibility Template (VPAT)

Fujitsu LifeBook T Series TabletPC Voluntary Product Accessibility Template

Avaya IP Office 10.1 Telecommunication Functions

Inventions on expressing emotions In Graphical User Interface

Glossary of Inclusion Terminology

The Use of Voice Recognition and Speech Command Technology as an Assistive Interface for ICT in Public Spaces.

ITU-T. FG AVA TR Version 1.0 (10/2013) Part 3: Using audiovisual media A taxonomy of participation

Avaya G450 Branch Gateway, Release 7.1 Voluntary Product Accessibility Template (VPAT)

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

A Novel Based Adaptive Communication Technique For Specially Abled People

Summary Table Voluntary Product Accessibility Template. Not Applicable

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE) Vol 5, Issue 3, March 2018 Gesture Glove

Summary Table Voluntary Product Accessibility Template. Supporting Features. Not Applicable. Supports. Not Applicable. Supports

Smart Speaking Gloves for Speechless

VPAT Summary. VPAT Details. Section Telecommunications Products - Detail. Date: October 8, 2014 Name of Product: BladeCenter HS23

Recognition of Voice and Text Using Hand Gesture

Android based Monitoring Human Knee Joint Movement Using Wearable Computing

SMART GLOVES FOR HAND GESTURE RECOGNITION USING SIGN LANGUAGE TO SPEECH CONVERSION SYSTEM

Real Time Sign Language Processing System

ISSN: [ ] [Vol-3, Issue-3 March- 2017]

Summary Table Voluntary Product Accessibility Template. Supporting Features. Not Applicable- Supports with Exception. Not Applicable.

Broadcasting live and on demand relayed in Japanese Local Parliaments. Takeshi Usuba (Kaigirokukenkyusho Co.,Ltd)

Speaking System For Mute

Phonak Wireless Communication Portfolio Product information

A Review on Hand Gesture Recognition for Deaf and Dumb People Using GSM Module

Glossary of Disability-Related Terms

A Communication tool, Mobile Application Arabic & American Sign Languages (ARSL) Sign Language (ASL) as part of Teaching and Learning

Apple emac. Standards Subpart Software applications and operating systems. Subpart B -- Technical Standards

Communications Accessibility with Avaya IP Office

Summary Table Voluntary Product Accessibility Template. Supporting Features. Supports. Supports. Supports. Supports

Summary Table Voluntary Product Accessibility Template. Criteria Supporting Features Remarks and explanations

Gesture Vocalizer using IoT

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Relay Conference Captioning

Summary Table Voluntary Product Accessibility Template

Mechanicsburg, Ohio. Policy: Ensuring Effective Communication for Individuals with Disabilities Policy Section: Inmate Supervision and Care

VPAT for Apple MacBook Air (mid 2013)

Voluntary Product Accessibility Template Summary Table

Summary Table Voluntary Product Accessibility Template. Criteria Supporting Features Remarks and explanations

Voluntary Product Accessibility Template (VPAT)

Date: Jan 23, 2019 Name of Product: Zoom Video Conferencing and Webinar v4.3.0 (Android) Contact for more Information:

Cisco Unified Communications Accessibility Innovation

Cisco Unified Communications: Bringing Innovation to Accessibility

Summary Table Voluntary Product Accessibility Template

Research Proposal on Emotion Recognition

GLOVES BASED HAND GESTURE RECOGNITION USING INDIAN SIGN LANGUAGE

Avaya G450 Branch Gateway, R6.2 Voluntary Product Accessibility Template (VPAT)

Microcontroller and Sensors Based Gesture Vocalizer

Tangible glove for sign gesture into text & speech in Tamil Language

and Technology I. INTRODUCTION METHOD & MATERIAL

Summary Table Voluntary Product Accessibility Template. Criteria Supporting Features Remarks and explanations

Voluntary Product Accessibility Template (VPAT)

Summary Table Voluntary Product Accessibility Template. Supports. Not Applicable. Not Applicable- Not Applicable- Supports

Hand Gesture Recognition In Real Time Using IR Sensor

Sound Interfaces Engineering Interaction Technologies. Prof. Stefanie Mueller HCI Engineering Group

The following information relates to NEC products offered under our GSA Schedule GS-35F- 0245J and other Federal Contracts.

New Approaches to Accessibility. Richard Ladner University of Washington

Everything you need to stay connected

Summary Table Voluntary Product Accessibility Template. Supporting Features Not Applicable Not Applicable. Supports with Exceptions.

AC : USABILITY EVALUATION OF A PROBLEM SOLVING ENVIRONMENT FOR AUTOMATED SYSTEM INTEGRATION EDUCA- TION USING EYE-TRACKING

Networx Enterprise Proposal for Internet Protocol (IP)-Based Services. Supporting Features. Remarks and explanations. Criteria

Kinect Based Edutainment System For Autistic Children

COMPUTER PLAY IN EDUCATIONAL THERAPY FOR CHILDREN WITH STUTTERING PROBLEM: HARDWARE SETUP AND INTERVENTION

Overview. Video calling today. Who we are 8/21/12. Video Quality and Interoperability across Videophones Today and in the Future

In this chapter, you will learn about the requirements of Title II of the ADA for effective communication. Questions answered include:

As of: 01/10/2006 the HP Designjet 4500 Stacker addresses the Section 508 standards as described in the chart below.

A non-invasive tongue controlled wireless assistive device for quadriplegics.

Section Web-based Internet information and applications VoIP Transport Service (VoIPTS) Detail Voluntary Product Accessibility Template

An Approach to Hand Gesture Recognition for Devanagari Sign Language using Image Processing Tool Box

Quick guide to connectivity and the Interton Sound app

Accessibility of Communications in Canada Presentation to Accessible Americas III: Information and Communication for ALL

Networx Enterprise Proposal for Internet Protocol (IP)-Based Services. Supporting Features. Remarks and explanations. Criteria

SUMMARY TABLE VOLUNTARY PRODUCT ACCESSIBILITY TEMPLATE

Summary Table: Voluntary Product Accessibility Template

Transcription:

RASPBERRY PI BASED GESTURE BASED EMAIL ACCESS AND VOICE COMMAND FOR BLIND AND DUMB Navajyothi N 1, Dr. K R Nataraj 2 1PG Student, SJB Institute of Technology, Bangalore, India 2Head of the Department, ECE, SJB Institute of Technology, Bangalore ---------------------------------------------------------------------***--------------------------------------------------------------------- Abstract - In today s world communication has become so easy due to integration of communication technologies with internet. But the blind people find difficult to use this technology because of they require visual sensitivity and in our country around 2.78% of peoples are not able to speak (dumb). Their communications with others are only using the motion of their hands and expressions. In order to overcome the complexity of the dumb and blind peoples this project has been proposed. This system is based on the motion sensor (accelerometer). Gesture based system is designed to provide the communication between dumb, deaf and blind people and their communication with the healthy person. Gesture used in this project is by using the data glove and it is based on the microcontroller. Data glove with inbuilt sensor is used to detect the movements of the hands and converts it into as voice output and generates email with predefined images and subject to the concern person. Key Words: Gestures, Accelerometer, Raspberry pi, Arduino Uno, Qt Creator, and Lx Terminal. 1. INTRODUCTION Handicapped individuals are the general population physically or mentally challenged. A Dumb and Blind people confronts such a variety of issues in the general public. Handicapped in the sense it is the condition of vulnerability and furthermore it alludes to the restrictions experienced by the disabled individual in correlation with the exercises of the unimpaired of comparable age and sex. This paper aims to lower the obstruction in communication. It depends on the need of building up an electronic gadget that can make an interpretation of gesture based communication into speech so as to make the communication between the mute groups with overall population conceivable. Motions of the hands are detected by the glove for Dumb people and it will be changed over into voice so ordinary individuals can comprehend their look motion in a gesture based communication is a specific development of the hands with a particular shape made out of them and it sends email to concern person of dumb and blind people predefined images. 1.1 GESTURE RECONGNITION The Gesture Recognition in deciphering numerical calculations and come it's under the software engineering and dialect innovation. Motions are the expressions, which are generally utilized as a part of communication between individuals. Research is in advance that plans to incorporate signal as an expression in Human Computer Interaction (HCI). In human communication, the utilization of speech and motions is totally planned. Machine motion and communication through signing acknowledgment is about acknowledgment of motions and gesture based communication utilizing PCs. There are a few procedures are utilized for social affair data about body situating; ordinarily either picture based (utilizing cameras, moving lights and so forth) or gadget based (utilizing instrumented gloves, position trackers and so forth.). In this paper I have used motion sensor (Accelerometer) to obtain the gestures of the human hands. 2. LITERATURE REVIEW The researches and some research has given suggestions clearly related to the innovations about the controlling of the signals in concern about the present population. Despite the information so as to there are individual viewpoints as well as many focus to specify from the investigation, however this impression consider have additional eagerness used for the accompanying classifications, because these are very important regions of movement based UI. It has been around 30 years of research and analysts have been working ceaselessly on motion based framework. The vast majority of the explores depend available motions. Coordinate control by means of hand stance is quick, however restricted in the quantity of Choices. There are looks into about body signal, finger point development. In the early stage, analysts utilized gloves with microcontroller and associated with the gadget through a cable. Motion of the head and signal produced in addition to the voice were additionally within the examination, yet hand motion was the most overwhelming some portion of motion control framework. Clients Most of the examination of the study utilize or focus on the general clients of all ages. At first it was generally for PC clients to take a shot at the articles or introduction. Wheelchair clients prefer signal controlled system which is based on the accelerometer sensor. From the last 5 years the examinations are done on old and disable people. Explores reveal to facilitate applications which is based on signals are used for various things, enthusiasm, controlling home machine, tele-mind, 2017, IRJET Impact Factor value: 5.181 ISO 9001:2008 Certified Journal Page 1457

tele-wellbeing, elderly or handicap mind. The extent of the application reveals us the significance of more examines in a action controlled structure. Maximum applications are to replace predictable information devices like console and mouse, open application for elderly-impair like accelerometer.. Presently individuals can unite with any media utilizing signal to control wide variety of consumptions. We have motion based business items in 2003. Gestures have been trapped by using infrared bars, information glove, still camera, wired and many between accompanying progressions like gloves, pendant, and infrared flag arrange server and so forth. Late vision procedure, video and web cam based motion response has made it conceivable to catch any intuitive motion for any ubiquitous gadgets from the regular environment with 3D perception. Latest and wonderful surveys of the work done in gesture recognition field by many scientists are described in [4]. Here [5] and [6] evaluate the gesture acknowledgment for human automaton assembly and human automaton useful interaction. Visual touchpad and two gave gestural info gadget [6] gives the human obvious touch screen, less cost visual based information gadget this will permit the interchanges of the PC's, Laptops, Public stands, or Large divider mounted presentations. By utilizing camera we can get the 3D places of the fingertips and the stereo vision can decide the separation amongst fingertip and Visual touchpad. From this paper I have utilized fingertip motion separate data. [7] Offers a fresh "banner level" perception by examining prosodic miracles of consistent movement in addition to talk co-creation. Similarly shows a computational structure for enlightening reliable movement affirmation in perspective of two wonders that catch think (coverbalization) and programmed (physiological) responsibilities of prosodic synchronization.in this paper signs are used to control the wheel situate; this ought to be conceivable using two figuring s. From this paper I got hand flag affirmation. affirmation of signs, the glove is formed physically. As opposed to microcontroller I am supplanting Raspberry Pi 3 show in my venture to make it open remotely using web. In [10] human Interface Devices have reliably been a confinement for joint effort between the human and propelled world. The advancing example is to encourage this; we have been using a Mouse and a Keyboard to interface with Computers. In any case it is further less requesting to collaborate with the PC if trademark signals, for instance, just a tilt of the arm or a point could control the mouse. This has seen a conventional measure of use in field of gaming wiping out hand held wired controllers, for instance, joysticks transcending to trademark hand advancements which are incorporated into the redirection to get control over the automated end. These puzzling systems are dealt with through the item end middleware which interprets data from sources, for instance, cameras and sensor. This paper means to give an introduced insignificant exertion course of action which does in like manner in primitive conditions without greatly multifaceted nature. The same here is recognized through an assortment of sensors whose data is arranged and mapped to a particular parameter that can be controlled in the physical world. The paper looks at the structure prototyped to control the mouse of a PC through movements and tilt of the wrist. The same is recognized utilizing Accelerometer, Reed Switch and a Flex sensor. From this paper I have information about sensors used to get the flag data and to controlling the mouse and comfort remotely. 3. PROPOSED METHODOLOGY This paper presents the methodology of the work done, here first introducing all the equipment gadgets like ZigBee module, which is utilized at the both the finishes of transmitter and the beneficiary. Accelerometer produces a few charges through the motions for the correspondence reason and in view of the got information the orders are send to recipient end, which in turn process the commands and gives the output by means of speaker as a voice order for a few motions and email alert with predefined images for specific movements of the hand motion. [8] Discusses diverse classifications for motion acknowledgment, motion acknowledgment can done in various courses, by utilizing camera and by movement sensors signals can be acquired effortlessly. This [9] papers gives information about flag affirmation to moronic and hard of hearing individuals and this structure is made with the help of microcontroller. To see movements is set as the essential tenacity of glove diagram. For this we require flex sensor, voice module and a microcontroller. The sensors rely on upon the surveyed of sign pictures, which uses the measure of sensors in upgrade. AT89S52 is used as the microcontroller. For remote data convey we can use Bluetooth contribute android convenient. Resulting to evaluating the hardware required for the Fig-1: Block Diagram 2017, IRJET Impact Factor value: 5.181 ISO 9001:2008 Certified Journal Page 1458

Raspberry Pi module is fundamentally used to actualize this project work and this goes about as heart of this work, which is utilized to deliver the output of the proposed work. Accelerometer sensors are the movement sensors Gesture based structure is an large scale microcontroller based structure being intended to streamline interaction among the dumb, deaf and blind peoples and their interaction with the normal persons. The proposed system can be enthusiastically reconfigured to work as a smart device. The gathered signals is passed to the Raspberry Pi board utilizing the ZigBee module, here the ZigBee module is interfaced with the Arduino board to pass the motions to the beneficiary end. At the beneficiary end ZigBee gets information and transmits to Pi board, which thus changes over the signals into human conspicuous voice by the procedure of ATOI, which is only ASCII information is changed over into Integer esteem by the SCOM support and afterward got outcome in contrasted and the set edge esteem, for the specific edge esteem it gives voice summon as directions and sends email alarm with predefined pictures. 3.1 RASPBERRY PI 3 MODEL B projects, for example, command line tools and supports for servers. Qt utilizes standard C++ yet makes broad utilization of an special code generator (called the Meta Object Compiler, or moc) together with a few macros to improve the language. Qt can likewise be utilized as a part of a few other programming languages through language bindings. It keeps running on the real desktop stages and a portion of the versatile stages. It has broad internationalization bolster. Non-GUI highlights incorporate SQL database get to, XML parsing; string administration, arrange bolster, and a brought together cross-stage application programming interface (API) for file handling. Code written in the Qt creator using C++ and python code and then we can display the results in the Lx Terminal window and we can check the output at audio jack and the email. 4. RESULTS OBTAINED The developed prototype using Raspberry Pi 3 board, Arduino Uno board, ZigBee module and Accelerometer sensor is shown below. Fig -2: Raspberry Pi 3 Model In this project the Raspberry Pi board will receives the gesture value from the ZigBee module presented at the receiver end, the received ASCII X value fed into SCOM Buffer, here it converts the ASCII value in to integer value by ATOI command in the programming and then its stores in the Temporary buffer then its compares with the set threshold value and finally it gives voice command as output to corresponding threshold and sends email alert with predefined images. 3.2 SOFTWARE USED In this proposed work I have used Qt Creator with C ++ and Python code to create the application and run in the Raspbian Os, which is the operating system copied as image to the Micro SD card and inserted to the Raspberry Pi board. Qt is a cross-stage application system that is normally used for creating application programming with a graphical UI (GUI), and furthermore utilized for creating non-gui Fig-3: Prototype Developed In my proposed work I have set Threshold values between >=400 to <475, here I have created three commands for dumb and blind people, the commands with set threshold values is as follows, 1. >=400 and<420: command: Towards Bus route 2. >=425 and<450:command-please provide water 3. >=450 and<475:command-hello sir Like this we can insert N number of commands in the programming. When executing the code at Raspbian window, the following result is obtained for the Threshold Value >=400 and<420 and plays the audio output as Towards Bus Route. 2017, IRJET Impact Factor value: 5.181 ISO 9001:2008 Certified Journal Page 1459

Fig-6: email alert for command 3 Fig-4: Email alert for Command 1 The result shown below is for command 2 with set value in the range of >=425 and<450, and the command Please Provide water plays in the audio output and email will be sent with predefined image as shown 5. CONCLUSIONS The working procedure of the developed prototype and its design is helpful for the dumb, deaf and blind people to communicate with the people with some disabilities and with the normal people in the society. The dumb people will communicate with others by using their sign language, some may understand and some may not understand and the blind people are not able to see their gestures, so this system is proposed to convert the sign language into voice output and text with predefined images sent as email to the concern person as well as deaf people. Simple circuitry as it does not require special hardware, it is easy to implement and it requires less power and it is simple in nature to use. This system is efficient, reliable and less cost. REFERENCES [1]. Wolf, C. G. Can people use gesture commands? IBM Research Report, (RC 11867), April 7, 1986. Fig-5: email alert for command 2 When accelerometer moved in some other direction when it reaches set Threshold values between >=450 and<475,the command 3 will be executed and played in the audio jack with speaker or headphones as Hello Sir and the corresponding email will be sent with the predefined image as shown below [2]. Sanna K., Juha K., Jani M. and Johan M (2006), Visualization of Hand Gestures for Pervasive Computing Environments, in the Proceedings of the working conference on advanced visual interfaces, ACM, Italy, p. 480-483. [3]. Juha K., Panu K., Jani M., Sanna K., Giuseppe S., Luca J. and Sergio D. M. Accelerometer-based gesture control for a design environment, Springer, Finland, 2005. [4]. Jani M., Juha K., Panu K., and Sanna K. (2004). Enabling fast and effortless customization in accelerometer based gesture interaction, in the Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia. ACM, Finland. P. 25-31 2017, IRJET Impact Factor value: 5.181 ISO 9001:2008 Certified Journal Page 1460

[5].http://www.telegraph.co.uk/news/uknews/1563076/El derly-addicted-to-nintendo-wii-atcare-home.html [6]. Malik, S. and Laszlo, J. (2004). Visual Touchpad: A Twohanded Gestural Input Device. In Proceedings of the ACM International Conference on Multimodal Interfaces. p. 289 [7]. Jia, P. and Huosheng H. Hu. (2007), Head gesture recognition for hands-free control of an intelligent wheelchair, Industrial Robot: An International Journal, Emerald, p60-68. [8]. WII Nintendo, 2006, http://www.wii.com, Available at http://www.wii.com [Last accessed April 21, 2009]. 9. W. K. Edwards and R. E. Grinter, "At home with ubiquitous computing: seven challenges," presented at Ubicomp, Atlanta, USA, 2001. [9]. Divyanshee Mertiya#1, Ayush Dadhich#2, Bhaskar Verma#3, DipeshPatidar* A Speaking module for Deaf and Dumb, #student, *assistant professor Department of Electronics & comm. Poornima Institute of Engineering and Technology, Jaipur, Rajasthan, India. [10]. Rithesh M Nanda1, Harshini H K1, Praveen Kuruvadi1, Ankhit B V1, C Gururaj2, wireless Gesture Controlled System, 1Student, Dept. of Telecommunication, BMS College of Engineering, Bangalore, India 2Assistant Professor, Dept. of Telecommunication, BMS College of Engineering, Bangalore, India. [11]. Pooja Dongare1,Omkar Kandal Gaonkar2, Rohan Kanse3, Sarvesh Kukyan4, Innovative Tool For Deaf, Dumb And Blind People, B.E Students K.C. College Of Engineering & Management Studies & Research, Kopri, Thane(E)-400 603, India. 2017, IRJET Impact Factor value: 5.181 ISO 9001:2008 Certified Journal Page 1461