ASL Gesture Recognition Using. a Leap Motion Controller

Size: px
Start display at page:

Download "ASL Gesture Recognition Using. a Leap Motion Controller"

Transcription

1 ASL Gesture Recognition Using a Leap Motion Controller Carleton University COMP 4905 Martin Gingras Dr. Dwight Deugo Wednesday, July 22, 2015

2 1 Abstract Individuals forced to use American Sign Language as a primary method to communicate are trapped by a language barrier worse than any other. ASL users, despite any quantity of time dedicated to learning the language, are unable to train everyone they interact with how to understand them. The recent boom of new innovations in technology, particularly in this study the Leap Motion controller, an IR sensor capable of detecting hands within it s field of view, represent a real potential to address this issue. Furthermore, the miniaturization of computers has made the portability of a solution based around this technology feasible. This research attempts to find attributes captured by the Leap Motion controller that are able to distinguish individual hand gestures in order to demonstrate the capability of potentially parsing and presenting a real time perspective of gestures being signed. The process improved my understanding of the Leap Motion API, how to communicate with it, and create an application powered by that data. The application that resulted from this study, Leap ASL, showcases the use case for this technology as well as the potential that is available within this domain.

3 2 Acknowledgments First and foremost I would like to acknowledge the team at Leap Motion for developing the controller that I used for this study, without which it would have been impossible. Their advances in the domain of detecting objects, in 3D space, using infrared sensors shows great promise and made my experiments possible. As their project advances there are boundless possibilities that will be exposed. Furthermore their contributions to the open source community surrounding their technology significantly simplifies the invention of new applications, without which would make development in this field prohibitively difficult. I would like to thank Dr. Dwight Deugo for agreeing to be my supervisor for this research. His assistance in selecting an initial topic and encouragement on maintaining a narrow focus in my study proved invaluable. By keeping the scope of this investigation narrow I was able to achieve everything that I d intended to and see some intriguing results. Lastly I would like to thank the entire Computer Science department at Carleton University for the last five years of instruction and guidance that allowed me to advance both personally and professionally into the developer I am today.

4 3 Table of Contents Abstract 1 Acknowledgments 2 Table of Contents 3 List of Figures 4 Introduction 5 American Sign Language 6 Leap Motion Controller 8 Leap Motion Developer Community 9 Leap ASL 10 Leap ASL Capture Loop 11 Leap ASL Hand and Finger Objects 12 Approach 13 Results 15 Future work 16 Conclusion 18 References 19

5 4 List of Figures Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 The letters A and S in American Sign Leap ASL in action showing a hand being recorded by the leap motion controller The prompt to record a letter The recorded letter and hash associated with it The letter detected and displayed based on the hand s positioning

6 5 Introduction Hearing and speaking impairment, and the reliance on sign language for communication creates an immense barrier for many individuals. ASL users are not able to teach those they encounter how to sign, meaning that, unlike other language barriers there is no way to improve your ability convey a message. With the immense advances in technology in recent decades this problem is one where it should be possible to improve immensely and create a meaningful impact on millions of peoples lives. The motivation behind this investigation is to discover if there are tools available today that possess the potential to making headway in this field. Since the invention of the computer, in particular the advent of the personal computer, there have been dozens of input devices that have been experimented with. 1 However, the keyboard and mouse combination has remained the predominant tool. This has proven to be very functional for stationary individuals focused on productivity. However, when it comes to mobility, this combination falls short. Recent research and development into this problem domain have resulted in new technologies for individuals to interact with their devices. Leap Motion is one company that took an active interest in this topic and invented the Leap Motion Controller, what may be the next generation of interacting with computers. The Leap Motion controller uses infrared sensors to detect objects and items directly in front of it. 2 So far most applications developed with this technology have focused on the gamification of the interface or the creation of

7 6 productivity tools. 3 When analyzing this tool however, it was observed that another interesting application was possible. By detecting and analyzing hands in the field of view, their position, pitch, direction, it may be possible to match them to American Sign Language hand positions and gestures. If it were possible to interpret these hand positions it may be possible to provide a tool to interpret and translate them in real-time. The goal of this investigation is to discover if it is possible to use the Leap Motion controller to detect hand positioning accurately and precisely enough so that it s promise and or shortcomings are exposed. The resulting project should be simple and straightforward to use and clearly demonstrate the capabilities of the matching algorithm used to detects signs. American Sign Language American sign language is the predominant sign language of the deaf communities in the United States and Canada. It is composed of both static and dynamic gestures that can involve the hands, face, and torso in order to convey either a letter or word. 4 This introduces an extraordinary level of complexity when trying to interpret letters and words in real time. In order to keep the scope narrow, this paper will generally ignore signs %20Language%20for%20All-English.pdf

8 7 requiring any sort of motion since that would require the aggregation multiple frames of hand motion being captured accurately which proved to be a non-trivial issue to overcome. Focusing purely on static gestures proved to be very difficult on its own. Due to the subtle nuances of the language, as depicted in the figure on the right, the slight Figure 1 - The letters A and S in American Sign Language change in position in the thumb is difficult for a digital device to detect. This issue is quite common in american sign language. There are a plethora of words to try to communicate so these small nuances, noticeable by a human user, are quite common, however present a massive barrier for software attempting to interpret the users intentions. Even among ASL users, variations between dialects of signs are found around the world.5 This adds an additional barrier for users since they cannot reliable expect to be able to communicate even with those versed in the same communication medium as them. By creating a common tool to translate ASL it would inevitably encourage a 5

9 8 common set of hand positions and gestures. This uniformity is an unintended benefit of the creation of a real-time interpreter of ASL. Leap Motion Controller The Leap Motion Controller is a peripheral USB device with two monochromatic IR cameras and three infrared LEDs designed to be used face up a desk. 6 The device sends hundreds of frames per second through the USB cable to a host computer were Leap Motions software converts the two dimensional images, consisting of a roughly three and a quarter feet hemispherical area, into three dimension positional data. 7 This allows you to convert objects in the three dimensional space near your computer into numeric interpretations allows you to perform mathematical operations on them. The Leap Motion controller can detect hands and items held in hands within its field of view and based on these detections a publishes additional details about what is in its field of view to improve the experience developing using this tool. The data is published through the Leap Motion application programming interface, or API. The API allows developers to communicate with the controller and consume the interpreted data published by Leap Motion s software. 8 Additionally, the Leap Motion team has created

10 9 software developer kits, or SDKs in multiple programming languages that abstract the three dimensional view that the controller consumes and presents it in a well documented set of models for developers and their applications to interact with. For the Leap Motion ASL tool I used the JavaScript SDK. The reason for this decision is primarily due to the portability of JavaScript. JavaScript can be run it almost every web browser in the world, as well as a server side language in the form of NodeJS. This portability made it very appealing since no matter how future iterations of this application may progress the initial investigative efforts of this study will still be very applicable. Early stages of this project required a large amount of time experimenting with configuring and connecting to the leap motion controller. Once configured communicating with the API and experimenting with the Leap Motion s various SDKs to determine what data and features were available was another area that required time and energy. Early experiments had little to do with my final goal and were only to familiarize myself with the tool. Leap Motion Developer Community Leap Motion has, since day one, nurtured a strong developer community. They have done so by open sourcing a ton of sample as well as full applications. 9 These sample 9

11 10 applications provide a phenomenal resource for quickly iterate upon. Using these resources I was able to quickly add a graphical display of what the leap motion controller was detecting to my application without dedicating a significant amount of time to this ancillary activity. 10 The large quantity of samples and documentation drastically improved the stage of this investigation involving becoming familiar with the Leap Motion controller, it s API, and the SDK s required to be used to interact with the controller. Part of the goals of this project was to in turn contribute back to the leap motion community and potentially provide a resource for others to use in their own applications. The code for my application will be published on GitHub where it is easily accessed, can be forked, and improved upon by others. Leap ASL 10 Leap ASL is the application developed to investigate the opportunity that I perceived in regards to this latest technology developed and the societal benefits it could theoretically represent. It is a simple stateless web app that relies purely on client side code, rather than relying on a back end server to communicate with. The application interfaces with the leap motion controller through a JavaScript API using web sockets to communicate at a phenomenal rate. It pulls the interpreted three dimensional data perceived from the controller allowing both visualization and manipulation. One of the open source solutions from the Leap Motion open source community, leapjs-riggedhttp://github.com/leapmotion/leapjs-rigged-hand

12 11 hand, graphically displays a visualization of the three dimensional interpretation of the users hand. This visual feedback allows users to clearly see what the information the application will capture when recording hand positions. The design of the web app is intentionally simple to emphasize function over form for this iteration. A screenshot of the application in use depicted in Figure 2 below. Figure 2 - Leap ASL in action showing a hand being recorded by the leap motion controller Leap ASL Capture Loop Leap Motion s proprietary software broadcasts the current state and view from the controller to all subscribed JavaScript listeners over the WebSocket protocol. This loop is accessible by connecting callback to the Leap object s loop property. The callback

13 12 then gets called with a Frame object as a parameter every time that the proprietary software running on the local machine broadcasts. Leap ASL captures the data by attaching a callback to the loop property and capturing incoming frames and interpreting the data contained therein. The Frame object is an abstracted object containing information regarding the current state of the ambient area surrounding the controller. This data contains tons of information including a history of objects detected by the controller, recognizable gestures being performed, translations objects in the frame have performed since a specified frame id in the past, and, most importantly for this application, any hand objects detected within it s field of view. Leap ASL Hand and Finger Objects The hand object passed by Leap Motions API contains yet another load of data to be interpreted. Leap Motion detects a large set of properties about the position and motion of hands in it s view. These properties include information such as whether it is the left or right hand, the hand s grab strength, the hands velocity, and many more. The properties there were the most important to this study were the hand s pitch, roll, and yaw. These values were added in later iterations of the program to improve the ability to distinguish between between hand positions similar but facing different directions. The initial approach used to try to distinguish between hand positions were the software s interpretation of the fingers.

14 13 From the fingers you can pull data such as the bones and joints that compose them, the direction they re pointing, and the position of the tip, and other such data. By iterating over the fingers and combining their data, you can essentially get a snapshot of a three dimensional view of your hand and fingers at a given moment. This concept is fundamental in order to be able to later compare a hand position to a prior one. Approach For the first iteration of Leap ASL I built out a simple capture mechanism that, when the user pressed the spacebar would prompt the user to enter a letter. The application would then store the whole hand object, with the associated letter in it within an array. When the user pressed another key the program would iterate over the array of past recorded letters and compare the current hand state to it. While this approach in theory has potential, the reality is that the granularity, Figure 3 - The prompt to record a letter in terms of decimal recorded in all the properties of a hand was way too precise to ever match when recreating the hand position. Furthermore, having to press a button to compare the current hand position to recorded positions increased the length of time in

15 14 the feedback loop drastically, introducing further difficulty in matching a recorded hand position. Resolving the issue regarding the slow feedback loop had a cascading benefit of also addressing the other issue discovered. By hooking into the callback loop triggered by the Leap Motion API on every frame recorded Leap ASL could automatically take any hand objects in the controller s field of view and compare them against recorded values. At this point the inefficiency of iterating over an array became noticeable as the frame rate of the rendered view of the hand deteriorated. The solution discovered was to use a hash table to store the recorded values and a hashing function that would create a hash from the hand and finger data. This could be used both to store a snapshot of the hand and to look it really quickly since hash table look ups are incredibly quick. Using a hash table required that key properties of the hand and finger objects were selected. When storing properties of the hand and fingers it was necessary to round to a single decimal place because of the granularity that the Leap Motion Figure 4 - The recorded letter and hash associated with it controller publishes.

16 15 Results The second approach worked much better. Successive attempts of previously recorded positions were recognized by the application. Yet the granularity of data on a hand position was still so accurate that it was challenging to get results. In order to address this issue, a fudge factor was introduced. This function assessed two hashes and compared their properties and as long as the overall difference between values was less than x and no single value was more than y off of their recorded value it was a match. By modifying x and y it was possible to tweak the required accuracy of the matching algorithm significantly. Additional improvements came when experimenting with the controllers properties regarding it s orientation. By placing it on it s side and enabling head mounted display mode improved the analysis of the hand and fingers position because a users wrist wouldn't be obscuring their position. As Leap Motion s software for hand and finger detection improves, this application would inherently benefit from all of the effort and development into the SDK. Figure 5 - The letter detected and displayed based on the hand s positioning

17 16 The results are positive and indicative of potential of incredible advances within this domain. Though the scope of the investigation was quite narrow, they allowed rapid progress that wouldn't be possible if focusing on the broader more complex implications of this issue. The issues that would have been required to be addressed if the scope was allowed to creep would require significant investigative effort and development time, unnecessary for this early proof of concept. Future work A fundamental issue with the vision proposed is that individuals are different, sign language is partially subjective and therefore one dataset for everyone is ineffective. In order to avoid issues raised by the difference in individuals a possible approach would be to have a learning stage for the application where you repeated a set of gestures to calibrate the software to your specific gestures. Additionally, by adding two buttons to the device that indicated whether the interpreted word or gesture is correct, a feedback loop is introduced where the application can learn from its success and failure. This learning would improve the platform for everyone on it while. Gestures represent another area of potential problems because the hand position as it goes through the gesture would have to be able to both be recorded then retrieved later. An approach for this that would be to store each frame in a gesture as a hand position, however flag it as part of a gesture and have a pointer to the hand position before it.

18 17 Then if a hand position is part of a gesture you add it to a stack of frames that would represent the ongoing gesture and, as the gesture is completed compare the frames in it s stack to the recorded objects. You would then have to decide on a confidence interval for how similar individual frames have to be to recorded ones, how many frames wrong, and how wrong, while still considering the gesture valid, and how many frames could be missing due to inconsistencies in how frequent or at what part of the gesture a frame is recorded. The successes of this experiment begs one to think to where this sort of application could go. The requirements to run the application are very small, only needing a basic computer with a usb port and the Leap Motion software running on it. It paired in a client server architecture where the heavy computing was offloaded to a remote server could essentially miniaturize the components creating a tool to parse gestures in real time as a user signed. Aggregating those results then rendering them through a text to speech software and playing the parsed letters could essentially provide an translator for individuals who can only sign to communicate with others who haven t learnt to sign yet. Furthermore, by adding a microphone to this device, it could listen and interpret spoken word and display a video of the sign language equivalent on an attached screen. This would allow bi-directional communication for both mute and deaf individuals in realtime. Improving the hand position analysis would be possible by focusing on individual letters, words, and phrases and discovering which key attributes truly matter for them, whether it is only some fingers being a certain way or the hand direction being exactly right; by

19 18 narrowing the number of attributes to a smaller subset, would allow for easier matching while likely improving the accuracy of the matches. As improvements were made to how hand positions and gestures are interpreted they could be A/B tested on subsets of the user base and their relative improvements quantified. Conclusion The use of the Leap Motion Controller to detect hand position in 3D space and interpret it as a ASL gesture shows immense promise. This success on a small scale could be replicated and drastically improved upon by a larger effort. The potential it represents, to change tons of peoples lives, would be truly incredible if realized. To be able to translate gestures in real-time to voice and then have verbal responses played back as a video of an individual signing would inevitable change the way deaf and mute individuals were able to interact with everyone else. This investigation achieved everything it set out to do and lays the groundwork for others to delve deeper into the possibilities that this idea presents.

20 19 References Garbani, L., Siegwart, R., & Pradalier, C. (2011, December 1). History of Computer Pointing Input Devices - ETH Z. Retrieved August 16, Leap Motion SDK. (n.d.). Retrieved August 16, 2015, from developer.leapmotion.com/ Leap Motion. (n.d.). Retrieved August 16, 2015, from Leap Motion: 3D hands-free motion control, unbound. (n.d.). Retrieved August 16, 2015, from Leapmotion/leapjs-rigged-hand. (n.d.). Retrieved August 16, 2015, from github.com/leapmotion/leapjs-rigged-hand Weichert, F., Bachmann, D., Rudak, B., & Fisseler, D. (n.d.). Analysis of the Accuracy and Robustness of the Leap Motion Controller. Sensors,

Detection and Recognition of Sign Language Protocol using Motion Sensing Device

Detection and Recognition of Sign Language Protocol using Motion Sensing Device Detection and Recognition of Sign Language Protocol using Motion Sensing Device Rita Tse ritatse@ipm.edu.mo AoXuan Li P130851@ipm.edu.mo Zachary Chui MPI-QMUL Information Systems Research Centre zacharychui@gmail.com

More information

Voluntary Product Accessibility Template (VPAT)

Voluntary Product Accessibility Template (VPAT) Avaya Vantage TM Basic for Avaya Vantage TM Voluntary Product Accessibility Template (VPAT) Avaya Vantage TM Basic is a simple communications application for the Avaya Vantage TM device, offering basic

More information

VPAT Summary. VPAT Details. Section Telecommunications Products - Detail. Date: October 8, 2014 Name of Product: BladeCenter HS23

VPAT Summary. VPAT Details. Section Telecommunications Products - Detail. Date: October 8, 2014 Name of Product: BladeCenter HS23 Date: October 8, 2014 Name of Product: BladeCenter HS23 VPAT Summary Criteria Status Remarks and Explanations Section 1194.21 Software Applications and Operating Systems Section 1194.22 Web-based Internet

More information

University of Toronto. Final Report. myacl. Student: Alaa Abdulaal Pirave Eahalaivan Nirtal Shah. Professor: Jonathan Rose

University of Toronto. Final Report. myacl. Student: Alaa Abdulaal Pirave Eahalaivan Nirtal Shah. Professor: Jonathan Rose University of Toronto Final Report myacl Student: Alaa Abdulaal Pirave Eahalaivan Nirtal Shah Professor: Jonathan Rose April 8, 2015 Contents 1 Goal & Motivation 2 1.1 Background..............................................

More information

Avaya IP Office R9.1 Avaya one-x Portal Call Assistant Voluntary Product Accessibility Template (VPAT)

Avaya IP Office R9.1 Avaya one-x Portal Call Assistant Voluntary Product Accessibility Template (VPAT) Avaya IP Office R9.1 Avaya one-x Portal Call Assistant Voluntary Product Accessibility Template (VPAT) Avaya IP Office Avaya one-x Portal Call Assistant is an application residing on the user s PC that

More information

ITU-T. FG AVA TR Version 1.0 (10/2013) Part 3: Using audiovisual media A taxonomy of participation

ITU-T. FG AVA TR Version 1.0 (10/2013) Part 3: Using audiovisual media A taxonomy of participation International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU FG AVA TR Version 1.0 (10/2013) Focus Group on Audiovisual Media Accessibility Technical Report Part 3: Using

More information

AVR Based Gesture Vocalizer Using Speech Synthesizer IC

AVR Based Gesture Vocalizer Using Speech Synthesizer IC AVR Based Gesture Vocalizer Using Speech Synthesizer IC Mr.M.V.N.R.P.kumar 1, Mr.Ashutosh Kumar 2, Ms. S.B.Arawandekar 3, Mr.A. A. Bhosale 4, Mr. R. L. Bhosale 5 Dept. Of E&TC, L.N.B.C.I.E.T. Raigaon,

More information

HearIntelligence by HANSATON. Intelligent hearing means natural hearing.

HearIntelligence by HANSATON. Intelligent hearing means natural hearing. HearIntelligence by HANSATON. HearIntelligence by HANSATON. Intelligent hearing means natural hearing. Acoustic environments are complex. We are surrounded by a variety of different acoustic signals, speech

More information

LIBRE-LIBRAS: A Tool with a Free-Hands Approach to Assist LIBRAS Translation and Learning

LIBRE-LIBRAS: A Tool with a Free-Hands Approach to Assist LIBRAS Translation and Learning CHAPTER 12 LIBRE-LIBRAS: A Tool with a Free-Hands Approach to Assist LIBRAS Translation and Learning Evandro Terra Gonçalves *, Renata Monteiro de Oliveira, Rubens Jesse Catharino Deknes Gonçalves, Sidney

More information

Summary Table Voluntary Product Accessibility Template

Summary Table Voluntary Product Accessibility Template The following Voluntary Product Accessibility refers to the Apple MacBook Air. For more on the accessibility features of Mac OS X and the MacBook Air, visit Apple s accessibility Web site at http://www.apple.com/accessibility.

More information

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1 1. INTRODUCTION Sign language interpretation is one of the HCI applications where hand gesture plays important role for communication. This chapter discusses sign language interpretation system with present

More information

Sign Language Interpretation Using Pseudo Glove

Sign Language Interpretation Using Pseudo Glove Sign Language Interpretation Using Pseudo Glove Mukul Singh Kushwah, Manish Sharma, Kunal Jain and Anish Chopra Abstract The research work presented in this paper explores the ways in which, people who

More information

Avaya 3904 Digital Deskphone Voluntary Product Accessibility Template (VPAT)

Avaya 3904 Digital Deskphone Voluntary Product Accessibility Template (VPAT) Avaya 3904 Digital Deskphone Voluntary Product Accessibility Template (VPAT) The Avaya 3904 Digital Deskphone is an endpoint terminal used in conjunction with the Avaya Communication Server 1000 and Avaya

More information

Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information:

Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information: Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information: accessibility@cisco.com Summary Table - Voluntary Product Accessibility Template Criteria Supporting Features Remarks

More information

Fujitsu LifeBook T Series TabletPC Voluntary Product Accessibility Template

Fujitsu LifeBook T Series TabletPC Voluntary Product Accessibility Template Fujitsu LifeBook T Series TabletPC Voluntary Product Accessibility Template 1194.21 Software Applications and Operating Systems* (a) When software is designed to run on a system that This product family

More information

Avaya one-x Communicator for Mac OS X R2.0 Voluntary Product Accessibility Template (VPAT)

Avaya one-x Communicator for Mac OS X R2.0 Voluntary Product Accessibility Template (VPAT) Avaya one-x Communicator for Mac OS X R2.0 Voluntary Product Accessibility Template (VPAT) Avaya one-x Communicator is a unified communications client that allows people to communicate using VoIP and Contacts.

More information

Recognition of sign language gestures using neural networks

Recognition of sign language gestures using neural networks Recognition of sign language gestures using neural s Peter Vamplew Department of Computer Science, University of Tasmania GPO Box 252C, Hobart, Tasmania 7001, Australia vamplew@cs.utas.edu.au ABSTRACT

More information

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures. Date: 26 June 2017 Voluntary Accessibility Template (VPAT) This Voluntary Product Accessibility Template (VPAT) describes accessibility of Polycom s CX5100 Unified Conference Station against the criteria

More information

Summary Table Voluntary Product Accessibility Template. Supports. Please refer to. Supports. Please refer to

Summary Table Voluntary Product Accessibility Template. Supports. Please refer to. Supports. Please refer to Date Aug-07 Name of product SMART Board 600 series interactive whiteboard SMART Board 640, 660 and 680 interactive whiteboards address Section 508 standards as set forth below Contact for more information

More information

Glossary of Inclusion Terminology

Glossary of Inclusion Terminology Glossary of Inclusion Terminology Accessible A general term used to describe something that can be easily accessed or used by people with disabilities. Alternate Formats Alternate formats enable access

More information

Real Time Sign Language Processing System

Real Time Sign Language Processing System Real Time Sign Language Processing System Dibyabiva Seth (&), Anindita Ghosh, Ariruna Dasgupta, and Asoke Nath Department of Computer Science, St. Xavier s College (Autonomous), Kolkata, India meetdseth@gmail.com,

More information

Real-time Communication System for the Deaf and Dumb

Real-time Communication System for the Deaf and Dumb Real-time Communication System for the Deaf and Dumb Kedar Potdar 1, Gauri Nagavkar 2 U.G. Student, Department of Computer Engineering, Watumull Institute of Electronics Engineering and Computer Technology,

More information

An Approach to Hand Gesture Recognition for Devanagari Sign Language using Image Processing Tool Box

An Approach to Hand Gesture Recognition for Devanagari Sign Language using Image Processing Tool Box An Approach to Hand Gesture Recognition for Devanagari Sign Language using Image Processing Tool Box Prof. Abhijit V. Warhade 1 Prof. Pranali K. Misal 2 Assistant Professor, Dept. of E & C Engineering

More information

Gesture Recognition using Marathi/Hindi Alphabet

Gesture Recognition using Marathi/Hindi Alphabet Gesture Recognition using Marathi/Hindi Alphabet Rahul Dobale ¹, Rakshit Fulzele², Shruti Girolla 3, Seoutaj Singh 4 Student, Computer Engineering, D.Y. Patil School of Engineering, Pune, India 1 Student,

More information

Voluntary Product Accessibility Template (VPAT)

Voluntary Product Accessibility Template (VPAT) (VPAT) Date: Product Name: Product Version Number: Organization Name: Submitter Name: Submitter Telephone: APPENDIX A: Suggested Language Guide Summary Table Section 1194.21 Software Applications and Operating

More information

VPAT for Apple MacBook Air (mid 2013)

VPAT for Apple MacBook Air (mid 2013) VPAT for Apple MacBook Air (mid 2013) The following Voluntary Product Accessibility information refers to the Apple MacBook air (mid 2013). For more information on the accessibility features of Mac OS

More information

Apple emac. Standards Subpart Software applications and operating systems. Subpart B -- Technical Standards

Apple emac. Standards Subpart Software applications and operating systems. Subpart B -- Technical Standards Apple emac Standards Subpart 1194.21 Software applications and operating systems. 1194.22 Web-based intranet and internet information and applications. 1194.23 Telecommunications products. 1194.24 Video

More information

Konftel 300Mx. Voluntary Product Accessibility Template (VPAT)

Konftel 300Mx. Voluntary Product Accessibility Template (VPAT) Konftel 300Mx Voluntary Product Accessibility Template (VPAT) The Konftel 300Mx is a sophisticated speakerphone, intended for use by groups of up to ten people in conference room and meeting room environments.

More information

Summary Table Voluntary Product Accessibility Template. Supporting Features. Supports. Supports. Supports. Supports

Summary Table Voluntary Product Accessibility Template. Supporting Features. Supports. Supports. Supports. Supports Date: March 31, 2016 Name of Product: ThinkServer TS450, TS550 Summary Table Voluntary Product Accessibility Template Section 1194.21 Software Applications and Operating Systems Section 1194.22 Web-based

More information

Getting the Design Right Daniel Luna, Mackenzie Miller, Saloni Parikh, Ben Tebbs

Getting the Design Right Daniel Luna, Mackenzie Miller, Saloni Parikh, Ben Tebbs Meet the Team Getting the Design Right Daniel Luna, Mackenzie Miller, Saloni Parikh, Ben Tebbs Mackenzie Miller: Project Manager Daniel Luna: Research Coordinator Saloni Parikh: User Interface Designer

More information

Avaya G450 Branch Gateway, Release 7.1 Voluntary Product Accessibility Template (VPAT)

Avaya G450 Branch Gateway, Release 7.1 Voluntary Product Accessibility Template (VPAT) Avaya G450 Branch Gateway, Release 7.1 Voluntary Product Accessibility Template (VPAT) can be administered via a graphical user interface or via a text-only command line interface. The responses in this

More information

Voluntary Product Accessibility Template (VPAT)

Voluntary Product Accessibility Template (VPAT) Voluntary Product Accessibility Template (VPAT) Date: January 25 th, 2016 Name of Product: Mitel 6730i, 6731i, 6735i, 6737i, 6739i, 6753i, 6755i, 6757i, 6863i, 6865i, 6867i, 6869i, 6873i Contact for more

More information

Getting Started with AAC

Getting Started with AAC Getting Started with AAC P A R E N T G U I D E Many children have medical conditions that impact their ability to speak and learn language. But thanks to augmentative and alternative communication (AAC),

More information

iclicker+ Student Remote Voluntary Product Accessibility Template (VPAT)

iclicker+ Student Remote Voluntary Product Accessibility Template (VPAT) iclicker+ Student Remote Voluntary Product Accessibility Template (VPAT) Date: May 22, 2017 Product Name: iclicker+ Student Remote Product Model Number: RLR15 Company Name: Macmillan Learning, iclicker

More information

User Manual Verizon Wireless. All Rights Reserved. verizonwireless.com OM2260VW

User Manual Verizon Wireless. All Rights Reserved. verizonwireless.com OM2260VW User Manual 2010 Verizon Wireless. All Rights Reserved. verizonwireless.com OM2260VW Home Phone Connect Welcome to Verizon Wireless Thank you for choosing Verizon Wireless Home Phone Connect. You re now

More information

Empirical Research Methods for Human-Computer Interaction. I. Scott MacKenzie Steven J. Castellucci

Empirical Research Methods for Human-Computer Interaction. I. Scott MacKenzie Steven J. Castellucci Empirical Research Methods for Human-Computer Interaction I. Scott MacKenzie Steven J. Castellucci 1 Topics The what, why, and how of empirical research Group participation in a real experiment Observations

More information

enterface 13 Kinect-Sign João Manuel Ferreira Gameiro Project Proposal for enterface 13

enterface 13 Kinect-Sign João Manuel Ferreira Gameiro Project Proposal for enterface 13 enterface 13 João Manuel Ferreira Gameiro Kinect-Sign Project Proposal for enterface 13 February, 2013 Abstract This project main goal is to assist in the communication between deaf and non-deaf people.

More information

Hand Gesture Recognition and Speech Conversion for Deaf and Dumb using Feature Extraction

Hand Gesture Recognition and Speech Conversion for Deaf and Dumb using Feature Extraction Hand Gesture Recognition and Speech Conversion for Deaf and Dumb using Feature Extraction Aswathy M 1, Heera Narayanan 2, Surya Rajan 3, Uthara P M 4, Jeena Jacob 5 UG Students, Dept. of ECE, MBITS, Nellimattom,

More information

itracker Word Count: 1453 Apper Contribution: 352 ECE Creative Applications for Mobile Devices For: Professor Jonathan Rose April 12, 2013

itracker Word Count: 1453 Apper Contribution: 352 ECE Creative Applications for Mobile Devices For: Professor Jonathan Rose April 12, 2013 itracker Word Count: 1453 Apper Contribution: 352 ECE 1778 - Creative Applications for Mobile Devices For: Professor Jonathan Rose April 12, 2013 By: Baieruss Trinos (990072292) Babak Basseri (996441845)

More information

The Use of Voice Recognition and Speech Command Technology as an Assistive Interface for ICT in Public Spaces.

The Use of Voice Recognition and Speech Command Technology as an Assistive Interface for ICT in Public Spaces. The Use of Voice Recognition and Speech Command Technology as an Assistive Interface for ICT in Public Spaces. A whitepaper published by Peter W Jarvis (Senior Executive VP, Storm Interface) and Nicky

More information

Sign Language to English (Slate8)

Sign Language to English (Slate8) Sign Language to English (Slate8) App Development Nathan Kebe El Faculty Advisor: Dr. Mohamad Chouikha 2 nd EECS Day April 20, 2018 Electrical Engineering and Computer Science (EECS) Howard University

More information

Product Model #:ASTRO Digital Spectra Consolette W7 Models (Local Control)

Product Model #:ASTRO Digital Spectra Consolette W7 Models (Local Control) Subpart 1194.25 Self-Contained, Closed Products When a timed response is required alert user, allow sufficient time for him to indicate that he needs additional time to respond [ N/A ] For touch screen

More information

Appendix C Protocol for the Use of the Scribe Accommodation and for Transcribing Student Responses

Appendix C Protocol for the Use of the Scribe Accommodation and for Transcribing Student Responses Appendix C Protocol for the Use of the Scribe Accommodation and for Transcribing Student Responses writes or types student responses into the Student Testing Site or onto a scorable test booklet or answer

More information

Product Model #: Digital Portable Radio XTS 5000 (Std / Rugged / Secure / Type )

Product Model #: Digital Portable Radio XTS 5000 (Std / Rugged / Secure / Type ) Rehabilitation Act Amendments of 1998, Section 508 Subpart 1194.25 Self-Contained, Closed Products The following features are derived from Section 508 When a timed response is required alert user, allow

More information

The power to connect us ALL.

The power to connect us ALL. Provided by Hamilton Relay www.ca-relay.com The power to connect us ALL. www.ddtp.org 17E Table of Contents What Is California Relay Service?...1 How Does a Relay Call Work?.... 2 Making the Most of Your

More information

The Leap Motion controller: A view on sign language

The Leap Motion controller: A view on sign language The Leap Motion controller: A view on sign language Author Potter, Leigh-Ellen, Araullo, Jake, Carter, Lewis Published 2013 Conference Title The 25th Australian Computer-Human Interaction Conference DOI

More information

User Guide V: 3.0, August 2017

User Guide V: 3.0, August 2017 User Guide V: 3.0, August 2017 a product of FAQ 3 General Information 1.1 System Overview 5 1.2 User Permissions 6 1.3 Points of Contact 7 1.4 Acronyms and Definitions 8 System Summary 2.1 System Configuration

More information

Motion Control for Social Behaviours

Motion Control for Social Behaviours Motion Control for Social Behaviours Aryel Beck a.beck@ntu.edu.sg Supervisor: Nadia Magnenat-Thalmann Collaborators: Zhang Zhijun, Rubha Shri Narayanan, Neetha Das 10-03-2015 INTRODUCTION In order for

More information

Avaya B189 Conference Telephone Voluntary Product Accessibility Template (VPAT)

Avaya B189 Conference Telephone Voluntary Product Accessibility Template (VPAT) Avaya B189 Conference Telephone Voluntary Product Accessibility Template (VPAT) The Avaya B189 Conference Telephone is a sophisticated speakerphone, intended for use by groups of ten or more individuals

More information

Communication Access Features on Apple devices

Communication Access Features on Apple devices Communication Access Features on Apple devices The information in this guide is correct for devices running ios 10. Devices running earlier operating systems may differ. Page 2 Page 3 Page 4 Page 5 Page

More information

iclicker2 Student Remote Voluntary Product Accessibility Template (VPAT)

iclicker2 Student Remote Voluntary Product Accessibility Template (VPAT) iclicker2 Student Remote Voluntary Product Accessibility Template (VPAT) Date: May 22, 2017 Product Name: i>clicker2 Student Remote Product Model Number: RLR14 Company Name: Macmillan Learning, iclicker

More information

Avaya IP Office 10.1 Telecommunication Functions

Avaya IP Office 10.1 Telecommunication Functions Avaya IP Office 10.1 Telecommunication Functions Voluntary Product Accessibility Template (VPAT) Avaya IP Office is an all-in-one solution specially designed to meet the communications challenges facing

More information

Avaya B179 Conference Telephone Voluntary Product Accessibility Template (VPAT)

Avaya B179 Conference Telephone Voluntary Product Accessibility Template (VPAT) Avaya B179 Conference Telephone Voluntary Product Accessibility Template (VPAT) The Avaya B179 Conference Telephone is a sophisticated speakerphone, intended for use by groups of ten or more individuals

More information

Communications Accessibility with Avaya IP Office

Communications Accessibility with Avaya IP Office Accessibility with Avaya IP Office Voluntary Product Accessibility Template (VPAT) 1194.23, Telecommunications Products Avaya IP Office is an all-in-one solution specially designed to meet the communications

More information

Interact-AS. Use handwriting, typing and/or speech input. The most recently spoken phrase is shown in the top box

Interact-AS. Use handwriting, typing and/or speech input. The most recently spoken phrase is shown in the top box Interact-AS One of the Many Communications Products from Auditory Sciences Use handwriting, typing and/or speech input The most recently spoken phrase is shown in the top box Use the Control Box to Turn

More information

Avaya B159 Conference Telephone Voluntary Product Accessibility Template (VPAT)

Avaya B159 Conference Telephone Voluntary Product Accessibility Template (VPAT) Avaya B159 Conference Telephone Voluntary Product Accessibility Template (VPAT) The Avaya B159 Conference Telephone is a sophisticated speakerphone, intended for use by groups of ten or more individuals

More information

Sign Language in the Intelligent Sensory Environment

Sign Language in the Intelligent Sensory Environment Sign Language in the Intelligent Sensory Environment Ákos Lisztes, László Kővári, Andor Gaudia, Péter Korondi Budapest University of Science and Technology, Department of Automation and Applied Informatics,

More information

Member 1 Member 2 Member 3 Member 4 Full Name Krithee Sirisith Pichai Sodsai Thanasunn

Member 1 Member 2 Member 3 Member 4 Full Name Krithee Sirisith Pichai Sodsai Thanasunn Microsoft Imagine Cup 2010 Thailand Software Design Round 1 Project Proposal Template PROJECT PROPOSAL DUE: 31 Jan 2010 To Submit to proposal: Register at www.imaginecup.com; select to compete in Software

More information

Errol Davis Director of Research and Development Sound Linked Data Inc. Erik Arisholm Lead Engineer Sound Linked Data Inc.

Errol Davis Director of Research and Development Sound Linked Data Inc. Erik Arisholm Lead Engineer Sound Linked Data Inc. An Advanced Pseudo-Random Data Generator that improves data representations and reduces errors in pattern recognition in a Numeric Knowledge Modeling System Errol Davis Director of Research and Development

More information

THE GALLAUDET DICTIONARY OF AMERICAN SIGN LANGUAGE FROM HARRIS COMMUNICATIONS

THE GALLAUDET DICTIONARY OF AMERICAN SIGN LANGUAGE FROM HARRIS COMMUNICATIONS THE GALLAUDET DICTIONARY OF AMERICAN SIGN LANGUAGE FROM HARRIS COMMUNICATIONS DOWNLOAD EBOOK : THE GALLAUDET DICTIONARY OF AMERICAN SIGN Click link bellow and free register to download ebook: THE GALLAUDET

More information

Analysis of Recognition System of Japanese Sign Language using 3D Image Sensor

Analysis of Recognition System of Japanese Sign Language using 3D Image Sensor Analysis of Recognition System of Japanese Sign Language using 3D Image Sensor Yanhua Sun *, Noriaki Kuwahara**, Kazunari Morimoto *** * oo_alison@hotmail.com ** noriaki.kuwahara@gmail.com ***morix119@gmail.com

More information

Multimodal Interaction for Users with Autism in a 3D Educational Environment

Multimodal Interaction for Users with Autism in a 3D Educational Environment Multimodal Interaction for Users with Autism in a 3D Educational Environment Ing. Alessandro Trivilini Prof. Licia Sbattella Ing. Roberto Tedesco 1 Screenshots 2 Screenshots 3 Introduction Existing Projects

More information

Meri Awaaz Smart Glove Learning Assistant for Mute Students and teachers

Meri Awaaz Smart Glove Learning Assistant for Mute Students and teachers Meri Awaaz Smart Glove Learning Assistant for Mute Students and teachers Aditya C *1, Siddharth T 1, Karan K 1 and Priya G 2 School of Computer Science and Engineering, VIT University, Vellore, India 1

More information

Glove for Gesture Recognition using Flex Sensor

Glove for Gesture Recognition using Flex Sensor Glove for Gesture Recognition using Flex Sensor Mandar Tawde 1, Hariom Singh 2, Shoeb Shaikh 3 1,2,3 Computer Engineering, Universal College of Engineering, Kaman Survey Number 146, Chinchoti Anjur Phata

More information

Before the Department of Transportation, Office of the Secretary Washington, D.C

Before the Department of Transportation, Office of the Secretary Washington, D.C Before the Department of Transportation, Office of the Secretary Washington, D.C. 20554 ) In the Matter of ) Accommodations for Individuals Who Are ) OST Docket No. 2006-23999 Deaf, Hard of Hearing, or

More information

myphonak app User Guide

myphonak app User Guide myphonak app User Guide Getting started myphonak is an app developed by Sonova, the world leader in hearing solutions based in Zurich, Switzerland. Read the user instructions thoroughly in order to benefit

More information

A Communication tool, Mobile Application Arabic & American Sign Languages (ARSL) Sign Language (ASL) as part of Teaching and Learning

A Communication tool, Mobile Application Arabic & American Sign Languages (ARSL) Sign Language (ASL) as part of Teaching and Learning A Communication tool, Mobile Application Arabic & American Sign Languages (ARSL) Sign Language (ASL) as part of Teaching and Learning Fatima Al Dhaen Ahlia University Information Technology Dep. P.O. Box

More information

Available online at ScienceDirect. Procedia Technology 24 (2016 )

Available online at   ScienceDirect. Procedia Technology 24 (2016 ) Available online at www.sciencedirect.com ScienceDirect Procedia Technology 24 (2016 ) 1068 1073 International Conference on Emerging Trends in Engineering, Science and Technology (ICETEST - 2015) Improving

More information

Cisco Unified Communications: Bringing Innovation to Accessibility

Cisco Unified Communications: Bringing Innovation to Accessibility Cisco Unified Communications: Bringing Innovation to Accessibility Today s diverse workforce creates extraordinary value for business, government, and education. At the same time, it introduces new challenges.

More information

Making Sure People with Communication Disabilities Get the Message

Making Sure People with Communication Disabilities Get the Message Emergency Planning and Response for People with Disabilities Making Sure People with Communication Disabilities Get the Message A Checklist for Emergency Public Information Officers This document is part

More information

Voluntary Product Accessibility Template Summary Table

Voluntary Product Accessibility Template Summary Table VPAT for iphone X The following Voluntary Product Accessibility information refers to the iphone X ( iphone ). For more information on the accessibility features of the iphone and to learn more about iphone

More information

A Review on Gesture Vocalizer

A Review on Gesture Vocalizer A Review on Gesture Vocalizer Deena Nath 1, Jitendra Kurmi 2, Deveki Nandan Shukla 3 1, 2, 3 Department of Computer Science, Babasaheb Bhimrao Ambedkar University Lucknow Abstract: Gesture Vocalizer is

More information

Recognition of Hand Gestures by ASL

Recognition of Hand Gestures by ASL Recognition of Hand Gestures by ASL A. A. Bamanikar Madhuri P. Borawake Swati Bhadkumbhe Abstract - Hand Gesture Recognition System project will design and build a man-machine interface using a video camera

More information

As of: 01/10/2006 the HP Designjet 4500 Stacker addresses the Section 508 standards as described in the chart below.

As of: 01/10/2006 the HP Designjet 4500 Stacker addresses the Section 508 standards as described in the chart below. Accessibility Information Detail Report As of: 01/10/2006 the HP Designjet 4500 Stacker addresses the Section 508 standards as described in the chart below. Family HP DesignJet 4500 stacker series Detail

More information

Situation Reaction Detection Using Eye Gaze And Pulse Analysis

Situation Reaction Detection Using Eye Gaze And Pulse Analysis Situation Reaction Detection Using Eye Gaze And Pulse Analysis 1 M. Indumathy, 2 Dipankar Dey, 2 S Sambath Kumar, 2 A P Pranav 1 Assistant Professor, 2 UG Scholars Dept. Of Computer science and Engineering

More information

Applications to Assist Students with. ASD: Promoting Quality of Life. Jessica Hessel The University of Akron. Isabel Sestito The University of Akron

Applications to Assist Students with. ASD: Promoting Quality of Life. Jessica Hessel The University of Akron. Isabel Sestito The University of Akron Applications to Assist Students with Jessica Hessel The University of Akron ASD: Promoting Quality of Life Isabel Sestito The University of Akron Rae Lynn Daviso Akron Public Schools Quality of Life Indicators

More information

Tele-audiology. Deviceware. Integration with Resonance Portable Diagnostic Devices. White Paper (v1.1)

Tele-audiology. Deviceware. Integration with Resonance Portable Diagnostic Devices. White Paper (v1.1) Tele-audiology Deviceware Integration with Resonance Portable Diagnostic Devices White Paper (v1.1) Date: April, 2018 Preface RemotEAR offers Deviceware as a subscription-based service for hearing care

More information

National Relay Service: The Deaf Perspective DISCUSSION PAPER

National Relay Service: The Deaf Perspective DISCUSSION PAPER National Relay Service: The Deaf Perspective DISCUSSION PAPER Purpose This discussion paper National Relay Service: The Deaf Perspective has been written with the intention of generating feedback from

More information

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures. Date: 18 Nov 2013 Voluntary Accessibility Template (VPAT) This Voluntary Product Accessibility Template (VPAT) describes accessibility of Polycom s C100 and CX100 family against the criteria described

More information

Running Head: Overcoming Social and Communication Barriers 1

Running Head: Overcoming Social and Communication Barriers 1 Running Head: Overcoming Social and Communication Barriers 1 Overcoming Social and Communication Barriers for Students with Deaf-Blindness Lauren Bethune University of North Florida Running Head: Overcoming

More information

Posture Monitor. User Manual. Includes setup, guidelines and troubleshooting information for your Posture Monitor App

Posture Monitor. User Manual. Includes setup, guidelines and troubleshooting information for your Posture Monitor App Posture Monitor User Manual Includes setup, guidelines and troubleshooting information for your Posture Monitor App All rights reserved. This manual shall not be copied, in whole or in part, without the

More information

Advanced Audio Interface for Phonetic Speech. Recognition in a High Noise Environment

Advanced Audio Interface for Phonetic Speech. Recognition in a High Noise Environment DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited Advanced Audio Interface for Phonetic Speech Recognition in a High Noise Environment SBIR 99.1 TOPIC AF99-1Q3 PHASE I SUMMARY

More information

Accessing the "Far World": A New Age of Connectivity for Hearing Aids by George Lindley, PhD, AuD

Accessing the Far World: A New Age of Connectivity for Hearing Aids by George Lindley, PhD, AuD Accessing the "Far World": A New Age of Connectivity for Hearing Aids by George Lindley, PhD, AuD Mobile phones, PDAs, computers, televisions, music players, Bluetooth devices and even the other hearing

More information

[Debi] I m Debi Griggs, together with my co-presenter Gordon Hom, we d like to welcome you to this workshop on Getting Started Captioning Your Instructional Videos. I represent the ranks of the instructional

More information

Microphone Input LED Display T-shirt

Microphone Input LED Display T-shirt Microphone Input LED Display T-shirt Team 50 John Ryan Hamilton and Anthony Dust ECE 445 Project Proposal Spring 2017 TA: Yuchen He 1 Introduction 1.2 Objective According to the World Health Organization,

More information

Summary Table Voluntary Product Accessibility Template. Supporting Features Not Applicable Not Applicable. Supports with Exceptions.

Summary Table Voluntary Product Accessibility Template. Supporting Features Not Applicable Not Applicable. Supports with Exceptions. Plantronics/ Clarity Summary Table Voluntary Product Accessibility Template Criteria Section 1194.21 Software Applications and Operating Systems Section 1194.22 Web-based intranet and Internet Information

More information

VPAT. Voluntary Product Accessibility Template. Version 1.3

VPAT. Voluntary Product Accessibility Template. Version 1.3 VPAT Voluntary Product Accessibility Template Version 1.3 The purpose of the Voluntary Product Accessibility Template, or VPAT, is to assist Federal contracting officials and other buyers in making preliminary

More information

On Demand Video Remote Interpreting

On Demand Video Remote Interpreting On Demand Video Remote Interpreting Interpreter User Guide thebigword.com 1 Contents What is On Demand Video Remote Interpreting (VRI)? What do I Need to Deliver a Good On Demand VRI Experience? What are

More information

Hand of Hope. For hand rehabilitation. Member of Vincent Medical Holdings Limited

Hand of Hope. For hand rehabilitation. Member of Vincent Medical Holdings Limited Hand of Hope For hand rehabilitation Member of Vincent Medical Holdings Limited Over 17 Million people worldwide suffer a stroke each year A stroke is the largest cause of a disability with half of all

More information

Multimedia courses generator for hearing impaired

Multimedia courses generator for hearing impaired Multimedia courses generator for hearing impaired Oussama El Ghoul and Mohamed Jemni Research Laboratory of Technologies of Information and Communication UTIC Ecole Supérieure des Sciences et Techniques

More information

Impression-Making Designed for Dentists

Impression-Making Designed for Dentists Impression-Making Designed for Dentists June 2018 Solving the Impression Compromise Dentistry is practiced in the real world. Despite advances in impression material, impression quality continues to be

More information

Question & Answer (Q&A): Identifying and Meeting the Language Preferences of Health Plan Members

Question & Answer (Q&A): Identifying and Meeting the Language Preferences of Health Plan Members Question & Answer (Q&A): Identifying and Meeting the Language Preferences of Health Plan Members Webinar participants asked these questions during the Q&A portion of the Identifying and Meeting the Language

More information

Building an Application for Learning the Finger Alphabet of Swiss German Sign Language through Use of the Kinect

Building an Application for Learning the Finger Alphabet of Swiss German Sign Language through Use of the Kinect Zurich Open Repository and Archive University of Zurich Main Library Strickhofstrasse 39 CH-8057 Zurich www.zora.uzh.ch Year: 2014 Building an Application for Learning the Finger Alphabet of Swiss German

More information

Haptic Based Sign Language Interpreter

Haptic Based Sign Language Interpreter Haptic Based Sign Language Interpreter Swayam Bhosale 1, Harsh Kalla 2, Kashish Kitawat 3, Megha Gupta 4 Department of Electronics and Telecommunication,, Maharashtra, India Abstract: There is currently

More information

Sign Language Recognition using Webcams

Sign Language Recognition using Webcams Sign Language Recognition using Webcams Overview Average person s typing speed Composing: ~19 words per minute Transcribing: ~33 words per minute Sign speaker Full sign language: ~200 words per minute

More information

Using Deep Convolutional Networks for Gesture Recognition in American Sign Language

Using Deep Convolutional Networks for Gesture Recognition in American Sign Language Using Deep Convolutional Networks for Gesture Recognition in American Sign Language Abstract In the realm of multimodal communication, sign language is, and continues to be, one of the most understudied

More information

Clay Tablet Connector for hybris. User Guide. Version 1.5.0

Clay Tablet Connector for hybris. User Guide. Version 1.5.0 Clay Tablet Connector for hybris User Guide Version 1.5.0 August 4, 2016 Copyright Copyright 2005-2016 Clay Tablet Technologies Inc. All rights reserved. All rights reserved. This document and its content

More information

Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio

Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio Smart Gloves for Hand Gesture Recognition and Translation into Text and Audio Anshula Kumari 1, Rutuja Benke 1, Yasheseve Bhat 1, Amina Qazi 2 1Project Student, Department of Electronics and Telecommunication,

More information

Glossary of Disability-Related Terms

Glossary of Disability-Related Terms Glossary of Disability-Related Terms Accessible: In the case of a facility, readily usable by a particular individual; in the case of a program or activity, presented or provided in such a way that a particular

More information

Instructor Guide to EHR Go

Instructor Guide to EHR Go Instructor Guide to EHR Go Introduction... 1 Quick Facts... 1 Creating your Account... 1 Logging in to EHR Go... 5 Adding Faculty Users to EHR Go... 6 Adding Student Users to EHR Go... 8 Library... 9 Patients

More information