itracker Word Count: 1453 Apper Contribution: 352 ECE Creative Applications for Mobile Devices For: Professor Jonathan Rose April 12, 2013

Similar documents
University of Toronto. Final Report. myacl. Student: Alaa Abdulaal Pirave Eahalaivan Nirtal Shah. Professor: Jonathan Rose

FitFormula. Sam Liu (Apper) Simran Fitzgerald (Programmer) Jonathan Tomkun (Programmer) ECE1778 Final report

myphonak app User Guide

Enhanced Asthma Management with Mobile Communication

Team HandiApp. utdallas.edu/~srb Samuel Becker, Andrew Stein, Chris Hauptli, and Blake Barnhill. World Requirements Specification Evolution

GN Hearing app - Tinnitus Manager user guide

Engaging People Strategy

COA Innovation Award: My EAP Case Study

Instructor Guide to EHR Go

Demo Mode. Once you have taken the time to navigate your RPM 2 app in "Demo mode" you should be ready to pair, connect, and try your inserts.

PROPOSED WORK PROGRAMME FOR THE CLEARING-HOUSE MECHANISM IN SUPPORT OF THE STRATEGIC PLAN FOR BIODIVERSITY Note by the Executive Secretary

Contour Diabetes app User Guide

Continuous Glucose Monitoring (CGM) Dexcom G6 Training for Healthcare Professionals and Patients

Sign Language Interpretation Using Pseudo Glove

PushUpMadness. ISDS 3100 Spring /27/2012. Team Five: Michael Alford Jared Falcon Raymond Fuenzalida Dustin Jenkins

Unitron Remote Plus app

Voice Switch Manual. InvoTek, Inc Riverview Drive Alma, AR USA (479)

ELSI ISSUES IN mhealth

Table of Contents. Contour Diabetes App User Guide

MYFITNESSPAL: SETTING UP AN ACCOUNT

GLOOKO DREAMED FOR ANDROID USER GUIDE

Nitelog. User Manual. For use with the Z1 Auto CPAP System

Hearing Control App User Guide

Research & Development of Rehabilitation Technology in Singapore

Situation Reaction Detection Using Eye Gaze And Pulse Analysis

Intelligent Sensor Systems for Healthcare: A Case Study of Pressure Ulcer TITOLO. Prevention and Treatment TESI. Rui (April) Dai

GLUCOMETER: High Level Design

SmartLife - Addressing the Needs of People with Diabetes (PwD) Using Machine Learning

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Quick guide to connectivity and the ReSound Smart 3D app

JEFIT ios Manual Version 1.0 USER MANUAL. JEFIT Workout App Version 1.0 ios Device

Human Attributes Quantification from A 2D Image Using Hale CANVAS App

Communication Access Features on Apple devices

Beyond AI: Bringing Emotional Intelligence to the Digital

Re: ENSC 370 Project Gerbil Functional Specifications

visualfyhome Technology for people with hearing loss.

THIM User Manual 1.0 GETTING STARTED 3 WHAT YOU LL FIND IN THE BOX 3

Discover the Accessibility Features of Smartphones! A Wireless Education Seminar for Consumers who are Deaf and Hard-of-Hearing

POPULATION TRACKER - DREAMED USER GUIDE

Dexcom Current and Future Technology Innovations. Jorge Valdes Chief Technical Officer

Night Time Monitoring Programme me

The Opn Evolution. The remarkable journey of the development of Oticon Opn, and the groundbreaking results that prove life-changing benefits

Background. Opportunity. The Proposed Study of e-mh for Youth and Young Adults

THE STAR PROJECT. The STAR project will trial the app with two distinct client groups: Children with hearing impairment

Set Up SOS Video Chat and Screen-Sharing

Quick guide to connectivity and the Interton Sound app

CDC s New Milestone Tracker: There s an App for That!

Surveillance of Recent HIV Infections: Using a Pointof-Care Recency Test to Rapidly Detect and Respond to Recent Infections

Atlantis. Atlantis Viewer (app) User guide

$ 530 $ 800 $ HARDWARE SET a. HARDWARE SET b. HARDWARE SET c. Samsung Galaxy J2. Sennheiser HD280 Pro. Headphones.

Want to Engage on Age?

See the. Kickstart a life in control

1 Dexcom G4 Platinum User Guide May 2012

Epi-Care Wrist Sensor

Patient and Public Involvement in JPND Research

Mass Spectrometry Made Simple for Clinical Diagnostic Labs

Student Guide to EHR Go

Precyse University ICD-10 Education Tracks

Optimal Design: Bekesy Test for Mobile

Alternative Designs Report

Kinomap Trainer app is set for fitness machines including exercise bikes, ellipticals and cross trainers, treadmills and rowers.

Minimally Invasive Spine Surgery Education

Development of a Model and a Mobile Computing Framework for Effective Self-

Motion Charge&Go REPLICATING NATURE.

Connectivity guide for. BeMore app

Portable Retina Eye Scanning Device

Blood Pressure Monitor User Manual

R R R HARDWARE SET a. HARDWARE SET b. HARDWARE SET c. Samsung Galaxy J2. Sennheiser HD280 Pro. Headphones.

Transforming Healthcare. From a Managing Sickness Model to a Preserving Wellness Model. A Common Sense Approach to

ADVANCES in NATURAL and APPLIED SCIENCES

REPLICATING NATURE. Motion Charge&Go

AFHCANcart. Quick Check Guide

New Approaches to Accessibility. Richard Ladner University of Washington

EXPERIENCE THE WORLD S FIRST TRULY SMART HEARING DEVICE

Nokia Health Mate app

Yunmai Product Wiki. 1. YUNMAI Smart Scale M Premium Version PART ONE GENERAL DESCRITION

Thrive Hearing Control App User Guide for Apple. Congratulations on your journey to better hearing

Computational Neuroscience. Instructor: Odelia Schwartz

The next-generation, evidence-based clinical information resource DESIGNED TO OPTIMIZE TIME TO ANSWER.

Top 8 Neurology Apps Published on Diagnostic Imaging (

COMMUNITY REPORTER FAQ and STARTER KIT

What We ll Do Today. Goal of Capstone. Teams. Accessibility Capstone

Extraction of Blood Vessels and Recognition of Bifurcation Points in Retinal Fundus Image

Note: This is an authorized excerpt from 2016 Healthcare Benchmarks: Digital Health To download the entire report, go to

Presented by: Nathan Crone, M.D. and Mackenzie Cervenka, M.D. November 19 th, Innovation via Smart Watch: Meet Johns Hopkins EpiWatch

classic licensing program inflamaging

Literature Review Aarthi Sivasankar

Primary Screening and Ongoing Assessment, Diagnosis and Interventions

BREWFIT RULE FITNESS THE RIGHT WAY

Interactive Mobile Health and Rehabilitation

App user guide. resound.com

How-To Evaluate a Veterinary Digital Radiography System A SPECIAL REPORT

Assistive Technology Theories

Quick guide for Oticon Opn & Oticon ON App 1.8.0

Providing hope, help, support, and education since 1985 to improve the lives of people who have mood disorders.

Apps to help those with special needs.

AC : USABILITY EVALUATION OF A PROBLEM SOLVING ENVIRONMENT FOR AUTOMATED SYSTEM INTEGRATION EDUCA- TION USING EYE-TRACKING

Continuous Glucose Monitoring

Towards tailored and targeted asthma self-management using mobile technologies

FAQ ON QUANTUM HEALTH ANALYZER

Transcription:

itracker Word Count: 1453 Apper Contribution: 352 ECE 1778 - Creative Applications for Mobile Devices For: Professor Jonathan Rose April 12, 2013 By: Baieruss Trinos (990072292) Babak Basseri (996441845) Jeff Lee (995862883) 1

Introduction: One of the most common basic physical examination skills that are expected of most primary care physicians is the examination of eye movements (or extraocular movements), and being able to determine whether there are any disorders in the movements. This examination is performed routinely during periodic/annual health examinations with ones family physician, and also in acute-care settings, such as in the context of an acute stroke or head injury. The extraocular movement examination is performed by asking the patient to move their eyes (without moving their head) while following an object (e.g., a left index finger) in different directions left, right, up-left, down-left, up-right, and down-right. This is termed the six cardinal movements, and it provides a gross assessment of eye movements, also called the extraocular movements. As such, an assessment of the relevant (cranial) nerves, their associated muscles, and the muscle actions can be assessed for each eye. Any abnormalities detected in the coordinated movements of the eyes may suggest an underlying condition, such as a stroke, or brain tumor. Unfortunately, like many examinations in medicine, and despite significant interobserver reliability (e.g., most trained physicians will note the same gross abnormalities in movements), the examination of the extraocular movement is inherently subjective. Compared to other areas of medicine, such as the examination of the heart using a stethoscope, detecting abnormal heart sounds (or murmurs) is also analogously subjective. Consequently, patients are often sent for non-invasive, and relatively lowcost objective testing, namely a 2D-echocardiogram. However, for any eye movement disorders, patients are often referred to one of the few sub-specialists (e.g., Neuro- Ophthalmologists) across Canada, who may decide to do more specialized, and often time-consuming testing and investigations. Accordingly, aside from the inherently subjective physical examination, there are no currently available low-cost, non-invasive tools available to objectively evaluate for disorders of eye movements. With the advent of the Apple iphone in 2007, and Google Android platform in 2009, innovations in mobile phone, particularly video camera technology, have resulted in rapid advances and vast arrays of new tools available to healthcare professionals. Consequently, our goal was to develop a novel application that augments the clinical eye examination using a mobile phone camera to detect and capture normal and abnormal eye movements easily, accurately, in a low-cost and non-invasive manner. 2

Overall Design: Figure 1. Block diagram for itracker Android mobile application. The overall block design is shown in Figure 1. Functionalities of each component: Tracking Engine: o Utilizes OpenCV on Android to detect and track eyes using open-source algorithms based on Haar Classifiers. Voice Engine: o Utilizes the Google Android TTS Engine to provide computerized voice feedback during each step of testing Database: o SQLite database storing images and image processing data Diagnosis Engine: o Utilizes image processing data (e.g., eye coordinates) which are passed through physician designed algorithms (pending final implementation) to detect normal versus abnormal eye movements 3

Functionality and Screenshots: itracker was designed to have the following main functions: 1. Eye Detection 2. Eye Movement Testing 3. Eye Position Review 4. Diagnosis Engine This functionality can be demonstrated by the following walkthrough: Step 1 Start Tracking: The user is brought to the main screen for eye detection and calibration. The user should then hold the mobile phone approximately 1 foot away from the patient s face. The patient should then be asked to look straight ahead into the camera. At this point, the app will perform the following detection: 1. Face will be surrounded by a green coloured rectangle 2. Eye Orbits will each be surrounded by a red coloured rectangle 3. Eyes will each be surrounded by a yellow coloured rectangle 4. Pupils will each be marked by a bright point 4

Step 2 Start Testing: Once the image has been calibrated, the user can click on Next Test to advance the testing. Both on screen prompts and a computer voice will instruct the patient to look towards four specified directions (up, down, left and right) to complete each step of the test. Upon completion on testing, the user will be brought to the results screen. Step 3 Review Results: The results screen will allow the user to review the position of each eye, initially and in each of the four specified directions. These images can be reviewed clinically, and also downloaded off the mobile phone and stored into a patient s health records. 5

Step 4 Diagnosis: In the background, the diagnostic engine is performing an analysis of each eyes position. A provisional diagnosis is then provided to the user, along with additional diagnostic considerations which the clinician can review further. Functionality Issues and Limitations: Eye Detection: The App can generally detect eyes/pupils with high accuracy in ideal environmental conditions including having the patient stationary/seated and having adequate lighting. In less ideal conditions, the accuracy can be quite variable, and in some cases, either one, or no eyes/pupils are detected. Additionally, there may be more optimized eye detection algorithms available that can improve the detection accuracy, which will require ongoing research by our team. Diagnosis: The App is currently quite limited with regards to diagnostic capacity and provides limited/no accurate diagnoses at this stage. Limited algorithms have been developed that can provide a normal diagnosis, and four additional diagnoses for each eye namely the abnormal movement in one of the four linear directions. However, extracting accurate positional data from the eye/pupil detection proved to be a far more challenging endeavor than our programmers had anticipated particularly due to limited knowledge of OpenCV. This is further complicated by the currently limited accurate eye/pupil detection as noted previously. We will continue to explore OpenCV to determine how the positional data of each eye/pupil can be extracted and mapped and enable the algorithms to provide a provisional diagnosis. 6

Key Learning: Team Member: Baieruss Babak Jeff Key Learning: Learned to convey field-specific medical knowledge to non-healthcare professionals Learned to design algorithms for eye disorder detection Learned the basics of OpenCV Learned image processing under OpenCV Learned Android development for the first time Learned eye anatomy structure and function Learned the basics of OpenCV Learned Android UI design Learned eye anatomy structure and function Group Member Contribution: Group Member: Baieruss Babak Jeff Contribution: Role of team leader organized team meetings and collaborative efforts Construction of presentations for each spiral Construction of initial project reports and final report Design of eye detection algorithms (pending implementation) Input for user interface Discussion with clinical specialists on (eventual) testing of App on patients Voice engine OpenCV image capture code Database design Display of final results page User Interface design Set up common source code repository (GitHub) Set up initial source code using inputs from several open source code repositories 7

Apper Context: Whether it is Android or ios, there has been an ongoing paucity of medical applications, particularly using a mobile phone as a diagnostic aid or device. As of April 10, 2013, not a single top 50 application on the Google Play App Store Medical section uses a single sensor on the phone! Our App was therefore conceived to attempt to bridge this gap, and to further the utilization of mobile phones in clinical diagnostics. As a primary care physician, who practices in outpatient, inpatient, and emergency department settings, I have observed that the use of mobile phones as a diagnostic tool has been limited to non-existent thus far. Additionally, as a practitioner in downtown Toronto, which has the highest concentration of medical resources anywhere in Canada, I have certainly recognized that these resources are not uniformly available to all Canadians, especially those who are living in more rural centres. Not only is the access to subspecialists limited, but so too is access to specialized diagnostic equipment. From an ongoing research perspective, itracker will have broad clinical and research applications: Enhanced Collaboration: We view itracker is a first step in utilizing widely available mobile phone technology as a potential clinical aid and diagnostic tool, especially when access to highly specialized diagnostic equipment and sub-specialists (e.g., Neuro-Ophthalmologists) is limited. itracker can potentially further enhance collaboration not only amongst primary care physicians and subspecialists, but also physicians working in different geographic regions across Canada. The consultative process can be enhanced with the addition of sending images of potential eye movement disorders a practice that I have yet to encounter. Adding Objectivity to the Clinical Exam: itracker can provide both primary care providers and subspecialists another utility to enhance the objectivity of not only detecting eye movement disorders, but also capturing and storing the information for reference, and further analytics. Eye Detection Other Applications: Eye detection has broad applications, especially in other clinical areas such as psychiatry. For example, autistic children have been well described to have abnormal behaviours, including making poor eye contact. itracker may potentially be able to provide further insight into patterns of eye movements in these children, including potentially detecting abnormalities earlier. 8

Future Work: Enhance Accuracy and Performance: o We are endeavoring to enhance the accuracy and performance of itracker, including evaluating and testing other eye detection algorithms. Additionally, while we generally used some of the fastest available phones with the Android OS (e.g., Samsung Galaxy S3 and Google Nexus 4), benchmarks on different devices will be helpful to assess performance issues. Additional Capture Directions/Additional Diagnoses: o We will aim to add additional capture directions (e.g., up-left, down-left, upright, down-right) and thus be able to capture all the directions visualized in the clinical examination. Additionally, video capture may also be a future functionality. By adding the additional capture directions, we will be able to add additional potential diagnoses. Clinical Validation: o We will require clinical testing on patients both with and without eye movement disorders to enhance application performance and accuracy. Plans to test itracker at a large volume Neuro-Ophthalmology clinic are currently underway. Use in Low-Resource Areas: o By making our App publically available, we are aiming for clinical adoption especially in areas with limited healthcare resources and access to specialists and potentially add means of contacting and collaborating with specialists across Canada. Google Glass: o We would like to explore the options for using itracker on the Google Glass platform ideally, clinicians can perform a routine extraocular movement examination and have the Google Glass camera track the eyes alongside; analyses can then take place on the mobile device or potentially in the Cloud. Marketing/Entrepreneurship: We are not currently interested in having a business school class on marketing/entrepreneurship take up our project. Our goal is to make our project open source and publicly available for students, researchers, and healthcare professionals to modify, use, and enhance. Source Code: We are planning on releasing our source code to the public once we have completed the remaining basic functionality that we have intended to achieve. We anticipate on having this functionality ready over the next 30 to 60 days. At that point, we will make a link to download our source code publicly available. 9

Essential References: Bradski, Gary, and Adrian Kaehler. 2013. Learning OpenCV. O'Reilly. This book is an important key reference to learning OpenCV, and computer vision in general, and we consider this essential material for any programmers wishing to use OpenCV. Corco, Chet, and Garrett Menghini. 2012. Learning OpenCV with Xcode. Accessed April 2013. https://sites.google.com/site/learningopencv1/. While not specifically geared towards Android development, provided essential code samples and learning. GitHub. 2013. GitHub. Accessed April 2013. https://github.com/. Essential website/utility for having a common code repository. Hosek, Roman. 2013. Android eye detection and tracking with OpenCV. Accessed April 2013. http://romanhosek.cz/android-eye-detection-and-tracking-withopencv/. Essential code samples for eye detection and overall learning. OpenCV. 2013. OpenCV - Android. Accessed April 2013. http://opencv.org/platforms/android.html.. 2013. OpenCV4Android Samples. Accessed April 2013. http://opencv.org/platforms/android/opencv4android-samples.html. Stack Overflow. 2013. Stack Overflow. Accessed April 2013. http://stackoverflow.com/. Stack Overflow provided a general reference site for any code related issues as other users may have certainly encountered a similar problem as us. 10