Hand Eye Coordination Patterns in Target Selection
|
|
- Gavin Greene
- 5 years ago
- Views:
Transcription
1 Hand Eye Coordination Patterns in Target Selection Barton A. Smith, Janet Ho, Wendy Ark, and Shumin Zhai IBM Almaden Research Center 65 Harry Road San Jose, CA 9512 USA {basmith, ABSTRACT In this paper, we describe the use of tracking and trajectory analysis in the testing of the performance of input devices for control in Graphical User Interfaces (GUIs). By closely studying the behavior of test subjects performing pointing tasks, we can gain a more detailed understanding of the device design factors that may influence the overall performance with these devices. Our Results show there are many patterns of hand eye coordination at the computer interface which differ from patterns found in direct hand pointing at physical targets (Byrne, Anderson, Douglass, & Matessa, 1999). Keywords Eye tracking, gaze, hand eye coordination, pointing, target selection, mouse, touchpad, pointing stick, motor control 1. INTRODUCTION Human computer interface research has traditionally focused on performance. A typical topic of such a nature is computer input. Input devices and techniques are usually tested against a set of standard tasks in which user s performance on task completion time and error rate is measured and analyzed (e.g. Card, English, & Burr, 1978). The results of the performance analysis serve as the basis for refinement and redesign of the devices and techniques. However, observations at the performance level often overlook important information on how users actually accomplish the task, which may offer additional insights toward a better understanding of the interaction process and design solutions. As the field matures, process oriented research, should begin to contribute to the understanding of interaction. In human motor control research, the study of the micro-structure has served similar purposes (Jagacinski, Repperger, Moran, Ward, & Class, 198). In order to understand the usability of various 6 degree-offreedom (DOF) devices, Zhai and Milgram (Zhai, 1998) recently studied both the performance and the trajectory of 6 DOF manipulations of 3D objects. Their trajectory analysis revealed critical differences between devices in terms of coordination, which would not be found in time performance data alone. In conjunction with trajectory analysis, Eye-tracking provides a comprehensive approach to studying interaction processes. In the field of HCI, eye tracking has helped to improve the understanding of how users search and select menu items (Card, 1982), (Aaltonen, Hyrskykari, & Raiha, 1998), (Byrne et al., 1999). Eye tracking has also been used in studying human pointing tasks with hands in the physical world. Given that most motor control movement is either initiated or guided by perception, it is necessary to understand how relates to hand movement. For example, Helsen and colleagues studied the temporal and spatial coupling of gaze and hand movement (Helsen, Elliott, Starkes, & Ricker, 1998). In a reciprocal pointing task with two fixed targets, they found a rather invariant patterns of hand eye movement relationship: tended be initiated 7 ms earlier than hand movement; typically makes two saccades to land on target and the first saccade tended to undershoot. The pattern of task termination was also very consistent: stabilizes on target at 5% of the total hand response time. Pointing on a computing screen with an input device may or may not follow the patterns found in pointing with the hand to physical targets. There are many reasons for different behavior, due to the various disparities between the hand motion and the motion (Wang & MacKenzie, 1999). For examples, direct hand pointing is carried out with proprioceptive feedback of hand position in the human arm. Pointing at graphical objects on a screen is carried out with a, which does not have a direct, absolute mapping with hand motion. This means that the user may have to sample the location with gaze in the course of a pointing trial, unless the motion can be perceived by peripheral vision. The mapping between motion and input device is often a complex transfer function, which may further increase the complexity of the hand eye relationship in target acquisition tasks with a computer. In the case of computer mice, most of them are power mice with non-linear acceleration schemes. More precisely, the control gain in a power mouse is not a constant, but depends on the speed of the mouse motion. Faster movement of the mouse results in higher control gain. In the case of a pointing stick such as Trackpoint 1, the input force is mapped onto velocity by a complex transfer function (Rutledge and Selker), with various plateaus to provide a speed easier for the to follow. Detailed study of gaze pattern is surely useful for further refining the transfer functions in these devices. 1 TrackPoint is a trademark of the International Business Machines Corporation.
2 In the case of a small touchpad often seen in laptop computers, multiple strokes often have to be made in order to move the to a distant target. Does this mean the user has to gaze at the in order to make each stroke? In summary, understanding the eye hand relationship serves as an important foundation for understanding and designing input methods. Study has shown invariant hand-eye coordination patterns in direct hand pointing tasks (Helsen et al). The disparities between hand and motion on GUI interface suggest possibly much more complex hand eye behavior in computer target acquisition tasks, such as occasional gaze switch between and target, or gaze focus on. This study makes an initial attempt to test and understand hand eye coordination patterns at the computer interface. 2. METHODS 2.1 Participants 24 volunteers participated in the experiment. All participants had normal or corrected to normal vision. All were right-handed and were experienced computer users in the Windows platform with at least three years of continuous usage. Half of the participants had experience with the pointing stick. None of the participants had experience with the touchpad. 2.2 Experimental Design The participants were required to perform two different tasks with three input devices. The three input devices in question were mouse, touchpad, and pointing stick. Each participant switched to another input device after performing the two different tasks. In total, each participant performed six tasks. The order in which they used the input devices and the order of the two tasks were randomly counter-balanced. Task one was a reciprocal pointing task. A pair of identically sized circular targets was placed diametrically around the center of the computer monitor at specified distances and directions. Targets were presented with all possible combinations of the following: distance, radius, and angle. The center-to-center distances were, 4, and 6 pixels. The radii were 1, 2, and 3 pixels. The angles (from horizontal) were 45, -3,, 3, and 45 degrees. Each target pair was used for two trials, resulting in a total of 9 trials for this task. The participants were to look at the monitor screen and use the input device to point and alternately select by clicking on the presented target circles. Task two was a random pointing task. The participants pointed and clicked on a set of randomly distributed circular targets presented sequentially on the monitor screen. The targets had radii of 1, 2, and 3 pixels in random order. Each participant was also required to complete ninety trials of this task. Task 1 is most common in Fitts law based input device research. Task 2 is closer to what a user typically does on a computer screen by pointing. The key difference between the two types of tasks lies in the predictability of the target. For Task 1, the first click on a pair of targets started the actual data collection for that pair of targets. For the following two measured trials, the participant already knew where the next target was. For Task 2, the participant could not predict where a target would appear until the trial actually began. This difference may influence the hand-eye coordination pattern. Each subject received exactly the same set of targets. The target generation program used was a Java application called IDTest, available from The IDTest program ran on a Pentium-based 167 MHz computer. The targets were displayed at a resolution of 124 by 768 pixels by high colors (16 bits) on an IBM P21 monitor, using an ATI 3D PRO Turbo PC2TV video card. The viewable area of the screen was.365 m horizontal by.28 m vertical at a distance of.64 m from the eye. The screen refresh rate was 9 Hz. Three input devices were used in this experiment: an IBM mouse (model 12J3618), a Cirque SmartCat 2 touchpad, and an IBM TrackPoint pointing stick in a desktop keyboard. 2.3 Eye-tracking system Eye gaze position was tracked by an Applied Science Laboratories (ASL) Model 54 eye tracker unit. The tracking software ran on a Pentium MHz computer. This unit tracks gaze position by observing the position of the pupil and front surface reflection from a single eye. A chin rest was used to stabilize the participants viewing position and distance. A scan converter (Focus Enhancements TView Gold) was used to produce a combined video image signal of the targets displayed on the computer monitor and the eye position calculated by the eyetracker unit. This composite view was used by the experimenter to verify that the gaze tracking was working. Eye movement data were recorded every 6th of a second (6 Hz update rate) and averaged over every four data points through the ASL eye tracker interface program. The coordinates were recorded by IDTest. The eyetracker movement data were streamed into the P167 computer through a serial port and IDTest then combined eyetracker movement data and data in one single data file with respect to time. The eyetracking data obtained from the ASL eyetracker were converted into the same coordinate system as the coordinates. The calibration points on the ASL eyetracker and the stimulus machine were used to obtain the parameters for the conversion. 3. RESULTS We first look at the effect of device on overall pointing performance. Figure 1 shows the task completion time (the sum of all 9 trial completion times) for each device averaged over all tasks and subjects. Figure 1 shows that the device significantly affected performance time, F(2,44)=5.19, p<.1. Mean completion times were 12, 157, and 198 for the mouse, pointing stick and touchpad respectively. These results support previous findings for IBM TrackPoint device and touchpad performance times (Douglas, Kirkpatrick, and MacKenzie, 1999). We then searched for a constant invariant pattern that Helsen and colleagues found in direct hand pointing (Helsen et al 199?) through various statistical methods. However, after many attempts we could not find a strong central tendency in our data. Depending on the individual, many different hand eye coordination strategies were be used even within the same input device. 2 SmartCat is a trademark of Cirque Corporation.
3 Dummy 225 sentence is a significant pause, after beginning the trial by clicking on the initial target, before rapid motion toward the next target begins. (Seconds) Trackpoint Touchpad Mouse Figure 1: Mean performance Times by Device In order to examine the performance of the tasks in more detail, we examined patterns in relationship to and target locations by individual trials. Individual trial patterns as well as overall trends for a particular participant given a specific device and task were plotted. We found that unlike in the Helsen study, the hand-eye relationship was not very consistent across participants. Three different pointing behaviors appeared: eye gaze following the to the target, leading the to the target, and switching between the and the target until the target is reached. The first two were the most prevalent for these participants time (seconds) Figure 2: 'Cursor following' behavior Figure 2 is an example of the following the in Task 1 using the TrackPoint pointing stick. The graph shows the Euclidean distance of the and the to target. For this trial, the distance between target centers was 6 pixels and the target radius was 3 pixels. The trial began when the subject clicked on the target of the previous trial. There was little change in or position for the first.18 s. Then the began moving toward the target with the following. About.6 seconds was required to bring the within 5 pixels of the target center, at which time both the target and were within the foveal vision area. An additional.4 seconds was required to move the into the target and click. See Appendix A for an example of the aggregate data that shows this behavior for one participant. In Task 1, the subject knows in advance the position of the target for the next trial. Even so, there time (seconds) Figure 3: 'Target gaze' behavior Figure 3 is an example of the leading the to the target in Task 1 using the pointing stick. The target distance was 6 pixels and the target radius was 1 pixels. Both and position remained constant for the first.3 s of the trial, and then the moved rapidly to a position near the target. Essentially the person is focusing on the target rather than the. The followed to within pixels of the target in about.9 s, where it paused then went through and slightly beyond the target. An additional 1.8 s is required to bring the within the target and to click, during which time both and target are within the foveal vision area. See Appendix B for an example of the aggregate data, which show this behavior for one participant time (seconds) Figure 4: 'Switching' behavior Figure 4 is an example of switching from target to in Task 2 with a mouse. The target distance was 411 pixels and the target radius was 2 pixels. There was no motion, and thus no position recorded, for the first.17 s of the trial. When the was first moved, the was already near the target. This initial behavior is understandable since in Task 2, the new target appears in a random position. The subject must find the target visually before knowing in which direction the must be moved. The was moved rapidly to a position within less than pixels from the target center in about
4 .1 s. This is rapid initial movement phase is often seen with the mouse, due to the power mouse sensitivity acceleration. Then the was moved more slowly toward the target, during which time the gaze shifted back and forth from the target to the. Figures 5-7 are examples of one participant s and distance in relation to the target. These plots represent data from Task 1, only from the trials with distance of targets equal to 6 pixels. The figures 5-7 show the behavior using the pointing stick, mouse and touchpad, respectively. For this participant, the behaviors are very similar for all devices. There is a high concentration of points near the target before the draws closer to the target. For the touchpad plot, there are a couple trials when the arrives at the target in a very short period of time, but the high concentration of eye gaze points near the target are before the high concentration of points near the target. Figures 8-1 are examples of a different participant s and distance in relation to the target. Similar to the previous figures, these plots represent data from Task 1, only from the trials with target which are 6 pixels apart. Unlike the previous participant, who demonstrated similar behaviors across device, this participant seemed to change behavior. The pointing stick and touchpad behaviors are similar as both are exhibiting the target gaze behavior. However, for the mouse, the behavior cannot be categorized to either the target gaze behavior or the following behavior. In short, the hand eye coordination patterns were not determined by device alone. 4. DISCUSSIONS AND CONCLUSIONS In comparison to performance analysis, process-oriented study, especially eye tracking trajectory analysis, is much less mature and more complex. Many of the traditional techniques in studying performance, such as statistical variance analysis of
5 means, did not produce informative results in our study. When we averaged eye-tracking data in order to conduct variance analysis, we lost much of the information contained in the data. This may be partially due to the inadequacy of the method and partly due to the lack of one consistent pattern in the data, even for the same device and task parameters. Detailed, individual trial analysis and aggregate scatter plot analysis proved to me much more useful. We are planning to use alternative methods, such as data mining to further examine our data. When we started this project, we hoped to find consistent hand eye spatial and temporal relationship as Helsen and colleagues found in direct hand pointing tasks. They found that tended be initiated 7 ms earlier than hand movement; typically makes two saccades to land on target and the first saccade tended to undershot. Eye gaze stabilizes on target at 5% of the total hand response time. The result of our study showed the opposite: participants used a variety of hand eye coordination strategies in controlling to acquire targets. This is similar to the eye tracking study on menu selection by Byrne and colleague (Byrne et al, 1999) who predicted behavior with both ACT-R model and EPIC model, but neither model was confirmed or rejected. Eye movement pattern was more complex than either model could explain. In our current pointing task study, some participants used a strategy similar to direct hand pointing with the eye primarily on target and rarely on the. This means that without direct proprioceptive feedback of the position, which represents the hand action, some participants could use peripheral vision to monitor their motion. Alternatively, one can argue that these participants did not monitor their as all in the ballistic phase of pointing, as suggested by Woodworth s twophase theory of motor control (Woodworth, 1899). Woodworth s model suggests that the ballistic phase is triggered open loop behavior and only the second, phase near the target is guided closed-loop control behavior. Other participants used a strategy that is never reported in direct hand aiming tasks in the physical world. They continuously gazed at the, the visual representation of the physical hand, until the is in the vicinity of the target when both the and the target images are in the fovea. Yet other participants switched their attention back and forth between the and the target. This is also different from what the Woodworth model suggests in physical pointing. Overall, we found that participants used a variety of combinations of hand eye coordination patterns. This means that the design of input device algorithm should take all of these patterns into account. Designers cannot assume the one fixed hand-eye coordination pattern found in direct hand pointing. The trajectories alone also revealed some patterns different from those found in psychomotor studies where a bell curved velocity profile is often found. In our data, much faster speed is found at the onset of a trial, particularly when the input device was a power mouse. It is of both theoretical and practical importance to investigate if faster (than direct hand pointing) devices can be designed to take advantage of the non-linear transformation in input devices and various hand-eye coordination patterns found for computer target acquisition. 5. REFERENCES [1] Aaltonen, A., Hyrskykari, A., & Raiha, K.-J. (1998). 11 Spots, or How do users read menus? In Proceedings of CHI'98: ACM Conference on Human Factors in Computing Systems, Los Angeles, CA. [2] Byrne, M. D., Anderson, J. R., Douglass, S., & Matessa, M. (1999). Eye tracking the visual search of click-down menus. In Proceedings of CHI'99: ACM Conference on Human Factors in Computing Systems, Pittsburgh, PA. [3] Card, S. (1982, March 15-17, 1982). User perceptual mechanism in the search of computer command menus. In Proceedings of Human Factors in Computer Systems, Gaithersburg, Maryland. [4] Card, S. K., English, W. K., & Burr, B. J. (1978). Evaluation of mouse, rate controlled isometric joystick, step keys and text keys for text selection on a CRT. Ergonomics, 21, [5] Douglas, S. A., Kirkpatrick, A.E. and MacKenzie, I.S. Testing Pointing Device Performance and User Assessment with the ISO 9241, Part 9 Standard, in Proceedings of CHI '99 (Pittsburgh PA, May 1999) ACM Press, [6] Helsen, W. F., Elliott, D., Starkes, J. L., & Ricker, K. L. (1998). Temporal and spatial coupling of point of gaze and hand movement in aiming. Journal of Motor Behavior, 3(3), [7] Jagacinski, R. J., Repperger, D. W., Moran, M. S., Ward, S. L., & Class, B. (198). Fitts' law and the microstructure of rapid discrete movements. Journal of Experimental Psychology: Human Perception and Performance, 6(2), [8] Wang, Y., & MacKenzie, C. L. (1999). Effects of Orientation Disparity between haptic and graphic displays of objects in virtual environments., In Proceedings of INTERACT 99: IFIP International Conference on Human-Computer Interaction, Edinburgh, UK. [9] Woodworth, R. S. (1899). The accuracy of voluntary movement. The Psychological Review, Series of Monograph Supplements, 3(2 (Whole No. 13)), [1] Zhai, S., Mailgram, P. (1998). Quantifying coordination in multiple DOF movement and its application to evaluating 6 DOF input devices. In Proceedings of CHI'98: the ACM Conference on Human Factors in Computing Systems.
6 Appendix A All trials for a single subject, task, and device. 'Cursor following' behavior for Task 1 with mouse normalized time Appendix B All trials for a single subject, task, and device. 7 'Target gaze' strategy for Task 1 with TrackPoint distance (pixels) eye to target to target normalized time
The methods we were using to approach our past research
Methods The methods we were using to approach our past research Distinguish between the two different kind of controls Pressure threshold Combining Isometric and Isotonic Control to be available at the
More informationEvaluating Tactile Feedback in Graphical User Interfaces
Evaluating Tactile Feedback in Graphical User Interfaces Elina Tähkäpää and Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) Department of Computer and Information Sciences FIN- University
More informationPosner s Attention Test
iworx Physiology Lab Experiment Experiment HP-18 Posner s Attention Test Background Setup Lab Note: The lab presented here is intended for evaluation purposes only. iworx users should refer to the User
More informationFramework for Comparative Research on Relational Information Displays
Framework for Comparative Research on Relational Information Displays Sung Park and Richard Catrambone 2 School of Psychology & Graphics, Visualization, and Usability Center (GVU) Georgia Institute of
More informationIntelligent Object Group Selection
Intelligent Object Group Selection Hoda Dehmeshki Department of Computer Science and Engineering, York University, 47 Keele Street Toronto, Ontario, M3J 1P3 Canada hoda@cs.yorku.ca Wolfgang Stuerzlinger,
More informationInsight into Goal-Directed Movements: Beyond Fitts Law
Insight into Goal-Directed Movements: Beyond Fitts Law Karin Nieuwenhuizen 1, Dzmitry Aliakseyeu 2, and Jean-Bernard Martens 1 1 Eindhoven University of Technology, Department of Industrial Design, Den
More informationAC : USABILITY EVALUATION OF A PROBLEM SOLVING ENVIRONMENT FOR AUTOMATED SYSTEM INTEGRATION EDUCA- TION USING EYE-TRACKING
AC 2012-4422: USABILITY EVALUATION OF A PROBLEM SOLVING ENVIRONMENT FOR AUTOMATED SYSTEM INTEGRATION EDUCA- TION USING EYE-TRACKING Punit Deotale, Texas A&M University Dr. Sheng-Jen Tony Hsieh, Texas A&M
More informationMeasuring Focused Attention Using Fixation Inner-Density
Measuring Focused Attention Using Fixation Inner-Density Wen Liu, Mina Shojaeizadeh, Soussan Djamasbi, Andrew C. Trapp User Experience & Decision Making Research Laboratory, Worcester Polytechnic Institute
More informationCognitive Modeling Reveals Menu Search is Both Random and Systematic
Cognitive Modeling Reveals Menu Search is Both Random and Systematic Anthony J. Hornof and David E. Kieras Artificial Intelligence Laboratory Electrical Engineering & Computer Science Department University
More informationThe Effects of Hand Strength on Pointing Performance
Chapter 1 The Effects of Hand Strength on Pointing Performance P. Biswas and P. Robinson 1.1 Introduction Pointing tasks form a significant part of human-computer interaction in graphical user interfaces.
More information(Visual) Attention. October 3, PSY Visual Attention 1
(Visual) Attention Perception and awareness of a visual object seems to involve attending to the object. Do we have to attend to an object to perceive it? Some tasks seem to proceed with little or no attention
More informationPsychological Research
Psychol Res (1984) 46:121-127 Psychological Research Springer-Verlag 1984 Research note: Peak velocity timing invariance Alan M. Wing I and Ed Miller 2 1 Medical Research Council Applied Psychology Unit,
More informationFACTORS AFFECTING PERFORMANCE OF TARGET ACQUISITION TASKS FOR TOUCHPADS. Maya Arlini Puspasari 1, Yung-Hui Lee 2
International Journal of Technology (2012) 2: 145-159 ISSN 2086-9614 IJTech 2012 FACTORS AFFECTING PERFORMANCE OF TARGET ACQUISITION TASKS FOR TOUCHPADS Maya Arlini Puspasari 1, Yung-Hui Lee 2 1 Department
More informationTouchGrid: Touchpad pointing by recursively mapping taps to smaller display regions
Behaviour & Information Technology, vol. 24, no. 5 (2005), 337-346. Preprint version TouchGrid: Touchpad pointing by recursively mapping taps to smaller display regions Morten Hertzum Computer Science
More informationPupil Dilation as an Indicator of Cognitive Workload in Human-Computer Interaction
Pupil Dilation as an Indicator of Cognitive Workload in Human-Computer Interaction Marc Pomplun and Sindhura Sunkara Department of Computer Science, University of Massachusetts at Boston 100 Morrissey
More informationEBCC Data Analysis Tool (EBCC DAT) Introduction
Instructor: Paul Wolfgang Faculty sponsor: Yuan Shi, Ph.D. Andrey Mavrichev CIS 4339 Project in Computer Science May 7, 2009 Research work was completed in collaboration with Michael Tobia, Kevin L. Brown,
More informationEye Tracking the Visual Search of Click-Down Menus
Papers CHI 99 15-20 - MAY 1999 Eye Tracking the Visual Search of Click-Down Menus Michael D. Byrne, John R. Anderson, Scott Douglass, Michael Matessa Psychology Department Carnegie Mellon University Pittsburgh,
More informationControlled Experiments
CHARM Choosing Human-Computer Interaction (HCI) Appropriate Research Methods Controlled Experiments Liz Atwater Department of Psychology Human Factors/Applied Cognition George Mason University lizatwater@hotmail.com
More informationCharacterizing Visual Attention during Driving and Non-driving Hazard Perception Tasks in a Simulated Environment
Title: Authors: Characterizing Visual Attention during Driving and Non-driving Hazard Perception Tasks in a Simulated Environment Mackenzie, A.K. Harris, J.M. Journal: ACM Digital Library, (ETRA '14 Proceedings
More informationThe Gaze Cueing Paradigm with Eye Tracking Background Set-up Lab
iworx Physiology Lab Experiment Experiment HP-17 The Gaze Cueing Paradigm with Eye Tracking Background Set-up Lab Note: The lab presented here is intended for evaluation purposes only. iworx users should
More informationEvaluating Fitts Law Performance With a Non-ISO Task
Evaluating Fitts Law Performance With a Non-ISO Task ABSTRACT Maria Francesca Roig-Maimó University of Balearic Islands Dept. of Mathematics and Computer Science Palma, Spain xisca.roig@uib.es Cristina
More informationstraint for User Interface Design
Kochi University of Technology Aca Modeling Speed-Accuracy Tradeoff Title ased Tasks with Subjective Bias a straint for User Interface Design Author(s) ZHOU, Xiaolei Citation 高知工科大学, 博士論文. Date of 2009-09
More informationChange Blindness. The greater the lie, the greater the chance that it will be believed.
Change Blindness The greater the lie, the greater the chance that it will be believed. (kurt@kloover.com) Department of Computer Science Rochester Institute of Technology 1 Definitions Seeing: the use
More informationEnhancement of Application Software for Examination of Differential Magnification Methods and Magnification Interface Factors
Enhancement of Application Software for Examination of Differential Magnification Methods and Magnification Interface Factors Fion C. H. Lee and Alan H. S. Chan Abstract The variable visual acuity nature
More informationEvaluating Eyegaze Targeting to Improve Mouse Pointing for Radiology Tasks
Evaluating Eyegaze Targeting to Improve Mouse Pointing for Radiology Tasks Yan Tan, 1 Geoffrey Tien, 1 Arthur E. Kirkpatrick, 1 Bruce B. Forster, 2 and M. Stella Atkins 1 In current radiologists workstations,
More informationIntroductory Motor Learning and Development Lab
Introductory Motor Learning and Development Lab Laboratory Equipment & Test Procedures. Motor learning and control historically has built its discipline through laboratory research. This has led to the
More informationUsing Perceptual Grouping for Object Group Selection
Using Perceptual Grouping for Object Group Selection Hoda Dehmeshki Department of Computer Science and Engineering, York University, 4700 Keele Street Toronto, Ontario, M3J 1P3 Canada hoda@cs.yorku.ca
More informationModeling Visual Search Time for Soft Keyboards. Lecture #14
Modeling Visual Search Time for Soft Keyboards Lecture #14 Topics to cover Introduction Models of Visual Search Our Proposed Model Model Validation Conclusion Introduction What is Visual Search? Types
More informationDo you have to look where you go? Gaze behaviour during spatial decision making
Do you have to look where you go? Gaze behaviour during spatial decision making Jan M. Wiener (jwiener@bournemouth.ac.uk) Department of Psychology, Bournemouth University Poole, BH12 5BB, UK Olivier De
More informationOutline: Vergence Eye Movements: Classification I. Describe with 3 degrees of freedom- Horiz, Vert, torsion II. Quantifying units- deg, PD, MA III.
Outline: Vergence Eye Movements: Classification I. Describe with 3 degrees of freedom- Horiz, Vert, torsion II. Quantifying units- deg, PD, MA III. Measurement of Vergence:- Objective & Subjective phoria
More informationUsing Peripheral Processing and Spatial Memory to Facilitate Task Resumption
Using Peripheral Processing and Spatial Memory to Facilitate Task Resumption Raj M. Ratwani 1,2, Alyssa E. Andrews 2, Malcolm McCurry 1, J. Gregory Trafton 1,2, Matthew S. Peterson 2 Naval Research Laboratory
More informationCognitive Modeling Demonstrates How People Use Anticipated Location Knowledge of Menu Items
Cognitive Modeling Demonstrates How People Use Anticipated Location Knowledge of Menu Items Anthony J. Hornof and David E. Kieras Artificial Intelligence Laboratory Electrical Engineering & Computer Science
More informationNatural Scene Statistics and Perception. W.S. Geisler
Natural Scene Statistics and Perception W.S. Geisler Some Important Visual Tasks Identification of objects and materials Navigation through the environment Estimation of motion trajectories and speeds
More informationEffect of Positive and Negative Instances on Rule Discovery: Investigation Using Eye Tracking
Effect of Positive and Negative Instances on Rule Discovery: Investigation Using Eye Tracking Miki Matsumuro (muro@cog.human.nagoya-u.ac.jp) Kazuhisa Miwa (miwa@is.nagoya-u.ac.jp) Graduate School of Information
More informationPushing the Right Buttons: Design Characteristics of Touch Screen Buttons
1 of 6 10/3/2009 9:40 PM October 2009, Vol. 11 Issue 2 Volume 11 Issue 2 Past Issues A-Z List Usability News is a free web newsletter that is produced by the Software Usability Research Laboratory (SURL)
More informationResearch of Menu Item Grouping Techniques for Dynamic Menus Jun-peng GAO 1,a, Zhou-yang YUAN 1 and Chuan-yi LIU 1,b,*
2016 International Conference on Control and Automation (ICCA 2016) ISBN: 978-1-60595-329-8 Research of Menu Item Grouping Techniques for Dynamic Menus Jun-peng GAO 1,a, Zhou-yang YUAN 1 and Chuan-yi LIU
More informationExperimental evaluation of the accuracy of the second generation of Microsoft Kinect system, for using in stroke rehabilitation applications
Experimental evaluation of the accuracy of the second generation of Microsoft Kinect system, for using in stroke rehabilitation applications Mohammad Hossein Saadatzi 1 Home-based Stroke Rehabilitation
More informationSpeed Accuracy Trade-Off
Speed Accuracy Trade-Off Purpose To demonstrate the speed accuracy trade-off illustrated by Fitts law. Background The speed accuracy trade-off is one of the fundamental limitations of human movement control.
More informationHow Age Affects Pointing with Mouse and Touchpad: A Comparison of Young, Adult, and Elderly Users
International Journal of Human-Computer Interaction, vol. 26, no. 7 (2010), pp. 703-734 Preprint version How Age Affects Pointing with Mouse and Touchpad: A Comparison of Young, Adult, and Elderly Users
More informationPilot Study: Performance, Risk, and Discomfort Effects of the RollerMouse Station
Pilot Study: Performance, Risk, and Discomfort Effects of the RollerMouse Station Prepared For: Completed: 08-30-02 Humantech, Inc. Humantech Project #: 3258 Written By: M. Wynn, CPE Reviewed By: J.Sanford,
More informationThe influence of visual motion on fast reaching movements to a stationary object
Supplemental materials for: The influence of visual motion on fast reaching movements to a stationary object David Whitney*, David A. Westwood, & Melvyn A. Goodale* *Group on Action and Perception, The
More informationThe role of cognitive effort in subjective reward devaluation and risky decision-making
The role of cognitive effort in subjective reward devaluation and risky decision-making Matthew A J Apps 1,2, Laura Grima 2, Sanjay Manohar 2, Masud Husain 1,2 1 Nuffield Department of Clinical Neuroscience,
More informationChanging expectations about speed alters perceived motion direction
Current Biology, in press Supplemental Information: Changing expectations about speed alters perceived motion direction Grigorios Sotiropoulos, Aaron R. Seitz, and Peggy Seriès Supplemental Data Detailed
More informationAuthor(s) KONG, Jing, REN, Xiangshi, SHINOM. Rights Information and Communication Eng
Kochi University of Technology Aca Investigating the influence of co Title formance of pointing tasks for hu esign Author(s) KONG, Jing, REN, Xiangshi, SHINOM IEICE Transactions on Information Citation
More informationSpeed-Accuracy Tradeoff in Trajectory-Based Tasks with Temporal Constraint
Speed-Accuracy Tradeoff in Trajectory-Based Tasks with Temporal Constraint Xiaolei Zhou 1, Xiang Cao 2, and Xiangshi Ren 1 1 Kochi University of Technology, Kochi 782-8502, Japan zxljapan@gmail.com, ren.xiangshi@kochi-tech.ac.jp
More informationJournal of Experimental Psychology: Human Perception and Performance
Journal of Experimental Psychology: Human Perception and Performance Eye Movements Reveal how Task Difficulty Moulds Visual Search Angela H. Young and Johan Hulleman Online First Publication, May 28, 2012.
More informationFeature Integration Theory
Feature Integration Theory Introduction to Cognitive Sciences PROFESSOR AMITABHA MUKERJEE 2013 Authored by: Harmanjit Singh Feature Integration Theory Introduction to Cognitive Sciences Table of Contents
More informationEmpirical Research Methods for Human-Computer Interaction. I. Scott MacKenzie Steven J. Castellucci
Empirical Research Methods for Human-Computer Interaction I. Scott MacKenzie Steven J. Castellucci 1 Topics The what, why, and how of empirical research Group participation in a real experiment Observations
More informationEVALUATION OF DRUG LABEL DESIGNS USING EYE TRACKING. Agnieszka Bojko, Catherine Gaddy, Gavin Lew, Amy Quinn User Centric, Inc. Oakbrook Terrace, IL
PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 9th ANNUAL MEETING 00 0 EVALUATION OF DRUG LABEL DESIGNS USING EYE TRACKING Agnieszka Bojko, Catherine Gaddy, Gavin Lew, Amy Quinn User Centric,
More informationInternational Journal of Software and Web Sciences (IJSWS)
International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) ISSN (Print): 2279-0063 ISSN (Online): 2279-0071 International
More informationIntroduction to Computational Neuroscience
Introduction to Computational Neuroscience Lecture 11: Attention & Decision making Lesson Title 1 Introduction 2 Structure and Function of the NS 3 Windows to the Brain 4 Data analysis 5 Data analysis
More informationVISUAL FIELDS. Visual Fields. Getting the Terminology Sorted Out 7/27/2018. Speaker: Michael Patrick Coleman, COT & ABOC
VISUAL FIELDS Speaker: Michael Patrick Coleman, COT & ABOC Visual Fields OBJECTIVES: 1. Explain what is meant by 30-2 in regards to the Humphrey Visual Field test 2. Identify the difference between a kinetic
More informationHCI Lecture 1: Human capabilities I: Perception. Barbara Webb
HCI Lecture 1: Human capabilities I: Perception Barbara Webb Key points: Complexity of human, computer and task interacting in environment: which part should change? Human constraints, e.g. Fitts law for
More informationIAT 355 Perception 1. Or What You See is Maybe Not What You Were Supposed to Get
IAT 355 Perception 1 Or What You See is Maybe Not What You Were Supposed to Get Why we need to understand perception The ability of viewers to interpret visual (graphical) encodings of information and
More informationBest Practice: SPORTS
Best Practice: SPORTS Go to the section that is most appropriate for you Key Points... 1 Introduction... 1 Preparation... 3 Novice Athletes... 4 Training Sessions with Larger Groups (e.g. 5 25)... 4 Training
More informationAssistant Professor Computer Science. Introduction to Human-Computer Interaction
CMSC434 Introduction to Human-Computer Interaction Week 07 Lecture 19 Nov 4, 2014 Human Information Processing Human Computer Interaction Laboratory @jonfroehlich Assistant Professor Computer Science TODAY
More informationAugmented Cognition: Allocation of Attention
Augmented Cognition: Allocation of Attention Misha Pavel 1, Guoping Wang, Kehai Li Oregon Health and Science University OGI School of Science and Engineering 20000 NW Walker Road Beaverton, OR 97006, USA
More informationHUMAN ABILITIES CPSC 544 FUNDAMENTALS IN DESIGNING INTERACTIVE COMPUTATION TECHNOLOGY FOR PEOPLE (HUMAN COMPUTER INTERACTION) WEEK 7 CLASS 13
HUMAN ABILITIES CPSC 544 FUNDAMENTALS IN DESIGNING INTERACTIVE COMPUTATION TECHNOLOGY FOR PEOPLE (HUMAN COMPUTER INTERACTION) WEEK 7 CLASS 13 Joanna McGrenere and Leila Aflatoony Includes slides from Karon
More informationImproving Search Task Performance Using Subtle Gaze Direction
Improving Search Task Performance Using Subtle Gaze Direction Ann McNamara Saint Louis University Reynold Bailey Rochester Institute of Technology Cindy Grimm Washington University in Saint Louis Figure
More informationInventions on expressing emotions In Graphical User Interface
From the SelectedWorks of Umakant Mishra September, 2005 Inventions on expressing emotions In Graphical User Interface Umakant Mishra Available at: https://works.bepress.com/umakant_mishra/26/ Inventions
More informationPredictive Interaction using the Delphian Desktop
Predictive Interaction using the Delphian Desktop Takeshi Asano Ehud Sharlin Yoshifumi Kitamura Kazuki Takashima Fumio Kishino Human Interface Eng. Lab. Osaka University Suita, Osaka 565-0871, Japan {asano;
More information7 Grip aperture and target shape
7 Grip aperture and target shape Based on: Verheij R, Brenner E, Smeets JBJ. The influence of target object shape on maximum grip aperture in human grasping movements. Exp Brain Res, In revision 103 Introduction
More informationVisual Selection and Attention
Visual Selection and Attention Retrieve Information Select what to observe No time to focus on every object Overt Selections Performed by eye movements Covert Selections Performed by visual attention 2
More informationIntro to HCI / Why is Design Hard?
Intro to HCI / Why is Design Hard? September 12, 2016 Fall 2016 COMP 3020 1 Announcements A02 notes: http://www.cs.umanitoba.ca/~umdubo26/comp3020/ A01 notes: http://www.cs.umanitoba.ca/~bunt/comp3020/lecturenotes.html
More informationVISUAL PERCEPTION OF STRUCTURED SYMBOLS
BRUC W. HAMILL VISUAL PRCPTION OF STRUCTURD SYMBOLS A set of psychological experiments was conducted to explore the effects of stimulus structure on visual search processes. Results of the experiments,
More informationSri Vidya College of Engineering & Technology UNIT 1-2 MARKS
UNIT 1-2 MARKS 1. What are the 5 major senses? Sight Hearing Touch Taste Smell 2. What are the effoectors? Limbs Fingers Eyes Head Vocal system. 3. What are the two stages of vision the physical reception
More informationInfluence of Subliminal Cueing on Visual Search Tasks
Influence of Subliminal Cueing on Visual Search Tasks Bastian Pfleging, Dominique Rau, Niels Henze, Bastian Reitschuster Albrecht Schmidt VIS, University of Stuttgart VIS, University of Stuttgart 70569
More informationv Feature Stamping SMS 12.0 Tutorial Prerequisites Requirements TABS model Map Module Mesh Module Scatter Module Time minutes
v. 12.0 SMS 12.0 Tutorial Objectives In this lesson will teach how to use conceptual modeling techniques to create numerical models that incorporate flow control structures into existing bathymetry. The
More informationREACTION TIME MEASUREMENT APPLIED TO MULTIMODAL HUMAN CONTROL MODELING
XIX IMEKO World Congress Fundamental and Applied Metrology September 6 11, 2009, Lisbon, Portugal REACTION TIME MEASUREMENT APPLIED TO MULTIMODAL HUMAN CONTROL MODELING Edwardo Arata Y. Murakami 1 1 Digital
More informationThe Effects of Action on Perception. Andriana Tesoro. California State University, Long Beach
ACTION ON PERCEPTION 1 The Effects of Action on Perception Andriana Tesoro California State University, Long Beach ACTION ON PERCEPTION 2 The Effects of Action on Perception Perception is a process that
More informationLooking Back: Presenting User Study Results
Looking Back: Presenting User Study Results Keep in mind that there are various types of data Need to summarize the (vast amount of) collected data Graphs, e.g. histogram Characteristics» minimum, maximum,
More informationOPTIC FLOW IN DRIVING SIMULATORS
OPTIC FLOW IN DRIVING SIMULATORS Ronald R. Mourant, Beverly K. Jaeger, and Yingzi Lin Virtual Environments Laboratory 334 Snell Engineering Center Northeastern University Boston, MA 02115-5000 In the case
More informationVIDEONYSTAGMOGRAPHY (VNG)
VIDEONYSTAGMOGRAPHY (VNG) Expected outcomes Site of lesion localization: Determine which sensory input, motor output, and/ or neural pathways may be responsible for reported symptoms. Functional ability:
More informationAssignment Question Paper I
Subject : - Discrete Mathematics Maximum Marks : 30 1. Define Harmonic Mean (H.M.) of two given numbers relation between A.M.,G.M. &H.M.? 2. How we can represent the set & notation, define types of sets?
More informationReal Time Sign Language Processing System
Real Time Sign Language Processing System Dibyabiva Seth (&), Anindita Ghosh, Ariruna Dasgupta, and Asoke Nath Department of Computer Science, St. Xavier s College (Autonomous), Kolkata, India meetdseth@gmail.com,
More informationHUMAN ABILITIES CPSC 544 FUNDAMENTALS IN DESIGNING INTERACTIVE COMPUTATIONAL TECHNOLOGY FOR PEOPLE (HUMAN COMPUTER INTERACTION) WEEK 7 CLASS 13
HUMAN ABILITIES CPSC 544 FUNDAMENTALS IN DESIGNING INTERACTIVE COMPUTATIONAL TECHNOLOGY FOR PEOPLE (HUMAN COMPUTER INTERACTION) WEEK 7 CLASS 13 Joanna McGrenere and Leila Aflatoony Includes slides from
More informationEYE MOVEMENTS DURING VISUAL AND AUDITORY TASK PERFORMANCE
NAVAL HEALTH RESEARCH CENTER EYE MOVEMENTS DURING VISUAL AND AUDITORY TASK PERFORMANCE E. Viirre K. Van Orden S. Wing B. Chase C. Pribe V. Taliwal J. Kwak Report No. 04-04 Approved for public release;
More informationThe Impact of Schemas on the Placement of Eyes While Drawing.
The Red River Psychology Journal PUBLISHED BY THE MSUM PSYCHOLOGY DEPARTMENT The Impact of Schemas on the Placement of Eyes While Drawing. Eloise M. Warren. Minnesota State University Moorhead Abstract.
More informationPrincipals of Object Perception
Principals of Object Perception Elizabeth S. Spelke COGNITIVE SCIENCE 14, 29-56 (1990) Cornell University Summary Infants perceive object by analyzing tree-dimensional surface arrangements and motions.
More informationA longitudinal study of text entry by gazing and smiling. Titta Rintamäki
A longitudinal study of text entry by gazing and smiling Titta Rintamäki University of Tampere School of Information Sciences Interactive Technology M.Sc. thesis Supervisor: Outi Tuisku February 2014 ii
More informationCongruency Effects with Dynamic Auditory Stimuli: Design Implications
Congruency Effects with Dynamic Auditory Stimuli: Design Implications Bruce N. Walker and Addie Ehrenstein Psychology Department Rice University 6100 Main Street Houston, TX 77005-1892 USA +1 (713) 527-8101
More informationExperiences on Attention Direction through Manipulation of Salient Features
Experiences on Attention Direction through Manipulation of Salient Features Erick Mendez Graz University of Technology Dieter Schmalstieg Graz University of Technology Steven Feiner Columbia University
More informationA Vision-based Affective Computing System. Jieyu Zhao Ningbo University, China
A Vision-based Affective Computing System Jieyu Zhao Ningbo University, China Outline Affective Computing A Dynamic 3D Morphable Model Facial Expression Recognition Probabilistic Graphical Models Some
More informationHuman Information Processing. CS160: User Interfaces John Canny
Human Information Processing CS160: User Interfaces John Canny Review Paper prototyping Key part of early design cycle Fast and cheap, allows more improvements early Formative user study Experimenters
More informationHybrid BCI for people with Duchenne muscular dystrophy
Hybrid BCI for people with Duchenne muscular dystrophy François Cabestaing Rennes, September 7th 2017 2 / 13 BCI timeline 1929 - Electroencephalogram (Berger) 1965 - Discovery of cognitive evoked potentials
More information2012 Course : The Statistician Brain: the Bayesian Revolution in Cognitive Science
2012 Course : The Statistician Brain: the Bayesian Revolution in Cognitive Science Stanislas Dehaene Chair in Experimental Cognitive Psychology Lecture No. 4 Constraints combination and selection of a
More informationValidity of Haptic Cues and Its Effect on Priming Visual Spatial Attention
Validity of Haptic Cues and Its Effect on Priming Visual Spatial Attention J. Jay Young & Hong Z. Tan Haptic Interface Research Laboratory Purdue University 1285 EE Building West Lafayette, IN 47907 {youngj,
More informationPupillary Response Based Cognitive Workload Measurement under Luminance Changes
Pupillary Response Based Cognitive Workload Measurement under Luminance Changes Jie Xu, Yang Wang, Fang Chen, Eric Choi National ICT Australia 1 University of New South Wales jie.jackxu@gmail.com, {yang.wang,
More informationEye Movements, Perceptions, and Performance
Eye Movements, Perceptions, and Performance Soussan Djamasbi UXDM Research Laboratory Worcester Polytechnic Institute djamasbi@wpi.edu Dhiren Mehta UXDM Research Laboratory Worcester Polytechnic Institute
More informationIntelligent Mouse-Based Object Group Selection
Intelligent Mouse-Based Object Group Selection Hoda Dehmeshki and Wolfgang Stuerzlinger Department of Computer Science and Engineering York University, Toronto, Canada Abstract. Modern graphical user interfaces
More informationHuman Performance 8.1 HUMAN PERFORMANCE
Human Performance 8.1 Chapter 8: HUMAN PERFORMANCE Introduction As you will find in this book, computer input device design, and use, reflects a variety of choices in multiple dimensions. Ideally, these
More informationDifferential Viewing Strategies towards Attractive and Unattractive Human Faces
Differential Viewing Strategies towards Attractive and Unattractive Human Faces Ivan Getov igetov@clemson.edu Greg Gettings ggettin@clemson.edu A.J. Villanueva aaronjv@clemson.edu Chris Belcher cbelche@clemson.edu
More informationRethinking the Progress Bar
Rethinking the Progress Bar Chris Harrison 1,2 Brian Amento 2 Stacey Kuznetsov 3 Robert Bell 2 1 Human Computer Interaction Institute Carnegie Mellon University chrish@cmu.edu 2 AT&T Labs-Research Florham
More informationKoji Sakai. Kyoto Koka Women s University, Ukyo-ku Kyoto, Japan
Psychology Research, September 2018, Vol. 8, No. 9, 435-442 doi:10.17265/2159-5542/2018.09.002 D DAVID PUBLISHING Effect of Pursuit Eye Movement and Attentional Allocation on Perceptual-Motor Prediction
More informationDC-6. Diagnostic Ultrasound System
DC-6 Diagnostic Ultrasound System MINDRAY is proud to introduce DC-6, a color Doppler ultrasound system for general applications. DC-6 incorporates the latest digital ultrasound image processing technology
More informationPhonak Target 4.3. Desktop Fitting Guide. Content. March 2016
Phonak Target 4.3 March 2016 Desktop Fitting Guide This guide provides you with a detailed introduction to latest hearing instrument fitting with Phonak Target. www.phonakpro.com/target_guide Please also
More informationFinding an Efficient Threshold for Fixation Detection in Eye Gaze Tracking
Finding an Efficient Threshold for Fixation Detection in Eye Gaze Tracking Sudarat Tangnimitchok 1 *, Nonnarit O-larnnithipong 1 *, Armando Barreto 1 *, Francisco R. Ortega 2 **, and Naphtali D. Rishe
More informationGood Enough But I ll Just Check: Web-page Search as Attentional Refocusing
Good Enough But I ll Just Check: Web-page Search as Attentional Refocusing Duncan P. Brumby (BrumbyDP@Cardiff.ac.uk) Andrew Howes (HowesA@Cardiff.ac.uk) School of Psychology, Cardiff University, Cardiff
More informationHuman Information Processing
Human Information Processing CS160: User Interfaces John Canny. Topics The Model Human Processor Memory Fitt s law and Power Law of Practice Why Model Human Performance? Why Model Human Performance? To
More informationHuman On-Line Response to Visual and Motor Target Expansion
Human On-Line Response to Visual and Motor Target Expansion Andy Cockburn and Philip Brock Human-Computer Interaction Lab Department of Computing Science and Software Engineering University of Canterbury
More information