RESNA Gaze Tracking System for Enhanced Human-Computer Interaction
|
|
- Ashlynn Young
- 5 years ago
- Views:
Transcription
1 RESNA Gaze Tracking System for Enhanced Human-Computer Interaction Journal: Manuscript ID: Submission Type: Topic Area: RESNA 2008 Annual Conference RESNA-SDC Student Design Competition Computer Applications & Communication (CAC)
2 Page 1 of 7 RESNA Paper Type: U, Topic Area: CAC Gaze Tracking System for Enhanced Human-Computer Interaction Michael J. Lenisa, Breanna Heidenburg, Daniel Wentzel, Dr. Aleksander Malinowski ABSTRACT Eye based interface systems for computers have existed for several years, but have not come into widespread use due to the high cost of the associated technology and devices. Recent developments in the area of high grade consumer cameras with small form factors have created devices which allow for low cost gaze tracking research to be conducted. Previous design work done in this field has often neglected to provide feedback to the user, thereby limiting the potential uses of those systems. Results have improved upon previous best tracking accuracies, as well as achieving a speed increase over conventional mouse based systems. A Head Mounted Display has been integrated into the system, allowing full hands free use and mobility. Continuing advances in this field, such as those shown in this research, will rapidly lead to the designed system being viable for real world assistive technology use. KEYWORDS human interface device; hands free operation; gaze tracking; eye tracking BACKGROUND Gaze tracking has been used for many decades as a tool to study human cognitive process including reading, driving, watching commercials, and other activities (1). With the advent of personal computers, potential integration of such systems has been considered only recently in 1991 by Jacob (2). Successful attempts to employ gaze tracking as user interface were made to allow users with movement disabilities to type by looking at virtual keyboard (3), or for mouse pointer control (4). Systems of this kind have also been used by individuals without any disabilities to enhance performance (5, 6). The continuing absence of consumer-grade eye-tracking human computer interface technology is result of the high price of eye tracking technology and intrusiveness of such systems, despite the fact that technologies allowing so have existed for many years (7). Only recently relatively low cost systems have been investigated, for example by Li et al in 2006 (8-10). A full technical description of this project, along with complete source code is available at the project website (11). Also, this research has been slated for publishing at the HIS conference in May The design work described herein shows the accuracy that can be obtained using modern, low-cost systems. The two most significant contributions of this research are the use of blob detection in the visible spectrum instead of circular shape in the infrared spectrum, and use of higher dimensional polynomials to calibrate and map the detected gaze position to the position on the computer screen. DESIGN AND DEVELOPMENT The design of this gaze tracking system called for an image capture device working in the visible spectrum to capture images of the user s eye. In order to provide reasonable resolution, the camera is placed slightly below the user s eye, positioned facing up towards the eye. In the design process it was found that the physiology of the human eye does not allow for easy shape detection of the pupil, as every user s eye is different, as well as slight shape changes of the pupil at various deflection angles. Therefore, image processing techniques such as contrast stretching,
3 RESNA Page 2 of 7 binary thresholding, and blob detection are used to correctly identify the pupil. After first identifying the center of a user s pupil, the Cartesian coordinates of the center are passed through a mapping algorithm to determine a cursor position on a head mounted display. After the user has properly calibrated the system, a set of constants is derived which are used in a point transformation algorithm. The point transformation algorithm accepts the set of calibration constants, as well as the location of the center of the pupil within the image. The output of this function is the coordinate position of a cursor on the screen representing where the user s gaze is focused. This process allows for gaze-based cursor positioning control, as well as action based interaction, such as clicking. Clicking is achieved via dwell time, wherein past positions of the user s gaze are used to determine if the user has been focusing in a small region for a given period of time. The size of the region and the dwell time required are variable parameters within the system. The system accuracy is limited by inherent properties of the human eye. Examples of problems contributing to inaccuracy include: saccadic eye movements, ocular field of view, and improper calibration. These problems manifest themselves in cursor positioning error, wherein the cursor position is a small distance from the actual location of the user s gaze. This inherent error reduces the usability of the system, such that at the current time use with a generic computer desktop becomes impractical. The solution to this was to create an interface specialized for gaze-based interaction, increasing the size of screen objects to a size larger than the expected error of the system. RESULTS The designed system produces accurate tracking and cursor placement, as well as producing a reaction time superior to traditional tactile interfaces. Figure 1 shows an error contour plot depicting absolute error at specific screen locations. Low error is shown in blue and high error is shown in red Figure 1 goes here As shown in Figure 1, error is higher in the corner sections of the screen. The average error over the entire screen is 14 pixels. This resulted in a system that was very usable. Users demonstrated a strong capability to control the system. Along with accurately placing the cursor, the system also improved upon reaction and positioning times of standard computer input devices (such as a mouse or track pad.) Figure 2 shows average user response times for a track pad, a mouse, and the gaze based system Figure 2 goes here As shown, the gaze based system is inherently faster than other input methods. A 53% increase was seen over standard track pad methods, and a 12% increase over mouse based systems. Users claimed that the increase in speed was incredible, with the interface able to navigate to a page selected by the user almost as quickly as the user could think about navigating to that page. CONCLUSION
4 Page 3 of 7 RESNA The designed system has shown to be an acceptable low cost alternative to previous gaze tracking methods. REFERENCES (1) Duchowski, A A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments and Computers, vol. 34, No. 4, pp (2) Jacob, R The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems, vol. 9, No. 2, pp (3) Majaranta, P., Raiha, K., Twenty years of eye typing: systems and design issues, Proceedings of the symposium on Eye tracking research and applications, 2002, pp (4) Hornof, A. J., Cavender, A., Hoselton, R., Eyedraw: A system for drawing pictures with eye movements, ACM SIGACCESS Conference on Computers and Accessibility, Atlanta, Georgia, 2004, pp (5) Silbert, L., Jacob, R., Evaluation of eye gaze interaction, Proceedings of the SIGCHI conference on Human factors in computing systems, 2000, pp (6) Tanriverdi, V., Jacob, R., Interacting with eye movements in virtual environments, Proceedings of the SIGCHI conference on Human factors in computing systems, 2000, pp (7) Young, L., Sheena, D., Survey of eye movement recording methods, Behavior Research Methods and Instrumentation 7, 1975, pp (8) Babcock J., Pelz, J., Building a lightweight eyetracking headgear, Eye Tracking Research and Applications Symposium, 2004, pp (9) Pelz, J., Canosa, R., Babcock J., Kucharczyk, D., Silver, A., Konno, D., Portable eyetracking: A study of natural eye movements Proceedings of the SPIE, Human Vision and Electronic Imaging, 2000, pp (10) Li, D., Babcock J., and Parkhurst, J.D.. openeyes: a low-cost head-mounted eye-tracking solution. AMC Eye Tracking Research and Applications Symposium, (11) Heidenburg, B., Lenisa, M., Wentzel, D., Malinowski, A., February 2008, Gaze Tracking System, Project Web Site, Available: (12) Heidenburg, B., Lenisa, M., Wentzel, D., Malinowski, A., Data Mining for Gaze Tracking System, To be published in Proceedings of Conference on Human System Interaction (HSI08), Krakow, Poland, May 25-27, CONTACT INFORMATION Michael J. Lenisa 750 Rosedale Roselle, IL CELL: (630) mlenisa@bradley.edu
5 RESNA Page 4 of 7 Figure 1 Figure 2
6 Page 5 of 7 RESNA!"#$#%&"''((%& )*"+($(%
7 RESNA Page 6 of 7 Shown is a three dimensional graph depicting error values for various cursor positions on the display.
8 Page 7 of 7 RESNA This chart shows a comparison of computer input methods and their respective reaction times. Input methods shown include: Track Pad (762.7 milliseconds), Standard mouse (559.9 milliseconds), and the designed gaze based system (496.9 milliseconds).
openeyes: a low-cost head-mounted eye-tracking solution
openeyes: a low-cost head-mounted eye-tracking solution Dongheng Li, Jason Babcock, and Derrick J. Parkhurst The Human Computer Interaction Program Iowa State University, Ames, Iowa, 50011 Abstract Eye
More informationGaze Tracking System
Gaze Tracking System Project Students: Breanna Michael Daniel Heidenburg Lenisa Wentzel Advisor: Dr. Malinowski Monday, December 10, 2007 Abstract An eye tracking system will be created that will control
More informationHUMAN COMPUTER INTERFACE
HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the
More informationKeeping an eye on the game: eye gaze interaction with Massively Multiplayer Online Games and virtual communities for motor impaired users
Keeping an eye on the game: eye gaze interaction with Massively Multiplayer Online Games and virtual communities for motor impaired users S Vickers 1, H O Istance 1, A Hyrskykari 2, N Ali 2 and R Bates
More informationEye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop Environment Mr. K.Raju Ms. P.Haripriya ABSTRACT: This paper addresses the eye gaze tracking problem using a lowcost andmore convenient web camera in a desktop
More informationTools for a Gaze-controlled Drawing Application Comparing Gaze Gestures against Dwell Buttons
Tools for a Gaze-controlled Drawing Application Comparing Gaze Gestures against Dwell Buttons Henna Heikkilä Tampere Unit for Computer-Human Interaction School of Information Sciences University of Tampere,
More informationCompensating for Eye Tracker Camera Movement
Compensating for Eye Tracker Camera Movement Susan M. Kolakowski Jeff B. Pelz Visual Perception Laboratory, Carlson Center for Imaging Science, Rochester Institute of Technology, Rochester, NY 14623 USA
More informationGazeTrain: A case study of an action oriented gaze-controlled game
Downloaded from orbit.dtu.dk on: Dec 20, 2017 GazeTrain: A case study of an action oriented gaze-controlled game Laursen, Lasse Farnung; Ersbøll, Bjarne Kjær Published in: COGAIN2009 Proceedings Publication
More informationComparison of Three Eye Tracking Devices in Psychology of Programming Research
In E. Dunican & T.R.G. Green (Eds). Proc. PPIG 16 Pages 151-158 Comparison of Three Eye Tracking Devices in Psychology of Programming Research Seppo Nevalainen and Jorma Sajaniemi University of Joensuu,
More informationwww. riseeyetracker.com TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01
TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01 CONTENTS 1 INTRODUCTION... 5 2 SUPPORTED CAMERAS... 5 3 SUPPORTED INFRA-RED ILLUMINATORS... 7 4 USING THE CALIBARTION UTILITY... 8 4.1
More informationPatents of eye tracking system- a survey
Patents of eye tracking system- a survey Feng Li Center for Imaging Science Rochester Institute of Technology, Rochester, NY 14623 Email: Fxl5575@cis.rit.edu Vision is perhaps the most important of the
More informationMulti-Modal User Interaction. Lecture 3: Eye Tracking and Applications
Multi-Modal User Interaction Lecture 3: Eye Tracking and Applications Zheng-Hua Tan Department of Electronic Systems Aalborg University, Denmark zt@es.aau.dk 1 Part I: Eye tracking Eye tracking Tobii eye
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationBuilding a lightweight eyetracking headgear
Building a lightweight eyetracking headgear Jason S.Babcock & Jeff B. Pelz Rochester Institute of Technology Abstract Eyetracking systems that use video-based cameras to monitor the eye and scene can be
More informationDESIGNING AND CONDUCTING USER STUDIES
DESIGNING AND CONDUCTING USER STUDIES MODULE 4: When and how to apply Eye Tracking Kristien Ooms Kristien.ooms@UGent.be EYE TRACKING APPLICATION DOMAINS Usability research Software, websites, etc. Virtual
More informationGaze-enhanced Scrolling Techniques
Gaze-enhanced Scrolling Techniques Manu Kumar Stanford University, HCI Group Gates Building, Room 382 353 Serra Mall Stanford, CA 94305-9035 sneaker@cs.stanford.edu Andreas Paepcke Stanford University,
More informationInsights into High-level Visual Perception
Insights into High-level Visual Perception or Where You Look is What You Get Jeff B. Pelz Visual Perception Laboratory Carlson Center for Imaging Science Rochester Institute of Technology Students Roxanne
More informationTowards a Google Glass Based Head Control Communication System for People with Disabilities. James Gips, Muhan Zhang, Deirdre Anderson
Towards a Google Glass Based Head Control Communication System for People with Disabilities James Gips, Muhan Zhang, Deirdre Anderson Boston College To be published in Proceedings of HCI International
More informationTowards Wearable Gaze Supported Augmented Cognition
Towards Wearable Gaze Supported Augmented Cognition Andrew Toshiaki Kurauchi University of São Paulo Rua do Matão 1010 São Paulo, SP kurauchi@ime.usp.br Diako Mardanbegi IT University, Copenhagen Rued
More informationBackground Adaptive Band Selection in a Fixed Filter System
Background Adaptive Band Selection in a Fixed Filter System Frank J. Crosby, Harold Suiter Naval Surface Warfare Center, Coastal Systems Station, Panama City, FL 32407 ABSTRACT An automated band selection
More informationEyeDROID: Android eye tracking system
EyeDROID: Android eye tracking system Daniel Garcia IT University of Copenhagen Copenhagen, Denmark dgac@itu.dk Ioannis Sintos IT University of Copenhagen Copenhagen, Denmark isin@itu.dk ABSTRACT Current
More informationPOLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM
BIOMEDICAL ENGINEERING- APPLICATIONS, BASIS & COMMUNICATIONS POLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM 141 CHERN-SHENG LIN 1, HSIEN-TSE CHEN 1, CHIA-HAU LIN 1, MAU-SHIUN
More informationPERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT
PERFORMANCE IN A HAPTIC ENVIRONMENT Michael V. Doran,William Owen, and Brian Holbert University of South Alabama School of Computer and Information Sciences Mobile, Alabama 36688 (334) 460-6390 doran@cis.usouthal.edu,
More informationEye-Gaze Tracking Using Inexpensive Video Cameras. Wajid Ahmed Greg Book Hardik Dave. University of Connecticut, May 2002
Eye-Gaze Tracking Using Inexpensive Video Cameras Wajid Ahmed Greg Book Hardik Dave University of Connecticut, May 2002 Statement of Problem To track eye movements based on pupil location. The location
More informationUWYO VR SETUP INSTRUCTIONS
UWYO VR SETUP INSTRUCTIONS Step 1: Power on the computer by pressing the power button on the top right corner of the machine. Step 2: Connect the headset to the top of the link box (located on the front
More informationPupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique
PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique Yoshinobu Ebisawa, Daisuke Ishima, Shintaro Inoue, Yasuko Murayama Faculty of Engineering, Shizuoka University Hamamatsu, 432-8561,
More informationA Comparative Study of Structured Light and Laser Range Finding Devices
A Comparative Study of Structured Light and Laser Range Finding Devices Todd Bernhard todd.bernhard@colorado.edu Anuraag Chintalapally anuraag.chintalapally@colorado.edu Daniel Zukowski daniel.zukowski@colorado.edu
More informationMicrosoft Scrolling Strip Prototype: Technical Description
Microsoft Scrolling Strip Prototype: Technical Description Primary features implemented in prototype Ken Hinckley 7/24/00 We have done at least some preliminary usability testing on all of the features
More informationMEASUREMENT CAMERA USER GUIDE
How to use your Aven camera s imaging and measurement tools Part 1 of this guide identifies software icons for on-screen functions, camera settings and measurement tools. Part 2 provides step-by-step operating
More informationCSE Thu 10/22. Nadir Weibel
CSE 118 - Thu 10/22 Nadir Weibel Today Admin Teams : status? Web Site on Github (due: Sunday 11:59pm) Evening meetings: presence Mini Quiz Eye-Tracking Mini Quiz on Week 3-4 http://goo.gl/forms/ab7jijsryh
More informationImplementing Eye Tracking Technology in the Construction Process
Implementing Eye Tracking Technology in the Construction Process Ebrahim P. Karan, Ph.D. Millersville University Millersville, Pennsylvania Mehrzad V. Yousefi Rampart Architects Group Tehran, Iran Atefeh
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More informationVEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu
More informationEye.Breathe.Music: creating music through minimal movement
Eye.Breathe.Music: creating music through minimal movement Sam Bailey, Adam Scott, Harry Wright, Ian Symonds, and Kia Ng ICSRiM University of Leeds, School of Computing, School of Electronic and Electrical
More informationText Input Methods for Eye Trackers Using Off-Screen Targets
Text Input Methods for Eye Trackers Using Off-Screen Targets Poika Isokoski* University of Tampere Abstract Text input with eye trackers can be implemented in many ways such as on-screen keyboards or context
More informationEyeDraw: Enabling Children with Severe Motor Impairments to Draw with Their Eyes
EyeDraw: Enabling Children with Severe Motor Impairments to Draw with Their Eyes Anthony J. Hornof Computer and Information Science University of Oregon Eugene, OR 97403 USA hornof@cs.uoregon.edu Abstract
More informationHow People Take Pictures: Understanding Consumer Behavior through Eye Tracking Before, During, and After Image Capture
SIMG-503 Senior Research How People Take Pictures: Understanding Consumer Behavior through Eye Tracking Before, During, and After Image Capture Final Report Marianne Lipps Visual Perception Laboratory
More informationSpring 2005 Group 6 Final Report EZ Park
18-551 Spring 2005 Group 6 Final Report EZ Park Paul Li cpli@andrew.cmu.edu Ivan Ng civan@andrew.cmu.edu Victoria Chen vchen@andrew.cmu.edu -1- Table of Content INTRODUCTION... 3 PROBLEM... 3 SOLUTION...
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationA Wearable Device for First Person Vision
A Wearable Device for First Person Vision Michaël Devyver Carnegie Mellon University Pittsburgh, Pennsylvania 15213, USA Akihiro Tsukada Carnegie Mellon University Pittsburgh, Pennsylvania 15213, USA Takeo
More informationWeb-Based Touch Display for Accessible Science Education
Web-Based Touch Display for Accessible Science Education Evan F. Wies*, John A. Gardner**, M. Sile O Modhrain*, Christopher J. Hasser*, Vladimir L. Bulatov** *Immersion Corporation 801 Fox Lane San Jose,
More informationFrame-Rate Pupil Detector and Gaze Tracker
Frame-Rate Pupil Detector and Gaze Tracker C.H. Morimoto Ý D. Koons A. Amir M. Flickner ÝDept. Ciência da Computação IME/USP - Rua do Matão 1010 São Paulo, SP 05508, Brazil hitoshi@ime.usp.br IBM Almaden
More informationDevelopment of Gaze Detection Technology toward Driver's State Estimation
Development of Gaze Detection Technology toward Driver's State Estimation Naoyuki OKADA Akira SUGIE Itsuki HAMAUE Minoru FUJIOKA Susumu YAMAMOTO Abstract In recent years, the development of advanced safety
More informationA software video stabilization system for automotive oriented applications
A software video stabilization system for automotive oriented applications A. Broggi, P. Grisleri Dipartimento di Ingegneria dellinformazione Universita degli studi di Parma 43100 Parma, Italy Email: {broggi,
More informationEYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1
EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1 Abstract Navigation is an essential part of many military and civilian
More informationSoftware Development Kit to Verify Quality Iris Images
Software Development Kit to Verify Quality Iris Images Isaac Mateos, Gualberto Aguilar, Gina Gallegos Sección de Estudios de Posgrado e Investigación Culhuacan, Instituto Politécnico Nacional, México D.F.,
More informationReal-time Simulation of Arbitrary Visual Fields
Real-time Simulation of Arbitrary Visual Fields Wilson S. Geisler University of Texas at Austin geisler@psy.utexas.edu Jeffrey S. Perry University of Texas at Austin perry@psy.utexas.edu Abstract This
More informationGAZE-CONTROLLED GAMING
GAZE-CONTROLLED GAMING Immersive and Difficult but not Cognitively Overloading Krzysztof Krejtz, Cezary Biele, Dominik Chrząstowski, Agata Kopacz, Anna Niedzielska, Piotr Toczyski, Andrew T. Duchowski
More informationGeo-Located Content in Virtual and Augmented Reality
Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationA Brief Survey of HCI Technology. Lecture #3
A Brief Survey of HCI Technology Lecture #3 Agenda Evolution of HCI Technology Computer side Human side Scope of HCI 2 HCI: Historical Perspective Primitive age Charles Babbage s computer Punch card Command
More informationAR 2 kanoid: Augmented Reality ARkanoid
AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular
More informationEye-Tracking Methodolgy
Eye-Tracking Methodolgy Author: Bálint Szabó E-mail: szabobalint@erg.bme.hu Budapest University of Technology and Economics The human eye Eye tracking History Case studies Class work Ergonomics 2018 Vision
More informationGaze Interaction and Gameplay for Generation Y and Baby Boomer Users
Gaze Interaction and Gameplay for Generation Y and Baby Boomer Users Mina Shojaeizadeh, Siavash Mortazavi, Soussan Djamasbi User Experience & Decision Making Research Laboratory, Worcester Polytechnic
More informationCombined Approach for Face Detection, Eye Region Detection and Eye State Analysis- Extended Paper
International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 10, Issue 9 (September 2014), PP.57-68 Combined Approach for Face Detection, Eye
More informationCSE Tue 10/23. Nadir Weibel
CSE 118 - Tue 10/23 Nadir Weibel Today Admin Project Assignment #3 Mini Quiz Eye-Tracking Wearable Trackers and Quantified Self Project Assignment #3 Mini Quiz on Week 3 On Google Classroom https://docs.google.com/forms/d/16_1f-uy-ttu01kc3t0yvfwut2j0t1rge4vifh5fsiv4/edit
More informationComparing Two Haptic Interfaces for Multimodal Graph Rendering
Comparing Two Haptic Interfaces for Multimodal Graph Rendering Wai Yu, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, U. K. {rayu, stephen}@dcs.gla.ac.uk,
More informationWearable Computing. Toward Mobile Eye-Based Human-Computer Interaction
Wearable Computing Editor: Bernt Schiele n MPI Informatics n schiele@mpi-inf.mpg.de Toward Mobile Eye-Based Human-Computer Interaction Andreas Bulling and Hans Gellersen Eye-based human-computer interaction
More informationColor Management User Guide
Color Management User Guide Edition July 2001 Phase One A/S Roskildevej 39 DK-2000 Frederiksberg Denmark Tel +45 36 46 01 11 Fax +45 36 46 02 22 Phase One U.S. 24 Woodbine Ave Northport, New York 11768
More information(12) Patent Application Publication (10) Pub. No.: US 2006/ A1
US 2006004.4273A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2006/0044273 A1 Numazawa et al. (43) Pub. Date: Mar. 2, 2006 (54) MOUSE-TYPE INPUT DEVICE (30) Foreign Application
More informationColumn-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation
ITE Trans. on MTA Vol. 2, No. 2, pp. 161-166 (2014) Copyright 2014 by ITE Transactions on Media Technology and Applications (MTA) Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based
More informationSpectrum Occupancy Measurement: An Autocorrelation based Scanning Technique using USRP
Spectrum Occupancy Measurement: An Autocorrelation based Scanning Technique using USRP Sriram Subramaniam, Hector Reyes and Naima Kaabouch Electrical Engineering, University of North Dakota Grand Forks,
More informationOmni-Directional Catadioptric Acquisition System
Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationElectronic Research Archive of Blekinge Institute of Technology
Electronic Research Archive of Blekinge Institute of Technology http://www.bth.se/fou/ This is an author produced version of a conference paper. The paper has been peer-reviewed but may not include the
More informationSetting Up a Matrix/Template in the Vienna Instruments Player
Setting Up a Matrix/Template in the Vienna Instruments Player By Peter Lawrence Alexander / February 6, 2013 A first glance at the Vienna Instruments player can appear to be both daunting and overwhelming.
More informationBefore you start, make sure that you have a properly calibrated system to obtain high-quality images.
CONTENT Step 1: Optimizing your Workspace for Acquisition... 1 Step 2: Tracing the Region of Interest... 2 Step 3: Camera (& Multichannel) Settings... 3 Step 4: Acquiring a Background Image (Brightfield)...
More informationDifferences in Fitts Law Task Performance Based on Environment Scaling
Differences in Fitts Law Task Performance Based on Environment Scaling Gregory S. Lee and Bhavani Thuraisingham Department of Computer Science University of Texas at Dallas 800 West Campbell Road Richardson,
More informationAttorney Docket No Date: 25 April 2008
DEPARTMENT OF THE NAVY NAVAL UNDERSEA WARFARE CENTER DIVISION NEWPORT OFFICE OF COUNSEL PHONE: (401) 832-3653 FAX: (401) 832-4432 NEWPORT DSN: 432-3853 Attorney Docket No. 98580 Date: 25 April 2008 The
More informationMeasuring the Greenness Index. Using Picture Post and Analyzing Digital Images software to measure seasonal changes in vegetation
Name: Date: Measuring the Greenness Index Using Picture Post and Analyzing Digital Images software to measure seasonal changes in vegetation Introduction A vegetation index is a single number that measures
More informationYou Can Make a Difference! Due November 11/12 (Implementation plans due in class on 11/9)
You Can Make a Difference! Due November 11/12 (Implementation plans due in class on 11/9) In last week s lab, we introduced some of the basic mechanisms used to manipulate images in Java programs. In this
More informationBuilding a gesture based information display
Chair for Com puter Aided Medical Procedures & cam par.in.tum.de Building a gesture based information display Diplomarbeit Kickoff Presentation by Nikolas Dörfler Feb 01, 2008 Chair for Computer Aided
More informationDEMONSTRATION OF AUTOMATIC WHEELCHAIR CONTROL BY TRACKING EYE MOVEMENT AND USING IR SENSORS
DEMONSTRATION OF AUTOMATIC WHEELCHAIR CONTROL BY TRACKING EYE MOVEMENT AND USING IR SENSORS Devansh Mittal, S. Rajalakshmi and T. Shankar Department of Electronics and Communication Engineering, SENSE
More informationThermaViz. Operating Manual. The Innovative Two-Wavelength Imaging Pyrometer
ThermaViz The Innovative Two-Wavelength Imaging Pyrometer Operating Manual The integration of advanced optical diagnostics and intelligent materials processing for temperature measurement and process control.
More informationA Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect
A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect Peter Dam 1, Priscilla Braz 2, and Alberto Raposo 1,2 1 Tecgraf/PUC-Rio, Rio de Janeiro, Brazil peter@tecgraf.puc-rio.br
More informationEyeRecToo: Open-source Software for Real-time Pervasive Head-mounted Eye Tracking
EyeRecToo: Open-source Software for Real-time Pervasive Head-mounted Eye Tracking Thiago Santini, Wolfgang Fuhl, David Geisler and Enkelejda Kasneci Perception Engineering, University of Tübingen, Tübingen,
More information][ R G [ Q] Y =[ a b c. d e f. g h I
Abstract Unsupervised Thresholding and Morphological Processing for Automatic Fin-outline Extraction in DARWIN (Digital Analysis and Recognition of Whale Images on a Network) Scott Hale Eckerd College
More informationService Bulletin
Service Bulletin 09-094 Applies To: ALL October 30, 2010 Photos For Warranty Windshield Claims (Supersedes 09-094, dated February 23, 2010, to revise the information marked by the black bars) REVISION
More informationwith MultiMedia CD Randy H. Shih Jack Zecher SDC PUBLICATIONS Schroff Development Corporation
with MultiMedia CD Randy H. Shih Jack Zecher SDC PUBLICATIONS Schroff Development Corporation WWW.SCHROFF.COM Lesson 1 Geometric Construction Basics AutoCAD LT 2002 Tutorial 1-1 1-2 AutoCAD LT 2002 Tutorial
More informationGlassSpection User Guide
i GlassSpection User Guide GlassSpection User Guide v1.1a January2011 ii Support: Support for GlassSpection is available from Pyramid Imaging. Send any questions or test images you want us to evaluate
More informationComparison of Relative Versus Absolute Pointing Devices
The InsTITuTe for systems research Isr TechnIcal report 2010-19 Comparison of Relative Versus Absolute Pointing Devices Kent Norman Kirk Norman Isr develops, applies and teaches advanced methodologies
More information(12) United States Patent (10) Patent No.: US 6,758,563 B2
USOO6758563B2 (12) United States Patent (10) Patent No.: Levola (45) Date of Patent: Jul. 6, 2004 (54) EYE-GAZE TRACKING 5,982,555 11/1999 Melville et al. 6,027.216 A * 2/2000 Guyton et al.... 351/200
More information1. Start with scatter plot: 2. Find corner points. 3. Capture image. 4. Corners
1. Start with scatter plot: 2. Find corner points Easiest way to insert picture properly in GeoGebra is to have corner points. We see that: bottom corner is (2,10) top corner is (9,21) 3. Capture image
More informationGESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL
GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different
More informationImplementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring
Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Ashill Chiranjan and Bernardt Duvenhage Defence, Peace, Safety and Security Council for Scientific
More informationReview on Eye Visual Perception and tracking system
Review on Eye Visual Perception and tracking system Pallavi Pidurkar 1, Rahul Nawkhare 2 1 Student, Wainganga college of engineering and Management 2 Faculty, Wainganga college of engineering and Management
More informationThe student will: download an image from the Internet; and use Photoshop to straighten, crop, enhance, and resize a digital image.
Basic Photoshop Overview: Photoshop is one of the most common computer programs used to work with digital images. In this lesson, students use Photoshop to enhance a photo of Brevig Mission School, so
More informationVishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)
Vishnu Nath Usage of computer vision and humanoid robotics to create autonomous robots (Ximea Currera RL04C Camera Kit) Acknowledgements Firstly, I would like to thank Ivan Klimkovic of Ximea Corporation,
More informationThe Wearable Eyetracker: A Tool for the Study of High-level Visual Tasks
The Wearable Eyetracker: A Tool for the Study of High-level Visual Tasks February 2003 Jason S. Babcock, Jeff B. Pelz Institute of Technology Rochester, NY 14623 Joseph Peak Naval Research Laboratories
More informationDRAFT 2016 CSTA K-12 CS
2016 CSTA K-12 CS Standards: Level 1 (Grades K-5) K-2 Locate and identify (using accurate terminology) computing, input, and output devices in a variety of environments (e.g., desktop and laptop computers,
More informationA miniature head-mounted camera for measuring eye closure
A miniature head-mounted camera for measuring eye closure Simon J. Knopp NZ Brain Research Institute Carrie R. H. Innes NZ Brain Research Institute Philip J. Bones Richard D. Jones NZ Brain Research Institute
More informationEXCELLENCE IN 3D MEASUREMENT
Application Example: Quality Control Sheet Metal: Measuring Characteristic Features Using the Optical Measuring Machine TRITOPCMM Measuring Systems: TRITOPCMM Keywords: Hole pattern, primitive location,
More informationZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field
ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field Figure 1 Zero-thickness visual hull sensing with ZeroTouch. Copyright is held by the author/owner(s). CHI 2011, May 7 12, 2011, Vancouver, BC,
More informationSwept-Field User Guide
Swept-Field User Guide Note: for more details see the Prairie user manual at http://www.prairietechnologies.com/resources/software/prairieview.html Please report any problems to Julie Last (jalast@wisc.edu)
More informationThe introduction and background in the previous chapters provided context in
Chapter 3 3. Eye Tracking Instrumentation 3.1 Overview The introduction and background in the previous chapters provided context in which eye tracking systems have been used to study how people look at
More informationOpto Engineering S.r.l.
TUTORIAL #1 Telecentric Lenses: basic information and working principles On line dimensional control is one of the most challenging and difficult applications of vision systems. On the other hand, besides
More informationCS 315 Intro to Human Computer Interaction (HCI)
CS 315 Intro to Human Computer Interaction (HCI) Direct Manipulation Examples Drive a car If you want to turn left, what do you do? What type of feedback do you get? How does this help? Think about turning
More informationDrumtastic: Haptic Guidance for Polyrhythmic Drumming Practice
Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The
More informationVirtual Touch Human Computer Interaction at a Distance
International Journal of Computer Science and Telecommunications [Volume 4, Issue 5, May 2013] 18 ISSN 2047-3338 Virtual Touch Human Computer Interaction at a Distance Prasanna Dhisale, Puja Firodiya,
More informationStitching MetroPro Application
OMP-0375F Stitching MetroPro Application Stitch.app This booklet is a quick reference; it assumes that you are familiar with MetroPro and the instrument. Information on MetroPro is provided in Getting
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationA Comparison of Smooth Pursuit- and Dwell-based Selection at Multiple Levels of Spatial Accuracy
A Comparison of Smooth Pursuit- and Dwell-based Selection at Multiple Levels of Spatial Accuracy Dillon J. Lohr Texas State University San Marcos, TX 78666, USA djl70@txstate.edu Oleg V. Komogortsev Texas
More information