Haptic gaze-tracking based perception of graphical user interfaces
|
|
- Evangeline Robinson
- 5 years ago
- Views:
Transcription
1 University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 2007 Haptic gaze-tracking based perception of graphical user interfaces Simon Meers University of Wollongong, meers@uow.edu.au Koren Ward University of Wollongong, koren@uow.edu.au Publication Details This paper was originally published as: Meers, S & Ward, K, Haptic gaze-tracking based perception of graphical user interfaces, 11th International Conference on Information Visualisation, Zurich, Switzerland, 4-6 July Research Online is the open access institutional repository for the University of Wollongong. For further information contact the UOW Library: research-pubs@uow.edu.au
2 Haptic gaze-tracking based perception of graphical user interfaces Abstract This paper presents a novel human-computer interface that enables the computer display to be perceived without any use of the eyes. Our system works by tracking the user's head position and orientation to obtain their 'gaze' point on a virtual screen, and by indicating to the user what object is present at the gaze location via haptic feedback to the fingers and synthetic speech or Braille text. This is achieved by using the haptic vibration frequency delivered to the fingers to indicate the type of screen object at the gaze position, and the vibration amplitude to indicate the screen object's window-layer, when the object is contained in overlapping windows. Also, objects that are gazed at momentarily have their name output to the user via a Braille display or synthetic speech. Our experiments have shown that by browsing over the screen and receiving haptic and voice (or Braille) feedback in this manner, the user is able to acquire a mental two-dimensional representation of the virtual screen and its content without any use of the eyes. This form of blind screen perception can then be used to locate screen objects and controls and manipulate them with the mouse or via gaze control. Our experimental results are provided in which we demonstrate how this form of blind screen perception can effectively be used to exercise point-and-click and drag-and-drop control of desktop objects and open windows by using the mouse, or the user's head pose, without any use of the eyes. Keywords haptic, gaze-tracking, user-interface, human-computer-interface, blind, non-visual Disciplines Physical Sciences and Mathematics Publication Details This paper was originally published as: Meers, S & Ward, K, Haptic gaze-tracking based perception of graphical user interfaces, 11th International Conference on Information Visualisation, Zurich, Switzerland, 4-6 July This conference paper is available at Research Online:
3 Haptic Gaze-Tracking Based Perception of Graphical User Interfaces Simon Meers, Koren Ward University of Wollongong Abstract This paper presents a novel human-computer interface that enables the computer display to be perceived without any use of the eyes. Our system works by tracking the user s head position and orientation to obtain their gaze point on a virtual screen, and by indicating to the user what object is present at the gaze location via haptic feedback to the fingers and synthetic speech or Braille text. This is achieved by using the haptic vibration frequency delivered to the fingers to indicate the type of screen object at the gaze position, and the vibration amplitude to indicate the screen object s window-layer, when the object is contained in overlapping windows. Also, objects that are gazed at momentarily have their name output to the user via a Braille display or synthetic speech. Our experiments have shown that by browsing over the screen and receiving haptic and voice (or Braille) feedback in this manner, the user is able to acquire a mental two-dimensional representation of the virtual screen and its content without any use of the eyes. This form of blind screen perception can then be used to locate screen objects and controls and manipulate them with the mouse or via gaze control. Our experimental results are provided in which we demonstrate how this form of blind screen perception can effectively be used to exercise point-and-click and drag-and-drop control of desktop objects and open windows by using the mouse, or the user s head pose, without any use of the eyes. 1 Introduction To access the computer, most blind users use a screen reader program, like JAWS [4] or Window-Eyes [6], equipped with synthetic speech and/or a Braille display [3, 21, 25]. This gives blind users access to the text appearing on the screen. However, screen readers do not provide blind users with the ability to perceive and access Graphical User Interface (GUI) controls by using the mouse, which requires considerable hand-eye coordination to be used effectively. Consequently, most control access to the computer by the blind is achieved by using command line interfaces and/or by using specific combinations of control keys. This can limit the applications available to blind users as well as the tasks that can be performed on certain GUI-based applications. For example, when the software development industry changed from command line compilers to integrated development environments, almost all blind programmers were made redundant due their inability to effectively perceive and use such GUI development environments and the difficulty adapting such environments to their needs [1, 22]. Tactile devices for enabling blind users to perceive graphics or images on the computer have been under development for some time. For example the haptic mouse (e.g. [7, 9, 24]) can produce characteristic vibrations when the mouse cursor is moved over screen icons, window controls and application windows. Although this can indicate to a blind user what is currently beneath the mouse cursor, it does not give a blind user much idea of where screen objects are located on the screen or where the mouse is currently located. Force feedback devices, like the PHANToM [5], and tactile (or electro-tactile) displays, eg. [8, 10, 11, 13], can enable three-dimensional graphical models or twodimensional black-and-white images to be visualised by using the sense of touch (see [2] for a survey). However, little success has been demonstrated with these devices toward enabling blind users to interact with typical GUIs other than some simple memory and visualisation experiments like the memory house [23] which involves discovering buttons on different planes via force feedback and remembering the buttons that play the same sounds when found. Refreshable haptic Braille displays comprised of a twodimensional matrix of raisable pins have been used to devise some GUI controls. For example HyperBraille [12] has adopted the concept of pull-down menus which are customised to suit Braille displays. Also, the ACCESS project [20] attempts to present web based hypermedia information in Braille format within a refreshable Braille display. However, these systems do not provide the user with any two-dimensional perception of the screen content which can result in too little graphical information being presented to the user for effective GUI interaction. Also, confusion can occur where context-sensitive menus are involved. Furthermore, the main GUI concepts of point-
4 and-click and drag-and-drop do not implement well within Braille displays due to their limited size and the hands being occupied reading through Braille for much of the time. In an attempt to address these deficiencies, we have been developing a gaze-tracking interface for enabling blind users to perform many of the tasks that sighted users can perform on typical GUIs. Our system works by tracking the user s head pose with gaze-tracking hardware [14] to obtain the head s gaze position on a large virtual screen and by indicating to the user what is present at the gaze location via vibro-tactile or electro-tactile feedback to the fingers. The haptic vibration frequency felt by the fingers indicates the type of screen object at the gaze position. Likewise, the amplitude of the haptic vibrations indicates the window-layer of the screen object when the object is contained in overlapping windows. The mouse position is revealed with high amplitude vibrations when it is gazed at by the user. Any screen object s name can also be output to the user via synthetic speech or a Braille display when the user gazes at the object momentarily. By browsing over the screen and receiving haptic and voice (or Braille) feedback in this manner, the user is able to quickly acquire a mental representation of the virtual screen and its content. We have found this to be sufficient for performing many point-and-click or drag-anddrop tasks which are inherent in all GUI applications and operating systems. In the following sections we firstly provide a brief background review of our blind vision work leading up to the development of our haptic gaze-tracking user interface. This is followed with the implementation details of our system and our experimental results. 2 Background In previous work [15, 16], we have been investigating the use of electro-tactile user interfaces and range sensing devices for providing depth and colour perception of the environment to the blind. Our vision system, shown in Figure 1, works by extracting depth and colour information from sensors mounted on a headset worn by the user, and delivering this information to the fingers via electrotactile stimulation. To interpret the range and colour data, the user simply imagines that their hands are held with fingers extended in the direction viewed by the cameras. Each finger receives feedback regarding the area of the environment it is envisaged to be pointing toward. The intensity of electro-tactile stimulation felt indicates the distance to objects, while the frequency indicates the predominate colour of each region. Our experiments have shown that this continuous delivery of environmental depth and colour information to the user, in a form that is easy to interpret, enables the user to realise the 3D profile of their surroundings as well as the presence of landmarks based on their shape and colour, by Figure 1: The Electro-Neural Vision System Prototype surveying the environment with the sensor headset. Our experimental results demonstrate that this form of perception makes it possible for the user to navigate certain environments, recognise their location, and perceive the size, colour and movement of objects within the surrounding environment without any visual perception from the eyes. To adapt this concept to interpreting screen content on the computer we use a similar method for providing electro-tactile feedback to the fingers. However, instead of using a sensor headset, we use a head pose tracking system [14] that we developed to determine the user s gaze position on a predefined virtual screen. We also developed software for interpreting the virtual screen s content at the gaze position and delivering this information to the fingers via electro-tactile stimulation. The prototype electro-tactile interface is shown in Figure 2. This is comprised of Transcutaneous Electro-Neural Stimulation (TENS) electrodes that are fitted to the fingers and connected to a purpose-built computer-controlled TENS unit (not shown). In the following section, we provide further details of our haptic gaze-tracking user interface and the design of a vibro-tactile keyboard that could potentially be used instead of the electro-tactile interface. 3 Gaze-Tracking Interface The haptic gaze-tracking user interface is basically a device for enabling the computer screen to be perceived without any use of the eyes. This is achieved by tracking the head pose of the user and by delivering haptic feedback to the user s fingers which indicates what is located at the gaze position on a predefined virtual screen. This effectively enables the user to browse over the virtual screen and mentally visualise its content. It also makes it possible to perceive the mouse position on the virtual screen and to
5 3.1 The Virtual Screen For the system to work effectively, considerable attention had to be paid to appropriately defining the perceived virtual screen, its content and how this information is encoded for haptic interpretation. To provide increased resolution to the user we defined a large virtual screen that was divided up into a matrix of cells. A visual representation of part of the virtual screen is depicted in Figure 4. Figure 2: Electro-Tactile Interface comprised of Transcutaneous Electrodes fitted to the hands. manipulate icons, menus and other GUI controls through normal use of the mouse. The user can also choose to move the mouse with their gaze position (rather than independently) if desired which can simplify GUI-hand interaction considerably. Figure 3 shows the main hardware components of our haptic gaze-tracking user interface. Haptic feedback is provided by the TENS electrodes fitted to the user s fingers and connected the TENS unit (not shown). The head pose tracking system is comprised of the USB camera and the spectacles fitted to the user. The spectacles contain three infrared LEDs that are tracked by image processing software we developed. Note: the computer monitor shown in Figure 3, supporting the USB camera, is unnecessary for the system and was left turned off to conduct our experiments. Figure 3: Hardware Components: Gaze-Tracking Spectacles/Camera and Electro-Tactile Interface Figure 4: Close up representation of the virtual screen. To represent the desktop environment, each cell on the virtual screen is considered a unit of perception and may contain an icon or a GUI control. The size of the virtual screen is defined by the gaze area about which the user can comfortably browse. The cell size, which defines the virtual screen resolution, can be set by the user and would typically be smaller for experienced users (higher resolution) than for novice users (lower resolution). For our experiments we defined the virtual screen to be 6m wide x 4m high and at a distance of 2m from user. The size of each cell was set to 250mm x 250mm which provides a resolution of 24 cells wide x 16 cells high. Figure 5 shows a photo of our system together with an artist s representation of the virtual screen being perceived by the user. Our haptic gaze-tracking interface is intended to be mainly implemented at the operating system level as a means of providing alternative access to the desktop and typical GUI-based applications by the blind. Our prototype and initial experiments were based upon the Microsoft Windows desktop environment due to it being the most extensively deployed operating system. Subsequently, the main task involved reconstructing all of the GUI components of the Windows desktop (e.g. taskbar, icons, windows and controls) to cells comprising our virtual screen. Figures 4 and 5 show an example of how this was done with two open windows and some desktop icons. Our use of overlapping, resizable windows with scrollbars, etc., is not indicative of a belief that this is necessarily the best interface for blind users, however, this form
6 Figure 5: The Haptic Gaze-Tracking User Interface and an artist s impression of the Virtual Screen of interface has been widely adopted by operating system providers and application developers, and can be difficult for the blind to use due to the extensive use of GUI controls. Such GUI software applications can also be difficult to adapt to the needs of the blind by using development tools such as Microsoft s Active Accessibility [17]. Our haptic gaze-tracking technique may allow GUIs to be made accessible to blind users without significant redevelopment work. It may also provide the blind with an effective means of perceiving screen content and operating GUI controls via point-and-click and drag-and-drop actions which form the basis of nearly all modern computer applications. To implement the virtual screen, icons on the desktop are mapped to corresponding cells on the virtual screen and are made to snap to the nearest cell when released from the mouse. The taskbar is similarly mapped to the bottom row of cells on the virtual screen. Desktop icons are assigned a haptic type that defines the vibrations produced when the icon is gazed at. For our preliminary experiments, we simply defined five haptic types which were represented with the frequencies 10, 20, 40, 80 and 160 Hz. These haptic types correspond to file, folder, application, control and empty cell respectively. (Note: with the use of modulated frequencies this could be expanded.) Cells are also attributed to one of four levels of haptic intensity to enable the user to determine if the cell is at the desktop layer (lowest intensity); within an inactive window in front of the desktop but below another window (low-medium intensity); within the frontmost window (high-medium intensity); or occupied by the mouse (highest intensity). This can be seen in the visual representations of the virtual screen shown in Figures 4 and 5. Our experiments have shown that by browsing over the virtual screen and receiving haptic feedback in this manner the user is able to perceive the screen content in terms of where the mouse is currently located and where various icon types and controls are located on the desktop and within open windows. By gazing at any icon, control or label momentarily, the object s name is spoken via synthetic speech. (Note: Alternatively, the object s name can be displayed on a Braille display if one is fitted to the computer.) Consequently, it is relatively straightforward to perform tasks such as: locating a specific file or folder; opening files, folders or applications; or moving objects to another cell or folder by using the mouse or via gaze control. When a window is opened to reveal an open folder (or application window), its edges can be detected by the changes in haptic intensity when the user browses over it. This also enables the window controls to be located with relative ease for resizing, closing or minimising the window by using the mouse under gaze or hand control. Our haptic gaze-tracking interface is also equipped with an option for increasing the user s perceived area of the virtual screen near the gaze position. This is achieved by using vibrations delivered to the index fingers to reveal the cell at the gaze position and the vibrations delivered to the three outer fingers of each hand (ie. the pinky, ring and middle fingers) to reveal the content of three cell positions to the left and right of the gaze position as depicted in Figure 6. Figure 6: The electro-tactile interface showing how each finger is mapped to the gazed area of the virtual screen The overall intensity delivered to each finger can be preset in the system s setup. Normally, more intensity is desirable at the index fingers (cf. foveal vision) so that changes occurring at the outer (peripheral) fingers do not interfere with perceiving the object at the gaze centre. This enables the user to scan the entire virtual screen more quickly to reveal the location of screen icons, controls and the mouse. Our experiments have shown that even though it can be difficult to simultaneously realise the haptic types of all the screen objects that are browsed over rapidly using this form of foveal perception, it enables the user to quickly form a mental occupancy map of the screen layout. The user can then gaze at the objects of interest directly to exam-
7 ine them in more detail with the central (foveal) perception and retrieve additional information via speech synthesis or Braille output if desired. 3.2 Interacting with the Grid To assist inexperienced users in navigating the gridbased interface, an optional auditory beep cue is provided when the gaze moves from one cell to another. This is particularly useful when the user is scanning adjacent cells containing elements of the same type, since the haptic output will not change in this situation, making it difficult to perceive where one cell ends and the next begins. Our initial experiments with the interface also revealed confusion could occur when the user s gaze travelled along cell borders. For example, Figure 7(a) shows a path that the user s gaze could take across the virtual screen and the order in which the cells are visited. If these cells contain different screen objects the corresponding haptic responses would be similarly varying and confusing. To address this problem we introduced a threshold algorithm which prevented the gaze point from visiting a new cell until the gaze point moved more than one-quarter of a cell s width beyond the cell border (see Figure 7(b)). This measure enabled neighbouring cells to be differentiated and resolved more easily improving the interface s usability. (or elsewhere) with relative ease. We are also considering the use of vibro-tactile feedback as an alternative means of providing haptic feedback to the user. Figure 8 shows an artist s impression of one possible means of incorporating vibro-tactile feedback into a conventional computer keyboard. Here vibro-tactile actuators (or tactors) are used to provide feedback to the outer fingers, for providing peripheral perception, and a small Braille display is provided to inform the user of what is located at the gaze (or foveal) position via the pointer fingers. The mouse is implemented as a trackball that is controlled by the thumbs. The user can also translate (or morph) the mouse cursor to the gaze position by pressing a third mouse button. We expect to have our vibro-tactile interface and associated experimental results completed and published at a later stage. Figure 8: Artist s impression of the haptic keyboard. (a) (b) Figure 7: Grid/gaze interaction methods 3.3 Vibro-Tactile Interface At this stage our experiments have been conducted with an electro-tactile interface for providing haptic feedback to the user s fingers. Although this has proved effective it requires electrodes to be fitted to the user s fingers and wires from the electrodes to be connected to the TENS unit. To overcome this requirement we are developing wireless TENS electrode assemblies that can be fitted to the hands 3.4 Gaze-Tracking System Although a variety of gaze-tracking systems are commercially available, we found them all to be too computationally expensive or inaccurate for our application. Consequently, for this application we implemented an inexpensive and robust method for tracking the head position and orientation of the user by using a single low-cost USB camera and infrared light-emitting diodes concealed within spectacle frames worn by the user [14]. Unlike gaze and head pose tracking systems which rely on high-resolution stereo cameras and complex image processing hardware and software to find and track facial features on the user (eg. [18, 19, 26]), our head pose tracking system is able to efficiently locate and track the head s orientation and distance relative to the camera with little processing. Due to the infrared light-emitting diodes having fixed geometry, the system does not have to contend with the varying facial features of different users and therefore does not require any calibration procedure or training to accommodate any user. Furthermore, this system is unaffected by varying lighting conditions and can be used in the dark. The spectacles can be replaced by any other head-mounted object if desired. Our tests have shown that our head pose track-
8 ing system is accurate to within 0.5 degrees when the user is within one metre of the camera. This compares well with more expensive head tracking systems and proved adequate for our haptic gaze-tracking interface. 3.5 Learning Procedure We found that a setup procedure and practice was necessary for new users to be able to use the haptic gaze-tracking user interface effectively. The setup procedure was necessary to ensure that the electro-tactile signals delivered to the fingers were considered comfortable and appropriate for the user. Once set up, these settings are able to be saved to a file and restored whenever the user logs in. Also, practice at identifying desktop icons and window controls with the peripheral output turned off was necessary in order to avoid any confusion from trying to interpret too much information simultaneously with the hands. Once the user was able to associate the different electro-tactile signals with the different icon types, controls and window layers, the peripheral fingers could be gradually turned on and utilised for perceiving screen content more quickly. 4 Experimental Results Our experiments with the haptic gaze-tracking user interface have demonstrated that by browsing over the virtual screen and receiving haptic feedback in this manner the user is able to perceive and remember the types and locations of screen icons, the mouse and any GUI controls that are located on the desktop (or within open windows) without any use of the eyes. Furthermore, by gazing at any icon, control or label momentarily, and by having the object s name spoken via synthetic speech, it was relatively easy for the user to perform tasks such as: finding and starting applications, locating a specific file or folder, or opening files or folders. The user could also perform drag-anddrop manipulation of such objects, such as moving screen icons onto another cell or into another window by using the mouse with the hand or under gaze control. When a window was opened to reveal an open folder or application window, its edges could be easily detected by the changes in haptic intensity when the user browsed over it. This also enabled the window controls and menus to be located with relative ease and manipulation tasks to be performed like resizing, closing or minimising windows by using the mouse under gaze or hand control. Conclusion With GUI-based computer applications used extensively in the workplace, for education and for leisure, the blind are becoming increasingly disadvantaged and handicapped at performing many tasks on the computer that sighted users take for granted. Although efforts have been made to provide increased computer accessibility to the blind (e.g. Microsoft Active Accessibility [17] and JAWS for Windows [4]), many applications and subsequent occupations remain inaccessible and unachievable for the blind. This paper presents a novel haptic gaze-tracking humancomputer interface that enables the Windows desktop environment to be graphically perceived and accessed without any use of the eyes. Our experimental results have demonstrated that significant blind GUI perception and interactivity is able to be performed with our system including point-and-click and drag-and-drop control of conventional GUI objects. To further develop this work we intend also applying this method for achieving blind GUI interactivity to typical GUI-based computer applications such as webbrowsers and word processors. References [1] Alexander, S. Blind Programmers Facing Windows. Computer World, November 2, Reprinted online by CNN: computing/9811/06/blindprog.idg/ [2] Chouvardas, V., Miliou, A. and Hatalis, M., Tactile Displays: a short overview and recent Developments, 5th International Conference on Technology and Automation, (October, Thessaloniki, Greece), Pages , [3] Freedom Scientific. Focus Braille Display. products/ displays focus40-80.asp [4] Freedom Scientific. Job Access With Speech (JAWS). products/ software jaws.asp [5] Fritz, J.P., and Barner, K. E. Design of a Haptic Visualization System for People with Visual Impairments. IEEE Transactions on Rehabilitation Engineering, vol. 7, No 3, 1999, pp [6] GW Micro. Window-Eyes. com/ [7] Hughes, R.G. and Forrest, A.R. Perceptualisation using a tactile mouse. In Proceedings of the 7th conference on Visualization 96. Los Alamitos, CA, USA: IEEE Computer Society Press, 1996, pp. 181-ff. [8] Ikei, Y., Wakamatsu, K., and Fukuda, S. Texture display for tactile sensation. In Proceedings of the Seventh International Conference on Human-Computer Interaction, ser. Virtual Reality, vol. 2, 1997, pp [9] Immersion Corporation. ifeel Mouse. immersion.com
9 [10] Kaczmarek, K., Webster, J., Bach-y-Rita, P. and Tompkins, W. Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Transactions on Biomedical Engineering, vol. 38, Issue 1, pp. 1-16, [11] Kawai, Y. and Tomita, F. Interactive tactile display system: a support system for the visually disabled to recognize 3d objects. In Proceedings of the second annual ACM conference on Assistive technologies. ACM Press, 1996, pp [12] Kieninger, T. The growing up of hyperbraille an office workspace for blind people. In Proceedings of the 9th annual ACM symposium on User interface software and technology. New York, NY, USA: ACM Press, 1996, pp [13] Maucher, T., Meier, K., and Schemmel, J. The heidelberg tactile vision substitution system. In Proceeding of the Sixth International Conference on Tactile Aids, Hearing Aids and Cochlear Implants, [14] Meers, S., Ward, K. and Piper, I. Simple, Robust and Accurate head pose Tracking Using a Single Camera. Accepted in Proceedings of the Thirteenth Annual Conference on Mechatronics and Machine Vision in Practice, (December, Toowoomba, Australia), [15] Meers, S. and Ward, K. A Vision System for Providing 3D Perception of the Environment via Transcutaneous Electro-Neural Stimulation. In Proceedings of the 8th IEEE International Conference on Information Visualisation (July, London, UK), 2004, pp [16] Meers, S. and Ward, K. A Vision System for Providing the Blind with 3D Colour Perception of the Environment. In Proceedings of the 2005 Asia-Pacific Workshop on Visual Information Processing (November, Hong Hong), 2005, pp [17] Microsoft Corporation. Microsoft Active Accessibility. url=/library/en-us/dnanchor/html/accessibility.asp [18] NaturalPoint Inc. TrackIR, http: // [19] Newman, R., Matsumoto, Y., Rougeaux, S., and Zelinsky, A. Real-time stereo tracking for head pose and gaze estimation. In Proceedings. Fourth IEEE International Conference on Automatic Face and Gesture Recognition, pages , [20] Petrie, H., Morley, S., McNally, P., Graziana, P. and Emiliani, P. Access to hyper-media systems for blind people. Sensort Disabilities Research Unit, [21] Roberts, J., Slattery, O., and Kardos, D. Rotatingwheel braille display for continuous refreshable braille. In Society for Information Display conference in Long Beach, California, May 18; 2000 SID International Symposium Digest of Technical Papers, vol. XXXI, May 2000, pp [22] Siegfried, R., Teaching the Blind to Program Visually, In Proceedings of ISECON 2004, Newport, December [23] Sjöström, C. Designing Haptic Computer Interfaces For Blind People. In Proceedings of the Sixth IEEE International Symposium on Signal Processing and its Applications, Kuala Lumpur, Malaysia, August 13-16, [24] Virtouch Ltd. Virtouch Imaging Applications. Jerusalem, Israel. [25] Yobas, Y., Durand, D.M., Skebe, G.G., Lisy, F.J. and Huff, M.A. A novel integrable microvalve for refreshable braille display system. In Journal Of Microelectromechanical Systems, June 2003, vol. 12, no. 3, pp [26] Zhu, Z. and Ji, Q.. Real time 3d face pose tracking from an uncalibrated camera. In First IEEE Workshop on Face Processing in Video, in conjunction with IEEE International Conference on Computer Vision and Pattern Recognition (CVPR 04), (June, Washington DC), 2004.
Head-tracking haptic computer interface for the blind
University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 2010 Head-tracking haptic computer interface for the blind Simon Meers
More informationA vision system for providing 3D perception of the environment via: transcutaneous electro-neural stimulation
University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 2004 A vision system for providing 3D perception of the environment via:
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationGraphical User Interfaces for Blind Users: An Overview of Haptic Devices
Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Hasti Seifi, CPSC554m: Assignment 1 Abstract Graphical user interfaces greatly enhanced usability of computer systems over older
More informationHead-controlled perception via electro-neural stimulation
University of Wollongong Research Online University of Wollongong Thesis Collection University of Wollongong Thesis Collections 2012 Head-controlled perception via electro-neural stimulation Simon Meers
More informationComparing Two Haptic Interfaces for Multimodal Graph Rendering
Comparing Two Haptic Interfaces for Multimodal Graph Rendering Wai Yu, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, U. K. {rayu, stephen}@dcs.gla.ac.uk,
More informationMECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES
INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL
More informationThe use of gestures in computer aided design
Loughborough University Institutional Repository The use of gestures in computer aided design This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: CASE,
More informationTactile sensing system using electro-tactile feedback
University of Wollongong Research Online Faculty of Engineering and Information Sciences - Papers: Part A Faculty of Engineering and Information Sciences 2015 Tactile sensing system using electro-tactile
More informationTele-operation of a robot arm with electro tactile feedback
University of Wollongong Research Online Faculty of Engineering and Information Sciences - Papers: Part A Faculty of Engineering and Information Sciences 2013 Tele-operation of a robot arm with electro
More informationHUMAN COMPUTER INTERFACE
HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the
More informationMultisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study
Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv, Tel-Aviv,
More informationTele-operation of a Robot Arm with Electro Tactile Feedback
F Tele-operation of a Robot Arm with Electro Tactile Feedback Daniel S. Pamungkas and Koren Ward * Abstract Tactile feedback from a remotely controlled robotic arm can facilitate certain tasks by enabling
More informationHaptic presentation of 3D objects in virtual reality for the visually disabled
Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,
More informationPERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT
PERFORMANCE IN A HAPTIC ENVIRONMENT Michael V. Doran,William Owen, and Brian Holbert University of South Alabama School of Computer and Information Sciences Mobile, Alabama 36688 (334) 460-6390 doran@cis.usouthal.edu,
More informationDirect Manipulation. and Instrumental Interaction. CS Direct Manipulation
Direct Manipulation and Instrumental Interaction 1 Review: Interaction vs. Interface What s the difference between user interaction and user interface? Interface refers to what the system presents to the
More informationAdvancements in Gesture Recognition Technology
IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka
More informationR (2) Controlling System Application with hands by identifying movements through Camera
R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity
More informationMultisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills
Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills O Lahav and D Mioduser School of Education, Tel Aviv University,
More informationTactile Vision Substitution with Tablet and Electro-Tactile Display
Tactile Vision Substitution with Tablet and Electro-Tactile Display Haruya Uematsu 1, Masaki Suzuki 2, Yonezo Kanno 2, Hiroyuki Kajimoto 1 1 The University of Electro-Communications, 1-5-1 Chofugaoka,
More informationComparison of Haptic and Non-Speech Audio Feedback
Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability
More informationAutomatic Online Haptic Graph Construction
Automatic Online Haptic Graph Construction Wai Yu, Kenneth Cheung, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science University of Glasgow, Glasgow, UK {rayu, stephen}@dcs.gla.ac.uk
More informationHuman Factors. We take a closer look at the human factors that affect how people interact with computers and software:
Human Factors We take a closer look at the human factors that affect how people interact with computers and software: Physiology physical make-up, capabilities Cognition thinking, reasoning, problem-solving,
More informationDifferences in Fitts Law Task Performance Based on Environment Scaling
Differences in Fitts Law Task Performance Based on Environment Scaling Gregory S. Lee and Bhavani Thuraisingham Department of Computer Science University of Texas at Dallas 800 West Campbell Road Richardson,
More informationVirtual Reality Calendar Tour Guide
Technical Disclosure Commons Defensive Publications Series October 02, 2017 Virtual Reality Calendar Tour Guide Walter Ianneo Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Computer Graphics Virtual Reality Marc Erich Latoschik AI & VR Lab Artificial Intelligence Group University of Bielefeld Virtual Reality (or VR for short) Virtual Reality (or VR for short)
More informationYu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp
Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp. 105-124. http://eprints.gla.ac.uk/3273/ Glasgow eprints Service http://eprints.gla.ac.uk
More informationGeo-Located Content in Virtual and Augmented Reality
Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationA Brief Survey of HCI Technology. Lecture #3
A Brief Survey of HCI Technology Lecture #3 Agenda Evolution of HCI Technology Computer side Human side Scope of HCI 2 HCI: Historical Perspective Primitive age Charles Babbage s computer Punch card Command
More informationChapter 1 - Introduction
1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over
More informationHeads up interaction: glasgow university multimodal research. Eve Hoggan
Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not
More informationChapter 2 Introduction to Haptics 2.1 Definition of Haptics
Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic
More informationHaptic Camera Manipulation: Extending the Camera In Hand Metaphor
Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium
More informationThe University of Algarve Informatics Laboratory
arxiv:0709.1056v2 [cs.hc] 13 Sep 2007 The University of Algarve Informatics Laboratory UALG-ILAB September, 2007 A Sudoku Game for People with Motor Impairments Stéphane Norte, and Fernando G. Lobo Department
More informationGlasgow eprints Service
Yu, W. and Kangas, K. (2003) Web-based haptic applications for blind people to create virtual graphs. In, 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 22-23 March
More informationDirect Manipulation. and Instrumental Interaction. Direct Manipulation 1
Direct Manipulation and Instrumental Interaction Direct Manipulation 1 Direct Manipulation Direct manipulation is when a virtual representation of an object is manipulated in a similar way to a real world
More informationElectro-tactile Feedback System for a Prosthetic Hand
University of Wollongong Research Online Faculty of Engineering and Information Sciences - Papers: Part A Faculty of Engineering and Information Sciences 2015 Electro-tactile Feedback System for a Prosthetic
More informationGESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL
GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different
More informationDirect Manipulation. and Instrumental Interaction. Direct Manipulation
Direct Manipulation and Instrumental Interaction Direct Manipulation 1 Direct Manipulation Direct manipulation is when a virtual representation of an object is manipulated in a similar way to a real world
More informationInteractive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1
VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio
More informationExploring Geometric Shapes with Touch
Exploring Geometric Shapes with Touch Thomas Pietrzak, Andrew Crossan, Stephen Brewster, Benoît Martin, Isabelle Pecci To cite this version: Thomas Pietrzak, Andrew Crossan, Stephen Brewster, Benoît Martin,
More informationAn Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment
An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment Mohamad Shahrul Shahidan, Nazrita Ibrahim, Mohd Hazli Mohamed Zabil, Azlan Yusof College of Information Technology,
More informationHaplug: A Haptic Plug for Dynamic VR Interactions
Haplug: A Haptic Plug for Dynamic VR Interactions Nobuhisa Hanamitsu *, Ali Israr Disney Research, USA nobuhisa.hanamitsu@disneyresearch.com Abstract. We demonstrate applications of a new actuator, the
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More informationInteracting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception
More informationHuman Computer Interaction
Unit 23: Human Computer Interaction Unit code: QCF Level 3: Credit value: 10 Guided learning hours: 60 Aim and purpose T/601/7326 BTEC National The aim of this unit is to ensure learners know the impact
More informationVIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE
VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE Yiru Zhou 1, Xuecheng Yin 1, and Masahiro Ohka 1 1 Graduate School of Information Science, Nagoya University Email: ohka@is.nagoya-u.ac.jp
More informationInteractive Exploration of City Maps with Auditory Torches
Interactive Exploration of City Maps with Auditory Torches Wilko Heuten OFFIS Escherweg 2 Oldenburg, Germany Wilko.Heuten@offis.de Niels Henze OFFIS Escherweg 2 Oldenburg, Germany Niels.Henze@offis.de
More informationEnvironmental control by remote eye tracking
Loughborough University Institutional Repository Environmental control by remote eye tracking This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: SHI,
More informationDo You Feel What I Hear?
1 Do You Feel What I Hear? Patrick Roth 1, Hesham Kamel 2, Lori Petrucci 1, Thierry Pun 1 1 Computer Science Department CUI, University of Geneva CH - 1211 Geneva 4, Switzerland Patrick.Roth@cui.unige.ch
More informationCS 315 Intro to Human Computer Interaction (HCI)
CS 315 Intro to Human Computer Interaction (HCI) Direct Manipulation Examples Drive a car If you want to turn left, what do you do? What type of feedback do you get? How does this help? Think about turning
More informationMulti-Modal User Interaction
Multi-Modal User Interaction Lecture 4: Multiple Modalities Zheng-Hua Tan Department of Electronic Systems Aalborg University, Denmark zt@es.aau.dk MMUI, IV, Zheng-Hua Tan 1 Outline Multimodal interface
More informationApplication of 3D Terrain Representation System for Highway Landscape Design
Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationBlind navigation with a wearable range camera and vibrotactile helmet
Blind navigation with a wearable range camera and vibrotactile helmet (author s name removed for double-blind review) X university 1@2.com (author s name removed for double-blind review) X university 1@2.com
More informationUsing low cost devices to support non-visual interaction with diagrams & cross-modal collaboration
22 ISSN 2043-0167 Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration Oussama Metatla, Fiore Martin, Nick Bryan-Kinns and Tony Stockman EECSRR-12-03 June
More informationOne Display for a Cockpit Interactive Solution: The Technology Challenges
One Display for a Cockpit Interactive Solution: The Technology Challenges A. Xalas, N. Sgouros, P. Kouros, J. Ellinas Department of Electronic Computer Systems, Technological Educational Institute of Piraeus,
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationRASim Prototype User Manual
7 th Framework Programme This project has received funding from the European Union s Seventh Framework Programme for research, technological development and demonstration under grant agreement no 610425
More informationSpeech, Hearing and Language: work in progress. Volume 12
Speech, Hearing and Language: work in progress Volume 12 2 Construction of a rotary vibrator and its application in human tactile communication Abbas HAYDARI and Stuart ROSEN Department of Phonetics and
More informationHAPTIC USER INTERFACES Final lecture
HAPTIC USER INTERFACES Final lecture Roope Raisamo School of Information Sciences University of Tampere, Finland Content A little more about crossmodal interaction The next steps in the course 1 2 CROSSMODAL
More informationElectro-tactile Feedback System for a Prosthetic Hand
Electro-tactile Feedback System for a Prosthetic Hand Daniel Pamungkas and Koren Ward University of Wollongong, Australia daniel@uowmail.edu.au koren@uow.edu.au Abstract. Without the sense of touch, amputees
More informationGetting Started Guide
SOLIDWORKS Getting Started Guide SOLIDWORKS Electrical FIRST Robotics Edition Alexander Ouellet 1/2/2015 Table of Contents INTRODUCTION... 1 What is SOLIDWORKS Electrical?... Error! Bookmark not defined.
More informationTitle: A Comparison of Different Tactile Output Devices In An Aviation Application
Page 1 of 6; 12/2/08 Thesis Proposal Title: A Comparison of Different Tactile Output Devices In An Aviation Application Student: Sharath Kanakamedala Advisor: Christopher G. Prince Proposal: (1) Provide
More informationThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems
ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science
More informationResearch on Hand Gesture Recognition Using Convolutional Neural Network
Research on Hand Gesture Recognition Using Convolutional Neural Network Tian Zhaoyang a, Cheng Lee Lung b a Department of Electronic Engineering, City University of Hong Kong, Hong Kong, China E-mail address:
More information"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun
"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils Patrick Roth, Lori Petrucci, Thierry Pun Computer Science Department CUI, University of Geneva CH - 1211 Geneva
More informationIntroduction to Haptics
Introduction to Haptics Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction (TAUCHI) Department of Computer Sciences University of Tampere, Finland Definition
More informationOmni-Directional Catadioptric Acquisition System
Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationDo-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People
Do-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People Atheer S. Al-Khalifa 1 and Hend S. Al-Khalifa 2 1 Electronic and Computer Research Institute, King Abdulaziz City
More informationDetermining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain
Technical Disclosure Commons Defensive Publications Series October 02, 2017 Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Adam Glazier Nadav Ashkenazi Matthew
More informationFrom Encoding Sound to Encoding Touch
From Encoding Sound to Encoding Touch Toktam Mahmoodi King s College London, UK http://www.ctr.kcl.ac.uk/toktam/index.htm ETSI STQ Workshop, May 2017 Immersing a person into the real environment with Very
More information3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks
3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks David Gauldie 1, Mark Wright 2, Ann Marie Shillito 3 1,3 Edinburgh College of Art 79 Grassmarket, Edinburgh EH1 2HJ d.gauldie@eca.ac.uk, a.m.shillito@eca.ac.uk
More informationHaptic messaging. Katariina Tiitinen
Haptic messaging Katariina Tiitinen 13.12.2012 Contents Introduction User expectations for haptic mobile communication Hapticons Example: CheekTouch Introduction Multiple senses are used in face-to-face
More informationInput-output channels
Input-output channels Human Computer Interaction (HCI) Human input Using senses Sight, hearing, touch, taste and smell Sight, hearing & touch have important role in HCI Input-Output Channels Human output
More informationMOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device
MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.
More informationHamsaTouch: Tactile Vision Substitution with Smartphone and Electro-Tactile Display
HamsaTouch: Tactile Vision Substitution with Smartphone and Electro-Tactile Display Hiroyuki Kajimoto The University of Electro-Communications 1-5-1 Chofugaoka, Chofu, Tokyo 1828585, JAPAN kajimoto@kaji-lab.jp
More informationof interface technology. For example, until recently, limited CPU power has dictated the complexity of interface devices.
1 Introduction The primary goal of this work is to explore the possibility of using visual interpretation of hand gestures as a device to control a general purpose graphical user interface (GUI). There
More informationTEMPERATURE MAPPING SOFTWARE FOR SINGLE-CELL CAVITIES*
TEMPERATURE MAPPING SOFTWARE FOR SINGLE-CELL CAVITIES* Matthew Zotta, CLASSE, Cornell University, Ithaca, NY, 14853 Abstract Cornell University routinely manufactures single-cell Niobium cavities on campus.
More informationMULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT
MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003
More informationDirect gaze based environmental controls
Loughborough University Institutional Repository Direct gaze based environmental controls This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: SHI,
More informationArticle. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.
Article A comparison of three nonvisual methods for presenting scientific graphs ROTH, Patrick, et al. Abstract This study implemented three different methods for presenting scientific graphs to visually
More informationUSER-ORIENTED INTERACTIVE BUILDING DESIGN *
USER-ORIENTED INTERACTIVE BUILDING DESIGN * S. Martinez, A. Salgado, C. Barcena, C. Balaguer RoboticsLab, University Carlos III of Madrid, Spain {scasa@ing.uc3m.es} J.M. Navarro, C. Bosch, A. Rubio Dragados,
More informationInternational Journal of Research in Computer and Communication Technology, Vol 2, Issue 12, December- 2013
Design Of Virtual Sense Technology For System Interface Mr. Chetan Dhule, Prof.T.H.Nagrare Computer Science & Engineering Department, G.H Raisoni College Of Engineering. ABSTRACT A gesture-based human
More informationVirtual Tactile Maps
In: H.-J. Bullinger, J. Ziegler, (Eds.). Human-Computer Interaction: Ergonomics and User Interfaces. Proc. HCI International 99 (the 8 th International Conference on Human-Computer Interaction), Munich,
More informationAUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING
6 th INTERNATIONAL MULTIDISCIPLINARY CONFERENCE AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING Peter Brázda, Jozef Novák-Marcinčin, Faculty of Manufacturing Technologies, TU Košice Bayerova 1,
More informationInterface Design V: Beyond the Desktop
Interface Design V: Beyond the Desktop Rob Procter Further Reading Dix et al., chapter 4, p. 153-161 and chapter 15. Norman, The Invisible Computer, MIT Press, 1998, chapters 4 and 15. 11/25/01 CS4: HCI
More informationUser Interface Agents
User Interface Agents Roope Raisamo (rr@cs.uta.fi) Department of Computer Sciences University of Tampere http://www.cs.uta.fi/sat/ User Interface Agents Schiaffino and Amandi [2004]: Interface agents are
More informationDesign and evaluation of Hapticons for enriched Instant Messaging
Design and evaluation of Hapticons for enriched Instant Messaging Loy Rovers and Harm van Essen Designed Intelligence Group, Department of Industrial Design Eindhoven University of Technology, The Netherlands
More informationVocational Training with Combined Real/Virtual Environments
DSSHDUHGLQ+-%XOOLQJHU -=LHJOHU(GV3URFHHGLQJVRIWKHWK,QWHUQDWLRQDO&RQIHUHQFHRQ+XPDQ&RPSXWHU,Q WHUDFWLRQ+&,0 QFKHQ0DKZDK/DZUHQFH(UOEDXP9RO6 Vocational Training with Combined Real/Virtual Environments Eva
More informationDrumtastic: Haptic Guidance for Polyrhythmic Drumming Practice
Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The
More informationGlasgow eprints Service
Brewster, S.A. and King, A. (2005) An investigation into the use of tactons to present progress information. Lecture Notes in Computer Science 3585:pp. 6-17. http://eprints.gla.ac.uk/3219/ Glasgow eprints
More informationMultisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I
1 Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study I Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv,
More informationAR 2 kanoid: Augmented Reality ARkanoid
AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular
More informationvirtual reality SANJAY SINGH B.TECH (EC)
virtual reality SINGH (EC) SANJAY B.TECH What is virtual reality? A satisfactory definition may be formulated like this: "Virtual Reality is a way for humans to visualize, manipulate and interact with
More informationComparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians
British Journal of Visual Impairment September, 2007 Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians Dr. Olinkha Gustafson-Pearce,
More informationWhat was the first gestural interface?
stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things
More informationEvaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras
Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras TACCESS ASSETS 2016 Lee Stearns 1, Ruofei Du 1, Uran Oh 1, Catherine Jou 1, Leah Findlater
More information