CONSTRUCTING THE YELLOW BRICK ROAD: ROUTE BRICKS ON VIRTUAL TACTILE MAPS. Jochen Schneider 1)
|
|
- Leona Sutton
- 5 years ago
- Views:
Transcription
1 In: R. Vollmar, R. Wagner (eds). Proc. International Conference on Computers Helping People with Special Needs (ICCHP) 2000, Univ. of Karlsruhe, Germany, July 17-21, Wien: Österreichische Computer Gesellschaft, pp CONSTRUCTING THE YELLOW BRICK ROAD: ROUTE BRICKS ON VIRTUAL TACTILE MAPS Jochen Schneider 1) Abstract Tactile maps are the standard media to convey geographical information to blind people. An approach for an interactive substitute for tactile maps is presented, called virtual tactile maps. It is based on an image processing system tracking fingers and game piece-like objects. Output is done through speech and sound. Users learn routes by constructing them with route bricks. 1. Introduction Independent mobility is a key element of individual freedom. Pedestrians who are blind have to prepare themselves more thoroughly for a walk through an area they are unfamiliar with than pedestrians who can see. This is due to the fact that blind people cannot rely on seeing landmarks ahead of them en route from landmark to landmark. In order to orient themselves, blind people can use tactile maps which present geographical features as elevations instead of ink marks as on printed maps. Unfortunately, tactile maps are still mostly made by hand, therefore expensive and hard to get for a given area. In this paper, methods and tools are presented to simulate the experience of tactile map usage with input and output components which can be used for maps of different areas, without using an actual tactile map. We propose virtual tactile maps as replacements for tactile maps. Virtual tactile maps constitute interactive presentations of digital map data for blind people (cf. [13]). The enhancement presented in this paper, coined route bricks, consists of game piece-like objects to be placed on virtual tactile maps to facilitate the learning of routes. This paper is organized as follows. In section 2, the virtual tactile map concept is introduced. In section 3, related work in the field of supporting the mobility and orientation of blind people is described, covering both tactile maps and electronic travel aids. In section 4, a prototypical implementation of a virtual tactile map system is presented. Section 5 concludes with an outlook on future work. 1 Otto-von-Guericke-University of Magdeburg, Dept. of Computer Science, Inst. of Simulation and Graphics, P.O. Box 4120, Magdeburg, Germany, josch@isg.cs.uni-magdeburg.de
2 2. Virtual Tactile Maps Users operate virtual tactile map computer systems through hand movements on a tactile grid to input position which yields a tactile sensation. Data is presented acoustically through speech and sound, i.e., the tactile map does not exist per se, hence the name virtual tactile map. Virtual tactile maps enable blind and partially sighted users to explore an unknown geographical space, similar to tactile maps, which they are inspired by and named after. Virtual tactile maps are electronic travel aids, more specifically orientation systems for larger space [6]. Virtual tactile maps constitute no one-to-one translation of tactile maps for a digital system, but are digital maps which the user can interact with in a special way. The interaction is done through movements of hands and special objects picked up by a video camera. Information is presented during the interaction as speech and sound. The orientation of the hands is facilitated through a tactile grid. Apart from the technical details of implementing a image processing system reliable enough to be used independently by a user, the main challenge of the virtual tactile maps approach lies in the interaction. The first question is which movements the system is to interpret and how it maps the movements to the digital data. The next question is which information is to be chosen from the digital map data and how it is presented acoustically. The final question lies in the abstraction (cartographic generalization) of the map data so that it is suitable for the exploration with hand and object movements which are relatively coarse. 3. Related Work 3.1. Research on Tactile Maps The carved maps made by the Inuit three hundred years ago can be considered forerunners of tactile maps. They are highly abstracted as shapes and can be felt in the dark ([10], pp. 229). Today, tactile sheet maps offer blind people suitable access to geographical information [3]. The choice and tactual representation of cartographic features to present constitute the main challenges in designing such a map ([4], pp. 206). Besides tactile maps for geography education, we distinguish tactile orientation, mobility and topological maps. Orientation maps provide a general overview of a certain area. Mobility maps are tailored towards travellers and include orientation points. Topological maps show a certain route. Other detail is left out, the presentation is simplified and distorted (ibid., pp. 194). Tactile maps constitute mature means of orientation and in addition are portable, they can therefore be carried along on trips. On the other hand, tactile map design is by no means standardized. This is one of the reasons not all members of their target audience are able to use them successfully, apart from inscriptions done in braille, which not all blind persons can read ([7], 81-87). In addition, tactile
3 maps are not readily available for all regions, because they are mostly made by hand. For these reasons, there have been efforts to enhance tactile maps through the help of computer systems Electronic Travel Aids Tactile maps can be enhanced with sounds emitted by a computer by placing a tactile map on a touch tablet, loading a digital equivalent of the tactile map into the computer and emitting sound or text information on an object on the tactile map when a user presses the object and therefore the touch tablet [11]. With this approach, the exploration experience can be enhanced and braille labelling of features can be substituted with speech output, but there is still a tactile map needed. The MoBIC preparation system (MoPS) enables blind pedestrians to plan a walk through an urban area [12]. MoPS can be used to both explore maps and select routes in them. Routes selected are transferred to an electronic travel aid, the MoBIC outdoor system (MoODS), which then guides the user during his walk outdoors. The user interacts with the digital map loaded into the preparation system through the cursor keys of a PC keyboard. The system gives him information on his current position through speech and braille. As opposed to virtual tactile maps, absolute spatial input is not supported. The KnowWhere system is a hand gesture recognition system which conveys geographical information to blind people [8]. The system presents zoomable outlines of geographical objects by emitting a sound when the hand of a user touches the imaginary objects on a tactile grid. For a test, congenitally blind subjects explored country and state maps. They were able to find absolute positions and recognize puzzle pieces of the objects afterwards. Although they both deal with presenting geographical objects to the same user group through similar modalities, KnowWhere and virtual tactile maps differ in that KnowWhere conveys large-scale geographical information emphasizing shapes, as an atlas does, whereas virtual tactile maps serve to convey information specific to an urban area, including routes, as a street map does. 4. Prototype of a Virtual Tactile Map System 4.1. Design of a Prototype Requirements for the design of orientation aids can be elicited by asking how blind pedestrians prepare themselves for a walk through an urban area not fully known to them. Blind people who wish to navigate independently have to memorize the layout of the area given, learn path segments and angles between them and have to recognize them during walking [5]. Therefore, a virtual tactile map system should provide blind people with an overview of a given area, teach them straight route segments and angles at route succession choice points.
4 To implement the tactile map concept, there are no large tactile input/output devices available which can be used by blind people in the same manner as mice and graphics screens are used by sighted people (apart from research prototypes, s. [14]). Therefore, a new device was created, which tracks both a finger and small objects through image processing. Apart from the objects and a coloured ring for the finger tip, the device consists of a pad with a tactile grid and a camera on a tripod facing downward. To let the user explore the map before learning a route, the system starts in free exploration mode. In this mode, a user can move his hands on a rectangular tactile grid which abstractly represents the map and let the system tell him which geographical features lie underneath his fingers. When the finger tip touches a geographical object such as a street or a building on the grid, information on this object (e.g., its name) is emitted through synthesized speech. In exploration mode, a user can choose a route by selecting a start and an end point by placing game piece-like objects on the grid, after which the system calculates the path between them and switches to route learning mode. The first prototype of a virtual tactile map system emitted sound in route learning mode when the index finger was close to the route: The pitch of the sound was raised when the finger moved closer to the end point and was correspondingly lowered when it moved closer to the start point. The lateral distance to the nearest route segment was conveyed with the help of balance: When the finger tip was left of the nearest route segment, the balance was shifted to the left, and vice versa. In a first informal test done by a congenitally blind subject the exploration mode was found to be quite feasible, but the sound information did not suffice to keep the finger on the route (it did not enable tracing, cf. [1], pp. 312). For the second prototype, a new interaction method was devised in which the road is actively build by the user: The user constructs the road with long game pieces of a few different sizes called route bricks. The user selects the route as in the previous approach. The user then places one route brick after the other with the help of the system. The user places the start of the first route brick next to the piece symbolizing the beginning of the route, the start of consecutive route bricks at the end of the respective previous one. The system leads the user to place a route brick at the correct angle through sound. The route is learned both by constructing it with the route bricks and tracing the finished (tactile) route with the hand. The first prototype was not able to let the user trace the route, since the sound coding used did not suffice in conveying the two dimensions of point positions on the route. To direct a user to correctly place a route brick arbitrarily on the map, even three dimensions would need to be conveyed, the centre position (consisting of two dimensions) and the rotation angle of the brick. Fortunately, if bricks are placed one after the other starting with the route start maker, the start position of each brick is al-
5 GIS Acoustical Output Sound Speech Image capturing Raw map data Object recognition Fingers Markers Figure 1: Interaction of the system s main components ready known to the user, since it is the end position of the last brick correctly placed (or a point on the border of the round route start marker). Therefore, only the one-dimensional angle has to be conveyed through sound. The angle between the current position of the route brick to place (with the end point of the last brick correctly placed as the rotation axis) and the angle of the current piece of the route is mapped to three chords in different volumes. One chord stands for the deviation to the left, one for the deviation to the right and one for the correct placement of the brick. The placing of route bricks is similar to the approach to assess a child s mental map of a given route or environment by having it construct a layout with the help of model houses and card road strips [2]. One fundamental difference is that in our approach, the user learns a new layout (as opposed to recalling it) and is guided by the computer in placing objects. This approach is in accordance with the finding from psychology that blind people need information on path segments of routes and angles between them, as described above. Construction does also lead to more involvement and more learning than passive perception (cf. [9], pp. 273 for the reverse lag in recognition vs. production). In the case of virtual tactile maps, the construction of a route with route bricks should lead to a better understanding of the route than just moving a hand on a tactile grid and listening to the acoustical encoding of the distance of the finger from the route as described above. If this is indeed true needs to be evaluated formally Implementation The author has implemented virtual tactile maps with route bricks in a prototypical system. The system consists of modules for optical image tracking, the management of the digital map data and sound and speech output (see figure 1). The image processing component takes images from the camera and extracts the marked finger, the road end markers and the route bricks on the pad. Acoustical output is
6 Figure 2: Screenshot of the system with a visualisation of a map and a route produced after matching the positions of the objects extracted with GIS data and relating it to previous interactions (e.g., whether a route was selected). The image processing sub-system tracks objects in real-time. The finger marker and the route end elements are discriminated through their respective colour, the route bricks all have the same colour and are discriminated through their position. In order to let users successfully interact with the map with fingers and objects through image processing, four challenges had to be addressed: the presence of hands on the pad which could be confused with an object, occlusion of objects by the hands, the user accidentally moving objects already placed correctly, and finally the need of route bricks to attach to their respective predecessor. Objects are segmented by colour, the hand is therefore ignored since the object colours where chosen so that none of them is similar to skin colour. Each route brick can be placed without occluding it, because a route brick actually consists of two pieces on top of each other with vertical sticks connecting them, so that the brick can be moved by touching the lower piece without occluding the coloured upper piece. All objects to be placed have magnets underneath them which prevents them from being moved accidentally on the metal pad. The magnets also make route ends attach to each other. The current prototype of the virtual tactile map system is implemented as a standard GUI program under Windows NT (see figure 2 for a screen shot). After a digital map file has been loaded, it is displayed in a graphics window. There are also windows to display the current camera image and the position of objects tracked (not shown). The visual data display enables a blind and a sighted user to
7 cooperate in using the program (and facilitates development). In order to demonstrate and test the prototype on systems without image processing hardware, graphical representations of the current finger position and route selection elements can be manipulated with the mouse, which leads to the same acoustical output as interaction through the image recognition system. The program deals with three coordinate systems: one for the digital map, one for the image buffer and finally one for the visualization of the map and the current interaction in a GUI window. To translate between these coordinate systems, special translation objects are used, which are called coordinate adaptors. Coordinate adaptors are also responsible for translating between coordinate systems of different orientation. In order to keep relative distances as accurate as possible, the aspect ratio of the map is preserved both in the mapping from the image buffer coordinates (specifying positions of tracked fingers and objects) and the visual display of the map. Once a route has been selected by placing the two route end elements on the grid, the system finds the route and simplifies it by merging route segments who are connected by angles close to 180. The route of merged segments is then scaled so that it starts and ends on the border of the route start and end element, respectively. The scaling is done by an additional coordinate adaptor. The system then calculates the sizes of the route bricks to build the route. Generally, it can not be assumed that a straight line segment can exactly be filled with bricks. Therefore, the segments have to be refitted to a length which is a multiple of the brick length. In the current implementation of the system, all route bricks have the same length. 5. Future Work Although having only one brick size makes it easier for users to pick a brick to be placed, it makes the brick road less accurate compared to the route and harder to built. A road built with bricks of the same size is less accurate if the bricks are relatively large, because the error for each straight road segment is at worst equal to half the size of the smallest brick. Therefore, bricks of additional sizes will be used in the next incarnation of the system. The connection of bricks to each other or to the road end markers in the current setup is not satisfactory. Therefore, a new brick design will be devised through which bricks connect to each other mechanically, while still allowing the last brick to be rotated. Once this is done, the virtual tactile map approach will be evaluated formally.
8 6. References [1] BENTZEN, B.L., Orientation Aids, in: R.L. Welsch, B.B. Blasch (eds.), Foundations of Orientation and Mobility, New York [2] BLADES, M., Research Paradigms and Methodologies for Investigating Children s Wayfinding, in: N. Foreman, R. Gillet (eds.), Handbook of Spatial Reasearch Paradigms and Methodologies. Vol. 1, East Sussex [3] BRAMBRING, M., C. WEBER, Taktile, verbale und motorische Informationen zur geographischen Orientierung Blinder. Zeitschrift für experimentelle und angewandte Psychologie. Vol. 28 (1981). [4] EDMAN, P.K., Tactile Graphics, New York [5] GOLLEDGE, R.G., R.L. KLATZKY, J.M. LOOMIS, Cognitive Mapping and Wayfinding by Adults Without Vision, in: J. Portugali (ed.), The Construction of Cognitive Maps, Dordrecht [6] JANNSON, G., Spatial Orientation and Mobility of the Visually Impaired, in: B. Silverstone, M.A. Lang, B. Rosenthal, & E.E. Fraye (eds.), The Lighthouse Handbook on Visual Impairment and Rehabilitation, New York (in press). [7] HOLMES, E., R. MICHEL, A. RAAB, Computerunterstützte Erkundung digitaler Karten durch Sehbehinderte, in: W. Laufenberg, J. Lötzsch (eds.), Taktile Medien: Kolloquium über tastbare Abbildungen für Blinde. Freital/Dresden [8] KRUEGER, M.W., D. GILDEN, KnowWhere : an Audio/Spatial Interface for Blind People, in: Proc. ICAD `97, Palo Alto [9] MILLAR, S., Reading by Touch. London, New York [10] PAPANEK, V., The Green Imperative: Natural Design for the Real World, New York [11] PARKES, D., Nomad, an Audio-Tactile Tool for the Acquisition, Use and Management of Spatially Distributed Information by Visually Impaired People, in: A.F. Tatham and A.G. Dodds (eds.), Proc. Second International Symposium on Maps and Graphics for Visually Handicapped People, London [12] PETRIE, H., V. JOHNSON, TH. STROTHOTTE, A. RAAB, S. FRITZ, R. MICHEL, MoBIC: Designing a Travel Aid for Blind and Elderly People. The Journal of Navigation Vol. 49, No. 1 (1996). [13] SCHNEIDER, J., TH. STROTHOTTE, Virtual Tactile Maps, in: H.-J. Bullinger, J. Ziegler (eds.), Human-Computer Interaction: Ergonomics and User Interfaces, Proc. HCI Int l. '99 Vol. 1, Mahwah, NJ & London [14] SCHWEIKHARDT, W., Interaktives Erkunden von Graphiken durch Blinde, in: H.-J. Bullinger (ed.), Proc. Software-Ergonomie 85, Stuttgart 1985.
Virtual Tactile Maps
In: H.-J. Bullinger, J. Ziegler, (Eds.). Human-Computer Interaction: Ergonomics and User Interfaces. Proc. HCI International 99 (the 8 th International Conference on Human-Computer Interaction), Munich,
More informationConstructive Exploration of Spatial Information by Blind Users
Constructive Exploration of Spatial Information by Blind Users Jochen Schneider, Thomas Strothotte Department of Simulation and Graphics Otto-von-Guericke University of Magdeburg Universitätsplatz 2, 39016
More informationInteractive Exploration of City Maps with Auditory Torches
Interactive Exploration of City Maps with Auditory Torches Wilko Heuten OFFIS Escherweg 2 Oldenburg, Germany Wilko.Heuten@offis.de Niels Henze OFFIS Escherweg 2 Oldenburg, Germany Niels.Henze@offis.de
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More informationHaptic presentation of 3D objects in virtual reality for the visually disabled
Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,
More informationMELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS
MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS Richard Etter 1 ) and Marcus Specht 2 ) Abstract In this paper the design, development and evaluation of a GPS-based
More informationMultisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills
Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills O Lahav and D Mioduser School of Education, Tel Aviv University,
More informationMultisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study
Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv, Tel-Aviv,
More informationAccess Invaders: Developing a Universally Accessible Action Game
ICCHP 2006 Thursday, 13 July 2006 Access Invaders: Developing a Universally Accessible Action Game Dimitris Grammenos, Anthony Savidis, Yannis Georgalis, Constantine Stephanidis Human-Computer Interaction
More informationTouch & Gesture. HCID 520 User Interface Software & Technology
Touch & Gesture HCID 520 User Interface Software & Technology Natural User Interfaces What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger
More informationR (2) Controlling System Application with hands by identifying movements through Camera
R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity
More informationWhat was the first gestural interface?
stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things
More informationVisual Interpretation of Hand Gestures as a Practical Interface Modality
Visual Interpretation of Hand Gestures as a Practical Interface Modality Frederik C. M. Kjeldsen Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate
More informationDo You Feel What I Hear?
1 Do You Feel What I Hear? Patrick Roth 1, Hesham Kamel 2, Lori Petrucci 1, Thierry Pun 1 1 Computer Science Department CUI, University of Geneva CH - 1211 Geneva 4, Switzerland Patrick.Roth@cui.unige.ch
More informationBlindstation : a Game Platform Adapted to Visually Impaired Children
Blindstation : a Game Platform Adapted to Visually Impaired Children Sébastien Sablé and Dominique Archambault INSERM U483 / INOVA - Université Pierre et Marie Curie 9, quai Saint Bernard, 75,252 Paris
More informationExploring Geometric Shapes with Touch
Exploring Geometric Shapes with Touch Thomas Pietrzak, Andrew Crossan, Stephen Brewster, Benoît Martin, Isabelle Pecci To cite this version: Thomas Pietrzak, Andrew Crossan, Stephen Brewster, Benoît Martin,
More informationControlling vehicle functions with natural body language
Controlling vehicle functions with natural body language Dr. Alexander van Laack 1, Oliver Kirsch 2, Gert-Dieter Tuzar 3, Judy Blessing 4 Design Experience Europe, Visteon Innovation & Technology GmbH
More informationHuman Factors. We take a closer look at the human factors that affect how people interact with computers and software:
Human Factors We take a closer look at the human factors that affect how people interact with computers and software: Physiology physical make-up, capabilities Cognition thinking, reasoning, problem-solving,
More informationEvaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras
Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras TACCESS ASSETS 2016 Lee Stearns 1, Ruofei Du 1, Uran Oh 1, Catherine Jou 1, Leah Findlater
More informationMicrosoft Scrolling Strip Prototype: Technical Description
Microsoft Scrolling Strip Prototype: Technical Description Primary features implemented in prototype Ken Hinckley 7/24/00 We have done at least some preliminary usability testing on all of the features
More informationSearch Strategies of Visually Impaired Persons using a Camera Phone Wayfinding System
Search Strategies of Visually Impaired Persons using a Camera Phone Wayfinding System R. Manduchi 1, J. Coughlan 2 and V. Ivanchenko 2 1 University of California, Santa Cruz, CA 2 Smith-Kettlewell Eye
More informationof interface technology. For example, until recently, limited CPU power has dictated the complexity of interface devices.
1 Introduction The primary goal of this work is to explore the possibility of using visual interpretation of hand gestures as a device to control a general purpose graphical user interface (GUI). There
More informationVisual Communication by Colours in Human Computer Interface
Buletinul Ştiinţific al Universităţii Politehnica Timişoara Seria Limbi moderne Scientific Bulletin of the Politehnica University of Timişoara Transactions on Modern Languages Vol. 14, No. 1, 2015 Visual
More informationVirtual Reality Based Scalable Framework for Travel Planning and Training
Virtual Reality Based Scalable Framework for Travel Planning and Training Loren Abdulezer, Jason DaSilva Evolving Technologies Corporation, AXS Lab, Inc. la@evolvingtech.com, jdasilvax@gmail.com Abstract
More informationMultisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I
1 Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study I Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv,
More informationInteractive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1
VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio
More informationIDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez
IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK Javier Sanchez Center for Computer Research in Music and Acoustics (CCRMA) Stanford University The Knoll, 660 Lomita Dr. Stanford, CA 94305,
More informationResearch Seminar. Stefano CARRINO fr.ch
Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks
More informationMECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES
INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL
More informationVR/AR Concepts in Architecture And Available Tools
VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality
More informationTouch & Gesture. HCID 520 User Interface Software & Technology
Touch & Gesture HCID 520 User Interface Software & Technology What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger There were things I resented
More information"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun
"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils Patrick Roth, Lori Petrucci, Thierry Pun Computer Science Department CUI, University of Geneva CH - 1211 Geneva
More informationGraphical User Interfaces for Blind Users: An Overview of Haptic Devices
Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Hasti Seifi, CPSC554m: Assignment 1 Abstract Graphical user interfaces greatly enhanced usability of computer systems over older
More informationEffective Iconography....convey ideas without words; attract attention...
Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the
More informationInteracting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception
More informationArticle. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.
Article A comparison of three nonvisual methods for presenting scientific graphs ROTH, Patrick, et al. Abstract This study implemented three different methods for presenting scientific graphs to visually
More informationVirtual Grasping Using a Data Glove
Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct
More informationUsing low cost devices to support non-visual interaction with diagrams & cross-modal collaboration
22 ISSN 2043-0167 Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration Oussama Metatla, Fiore Martin, Nick Bryan-Kinns and Tony Stockman EECSRR-12-03 June
More informationTeaching Math & Science to Students Who Are Visually Impaired
Teaching Math & Science to Students Who Are Visually Impaired Guidelines for designing tactile graphics Teaching tactile graphics in math Teaching tactile graphics in science Questions to Ask Yourself
More information6 Ubiquitous User Interfaces
6 Ubiquitous User Interfaces Viktoria Pammer-Schindler May 3, 2016 Ubiquitous User Interfaces 1 Days and Topics March 1 March 8 March 15 April 12 April 26 (10-13) April 28 (9-14) May 3 May 10 Administrative
More informationCan a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects?
Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects? Gunnar Jansson Department of Psychology, Uppsala University
More informationIssues and Challenges of 3D User Interfaces: Effects of Distraction
Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an
More informationUSER S MANUAL (english)
USER S MANUAL (english) A new generation of 3D detection devices. Made in Germany Overview The TeroVido system consists of the software TeroVido3D and the recording hardware. It's purpose is the detection
More informationHand Gesture Recognition Using Radial Length Metric
Hand Gesture Recognition Using Radial Length Metric Warsha M.Choudhari 1, Pratibha Mishra 2, Rinku Rajankar 3, Mausami Sawarkar 4 1 Professor, Information Technology, Datta Meghe Institute of Engineering,
More informationHeads up interaction: glasgow university multimodal research. Eve Hoggan
Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not
More informationHow to Solve the Rubik s Cube Blindfolded
How to Solve the Rubik s Cube Blindfolded The purpose of this guide is to help you achieve your first blindfolded solve. There are multiple methods to choose from when solving a cube blindfolded. For this
More informationSpecification of symbols used on Audio-Tactile Maps for individuals with blindness
Specification of symbols used on Audio-Tactile Maps for individuals with blindness D2.3 Production of AT-Maps Prepare by : Contributors Konstantinos Charitakis All partners Work Package : No 2 Email: Form:
More informationA contemporary interactive computer game for visually impaired teens
Interactive Computer Game for Visually Impaired Teens Boonsit Yimwadsana, et al. A contemporary interactive computer game for visually impaired teens Boonsit Yimwadsana, Phakin Cheangkrachange, Kamchai
More informationUNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS
UNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS 5.1 Introduction Orthographic views are 2D images of a 3D object obtained by viewing it from different orthogonal directions. Six principal views are possible
More informationPerception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment
Perception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment Marko Horvat University of Zagreb Faculty of Electrical Engineering and Computing, Zagreb,
More informationVision: How does your eye work? Student Advanced Version Vision Lab - Overview
Vision: How does your eye work? Student Advanced Version Vision Lab - Overview In this lab, we will explore some of the capabilities and limitations of the eye. We will look Sight at is the one extent
More informationRV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI
RV - AULA 05 - PSI3502/2018 User Experience, Human Computer Interaction and UI Outline Discuss some general principles of UI (user interface) design followed by an overview of typical interaction tasks
More informationCHAPTER 1. INTRODUCTION 16
1 Introduction The author s original intention, a couple of years ago, was to develop a kind of an intuitive, dataglove-based interface for Computer-Aided Design (CAD) applications. The idea was to interact
More informationAnalyzing Situation Awareness During Wayfinding in a Driving Simulator
In D.J. Garland and M.R. Endsley (Eds.) Experimental Analysis and Measurement of Situation Awareness. Proceedings of the International Conference on Experimental Analysis and Measurement of Situation Awareness.
More informationEvaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface
Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University
More informationiwindow Concept of an intelligent window for machine tools using augmented reality
iwindow Concept of an intelligent window for machine tools using augmented reality Sommer, P.; Atmosudiro, A.; Schlechtendahl, J.; Lechler, A.; Verl, A. Institute for Control Engineering of Machine Tools
More informationFeelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces
Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics
More informationIntroduction to Virtual Reality (based on a talk by Bill Mark)
Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers
More informationEnabling Cursor Control Using on Pinch Gesture Recognition
Enabling Cursor Control Using on Pinch Gesture Recognition Benjamin Baldus Debra Lauterbach Juan Lizarraga October 5, 2007 Abstract In this project we expect to develop a machine-user interface based on
More information1.G.1 Distinguish between defining attributes. Build and draw shapes that possess K.G.3 Identify shapes as 2-D (flat) or 3-D (solid)
Identify and describe shapes, including squares, circles, triangles, rectangles, hexagons, cubes, cones, cylinders, and spheres (Standards K.G.1 3). Standard K.G.1 Describe objects in the environment using
More informationNodal Ninja SPH-1 User Manual
Nodal Ninja SPH-1 User Manual Nodal Ninja SPH-1 is a professional spherical bracket (360 degree pano bracket) for taking panoramic still images or virtual tours. It supports cameras with tripod mount under
More informationHaptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces
In Usability Evaluation and Interface Design: Cognitive Engineering, Intelligent Agents and Virtual Reality (Vol. 1 of the Proceedings of the 9th International Conference on Human-Computer Interaction),
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Computer Graphics Virtual Reality Virtual Reality Input Devices Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1 WIMP:
More informationTele-Nursing System with Realistic Sensations using Virtual Locomotion Interface
6th ERCIM Workshop "User Interfaces for All" Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface Tsutomu MIYASATO ATR Media Integration & Communications 2-2-2 Hikaridai, Seika-cho,
More informationpetra. 5
Haptic Paintings Using rapid prototyping technologies to grant visually impaired persons access to paintings, sculptures, graphics and architecture Volker Koch 1, Angelika J. Lückert 2, Thorsten Schwarz³,
More informationMeasuring in Centimeters
MD2-3 Measuring in Centimeters Pages 179 181 Standards: 2.MD.A.1 Goals: Students will measure pictures of objects in centimeters using centimeter cubes and then a centimeter ruler. Prior Knowledge Required:
More informationInfographics at CDC for a nonscientific audience
Infographics at CDC for a nonscientific audience A Standards Guide for creating successful infographics Centers for Disease Control and Prevention Office of the Associate Director for Communication 03/14/2012;
More informationShopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction
Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp
More informationChapter 3: Assorted notions: navigational plots, and the measurement of areas and non-linear distances
: navigational plots, and the measurement of areas and non-linear distances Introduction Before we leave the basic elements of maps to explore other topics it will be useful to consider briefly two further
More informationLeading the Agenda. Everyday technology: A focus group with children, young people and their carers
Leading the Agenda Everyday technology: A focus group with children, young people and their carers March 2018 1 1.0 Introduction Assistive technology is an umbrella term that includes assistive, adaptive,
More informationCOGNITIVE MODEL OF MOBILE ROBOT WORKSPACE
COGNITIVE MODEL OF MOBILE ROBOT WORKSPACE Prof.dr.sc. Mladen Crneković, University of Zagreb, FSB, I. Lučića 5, 10000 Zagreb Prof.dr.sc. Davor Zorc, University of Zagreb, FSB, I. Lučića 5, 10000 Zagreb
More informationDrawing on Your Memory
Level: Beginner to Intermediate Flesch-Kincaid Grade Level: 11.0 Flesch-Kincaid Reading Ease: 46.5 Drawspace Curriculum 2.2.R15-6 Pages and 8 Illustrations Drawing on Your Memory Techniques for seeing
More informationHUMAN COMPUTER INTERFACE
HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the
More informationt t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2
t t t rt t s s Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 1 r sr st t t 2 st t t r t r t s t s 3 Pr ÿ t3 tr 2 t 2 t r r t s 2 r t ts ss
More informationVision: How does your eye work? Student Version
Vision: How does your eye work? Student Version In this lab, we will explore some of the capabilities and limitations of the eye. We will look Sight is one at of the extent five senses of peripheral that
More informationModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern
ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern
More informationModule 8. Lecture-1. A good design is the best possible visual essence of the best possible something, whether this be a message or a product.
Module 8 Lecture-1 Introduction to basic principles of design using the visual elements- point, line, plane and volume. Lines straight, curved and kinked. Design- It is mostly a process of purposeful visual
More informationINDE/TC 455: User Interface Design
INDE/TC 455: User Interface Design Module 13.0 Interface Technology 1 Three more interface considerations What is the best allocation of responsibility between the human and the tool? What is the best
More informationHuman Computer Interaction (HCI, HCC)
Human Computer Interaction (HCI, HCC) AN INTRODUCTION Human Computer Interaction Why are we here? It may seem trite, but user interfaces matter: For efficiency, for convenience, for accuracy, for success,
More informationCONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM
CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,
More information3D Interaction Techniques
3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?
More informationOcclusion based Interaction Methods for Tangible Augmented Reality Environments
Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α Mark Billinghurst β Gerard J. Kim α α Virtual Reality Laboratory, Pohang University of Science and Technology
More informationSanta s Workshop? What s Inside. Feel & Find Tactile Discrimination n Game. Version 2 for Older Children.
What s Inside Workshop? Feel & Find Tactile Discrimination n Game Version 2 for Older Children Graphics by Krista Wallden http://www.teacherspayteachers.com/store/krista-wallden Border Ink n Little Things
More informationTHE HEIDELBERG TACTILE VISION SUBSTITUTION SYSTEM
Paper presented at the 6th International Conference on Tactile Aids, Hearing Aids and Cochlear Implants, ISAC2000, Exeter, May 2000 and at the International Conference on Computers Helping People with
More informationMagnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine
Show me the direction how accurate does it have to be? Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine Published: 2010-01-01 Link to publication Citation for published version (APA): Magnusson,
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Computer Graphics Virtual Reality Marc Erich Latoschik AI & VR Lab Artificial Intelligence Group University of Bielefeld Virtual Reality (or VR for short) Virtual Reality (or VR for short)
More informationIED Detailed Outline. Unit 1 Design Process Time Days: 16 days. An engineering design process involves a characteristic set of practices and steps.
IED Detailed Outline Unit 1 Design Process Time Days: 16 days Understandings An engineering design process involves a characteristic set of practices and steps. Research derived from a variety of sources
More informationComparing Two Haptic Interfaces for Multimodal Graph Rendering
Comparing Two Haptic Interfaces for Multimodal Graph Rendering Wai Yu, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, U. K. {rayu, stephen}@dcs.gla.ac.uk,
More informationHaptic Camera Manipulation: Extending the Camera In Hand Metaphor
Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium
More informationUsing Haptic Cues to Aid Nonvisual Structure Recognition
Using Haptic Cues to Aid Nonvisual Structure Recognition CAROLINE JAY, ROBERT STEVENS, ROGER HUBBOLD, and MASHHUDA GLENCROSS University of Manchester Retrieving information presented visually is difficult
More informationCreating User Experience by novel Interaction Forms: (Re)combining physical Actions and Technologies
Creating User Experience by novel Interaction Forms: (Re)combining physical Actions and Technologies Bernd Schröer 1, Sebastian Loehmann 2 and Udo Lindemann 1 1 Technische Universität München, Lehrstuhl
More informationOcclusion-Aware Menu Design for Digital Tabletops
Occlusion-Aware Menu Design for Digital Tabletops Peter Brandl peter.brandl@fh-hagenberg.at Jakob Leitner jakob.leitner@fh-hagenberg.at Thomas Seifried thomas.seifried@fh-hagenberg.at Michael Haller michael.haller@fh-hagenberg.at
More information3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.
CS-525U: 3D User Interaction Intro to 3D UI Robert W. Lindeman Worcester Polytechnic Institute Department of Computer Science gogo@wpi.edu Why Study 3D UI? Relevant to real-world tasks Can use familiarity
More informationCorrelation of Nelson Mathematics 2 to The Ontario Curriculum Grades 1-8 Mathematics Revised 2005
Correlation of Nelson Mathematics 2 to The Ontario Curriculum Grades 1-8 Mathematics Revised 2005 Number Sense and Numeration: Grade 2 Section: Overall Expectations Nelson Mathematics 2 read, represent,
More informationOptical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation
Optical Performance of Nikon F-Mount Lenses Landon Carter May 11, 2016 2.671 Measurement and Instrumentation Abstract In photographic systems, lenses are one of the most important pieces of the system
More informationHaptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test
a u t u m n 2 0 0 3 Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test Nancy E. Study Virginia State University Abstract The Haptic Visual Discrimination Test (HVDT)
More informationDevelopment of Synchronized CUI and GUI for Universal Design Tactile Graphics Production System BPLOT3
Development of Synchronized CUI and GUI for Universal Design Tactile Graphics Production System BPLOT3 Mamoru Fujiyoshi 1, Akio Fujiyoshi 2,AkikoOsawa 1, Yusuke Kuroda 3, and Yuta Sasaki 3 1 National Center
More informationVocational Training with Combined Real/Virtual Environments
DSSHDUHGLQ+-%XOOLQJHU -=LHJOHU(GV3URFHHGLQJVRIWKHWK,QWHUQDWLRQDO&RQIHUHQFHRQ+XPDQ&RPSXWHU,Q WHUDFWLRQ+&,0 QFKHQ0DKZDK/DZUHQFH(UOEDXP9RO6 Vocational Training with Combined Real/Virtual Environments Eva
More informationLocation and navigation system for visually impaired
Česky Paper: # 8/11/2002 ISSN 1213-161X Content Location and navigation system for visually impaired Václav Eksler *), Genevičve Baudoin *)), Martine Villegas *)) Department of Telecommunications Faculty
More information