CONSTRUCTING THE YELLOW BRICK ROAD: ROUTE BRICKS ON VIRTUAL TACTILE MAPS. Jochen Schneider 1)

Similar documents
Virtual Tactile Maps

Constructive Exploration of Spatial Information by Blind Users

Interactive Exploration of City Maps with Auditory Torches

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

A Kinect-based 3D hand-gesture interface for 3D databases

Haptic presentation of 3D objects in virtual reality for the visually disabled

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Access Invaders: Developing a Universally Accessible Action Game

Touch & Gesture. HCID 520 User Interface Software & Technology

R (2) Controlling System Application with hands by identifying movements through Camera

What was the first gestural interface?

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Do You Feel What I Hear?

Blindstation : a Game Platform Adapted to Visually Impaired Children

Exploring Geometric Shapes with Touch

Controlling vehicle functions with natural body language

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Microsoft Scrolling Strip Prototype: Technical Description

Search Strategies of Visually Impaired Persons using a Camera Phone Wayfinding System

of interface technology. For example, until recently, limited CPU power has dictated the complexity of interface devices.

Visual Communication by Colours in Human Computer Interface

Virtual Reality Based Scalable Framework for Travel Planning and Training

Multisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

Research Seminar. Stefano CARRINO fr.ch

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

VR/AR Concepts in Architecture And Available Tools

Touch & Gesture. HCID 520 User Interface Software & Technology

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Effective Iconography....convey ideas without words; attract attention...

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Article. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.

Virtual Grasping Using a Data Glove

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

Teaching Math & Science to Students Who Are Visually Impaired

6 Ubiquitous User Interfaces

Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects?

Issues and Challenges of 3D User Interfaces: Effects of Distraction

USER S MANUAL (english)

Hand Gesture Recognition Using Radial Length Metric

Heads up interaction: glasgow university multimodal research. Eve Hoggan

How to Solve the Rubik s Cube Blindfolded

Specification of symbols used on Audio-Tactile Maps for individuals with blindness

A contemporary interactive computer game for visually impaired teens

UNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS

Perception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment

Vision: How does your eye work? Student Advanced Version Vision Lab - Overview

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

CHAPTER 1. INTRODUCTION 16

Analyzing Situation Awareness During Wayfinding in a Driving Simulator

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

iwindow Concept of an intelligent window for machine tools using augmented reality

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Introduction to Virtual Reality (based on a talk by Bill Mark)

Enabling Cursor Control Using on Pinch Gesture Recognition

1.G.1 Distinguish between defining attributes. Build and draw shapes that possess K.G.3 Identify shapes as 2-D (flat) or 3-D (solid)

Nodal Ninja SPH-1 User Manual

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Realtime 3D Computer Graphics Virtual Reality

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

petra. 5

Measuring in Centimeters

Infographics at CDC for a nonscientific audience

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Chapter 3: Assorted notions: navigational plots, and the measurement of areas and non-linear distances

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers

COGNITIVE MODEL OF MOBILE ROBOT WORKSPACE

Drawing on Your Memory

HUMAN COMPUTER INTERFACE

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

Vision: How does your eye work? Student Version

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

Module 8. Lecture-1. A good design is the best possible visual essence of the best possible something, whether this be a message or a product.

INDE/TC 455: User Interface Design

Human Computer Interaction (HCI, HCC)

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

3D Interaction Techniques

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Santa s Workshop? What s Inside. Feel & Find Tactile Discrimination n Game. Version 2 for Older Children.

THE HEIDELBERG TACTILE VISION SUBSTITUTION SYSTEM

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine

Realtime 3D Computer Graphics Virtual Reality

IED Detailed Outline. Unit 1 Design Process Time Days: 16 days. An engineering design process involves a characteristic set of practices and steps.

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Using Haptic Cues to Aid Nonvisual Structure Recognition

Creating User Experience by novel Interaction Forms: (Re)combining physical Actions and Technologies

Occlusion-Aware Menu Design for Digital Tabletops

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

Correlation of Nelson Mathematics 2 to The Ontario Curriculum Grades 1-8 Mathematics Revised 2005

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation

Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test

Development of Synchronized CUI and GUI for Universal Design Tactile Graphics Production System BPLOT3

Vocational Training with Combined Real/Virtual Environments

Location and navigation system for visually impaired

Transcription:

In: R. Vollmar, R. Wagner (eds). Proc. International Conference on Computers Helping People with Special Needs (ICCHP) 2000, Univ. of Karlsruhe, Germany, July 17-21, 2000. Wien: Österreichische Computer Gesellschaft, pp. 641-648. CONSTRUCTING THE YELLOW BRICK ROAD: ROUTE BRICKS ON VIRTUAL TACTILE MAPS Jochen Schneider 1) Abstract Tactile maps are the standard media to convey geographical information to blind people. An approach for an interactive substitute for tactile maps is presented, called virtual tactile maps. It is based on an image processing system tracking fingers and game piece-like objects. Output is done through speech and sound. Users learn routes by constructing them with route bricks. 1. Introduction Independent mobility is a key element of individual freedom. Pedestrians who are blind have to prepare themselves more thoroughly for a walk through an area they are unfamiliar with than pedestrians who can see. This is due to the fact that blind people cannot rely on seeing landmarks ahead of them en route from landmark to landmark. In order to orient themselves, blind people can use tactile maps which present geographical features as elevations instead of ink marks as on printed maps. Unfortunately, tactile maps are still mostly made by hand, therefore expensive and hard to get for a given area. In this paper, methods and tools are presented to simulate the experience of tactile map usage with input and output components which can be used for maps of different areas, without using an actual tactile map. We propose virtual tactile maps as replacements for tactile maps. Virtual tactile maps constitute interactive presentations of digital map data for blind people (cf. [13]). The enhancement presented in this paper, coined route bricks, consists of game piece-like objects to be placed on virtual tactile maps to facilitate the learning of routes. This paper is organized as follows. In section 2, the virtual tactile map concept is introduced. In section 3, related work in the field of supporting the mobility and orientation of blind people is described, covering both tactile maps and electronic travel aids. In section 4, a prototypical implementation of a virtual tactile map system is presented. Section 5 concludes with an outlook on future work. 1 Otto-von-Guericke-University of Magdeburg, Dept. of Computer Science, Inst. of Simulation and Graphics, P.O. Box 4120, 39016 Magdeburg, Germany, e-mail: josch@isg.cs.uni-magdeburg.de

2. Virtual Tactile Maps Users operate virtual tactile map computer systems through hand movements on a tactile grid to input position which yields a tactile sensation. Data is presented acoustically through speech and sound, i.e., the tactile map does not exist per se, hence the name virtual tactile map. Virtual tactile maps enable blind and partially sighted users to explore an unknown geographical space, similar to tactile maps, which they are inspired by and named after. Virtual tactile maps are electronic travel aids, more specifically orientation systems for larger space [6]. Virtual tactile maps constitute no one-to-one translation of tactile maps for a digital system, but are digital maps which the user can interact with in a special way. The interaction is done through movements of hands and special objects picked up by a video camera. Information is presented during the interaction as speech and sound. The orientation of the hands is facilitated through a tactile grid. Apart from the technical details of implementing a image processing system reliable enough to be used independently by a user, the main challenge of the virtual tactile maps approach lies in the interaction. The first question is which movements the system is to interpret and how it maps the movements to the digital data. The next question is which information is to be chosen from the digital map data and how it is presented acoustically. The final question lies in the abstraction (cartographic generalization) of the map data so that it is suitable for the exploration with hand and object movements which are relatively coarse. 3. Related Work 3.1. Research on Tactile Maps The carved maps made by the Inuit three hundred years ago can be considered forerunners of tactile maps. They are highly abstracted as shapes and can be felt in the dark ([10], pp. 229). Today, tactile sheet maps offer blind people suitable access to geographical information [3]. The choice and tactual representation of cartographic features to present constitute the main challenges in designing such a map ([4], pp. 206). Besides tactile maps for geography education, we distinguish tactile orientation, mobility and topological maps. Orientation maps provide a general overview of a certain area. Mobility maps are tailored towards travellers and include orientation points. Topological maps show a certain route. Other detail is left out, the presentation is simplified and distorted (ibid., pp. 194). Tactile maps constitute mature means of orientation and in addition are portable, they can therefore be carried along on trips. On the other hand, tactile map design is by no means standardized. This is one of the reasons not all members of their target audience are able to use them successfully, apart from inscriptions done in braille, which not all blind persons can read ([7], 81-87). In addition, tactile

maps are not readily available for all regions, because they are mostly made by hand. For these reasons, there have been efforts to enhance tactile maps through the help of computer systems. 3.2. Electronic Travel Aids Tactile maps can be enhanced with sounds emitted by a computer by placing a tactile map on a touch tablet, loading a digital equivalent of the tactile map into the computer and emitting sound or text information on an object on the tactile map when a user presses the object and therefore the touch tablet [11]. With this approach, the exploration experience can be enhanced and braille labelling of features can be substituted with speech output, but there is still a tactile map needed. The MoBIC preparation system (MoPS) enables blind pedestrians to plan a walk through an urban area [12]. MoPS can be used to both explore maps and select routes in them. Routes selected are transferred to an electronic travel aid, the MoBIC outdoor system (MoODS), which then guides the user during his walk outdoors. The user interacts with the digital map loaded into the preparation system through the cursor keys of a PC keyboard. The system gives him information on his current position through speech and braille. As opposed to virtual tactile maps, absolute spatial input is not supported. The KnowWhere system is a hand gesture recognition system which conveys geographical information to blind people [8]. The system presents zoomable outlines of geographical objects by emitting a sound when the hand of a user touches the imaginary objects on a tactile grid. For a test, congenitally blind subjects explored country and state maps. They were able to find absolute positions and recognize puzzle pieces of the objects afterwards. Although they both deal with presenting geographical objects to the same user group through similar modalities, KnowWhere and virtual tactile maps differ in that KnowWhere conveys large-scale geographical information emphasizing shapes, as an atlas does, whereas virtual tactile maps serve to convey information specific to an urban area, including routes, as a street map does. 4. Prototype of a Virtual Tactile Map System 4.1. Design of a Prototype Requirements for the design of orientation aids can be elicited by asking how blind pedestrians prepare themselves for a walk through an urban area not fully known to them. Blind people who wish to navigate independently have to memorize the layout of the area given, learn path segments and angles between them and have to recognize them during walking [5]. Therefore, a virtual tactile map system should provide blind people with an overview of a given area, teach them straight route segments and angles at route succession choice points.

To implement the tactile map concept, there are no large tactile input/output devices available which can be used by blind people in the same manner as mice and graphics screens are used by sighted people (apart from research prototypes, s. [14]). Therefore, a new device was created, which tracks both a finger and small objects through image processing. Apart from the objects and a coloured ring for the finger tip, the device consists of a pad with a tactile grid and a camera on a tripod facing downward. To let the user explore the map before learning a route, the system starts in free exploration mode. In this mode, a user can move his hands on a rectangular tactile grid which abstractly represents the map and let the system tell him which geographical features lie underneath his fingers. When the finger tip touches a geographical object such as a street or a building on the grid, information on this object (e.g., its name) is emitted through synthesized speech. In exploration mode, a user can choose a route by selecting a start and an end point by placing game piece-like objects on the grid, after which the system calculates the path between them and switches to route learning mode. The first prototype of a virtual tactile map system emitted sound in route learning mode when the index finger was close to the route: The pitch of the sound was raised when the finger moved closer to the end point and was correspondingly lowered when it moved closer to the start point. The lateral distance to the nearest route segment was conveyed with the help of balance: When the finger tip was left of the nearest route segment, the balance was shifted to the left, and vice versa. In a first informal test done by a congenitally blind subject the exploration mode was found to be quite feasible, but the sound information did not suffice to keep the finger on the route (it did not enable tracing, cf. [1], pp. 312). For the second prototype, a new interaction method was devised in which the road is actively build by the user: The user constructs the road with long game pieces of a few different sizes called route bricks. The user selects the route as in the previous approach. The user then places one route brick after the other with the help of the system. The user places the start of the first route brick next to the piece symbolizing the beginning of the route, the start of consecutive route bricks at the end of the respective previous one. The system leads the user to place a route brick at the correct angle through sound. The route is learned both by constructing it with the route bricks and tracing the finished (tactile) route with the hand. The first prototype was not able to let the user trace the route, since the sound coding used did not suffice in conveying the two dimensions of point positions on the route. To direct a user to correctly place a route brick arbitrarily on the map, even three dimensions would need to be conveyed, the centre position (consisting of two dimensions) and the rotation angle of the brick. Fortunately, if bricks are placed one after the other starting with the route start maker, the start position of each brick is al-

GIS Acoustical Output Sound Speech Image capturing Raw map data Object recognition Fingers Markers Figure 1: Interaction of the system s main components ready known to the user, since it is the end position of the last brick correctly placed (or a point on the border of the round route start marker). Therefore, only the one-dimensional angle has to be conveyed through sound. The angle between the current position of the route brick to place (with the end point of the last brick correctly placed as the rotation axis) and the angle of the current piece of the route is mapped to three chords in different volumes. One chord stands for the deviation to the left, one for the deviation to the right and one for the correct placement of the brick. The placing of route bricks is similar to the approach to assess a child s mental map of a given route or environment by having it construct a layout with the help of model houses and card road strips [2]. One fundamental difference is that in our approach, the user learns a new layout (as opposed to recalling it) and is guided by the computer in placing objects. This approach is in accordance with the finding from psychology that blind people need information on path segments of routes and angles between them, as described above. Construction does also lead to more involvement and more learning than passive perception (cf. [9], pp. 273 for the reverse lag in recognition vs. production). In the case of virtual tactile maps, the construction of a route with route bricks should lead to a better understanding of the route than just moving a hand on a tactile grid and listening to the acoustical encoding of the distance of the finger from the route as described above. If this is indeed true needs to be evaluated formally. 4.2. Implementation The author has implemented virtual tactile maps with route bricks in a prototypical system. The system consists of modules for optical image tracking, the management of the digital map data and sound and speech output (see figure 1). The image processing component takes images from the camera and extracts the marked finger, the road end markers and the route bricks on the pad. Acoustical output is

Figure 2: Screenshot of the system with a visualisation of a map and a route produced after matching the positions of the objects extracted with GIS data and relating it to previous interactions (e.g., whether a route was selected). The image processing sub-system tracks objects in real-time. The finger marker and the route end elements are discriminated through their respective colour, the route bricks all have the same colour and are discriminated through their position. In order to let users successfully interact with the map with fingers and objects through image processing, four challenges had to be addressed: the presence of hands on the pad which could be confused with an object, occlusion of objects by the hands, the user accidentally moving objects already placed correctly, and finally the need of route bricks to attach to their respective predecessor. Objects are segmented by colour, the hand is therefore ignored since the object colours where chosen so that none of them is similar to skin colour. Each route brick can be placed without occluding it, because a route brick actually consists of two pieces on top of each other with vertical sticks connecting them, so that the brick can be moved by touching the lower piece without occluding the coloured upper piece. All objects to be placed have magnets underneath them which prevents them from being moved accidentally on the metal pad. The magnets also make route ends attach to each other. The current prototype of the virtual tactile map system is implemented as a standard GUI program under Windows NT (see figure 2 for a screen shot). After a digital map file has been loaded, it is displayed in a graphics window. There are also windows to display the current camera image and the position of objects tracked (not shown). The visual data display enables a blind and a sighted user to

cooperate in using the program (and facilitates development). In order to demonstrate and test the prototype on systems without image processing hardware, graphical representations of the current finger position and route selection elements can be manipulated with the mouse, which leads to the same acoustical output as interaction through the image recognition system. The program deals with three coordinate systems: one for the digital map, one for the image buffer and finally one for the visualization of the map and the current interaction in a GUI window. To translate between these coordinate systems, special translation objects are used, which are called coordinate adaptors. Coordinate adaptors are also responsible for translating between coordinate systems of different orientation. In order to keep relative distances as accurate as possible, the aspect ratio of the map is preserved both in the mapping from the image buffer coordinates (specifying positions of tracked fingers and objects) and the visual display of the map. Once a route has been selected by placing the two route end elements on the grid, the system finds the route and simplifies it by merging route segments who are connected by angles close to 180. The route of merged segments is then scaled so that it starts and ends on the border of the route start and end element, respectively. The scaling is done by an additional coordinate adaptor. The system then calculates the sizes of the route bricks to build the route. Generally, it can not be assumed that a straight line segment can exactly be filled with bricks. Therefore, the segments have to be refitted to a length which is a multiple of the brick length. In the current implementation of the system, all route bricks have the same length. 5. Future Work Although having only one brick size makes it easier for users to pick a brick to be placed, it makes the brick road less accurate compared to the route and harder to built. A road built with bricks of the same size is less accurate if the bricks are relatively large, because the error for each straight road segment is at worst equal to half the size of the smallest brick. Therefore, bricks of additional sizes will be used in the next incarnation of the system. The connection of bricks to each other or to the road end markers in the current setup is not satisfactory. Therefore, a new brick design will be devised through which bricks connect to each other mechanically, while still allowing the last brick to be rotated. Once this is done, the virtual tactile map approach will be evaluated formally.

6. References [1] BENTZEN, B.L., Orientation Aids, in: R.L. Welsch, B.B. Blasch (eds.), Foundations of Orientation and Mobility, New York 1980. [2] BLADES, M., Research Paradigms and Methodologies for Investigating Children s Wayfinding, in: N. Foreman, R. Gillet (eds.), Handbook of Spatial Reasearch Paradigms and Methodologies. Vol. 1, East Sussex 1997. [3] BRAMBRING, M., C. WEBER, Taktile, verbale und motorische Informationen zur geographischen Orientierung Blinder. Zeitschrift für experimentelle und angewandte Psychologie. Vol. 28 (1981). [4] EDMAN, P.K., Tactile Graphics, New York 1992. [5] GOLLEDGE, R.G., R.L. KLATZKY, J.M. LOOMIS, Cognitive Mapping and Wayfinding by Adults Without Vision, in: J. Portugali (ed.), The Construction of Cognitive Maps, Dordrecht 1996. [6] JANNSON, G., Spatial Orientation and Mobility of the Visually Impaired, in: B. Silverstone, M.A. Lang, B. Rosenthal, & E.E. Fraye (eds.), The Lighthouse Handbook on Visual Impairment and Rehabilitation, New York (in press). [7] HOLMES, E., R. MICHEL, A. RAAB, Computerunterstützte Erkundung digitaler Karten durch Sehbehinderte, in: W. Laufenberg, J. Lötzsch (eds.), Taktile Medien: Kolloquium über tastbare Abbildungen für Blinde. Freital/Dresden 1995. [8] KRUEGER, M.W., D. GILDEN, KnowWhere : an Audio/Spatial Interface for Blind People, in: Proc. ICAD `97, Palo Alto 1997. [9] MILLAR, S., Reading by Touch. London, New York 1997. [10] PAPANEK, V., The Green Imperative: Natural Design for the Real World, New York 1995. [11] PARKES, D., Nomad, an Audio-Tactile Tool for the Acquisition, Use and Management of Spatially Distributed Information by Visually Impaired People, in: A.F. Tatham and A.G. Dodds (eds.), Proc. Second International Symposium on Maps and Graphics for Visually Handicapped People, London 1988. [12] PETRIE, H., V. JOHNSON, TH. STROTHOTTE, A. RAAB, S. FRITZ, R. MICHEL, MoBIC: Designing a Travel Aid for Blind and Elderly People. The Journal of Navigation Vol. 49, No. 1 (1996). [13] SCHNEIDER, J., TH. STROTHOTTE, Virtual Tactile Maps, in: H.-J. Bullinger, J. Ziegler (eds.), Human-Computer Interaction: Ergonomics and User Interfaces, Proc. HCI Int l. '99 Vol. 1, Mahwah, NJ & London 1999. [14] SCHWEIKHARDT, W., Interaktives Erkunden von Graphiken durch Blinde, in: H.-J. Bullinger (ed.), Proc. Software-Ergonomie 85, Stuttgart 1985.