Virtual Tactile Maps

Size: px
Start display at page:

Download "Virtual Tactile Maps"

Transcription

1 In: H.-J. Bullinger, J. Ziegler, (Eds.). Human-Computer Interaction: Ergonomics and User Interfaces. Proc. HCI International 99 (the 8 th International Conference on Human-Computer Interaction), Munich, Germany, August 22-26, 1999, Vol. 1. Mahwah, NJ & London: Lawrence Erlbaum Assoc. pp Virtual Tactile Maps Jochen Schneider and Thomas Strothotte Otto-von-Guericke University of Magdeburg 1 Introduction Tactile maps constitute the media most commonly used by blind and partially sighted people for the exploration of spatial information. However, compared to maps for sighted people, they are inconvenient to produce and contain less information. We have developed methods and implemented them prototypically, by which visually impaired people can explore a geographical area flexibly with the help of a computer through virtual tactile maps. The design of a system is presented which captures hand positions through a video camera and produces acoustic output. It uses digital map data which is adapted to the requirements of the users. One of the most compelling problem is how to deal with the scale of a map, in particular the non-linearity of maps. In this paper, the development of the prototype and initial experience with it are presented. 2 A Challenging Vision A man, recognizable by the long white cane hanging from his arm as being blind, is standing on a sidewalk. He is holding his hands in front of him and moving them with concentration. He is in the middle of planning the rest of his walk through the down town area with the help of a virtual tactile map. He had explored the area and chosen a route through it at home. Now he is checking if he still remembers the rest of the route correctly. The man is wearing a baseball hat. There is a small camera in the screen of the hat, looking downwards at his hands and picking up their positions. The camera is connected to a small computer, which emits information through small earphones. The scenario described is a vision. We report on the development of a system for the exploration of virtual tactile maps as a wearable computer (cf. Mann 1997). In this paper, the concept of virtual tactile maps and a prototypical implementation will be presented.

2 3 The Virtual Tactile Map Concept Virtual tactile map systems are digital map systems which emit speech and sound when their maps are explored by hand movements. Virtual tactile maps enable blind and partially sighted users to explore an unknown geographical space and learn routes in it, similar to tactile maps, which they are inspired by and named after. The first aspect of the user interaction with virtual tactile maps is the interpretation of hand gestures and their mapping to the geographical space; the second aspect the selection and acoustical presentation of the geographical elements on the map; the third aspect the preparation of the digital map data to a representation suitable for hand input, which is relatively coarse. Virtual tactile maps serve to aid orientation, defined by Jansson as perceiving/knowing the spatial relations between the traveler s current position and direction of body and significant features of space (Jansson 99). Virtual tactile maps belong to a certain kind of electronic travel aids, namely orientation systems for larger space. 4 Related Work The KnowWhere system is a hand gesture recognition system which conveys geographical information to blind people (Krueger & Gilden 1997). The system presents zoomable geographical objects (e.g., outlines of countries) by emitting a sound when the hands of a user touch their imaginary image on the desktop. The hands lie on a tactile grid. Their movements lead to speech or sound output in accordance to the kind of the geographical element which they have touched. Krueger and Gilden have conducted a test of their system with five congenitally blind subjects. The test consisted of exploring large geographic objects. The subjects were able to find easily absolute positions and recognize puzzle pieces of the shapes explored afterwards. Two were also able to produce acceptable drawings of the outlines of the geographical objects. Although they both deal with presenting geographical objects to the same user group through video gesture input and acoustical output, KnowWhere and virtual tactile maps differ in some important aspects. The hardware used for KnowWhere consists of a light table and special-purpose hardware, which facilitates the gesture recognition, but also impedes portability and the widespread use of the system. Even more important, virtual tactile maps serve to convey information of an urban area, as a street map does, whereas KnowWhere conveys large-scale geographical information, as an atlas does. Starner, Weaver and Pentland (1998) describe a wearable computer, which recognizes American Sign Language gestures through video input. The system is being developed to eventually translate the gestures into spoken language. A

3 camera in a cap the user is wearing which looks down at his hands picks up the gestures. This approach shows the feasibility of the vision of a wearable system implementing virtual tactile maps. 5 The Design of a Virtual Tactile Map System In this section, requirements of virtual tactile maps and the design of an actual system are presented. 5.1 Requirements Requirements for virtual tactile maps can be elicited by asking how blind pedestrians prepare themselves for a walk in an urban area not fully known to them. Blind people who wish to navigate independently have to memorize the layout of the area given, learn segments of a path and angles between them and have to recognize them during walking (Golledge et al. 1996). A computer system can support this process by presenting the information needed. In (Strothotte et al. 1996), a study is described in which blind pedestrians and orientation and mobility trainers were asked about information an electronic travel aid should emit through speech synthesis. Information identified as essential was the name and the type of road, distances and obstacles on it; information identified as desirable was details about objects such as shops, public buildings, etc. Finally, information identified as nice-to-have was those on temporary obstacles such as roadworks and diversions. 5.2 System Design Based on the requirements described above, methods and tools for the exploration of virtual tactile maps were designed. The design was implemented as a prototype, and a first test was conducted. The most important aspects of the design were the interaction and the preparation of the map data, which will be described in the following. Interaction with cartographic objects on virtual tactile maps is done through pointing. A pointing device for a virtual tactile map system has to be large, portable, affordable, and allow absolute setting of more than one position. There are no tactile displays or touch tablets available meeting all of these requirements. Therefore, a video camera is used as an input device for the virtual tactile map approach presented here. The hand movements result in information about geographical objects under the hand and their relation to others being emitted as speech and sound. The virtual tactile map system presented here supports both map exploration and learning of a route. Exploration is done by moving hands freely on the map,

4 which results in the system speaking information about the objects under the hands. Route learning is done by first selecting a route. The system then guides a finger on the route through sound. Currently, pitch and balance are used to convey the distance of the finger from the route. Commercially available map data is used in the system, enriched with positions of street car stops, etc. The system manages the map data in a small Geographical Information System (GIS), which is able to answer requests on geographical elements at a certain position or the distance between points. 6 Implementation and Test of a Prototype The requirements above led to the design and implementation of a prototype. The prototypical setup is stationary. The camera is mounted on a tripod above a table. The stationary setup makes it possible to use a tactile grid pad on the desk the camera is aimed at, which serves to orient the hands and to facilitate image processing by providing a constant background. In addition, small markers can be employed as interaction devices, e.g. to place landmarks on the map. Finger tips and markers are recognized through color segmentation, in the case of the index fingers by finding the color of a ring each finger tip is wearing (see Fig. 1). A prototype first prototype of a virtual tactile map system with hand gesture recognition for a single hand and without markers was tested by a blind male of about forty years, experienced in using tactile maps. Both the exploration of the whole map and the learning of a route were tested. For the former, the test subject moved his hand on the pad. When touching a street, the system emitted its name. The street exploration mode was tested as well: A long street running roughly diagonal through the map was selected. The subject then tried to follow the street with the index finger, guided by a tone changing in pitch and balance. The map exploration feature was felt to be useful, the coding of the distance to a certain street through balance and pitch too rough, which can partly be attributed to the speakers used. Also, the gesture recognition subsystem was processing black-and-white pictures, resulting in jumps of the calculated hand position. 7 Future Work The gesture recognition, the preparation of the geographical data and its acoustical coding still need to be refined. The extent of a map can then be chosen by the user by placing tokens of different forms on the table. It remains to be investigated how these tokens can be replaced when moving from the stationary prototype implemented thus far to a truly mobile system.

5 GIS Acoustical Output Sound Speech Raw map data Image capturing Object recognition Fingers Markers Fig. 1: Architecture of the prototype 8 References Mann, S. (1996). Wearable computing: a first step towards personal imaging. IEEE Computer, 30 (2), Jansson, G. (1999). Spatial orientation and mobility for the visually impaired. In Silverstone, B., Lang, M. A., Rosenthal, B. & Fraye, E. E. (Eds.): The Lighthouse Handbook on Visual Impairment and Rehabilitation. New York: The Lighthouse and Oxford University Press. Forthcoming. Krueger, M. W. & Gilden, D. (1997). KnowWhere : an audio/spatial interface for blind people. Proc. ICAD 97. Xerox PARC: Xerox. (Available online under Kruger.PDF). Starner, Th., Weaver, J. & Pentland, A. (1998). Real-time American Sign Language recognition using desk and wearable computer based video. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20 (12), Golledge, R. G., Klatzky, R. L., & Loomis, J. M. (1996). Cognitive Mapping and Wayfinding by Adults Without Vision. In Portugali, J. (Ed.): The Construction of Cognitive Maps. Dordrecht: Kluwer,

CONSTRUCTING THE YELLOW BRICK ROAD: ROUTE BRICKS ON VIRTUAL TACTILE MAPS. Jochen Schneider 1)

CONSTRUCTING THE YELLOW BRICK ROAD: ROUTE BRICKS ON VIRTUAL TACTILE MAPS. Jochen Schneider 1) In: R. Vollmar, R. Wagner (eds). Proc. International Conference on Computers Helping People with Special Needs (ICCHP) 2000, Univ. of Karlsruhe, Germany, July 17-21, 2000. Wien: Österreichische Computer

More information

Constructive Exploration of Spatial Information by Blind Users

Constructive Exploration of Spatial Information by Blind Users Constructive Exploration of Spatial Information by Blind Users Jochen Schneider, Thomas Strothotte Department of Simulation and Graphics Otto-von-Guericke University of Magdeburg Universitätsplatz 2, 39016

More information

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills O Lahav and D Mioduser School of Education, Tel Aviv University,

More information

Interactive Exploration of City Maps with Auditory Torches

Interactive Exploration of City Maps with Auditory Torches Interactive Exploration of City Maps with Auditory Torches Wilko Heuten OFFIS Escherweg 2 Oldenburg, Germany Wilko.Heuten@offis.de Niels Henze OFFIS Escherweg 2 Oldenburg, Germany Niels.Henze@offis.de

More information

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv, Tel-Aviv,

More information

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS Richard Etter 1 ) and Marcus Specht 2 ) Abstract In this paper the design, development and evaluation of a GPS-based

More information

Multisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I

Multisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I 1 Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study I Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv,

More information

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE ISSN: 0976-2876 (Print) ISSN: 2250-0138 (Online) SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE L. SAROJINI a1, I. ANBURAJ b, R. ARAVIND c, M. KARTHIKEYAN d AND K. GAYATHRI e a Assistant professor,

More information

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras TACCESS ASSETS 2016 Lee Stearns 1, Ruofei Du 1, Uran Oh 1, Catherine Jou 1, Leah Findlater

More information

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers Leading the Agenda Everyday technology: A focus group with children, young people and their carers March 2018 1 1.0 Introduction Assistive technology is an umbrella term that includes assistive, adaptive,

More information

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK Javier Sanchez Center for Computer Research in Music and Acoustics (CCRMA) Stanford University The Knoll, 660 Lomita Dr. Stanford, CA 94305,

More information

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

From Dots To Shapes: an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun "From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils Patrick Roth, Lori Petrucci, Thierry Pun Computer Science Department CUI, University of Geneva CH - 1211 Geneva

More information

Automated Mobility and Orientation System for Blind

Automated Mobility and Orientation System for Blind Automated Mobility and Orientation System for Blind Shradha Andhare 1, Amar Pise 2, Shubham Gopanpale 3 Hanmant Kamble 4 Dept. of E&TC Engineering, D.Y.P.I.E.T. College, Maharashtra, India. ---------------------------------------------------------------------***---------------------------------------------------------------------

More information

6 Ubiquitous User Interfaces

6 Ubiquitous User Interfaces 6 Ubiquitous User Interfaces Viktoria Pammer-Schindler May 3, 2016 Ubiquitous User Interfaces 1 Days and Topics March 1 March 8 March 15 April 12 April 26 (10-13) April 28 (9-14) May 3 May 10 Administrative

More information

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence Ji-Won Song Dept. of Industrial Design. Korea Advanced Institute of Science and Technology. 335 Gwahangno, Yusong-gu,

More information

Introduction to Mediated Reality

Introduction to Mediated Reality INTERNATIONAL JOURNAL OF HUMAN COMPUTER INTERACTION, 15(2), 205 208 Copyright 2003, Lawrence Erlbaum Associates, Inc. Introduction to Mediated Reality Steve Mann Department of Electrical and Computer Engineering

More information

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Visual Interpretation of Hand Gestures as a Practical Interface Modality Visual Interpretation of Hand Gestures as a Practical Interface Modality Frederik C. M. Kjeldsen Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate

More information

Electronic Navigation Some Design Issues

Electronic Navigation Some Design Issues Sas, C., O'Grady, M. J., O'Hare, G. M.P., "Electronic Navigation Some Design Issues", Proceedings of the 5 th International Symposium on Human Computer Interaction with Mobile Devices and Services (MobileHCI'03),

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

Exploring Geometric Shapes with Touch

Exploring Geometric Shapes with Touch Exploring Geometric Shapes with Touch Thomas Pietrzak, Andrew Crossan, Stephen Brewster, Benoît Martin, Isabelle Pecci To cite this version: Thomas Pietrzak, Andrew Crossan, Stephen Brewster, Benoît Martin,

More information

Search Strategies of Visually Impaired Persons using a Camera Phone Wayfinding System

Search Strategies of Visually Impaired Persons using a Camera Phone Wayfinding System Search Strategies of Visually Impaired Persons using a Camera Phone Wayfinding System R. Manduchi 1, J. Coughlan 2 and V. Ivanchenko 2 1 University of California, Santa Cruz, CA 2 Smith-Kettlewell Eye

More information

Assessing the utility of dual finger haptic interaction with 3D virtual environments for blind people

Assessing the utility of dual finger haptic interaction with 3D virtual environments for blind people Assessing the utility of dual finger haptic interaction with 3D virtual environments for blind people K Gladstone 1, H Graupp 1 and C Avizzano 2 1 isys R&D, Royal National Institute of the Blind, 105 Judd

More information

1 ABSTRACT. Proceedings REAL CORP 2012 Tagungsband May 2012, Schwechat.

1 ABSTRACT. Proceedings REAL CORP 2012 Tagungsband May 2012, Schwechat. Oihana Otaegui, Estíbaliz Loyo, Eduardo Carrasco, Caludia Fösleitner, John Spiller, Daniela Patti, Adela, Marcoci, Rafael Olmedo, Markus Dubielzig 1 ABSTRACT (Oihana Otaegui, Vicomtech-IK4, San Sebastian,

More information

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp

More information

Technology offer. Aerial obstacle detection software for the visually impaired

Technology offer. Aerial obstacle detection software for the visually impaired Technology offer Aerial obstacle detection software for the visually impaired Technology offer: Aerial obstacle detection software for the visually impaired SUMMARY The research group Mobile Vision Research

More information

Paint with Your Voice: An Interactive, Sonic Installation

Paint with Your Voice: An Interactive, Sonic Installation Paint with Your Voice: An Interactive, Sonic Installation Benjamin Böhm 1 benboehm86@gmail.com Julian Hermann 1 julian.hermann@img.fh-mainz.de Tim Rizzo 1 tim.rizzo@img.fh-mainz.de Anja Stöffler 1 anja.stoeffler@img.fh-mainz.de

More information

Designing A Human Vehicle Interface For An Intelligent Community Vehicle

Designing A Human Vehicle Interface For An Intelligent Community Vehicle Designing A Human Vehicle Interface For An Intelligent Community Vehicle Kin Kok Lee, Yong Tsui Lee and Ming Xie School of Mechanical & Production Engineering Nanyang Technological University Nanyang Avenue

More information

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY *Ms. S. VAISHNAVI, Assistant Professor, Sri Krishna Arts And Science College, Coimbatore. TN INDIA **SWETHASRI. L., Final Year B.Com

More information

Exploring Surround Haptics Displays

Exploring Surround Haptics Displays Exploring Surround Haptics Displays Ali Israr Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh, PA 15213 USA israr@disneyresearch.com Ivan Poupyrev Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh,

More information

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne Introduction to HCI CS4HC3 / SE4HC3/ SE6DO3 Fall 2011 Instructor: Kevin Browne brownek@mcmaster.ca Slide content is based heavily on Chapter 1 of the textbook: Designing the User Interface: Strategies

More information

Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects?

Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects? Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects? Gunnar Jansson Department of Psychology, Uppsala University

More information

A Concept Study on Wearable Cockpit for Construction Work - not only for machine operation but also for project control -

A Concept Study on Wearable Cockpit for Construction Work - not only for machine operation but also for project control - A Concept Study on Wearable Cockpit for Construction Work - not only for machine operation but also for project control - Thomas Bock, Shigeki Ashida Chair for Realization and Informatics of Construction,

More information

VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY

VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY Construction Informatics Digital Library http://itc.scix.net/ paper w78-1996-89.content VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY Bouchlaghem N., Thorpe A. and Liyanage, I. G. ABSTRACT:

More information

Motivation and objectives of the proposed study

Motivation and objectives of the proposed study Abstract In recent years, interactive digital media has made a rapid development in human computer interaction. However, the amount of communication or information being conveyed between human and the

More information

INDE/TC 455: User Interface Design

INDE/TC 455: User Interface Design INDE/TC 455: User Interface Design Module 13.0 Interface Technology 1 Three more interface considerations What is the best allocation of responsibility between the human and the tool? What is the best

More information

Geo-Located Content in Virtual and Augmented Reality

Geo-Located Content in Virtual and Augmented Reality Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Towards affordance based human-system interaction based on cyber-physical systems

Towards affordance based human-system interaction based on cyber-physical systems Towards affordance based human-system interaction based on cyber-physical systems Zoltán Rusák 1, Imre Horváth 1, Yuemin Hou 2, Ji Lihong 2 1 Faculty of Industrial Design Engineering, Delft University

More information

Controlling vehicle functions with natural body language

Controlling vehicle functions with natural body language Controlling vehicle functions with natural body language Dr. Alexander van Laack 1, Oliver Kirsch 2, Gert-Dieter Tuzar 3, Judy Blessing 4 Design Experience Europe, Visteon Innovation & Technology GmbH

More information

INDE/TC 455: User Interface Design

INDE/TC 455: User Interface Design INDE/TC 455: User Interface Design Autumn 2008 Class #21 URL:courses.washington.edu/ie455 1 TA Moment 2 Class #20 Review Review of flipbooks 3 Assignments for Class #22 Individual Review modules: 5.7,

More information

R (2) Controlling System Application with hands by identifying movements through Camera

R (2) Controlling System Application with hands by identifying movements through Camera R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

A Support System for Visually Impaired Persons Using Three-Dimensional Virtual Sound

A Support System for Visually Impaired Persons Using Three-Dimensional Virtual Sound A Support System for Visually Impaired Persons Using Three-Dimensional Virtual Sound Yoshihiro KAWAI 1), Makoto KOBAYASHI 2), Hiroki MINAGAWA 2), Masahiro MIYAKAWA 2), and Fumiaki TOMITA 1) 1) Electrotechnical

More information

Projection Based HCI (Human Computer Interface) System using Image Processing

Projection Based HCI (Human Computer Interface) System using Image Processing GRD Journals- Global Research and Development Journal for Volume 1 Issue 5 April 2016 ISSN: 2455-5703 Projection Based HCI (Human Computer Interface) System using Image Processing Pankaj Dhome Sagar Dhakane

More information

Collaboration on Interactive Ceilings

Collaboration on Interactive Ceilings Collaboration on Interactive Ceilings Alexander Bazo, Raphael Wimmer, Markus Heckner, Christian Wolff Media Informatics Group, University of Regensburg Abstract In this paper we discuss how interactive

More information

VR/AR Concepts in Architecture And Available Tools

VR/AR Concepts in Architecture And Available Tools VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality

More information

Navigation System for the Blind:

Navigation System for the Blind: Jack M. Loomis Department of Psychology University of California Santa Barbara CA 93106 loomis@psych.ucsb.edu Navigation System for the Blind: Auditory Display Modes and Guidance Reginald G. Golledge Department

More information

International Journal OF Engineering Sciences & Management Research

International Journal OF Engineering Sciences & Management Research EMBEDDED MICROCONTROLLER BASED REAL TIME SUPPORT FOR DISABLED PEOPLE USING GPS Ravi Sankar T *, Ashok Kumar K M.Tech, Dr.M.Narsing Yadav M.S.,Ph.D(U.S.A) * Department of Electronics and Computer Engineering,

More information

Vocational Training with Combined Real/Virtual Environments

Vocational Training with Combined Real/Virtual Environments DSSHDUHGLQ+-%XOOLQJHU -=LHJOHU(GV3URFHHGLQJVRIWKHWK,QWHUQDWLRQDO&RQIHUHQFHRQ+XPDQ&RPSXWHU,Q WHUDFWLRQ+&,0 QFKHQ0DKZDK/DZUHQFH(UOEDXP9RO6 Vocational Training with Combined Real/Virtual Environments Eva

More information

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute. CS-525U: 3D User Interaction Intro to 3D UI Robert W. Lindeman Worcester Polytechnic Institute Department of Computer Science gogo@wpi.edu Why Study 3D UI? Relevant to real-world tasks Can use familiarity

More information

Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology Touch & Gesture HCID 520 User Interface Software & Technology Natural User Interfaces What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger

More information

Virtual Reality Calendar Tour Guide

Virtual Reality Calendar Tour Guide Technical Disclosure Commons Defensive Publications Series October 02, 2017 Virtual Reality Calendar Tour Guide Walter Ianneo Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

A contemporary interactive computer game for visually impaired teens

A contemporary interactive computer game for visually impaired teens Interactive Computer Game for Visually Impaired Teens Boonsit Yimwadsana, et al. A contemporary interactive computer game for visually impaired teens Boonsit Yimwadsana, Phakin Cheangkrachange, Kamchai

More information

Haptic messaging. Katariina Tiitinen

Haptic messaging. Katariina Tiitinen Haptic messaging Katariina Tiitinen 13.12.2012 Contents Introduction User expectations for haptic mobile communication Hapticons Example: CheekTouch Introduction Multiple senses are used in face-to-face

More information

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics

More information

Using Mixed Reality as a Simulation Tool in Urban Planning Project for Sustainable Development

Using Mixed Reality as a Simulation Tool in Urban Planning Project for Sustainable Development Journal of Civil Engineering and Architecture 9 (2015) 830-835 doi: 10.17265/1934-7359/2015.07.009 D DAVID PUBLISHING Using Mixed Reality as a Simulation Tool in Urban Planning Project Hisham El-Shimy

More information

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces In Usability Evaluation and Interface Design: Cognitive Engineering, Intelligent Agents and Virtual Reality (Vol. 1 of the Proceedings of the 9th International Conference on Human-Computer Interaction),

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART Author: S. VAISHNAVI Assistant Professor, Sri Krishna Arts and Science College, Coimbatore (TN) INDIA Co-Author: SWETHASRI L. III.B.Com (PA), Sri

More information

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,

More information

CS415 Human Computer Interaction

CS415 Human Computer Interaction CS415 Human Computer Interaction Lecture 10 Advanced HCI Universal Design & Intro to Cognitive Models October 30, 2017 Sam Siewert Summary of Thoughts on Intelligent Transportation Systems Collective Wisdom

More information

Audio GPS: spatial audio in a minimal attention interface

Audio GPS: spatial audio in a minimal attention interface Audio GPS: spatial audio in a minimal attention interface SIMON HOLLAND & DAVID R. MORSE Computing Department, The Open University, Walton Hall, Milton Keynes, MK7 6AA, United Kingdom. Email: S.Holland@open.ac.uk,

More information

Advanced Techniques for Mobile Robotics Location-Based Activity Recognition

Advanced Techniques for Mobile Robotics Location-Based Activity Recognition Advanced Techniques for Mobile Robotics Location-Based Activity Recognition Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Activity Recognition Based on L. Liao, D. J. Patterson, D. Fox,

More information

Computer Vision Based Real-Time Stairs And Door Detection For Indoor Navigation Of Visually Impaired People

Computer Vision Based Real-Time Stairs And Door Detection For Indoor Navigation Of Visually Impaired People ISSN (e): 2250 3005 Volume, 08 Issue, 8 August 2018 International Journal of Computational Engineering Research (IJCER) For Indoor Navigation Of Visually Impaired People Shrugal Varde 1, Dr. M. S. Panse

More information

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces Investigating Phicon Feedback in Non- Visual Tangible User Interfaces David McGookin and Stephen Brewster Glasgow Interactive Systems Group School of Computing Science University of Glasgow Glasgow, G12

More information

Going My Way: a user-aware route planner

Going My Way: a user-aware route planner Going My Way: a user-aware route planner Jaewoo Chung Media Laboratory, MIT 20 Ames St. E15-384C Cambridge, MA 02139 USA jaewoo@media.mit.edu Paulina Modlitba Media Laboratory, MIT 20 Ames St. E15-384C

More information

International Journal of Scientific & Engineering Research, Volume 7, Issue 2, February ISSN

International Journal of Scientific & Engineering Research, Volume 7, Issue 2, February ISSN International Journal of Scientific & Engineering Research, Volume 7, Issue 2, February-2016 181 A NOVEL RANGE FREE LOCALIZATION METHOD FOR MOBILE SENSOR NETWORKS Anju Thomas 1, Remya Ramachandran 2 1

More information

Integrate the BlindAid system in a traditional orientation and mobility rehabilitation program

Integrate the BlindAid system in a traditional orientation and mobility rehabilitation program Integrate the BlindAid system in a traditional orientation and mobility rehabilitation program The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story

More information

Blindstation : a Game Platform Adapted to Visually Impaired Children

Blindstation : a Game Platform Adapted to Visually Impaired Children Blindstation : a Game Platform Adapted to Visually Impaired Children Sébastien Sablé and Dominique Archambault INSERM U483 / INOVA - Université Pierre et Marie Curie 9, quai Saint Bernard, 75,252 Paris

More information

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine Show me the direction how accurate does it have to be? Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine Published: 2010-01-01 Link to publication Citation for published version (APA): Magnusson,

More information

HUMAN COMPUTER INTERFACE

HUMAN COMPUTER INTERFACE HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the

More information

A MOBILE SOLUTION TO HELP VISUALLY IMPAIRED PEOPLE IN PUBLIC TRANSPORTS AND IN PEDESTRIAN WALKS

A MOBILE SOLUTION TO HELP VISUALLY IMPAIRED PEOPLE IN PUBLIC TRANSPORTS AND IN PEDESTRIAN WALKS D. Brito, et al., Int. J. Sus. Dev. Plann. Vol. 13, No. 2 (2018) 281 293 A MOBILE SOLUTION TO HELP VISUALLY IMPAIRED PEOPLE IN PUBLIC TRANSPORTS AND IN PEDESTRIAN WALKS D. BRITO, T. VIANA, D. SOUSA, A.

More information

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Application Areas of AI   Artificial intelligence is divided into different branches which are mentioned below: Week 2 - o Expert Systems o Natural Language Processing (NLP) o Computer Vision o Speech Recognition And Generation o Robotics o Neural Network o Virtual Reality APPLICATION AREAS OF ARTIFICIAL INTELLIGENCE

More information

English Language Arts Materials Collections 2019 Integrated Model Spring Assessment Window

English Language Arts Materials Collections 2019 Integrated Model Spring Assessment Window English Language Arts Materials Collections 2019 Integrated Model Spring Assessment Window Dynamic Learning Maps (DLM ) testlets sometimes call for the use of specific materials. The Testlet Information

More information

CS594, Section 30682:

CS594, Section 30682: CS594, Section 30682: Distributed Intelligence in Autonomous Robotics Spring 2003 Tuesday/Thursday 11:10 12:25 http://www.cs.utk.edu/~parker/courses/cs594-spring03 Instructor: Dr. Lynne E. Parker ½ TA:

More information

FP7 ICT Call 6: Cognitive Systems and Robotics

FP7 ICT Call 6: Cognitive Systems and Robotics FP7 ICT Call 6: Cognitive Systems and Robotics Information day Luxembourg, January 14, 2010 Libor Král, Head of Unit Unit E5 - Cognitive Systems, Interaction, Robotics DG Information Society and Media

More information

Microsoft Scrolling Strip Prototype: Technical Description

Microsoft Scrolling Strip Prototype: Technical Description Microsoft Scrolling Strip Prototype: Technical Description Primary features implemented in prototype Ken Hinckley 7/24/00 We have done at least some preliminary usability testing on all of the features

More information

1. Introduction. 2. Research Context

1. Introduction. 2. Research Context (ATM) System for the Exploration of Digital Heritage Buildings by Visually-impaired Individuals - First Prototype and Preliminary Evaluation Liam O Sullivan, Lorenzo Picinali, Christopher Feakes, Douglas

More information

Smart Navigation System for Visually Impaired Person

Smart Navigation System for Visually Impaired Person Smart Navigation System for Visually Impaired Person Rupa N. Digole 1, Prof. S. M. Kulkarni 2 ME Student, Department of VLSI & Embedded, MITCOE, Pune, India 1 Assistant Professor, Department of E&TC, MITCOE,

More information

Static and dynamic tactile directional cues experiments with VTPlayer mouse

Static and dynamic tactile directional cues experiments with VTPlayer mouse Introduction Tactile Icons Experiments Conclusion 1/ 14 Static and dynamic tactile directional cues experiments with VTPlayer mouse Thomas Pietrzak - Isabelle Pecci - Benoît Martin LITA Université Paul

More information

Open Archive TOULOUSE Archive Ouverte (OATAO)

Open Archive TOULOUSE Archive Ouverte (OATAO) Open Archive TOULOUSE Archive Ouverte (OATAO) OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible. This is an author-deposited

More information

Robotic travel aid for the blind: HARUNOBU-6

Robotic travel aid for the blind: HARUNOBU-6 Robotic travel aid for the blind: HARUNOBU-6 Hideo Mori and Shinji Kotani Department of Electrical Engineering, Yamanashi University, Takeda-4, Kofu 400-8511, JAPAN forest@es.yamanashi.ac.jp, kotani@es.yamanashi.ac.jp

More information

2. Introduction to Computer Haptics

2. Introduction to Computer Haptics 2. Introduction to Computer Haptics Seungmoon Choi, Ph.D. Assistant Professor Dept. of Computer Science and Engineering POSTECH Outline Basics of Force-Feedback Haptic Interfaces Introduction to Computer

More information

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer

More information

Mobile Cognitive Indoor Assistive Navigation for the Visually Impaired

Mobile Cognitive Indoor Assistive Navigation for the Visually Impaired 1 Mobile Cognitive Indoor Assistive Navigation for the Visually Impaired Bing Li 1, Manjekar Budhai 2, Bowen Xiao 3, Liang Yang 1, Jizhong Xiao 1 1 Department of Electrical Engineering, The City College,

More information

AUTOMATIC SPEECH RECOGNITION FOR NUMERIC DIGITS USING TIME NORMALIZATION AND ENERGY ENVELOPES

AUTOMATIC SPEECH RECOGNITION FOR NUMERIC DIGITS USING TIME NORMALIZATION AND ENERGY ENVELOPES AUTOMATIC SPEECH RECOGNITION FOR NUMERIC DIGITS USING TIME NORMALIZATION AND ENERGY ENVELOPES N. Sunil 1, K. Sahithya Reddy 2, U.N.D.L.mounika 3 1 ECE, Gurunanak Institute of Technology, (India) 2 ECE,

More information

Controlling Humanoid Robot Using Head Movements

Controlling Humanoid Robot Using Head Movements Volume-5, Issue-2, April-2015 International Journal of Engineering and Management Research Page Number: 648-652 Controlling Humanoid Robot Using Head Movements S. Mounica 1, A. Naga bhavani 2, Namani.Niharika

More information

Location and navigation system for visually impaired

Location and navigation system for visually impaired Česky Paper: # 8/11/2002 ISSN 1213-161X Content Location and navigation system for visually impaired Václav Eksler *), Genevičve Baudoin *)), Martine Villegas *)) Department of Telecommunications Faculty

More information

COGNITIVE-MAP FORMATION OF BLIND PERSONS IN A VIRTUAL SOUND ENVIRONMENT. Tohoku Fukushi University. Japan.

COGNITIVE-MAP FORMATION OF BLIND PERSONS IN A VIRTUAL SOUND ENVIRONMENT. Tohoku Fukushi University. Japan. COGNITIVE-MAP FORMATION OF BLIND PERSONS IN A VIRTUAL SOUND ENVIRONMENT Makoto Ohuchi 1,2,Yukio Iwaya 1,Yôiti Suzuki 1 and Tetsuya Munekata 3 1 Research Institute of Electrical Communiion, Tohoku University

More information

Optical Marionette: Graphical Manipulation of Human s Walking Direction

Optical Marionette: Graphical Manipulation of Human s Walking Direction Optical Marionette: Graphical Manipulation of Human s Walking Direction Akira Ishii, Ippei Suzuki, Shinji Sakamoto, Keita Kanai Kazuki Takazawa, Hiraku Doi, Yoichi Ochiai (Digital Nature Group, University

More information

Spatial navigation in humans

Spatial navigation in humans Spatial navigation in humans Recap: navigation strategies and spatial representations Spatial navigation with immersive virtual reality (VENLab) Do we construct a metric cognitive map? Importance of visual

More information

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison e-issn 2455 1392 Volume 2 Issue 10, October 2016 pp. 34 41 Scientific Journal Impact Factor : 3.468 http://www.ijcter.com Design a Model and Algorithm for multi Way Gesture Recognition using Motion and

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time

More information

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI RV - AULA 05 - PSI3502/2018 User Experience, Human Computer Interaction and UI Outline Discuss some general principles of UI (user interface) design followed by an overview of typical interaction tasks

More information

Tips for Delivering Presentations

Tips for Delivering Presentations Tips for Delivering Presentations The Best Tips for Vocal Quality 1. Practice varying your inflection by reading passages from children s books because that type of delivery lets you exaggerate and experiment

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

Booklet of teaching units

Booklet of teaching units International Master Program in Mechatronic Systems for Rehabilitation Booklet of teaching units Third semester (M2 S1) Master Sciences de l Ingénieur Université Pierre et Marie Curie Paris 6 Boite 164,

More information

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB MD.SHABEENA BEGUM, P.KOTESWARA RAO Assistant Professor, SRKIT, Enikepadu, Vijayawada ABSTRACT In today s world, in almost all sectors, most of the work

More information