The Chatty Environment Providing Everyday Independence to the Visually Impaired

Similar documents
IMPROVING THE REALITY PERCEPTION OF VISUALLY IMPAIRED THROUGH PERVASIVE COMPUTING

Enhancing Bluetooth Location Services with Direction Finding

UbiComp s Impact on Other Sciences

Case sharing of the use of RF Localization Techniques. Dr. Frank Tong LSCM R&D Centre LSCM Summit 2015

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

[Kumar, 5(12): December2018] ISSN DOI /zenodo Impact Factor

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

SMART RFID FOR LOCATION TRACKING

CPE/CSC 580: Intelligent Agents

Senion IPS 101. An introduction to Indoor Positioning Systems

The Future of Smart Everyday Objects. Advances in Human Computer Interaction Sven Steudter

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

Design and Development of Blind Navigation System using GSM and RFID Technology

MAKING IOT SENSOR SOLUTIONS FUTURE-PROOF AT SCALE

OASIS concept. Evangelos Bekiaris CERTH/HIT OASIS ISWC2011, 24 October, Bonn

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

Ubiquitous Computing. Spring 2010

Beacons Proximity UUID, Major, Minor, Transmission Power, and Interval values made easy

Real Time Indoor Tracking System using Smartphones and Wi-Fi Technology

Interactive guidance system for railway passengers

Interface Design V: Beyond the Desktop

Measuring Crossing Times of Passenger Vehicles Using Bluetooth Technology at U.S. Mexico Border

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu

What will the robot do during the final demonstration?

Tutorial: The Web of Things

PEOPLE on Smart Cities

Smart Beacon Management with BlueRange

Indoor Positioning with a WLAN Access Point List on a Mobile Device

CitiTag Multiplayer Infrastructure

Multipath and Diversity

A 5G Paradigm Based on Two-Tier Physical Network Architecture

SYSTEM OF IDENTIFICATION AND INFORMATION OF BLIND AND VISUALLY IMPAIRED PERSONS IN THE TRAFFIC NETWORK

We have all of this Affordably NOW! Not months and years down the road, NOW!

Dual-Reality Objects

Mobile Crowdsensing enabled IoT frameworks: harnessing the power and wisdom of the crowd

6 Ubiquitous User Interfaces

Imagine your future lab. Designed using Virtual Reality and Computer Simulation

I C T. Per informazioni contattare: "Vincenzo Angrisani" -

AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS)

ibeacon Spoofing Security and Privacy Implications of ibeacon Technology Karan Singhal

Physical Affordances of Check-in Stations for Museum Exhibits

Concept of the application supporting blind and visually impaired people in public transport

HOW TO CHOOSE The Right College For You.

Interaction in Pervasive Computing Settings using Bluetooth-Enabled Active Tags and Passive RFID Technology together with Mobile Phones

overview steffen p walz, m.a.

Pervasive Indoor Localization and Tracking Based on Fingerprinting. Gary Chan Professor, CSE HKUST

Sensing in Ubiquitous Computing

Procedures for Testing and Troubleshooting Radianse RTLS

Indoor Localization and Tracking using Wi-Fi Access Points

Domain Understanding and Requirements Elicitation

Location Based Services On the Road to Context-Aware Systems

WAMI: An application that will retrieve lost items around a household

GPS TECHNOLOGY IN COMMUNITY SERVICES

A Simple Smart Shopping Application Using Android Based Bluetooth Beacons (IoT)

ENGR 499: Wireless ECG

Paper number ITS-EU-SP0127. Experimenting Bluetooth beacon infrastructure in urban transportation

DEMONSTRATION OF ROBOTIC WHEELCHAIR IN FUKUOKA ISLAND-CITY

AUGMENTED REALITY IN URBAN MOBILITY

SMART VIBRATING BAND TO INTIMATE OBSTACLE FOR VISUALLY IMPAIRED

Agenda Motivation Systems and Sensors Algorithms Implementation Conclusion & Outlook

A Real Estate Application of Eye tracking in a Virtual Reality Environment

Detection of Vulnerable Road Users in Blind Spots through Bluetooth Low Energy

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

SMART WEARABLE PROTOTYPE FOR VISUALLY IMPAIRED

Canada : Innovation and Inclusion in the Network Age

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Interactive Exploration of City Maps with Auditory Torches

Transport Technology for Microwave Environment

Computing Requirements of Sri Lankan Scientific Community

Visualizing the future of field service

SHOPPING IN MOTION HOW POSITIONING, INDOOR NAVIGATION AND PERSONALIZED MOBILE MARKETING SET STATIONARY TRADE IN MOTION.

ICANN Remote Participation Services

RED TACTON ABSTRACT:

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications

ANEC response to the CEN-CENELEC questionnaire on the possible need for standardisation on smart appliances

Advances and Perspectives in Health Information Standards

Buddy Bearings: A Person-To-Person Navigation System

1 Publishable summary

Finding Your Way with KLAS

Improve the Management of Pharmaceutical Inventory by Using an IoT Based Information System

USTGlobal. VIRTUAL AND AUGMENTED REALITY Ideas for the Future - Retail Industry

Seamless Navigation Demonstration Using Japanese Quasi-Zenith Satellite System (QZSS) and IMES

Ohio State University, Partners Develop 'Smart Paint' to Help the Visually Impaired Navigate Cities

Haptic presentation of 3D objects in virtual reality for the visually disabled

Location and navigation system for visually impaired

Automated Mobility and Orientation System for Blind

Indoor Positioning System using Magnetic Positioning and BLE beacons

Using BIM Geometric Properties for BLE-based Indoor Location Tracking

ASSET & PERSON TRACKING FOR INDOOR

AMIMaS: Model of architecture based on Multi-Agent Systems for the development of applications and services on AmI spaces

A Study on the Navigation System for User s Effective Spatial Cognition

Smart Navigation System for Visually Impaired Person

The Technologies behind a Context-Aware Mobility Solution

RADAR: An In-Building RF-based User Location and Tracking System

Wireless Device Location Sensing In a Museum Project

Master Project Report Sonic Gallery

Global harmonization of short-range devices categories

Remote PED Assistant. Gabriel DeRuwe. Department of Electrical & Computer Engineering

Transcription:

The Chatty Environment Providing Everyday Independence to the Visually Impaired Vlad Coroamă and Felix Röthenbacher Distributed Systems Group Institute for Pervasive Computing Swiss Federal Institute of Technology, ETH Zurich 8092 Zurich, Switzerland coroama@inf.ethz.ch, fr@felix.shacknet.nu Abstract. Visually impaired persons encounter serious difficulties in conducting an independent life, which are inherent to the nature of their impairment. In this paper, we suggest a way of deploying ubiquitous computing technologies to cope with some of these difficulties. We introduce the paradigm of a chatty environment, that reveals itself by talking to the visually impaired user, and thus offering her some of the information she would otherwise miss. We also describe an initial prototype of the chatty environment. Finally, we briefly analyze the potential system benefits and argue that the visually impaired are ideal early technology adopters in the pervasive healthcare field. 1 Independent Living for the Visually Impaired Visual impairment has one important characteristic: the blind person is in need of guidance and assistance. A deaf person can perfectly orient herself in new environments. Many wheelchair users are also able to lead a normal, independent life, in part thanks to the wheelchair access requirements in many countries. Not so the visual impaired. The typical blind person usually has a good sense of orientation only in her immediate neighborhood: at home, in her street, or on the short walk to work. But for the rest of the world, she is highly dependent on external help. Think for example of shopping in the supermarket. Thousands of items, feeling all the same, spread over hundreds of shelves. Visually impaired people will therefore only go shopping to their local supermarket and buy only few products in well known locations. Or think of a modern airport terminal or railway station. The blind person will not be able to find the way by herself an architecture with several floors connected in the most intricate ways is simply too complex to comprehend without any visual overview. Why do visually impaired have more difficulties than people with other physical impairments with respect to leading an independent life? The explanation is inherent to the way we humans use our senses: most of the information about the surroundings is gathered visually. Our eyes have about 120 million receptors, while the ears do their job with about 3000 receptors. The brain region responsible for processing visual input is

five times larger than the region handling the audio input. These are strong indications that the amount of visual information significantly exceeds other senses. 1 In order to allow the visually impaired a higher degree of independence, we propose the paradigm of the chatty environment. This environment tries to make some of the visual information available also to the visually impaired. For that, it uses other media a first prototype is based on audio, for later system versions, other channels, like tactile feedback, are possible. 2 The Chatty Environment The main goal of the system is to use alternative means to present the visually impaired user the input that sighted people get through the visual channel. In a first, naive approach, you may think of the user wandering around in the real-world and the world keeps talking to him, thus continuously revealing its existence and characteristics: Here is the shelf with milk products, behind you are the fridges with meat and ice, Here is track 9, do you want more information on the departing trains? This feature of the system will probably seem annoying to most sighted people. An environment talking endlessly to the user sounds like a headache to many of us that we would surely turn off after a few minutes. However, speaking to members of the Swiss Association of the Blind, it turns out that for visually impaired people, there can almost never be too much audio input. This is comparable to the huge amount of visual information sighted people pick up every second, few of which they really use. Here, too, it feels far from annoying to continuously receive that much unnecessary information since one has learned to focus on the interesting aspects only. 2.1 The System Overview We are currently in the process of building a prototype of the chatty environment as part of the ETH Zurich campus. The main components of the prototype are: Tagged Entities in the environment. A large number of the chatty environment s entities are tagged through electronic beacons. Thus, a virtual aura arises around the tagged real-world entities (see figure 1). Beacons are small active or passive electronic devices. Like beacons on the coastline, they attract your attention to special facilities. Depending on how much the environment is networked, there can be many beacons in the user s range. Beacons come and go as the user moves, thus she has to be continuously informed about entering and leaving beacons. World Explorer. The world explorer is a device carried by the user and is the interface between user and the tagged entities in the environment. When the user moves into the aura of an object, the explorer senses the object and mediates the information exchange between user and object. It does this through a standard interface, described in section 2.2. 1 One example where one can see the lack of independence of visually impaired people is car driving. While wheelchair users or deaf people are usually very well able and in most countries allowed to drive a car, this is unthinkable for blind persons.

Fig. 1. The virtual aura of tagged real-world objects. Virtual Counterparts of real-world objects. Beacons are typically small devices with limited ressources. Therefore, the objects usually have digital representations, so-called virtual counterparts, which typically reside on an Internet server. The counterpart s URL is the first information the world explorer gets from the object s beacon. Hence, even if the contact to the beacon is lost quickly, the explorer can gain all the information from the virtual counterpart. Communication Infrastructure. To access virtual counterparts and their data, the world explorer uses the background communication infrastructure. Therefore, the explorer is equipped with Bluetooth and WLAN 802.11 communication facilities. More details on system components as well as design decisions and their motivation are presented in section 2.3. First, however, we present another central system component: the audio user interface. 2.2 User Interface The chatty environment keeps presenting the environment s objects to the user until he chooses to investigate one of these. By pressing a button on the device, the user is capable of selecting the currently active object. The user is then presented with a standardized audio interface to the object. In the current implementation, the interface consists of four options: Information. By choosing this option, the user receives further information about the chosen entity. This information is highly dependent on what kind of object

was selected. A supermarket product may present the following information: producer, ingredients list, and expiration date. In the case of a train, the information might be: final destination, departure time, next stop, and list of all stops. Some of these points may in turn provide further details. Ingredients may have subitems like vegetarian (yes/no), organically produced (yes/no), and display complete ingredients list. Actions. Some of the objects in the chatty environment will allow the user to execute some action. One example is a train or bus allowing the user to open its closest door and acoustically guide him towards it. This is a well-known problem among visually impaired people, who may easily miss a bus or train because they are unable to find the door and its opening mechanism during the brief stop at the station. Leave Traces. The user can also decide to leave virtual post-its for himself or other users on an object. These will typically be audio files reminding him of something that he noticed the last time passing by. On a traffic light, for example, one could leave the information: Big crossroad ahead, should be crossed quickly. Information left like this would be automatically pushed onto the users device the next time he would pass by this object again. Take me there. By choosing this option, the user is guided to the currently selected object. This feature is useful when the user is in the vicinity of a virtual signboard. Sighted people orient themselves in an environment like a railway station not only by the objects they are able to see. They also learn about distant objects by reading the signs. We mirrored this concept by providing virtual signboards in the chatty environment. The virtual signs (which might or might not be attached to signs for sighted people) point to places of interest further away. 2.3 Implementation Details The world explorer system is implemented on an HP ipaq. The whole system is designed as a component framework based on the Component Object Model (COM) from Microsoft. Thus, further modules can be easily added. The beacons can be either passive tags - using radio frequency identification (RFID) technology - or some sort of active tags, like active RFID tags, CoolTown beacons [1], Berkeley TinyOS Motes [3], or Smart-Its [5]. For our system, we have chosen the UC Berkeley Motes. Through an abstract interface to the hardware, however, other devices like RFID tags, Smart-Its, etc. could easily be used as well. The navigation system, used to guide the user to places of interest announced by virtual signboards, is work-in-progress. It builds upon an ubiquitous computing indoor positioning system [2], which measures the signal strength of WLAN, Bluetooth and active RFID sources. To use the system, the user must carry three devices: the world explorer, a wireless headset for the audio output and possibly a remote input device attached to her cane. Input might be part of the world explorer (by using the PDA s buttons), or attached to the cane. In the latter case, the world explorer might be left in a backpack or pocket as the buttons would be integrated in the cane and the input transmitted via Bluetooth to the explorer.

3 Feasibility Aspects A general problem in the implementation of pervasive healthcare systems is the fact that the target user group usually aren t typical early technology adopters. Not so for the visually impaired: many are technically very skilled and eager to try out new technologies. The most prominent example can be found back in 1986 when the Braille n Speak device [4] became the world s first PDA, many years before Apple s Newton or the Palm Pilot showed up on the market. Another concern when developing healthcare applications is the required level of reliability. One cannot, for example, release a patient in need of constant monitoring from the hospital without being sure that both the portable monitoring device will not fail and the infrastructure connection will be ubiquitously available. Most pervasive healthcare applications require similar high level of reliability, as well as long-term clinical tests, before being adopted. The system for visually impaired introduced here has significantly less implementation constraints than most other applications in the healthcare field. Not only have the visually impaired proven to be eager to adopt new technologies. The system also has the potential to provide a considerably improved quality of life, without serious drawbacks. In the case of malfunction, the user looses the provided advantages, but doesn t get into serious or even life-threatening situations. Therefore, long lasting clinical tests shouldn t be necessary before deploying such a system. 4 Acknowledgements The authors wish to thank Jürgen Müller, University of Kassel, for the helpful discussions on the daily routine of visually impaired, as well as our colleague Jürgen Bohn, who contributed with valuable ideas in early stages of the project. Tarik Kapic, Swiss Federal Institute of Technology Lausanne, conducted many interviews with visually impaired persons and shared the results with us. The work came up as part of the project Living in a Smart Environment Implications of Ubiquitous Computing, which does interdisciplinary research on the future impacts of ubiquitous computing. The project is funded by the Gottlieb Daimler- and Karl Benz-foundation, Ladenburg, Germany. References 1. CoolTown beacons. http://cooltown.hp.com/beacon full.htm. 2. Jürgen Bohn and Harald Vogt. Robust probabilistic positioning based on high-level sensorfusion and map knowledge. Technical Report 421, Institute for Pervasive Computing, Distributed Systems Group, Swiss Federal Institute of Technology (ETH) Zurich, Switzerland, April 2003. 3. Berkeley Motes. http://webs.cs.berkeley.edu/tos/. 4. The Braille n Speak notetaker. Available at http://www.freedomscientific.com/fs products/ notetakers bns.asp. 5. The Smart-Its Project. http://www.smart-its.org.