Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Similar documents
Heads up interaction: glasgow university multimodal research. Eve Hoggan

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Adaptive Level of Detail in Dynamic, Refreshable Tactile Graphics

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Head-tracking haptic computer interface for the blind

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Beyond Visual: Shape, Haptics and Actuation in 3D UI

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Haptic gaze-tracking based perception of graphical user interfaces

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Tactile Vision Substitution with Tablet and Electro-Tactile Display

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

R (2) Controlling System Application with hands by identifying movements through Camera

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Creating Usable Pin Array Tactons for Non- Visual Information

Comparison of Haptic and Non-Speech Audio Feedback

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Glasgow eprints Service

Development of Synchronized CUI and GUI for Universal Design Tactile Graphics Production System BPLOT3

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Flexible Active Touch Using 2.5D Display Generating Tactile and Force Sensations

Do You Feel What I Hear?

Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Virtual Haptic Map Using Force Display Device for Visually Impaired

Haptic presentation of 3D objects in virtual reality for the visually disabled

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Chapter 1 - Introduction

Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Exploring Geometric Shapes with Touch

Touch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics

Development of a telepresence agent

Tactile Graphics Rendering Using Three Laterotactile Drawing Primitives

Proprioception & force sensing

Interactive Exploration of City Maps with Auditory Torches

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Haptic messaging. Katariina Tiitinen

Haptics Technologies: Bringing Touch to Multimedia

Vibrotactile Apparent Movement by DC Motors and Voice-coil Tactors

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

Advancements in Gesture Recognition Technology

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

Design and Evaluation of Tactile Number Reading Methods on Smartphones

Access Invaders: Developing a Universally Accessible Action Game

Conversational Gestures For Direct Manipulation On The Audio Desktop

Geo-Located Content in Virtual and Augmented Reality

Principles for Designing Large-Format Refreshable Haptic Graphics Using Touchscreen Devices: An Evaluation of Nonvisual Panning Methods

CHAPTER 2. RELATED WORK 9 similar study, Gillespie (1996) built a one-octave force-feedback piano keyboard to convey forces derived from this model to

Multi-Modal User Interaction

HUMAN COMPUTER INTERFACE

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

Auditory-Tactile Interaction Using Digital Signal Processing In Musical Instruments

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Precise manipulation of GUI on a touch screen with haptic cues

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword

Design and evaluation of Hapticons for enriched Instant Messaging

Blind navigation with a wearable range camera and vibrotactile helmet

Touching and Walking: Issues in Haptic Interface

Figure 2. Haptic human perception and display. 2.2 Pseudo-Haptic Feedback 2. RELATED WORKS 2.1 Haptic Simulation of Tapping an Object

A Brief Survey of HCI Technology. Lecture #3

Dynamic Knobs: Shape Change as a Means of Interaction on a Mobile Phone

TACTILE SENSING & FEEDBACK

CS 315 Intro to Human Computer Interaction (HCI)

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Computer Haptics and Applications

2. Introduction to Computer Haptics

A Multimodal Network Board Game System for Blind People

Feeding human senses through Immersion

Realtime 3D Computer Graphics Virtual Reality

Virtual Reality Calendar Tour Guide

What was the first gestural interface?

Benefits of using haptic devices in textile architecture

Virtual Tactile Maps

Exploring Surround Haptics Displays

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

LCC 3710 Principles of Interaction Design. Readings. Sound in Interfaces. Speech Interfaces. Speech Applications. Motivation for Speech Interfaces

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Output Devices - Non-Visual

Application of 3D Terrain Representation System for Highway Landscape Design

Development of Wearable Micro-Actuator Array for 3-D Virtual Tactile Displays

User Interface Agents

The use of gestures in computer aided design

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Article. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.

The University of Algarve Informatics Laboratory

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

Transcription:

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Hasti Seifi, CPSC554m: Assignment 1 Abstract Graphical user interfaces greatly enhanced usability of computer systems over older text-based systems. Unfortunately, blind users gain very little from the benefits of these graphical user interfaces (GUIs). Past research has looked at communicating graphical information to blind computer users through other channels such as haptic or auditory senses. In this report, I provide an overview of research on haptic devices, namely force feedback and tactile graphic displays and point to a few areas of usability research on tactile graphic displays. 1. Introduction Graphical computer interfaces enhance usability of computer systems by reducing the need for memorizing commands and locations of programs and files. Commands are executed as simple interactions such as mouse click with graphical elements e.g., buttons and icons. Also, one can find files and organize them based on their location on a computer desktop or folders. Other graphical representations such as graphs, maps, various images and games are also heavily used in computers and on Internet. These representations are important in many aspects including reading and understanding scientific materials, way finding, and entertainment. Unfortunately, graphical information is not easily accessible to blind users. In real world, blind users access graphical information using tactile graphic images which are made of physical materials such as thermoformed plastic and microcapsule paper[1]. The layout and components of an image are raised from the surface so that it can be felt by touch. However, making physical tactile graphics is expensive, time consuming, and results in bulky materials which deteriorates with use. Moreover, they are not suitable for dynamic and interactive contexts such as computer programs. To work with computers, blind users usually exploit accessibility software programs called screen readers together with synthetic speech and/or Braille displays[2]. Screen readers are software programs that attempt to identify and interpret information on the screen, and convert it into words for audio or a Braille output. Specialized hardware such as Braille displays and keyboards are especially used by deaf-blind users who need to rely on their sense of touch. However, screen readers with audio or a Braille displays can only convey information in the form of words or text[3]. Examples are communicating content of a text document, messages on a dialogue box, labels of UI elements, and describing an image in words. Thus, graphical information and their spatial layout are somewhat lost for blind users. Although blind users commonly use auditory output to build a mental map of the UI elements and their spatial layout or to read images[2], [4], in this report I focus on more recent research on haptic approaches for conveying graphical information. After reviewing the two major categories of haptic devices, namely force feedback and tactile devices, I conclude by pointing to a few areas of usability research on tactile devices. 1

2. Haptic Devices for Graphical User Interfaces 2.1. Force Feedback Devices Force feedback devices are based on our kinesthetic sense and can apply force to a user s hand. Being point-interaction devices, they model a user as a single (infinitesimal) point in the virtual world. The force is determined based on the distance of this point from all objects in that world and is the sum of the force from each object[3]. The best-known force feedback devices are PHANToM, force feedback computer mice such as FEELit mouse, and Joysticks. Both PHANToM and a Joystick have three degrees of freedom and thus can be used for navigation in a 3 dimensional space. FEELit mouse is a 2 dimensional device made by Immersion Co. as a mass market product. It has smaller work area and can apply less force compared to the two other devices. One problem with point-interaction devices is that objects exert no force when the user is not in contact with them. Thus, navigating the interface and finding UI elements is difficult[5]. Sjöström proposes a set of search tools to feel objects from a distance such as a magnet that pulls the user towards the closest object or a cross that makes the user feel when he/she is lined up with an object horizontally or vertically[4], [5]. 2.2. Tactile Devices Tactile devices are inspired from physical tactile graphics images. These devices display contours of an image or its components by raising pins or stimulating users skin for example with vibration. Tactile devices can be divided into two general categories: Static refreshable tactile displays (large-area displays), and dynamic refereshable tactile displays (virtual screens)[2]. 2.2.1 Static Refreshable Tactile Graphic Displays (Large Area Displays) These displays are usually made of a large tactile area which deforms or somehow stimulates the skin to display graphical information. The most common type of these displays use a large number of actuated pins in a surface which are raised to form an image. Some devices allow a range of pin heights e.g., 0 to 10 mm, to provide more information to users. These devices vary in their resolution, size, refreshing time, and their underlying technology. Vidal-Verdu and Hafez provides the ideal specifications for these displays alongside the specification for the devices reported in the literature[2]. Most notably, the ideal resolution for these displays is 1 dot/mm2 following the static resolution of the skin. The refreshing time required for these displays is in the order of 10 to 15 seconds since users usually need about a few minutes to explore the interface before taking an action. These devices closely resemble physical tactile graphic images and have been successful in conveying graphical information such as maps and graphs. However, the main drawbacks of these devices are their cost. An ideal 32 cm*24 cm display with 1 dot/mm2 resolution needs 76,800 dots which would cost around $300,000. The high price of these displays is due to the cost of actuators themselves and their assembly into an array. Another downside of these displays is high power consumption due to the high number of pins. Latching mechanisms have been proposed in the literature to lower their power requirements[2]. Some prominent examples of pinmatrix devices are Metec s DMD 120060, the Dot View Series from KGS, the NIST display, and Handytech s GWP, ranging from ranging from 24 16 up to 120 60 pins size[2]. BrailleDis9000 is a recent static refreshable tactile graphic display with 2

multitouch sensitivity and complementary voice output[6] [8]. The display is composed of a matrix of 60 rows and 120 columns of pins (a total of 7200 pins) driven by piezoelectric actuators. The device uses proprietary software called HyperReader which extends the capabilities of existing screen readers to deal with graphics and also handles direct manipulation on the surface with touch gestures[6]. BrailleDis9000 is part of the hyperbraille project across several German universities[8]. 2.2.2. Dynamic Refreshable Tactile Graphic Displays (Virtual Screens) Dynamic displays have a smaller tactile area usually configured on a pointer device such as a computer mouse[1], [2]. As a user moves the mouse, the tactile area updates to display the new content. Thus, they give the user the impression of moving his/her hand over a large tactile screen. Accordingly, these displays are also known as virtual screens. Because of their fewer pins, these displays have lower cost and power consumption. However, they require higher pin update rate or dynamic response, at least 30 50 Hz. To achieve that dynamic response, many prototypes use piezoelectric actuator technology[2]. In addition, to account for the low force exerted by piezoelectric actuators, many of these dynamic displays are active (rather than passive) i.e., they use vibrations usually from few hertz to 250 Hz instead of static pin movements. Vidal-Verdu and Hafez provide the ideal configuration for a dynamic display with the configurations of the devices reported in the literature[2]. The main drawbacks of dynamic displays are: 1) higher cognitive load, 2) skin adaptation after a period of use, and 3) lack of friction sensation. First, these displays usually provide a small window for users to inspect an image; they only have tactile areas for one to three fingers. Also, users cannot use both hands to explore the image. Compared to static refreshable displays, dynamic devices impose higher memory demand on users. Users tend to need more training, are slower in reading and show greater individual differences in performance[9]. Second, the stimulus tends to dull the sense of touch. The phenomenon is called adaptation and happens in seconds with static stimuli (pin deformation) and in minutes with vibrotactile and electrotactile stimulation[10]. The applications developed for these displays need to account for the required recovery time. Finally, these displays lack the friction sensation caused by hand movements on larger area displays. Past work found the friction sensation helpful in reading and proposed some mechanisms for simulating the friction on these devices[11]. OPTACON[12] is the best known dynamic refreshable tactile display which directly converts graphical information captured by a camera into tactile output. Unfortunately, the abovementioned tactile devices mostly exist as research prototypes and are not available in the market. For static refreshable displays, the largest barrier is their high production cost. Both static and dynamic devices need software applications specifically developed for them[13]. To date, very few applications exist for these devices which do not justify their cost for blind users. Basic applications are developed for the Hyperbraille project including Internet browser, office applications such as Word and PowerPoint, simple drawing application and computer games such as Solitaire (see hyperbraille.com for example videos). [13], [14]. 2.3 Usability Research on Tactile Graphic Devices Three ongoing areas of usability research on tactile graphic devices include: 1) Touch surfaces and gestures, 2) Haptic UI elements, 3) Adaptive level of detail 3

1. Touch Surfaces and Gestures: Touch surfaces allow for more controlled and intuitive interactions with objects on a computer screen. These surfaces are even more beneficial to blind users who typically use their hand for both input and output on computer devices. Their interactions require frequent hand movement between a keyboard and a tactile display e.g., Braille display, resulting in lower performance and losing their hand position and context. A touch sensitive display allows blind users to directly interact with interface element with minimal hand movements. To my knowledge, BrailleDisc9000 in Hyperbraille project is the only touch sensitive tactile graphic display to date. Gestural interaction with the device have been explored and tested including gestures such as zooming and panning, and gestures for a drawing application[6], [15]. 2. UI Elements: Another area of usability research explores appropriate UI elements such as buttons and scrollbars for haptic devices. Researchers need to consider specific characteristics of touch, such as its lower acuity and slower processing compared to vision, and appropriate GUI elements accordingly. As an example, interaction possibilities with a GUI element, e.g., a button, may not be clear from the layout of the button. A set of tactile widgets has been examined and utilized in HyperBraille project[6], [13]. As another example, Sjöström point to the difficulty of using list menus for blind users and propose a radial menu for these users[4], [5]. Finally, Prescher et al. propose multiple areas or regions to provide various types of information e.g., layout or formatting[6], [7]. Multiple areas allow for bi-manual exploration, and rapid information extraction by users. 3. Adaptive Level of Detail: Finally, research studies explore means of providing information in different levels of detail to blind users which help with faster processing and reduced fatigue for users and lower the required refresh rate for tactile devices. For large area displays, having multiple areas and stroking over the whole area provide means of skimming the content[2], [6]. For dynamic devices with small tactile area, past research has examined usability of manual or automatic toggle between different levels of detail[1]. Summary This report presents a summary of haptic approaches for displaying graphical information to blind computer users. Force feedback devices convey graphical information by applying force to users hands. Tactile devices usually display the contours and component of an image by deforming or vibrating a number of pins. These devices are divided into two categories of static refreshable displays and dynamic refreshable displays based on the size of tactile area and their required refresh rate. Most tactile devices are developed for research purposes. Ongoing research in the area investigates better technology for these devices which can address the cost, refresh rate, and power consumption issues and provide richer feedback. On the application side, few applications exist for these devices to date. Recent research efforts have explored the usability of the tactile interfaces, possibility of touch interaction and appropriate UI elements. 4

References [1] V. Lévesque, G. Petit, A. Dufresne, and V. Hayward, Adaptive level of detail in dynamic, refreshable tactile graphics, in Haptics Symposium (HAPTICS), 2012 IEEE, 2012, pp. 1 5. [2] F. Vidal-Verdu and M. Hafez, Graphical tactile displays for visually-impaired people, Neural Systems and Rehabilitation Engineering, IEEE Transactions on, vol. 15, no. 1, pp. 119 130, 2007. [3] M. Spindler, M. Kraus, and G. Weber, A graphical tactile screen-explorer, in Proceedings of the 12th international conference on Computers helping people with special needs, Berlin, Heidelberg, 2010, pp. 474 481. [4] C. Sjöström, The IT potential of haptics, Licentiate Thesis, 2002. [5] C. Sjostrom, Designing haptic computer interfaces for blind people, in Signal Processing and its Applications, Sixth International, Symposium on. 2001, 2001, vol. 1, pp. 68 71. [6] D. Prescher, G. Weber, and M. Spindler, A tactile windowing system for blind users, in Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility, New York, NY, USA, 2010, pp. 91 98. [7] M. Schiewe, W. Köhlmann, O. Nadig, and G. Weber, What you feel is what you get: Mapping guis on planar tactile displays, Universal Access in Human-Computer Interaction. Intelligent and Ubiquitous Interaction Environments, pp. 564 573, 2009. [8] T. Völkel, G. Weber, and U. Baumann, Tactile graphics revised: The novel BrailleDis 9000 pin-matrix device with multitouch input, Computers Helping People with Special Needs, pp. 835 842, 2008. [9] R. W. Cholewiak and A. A. Collins, Individual differences in the vibrotactile perception of a simple pattern set, Attention, Perception, & Psychophysics, vol. 59, no. 6, pp. 850 866, 1997. [10] U. Singh, R. Fearing, and others, Tactile after-images from static contact, in Proc. ASME Dynamic Systems and Control Division, 1998, vol. 64, pp. 163 170. [11] M. Shimojo, M. Shinohara, and Y. Fukui, Human shape recognition performance for 3D tactile display, Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on, vol. 29, no. 6, pp. 637 644, 1999. [12] J. G. Linvill and J. C. Bliss, A direct translation reading aid for the blind, Proceedings of the IEEE, vol. 54, no. 1, pp. 40 51, 1966. [13] C. Taras, M. Raschke, T. Schlegel, and T. Ertl, Running Graphical Desktop Applications on Tactile Graphics Displays Made Easy, in Proceedings of 3rd International Conference on Software Development for Enhancing Accessibility and Fighting Infoexclusion (DSAI 2010), 2010, pp. 25 26. [14] M. Rotard, C. Taras, and T. Ertl, Tactile web browsing for blind people, Multimedia Tools and Applications, vol. 37, no. 1, pp. 53 69, 2008. [15] M. Schmidt and G. Weber, Multitouch Haptic Interaction, in Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments, Berlin, Heidelberg, 2009, pp. 574 582. 5