Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Similar documents
ITS '14, Nov , Dresden, Germany

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

VolGrab: Realizing 3D View Navigation by Aerial Hand Gestures

Omni-Directional Catadioptric Acquisition System

Comparison of Haptic and Non-Speech Audio Feedback

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game

Exploring Surround Haptics Displays

Interactive Exploration of City Maps with Auditory Torches

Virtual Reality Calendar Tour Guide

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Application of 3D Terrain Representation System for Highway Landscape Design

Head-Movement Evaluation for First-Person Games

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

A Tactile Display using Ultrasound Linear Phased Array

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Haptics in Remote Collaborative Exercise Systems for Seniors

HamsaTouch: Tactile Vision Substitution with Smartphone and Electro-Tactile Display

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

Evaluation of Five-finger Haptic Communication with Network Delay

Development of Video Chat System Based on Space Sharing and Haptic Communication

Tactile Presentation to the Back of a Smartphone with Simultaneous Screen Operation

Design and Evaluation of Tactile Number Reading Methods on Smartphones

Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback

Haplug: A Haptic Plug for Dynamic VR Interactions

Experience of Immersive Virtual World Using Cellular Phone Interface

CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone

Investigating Gestures on Elastic Tabletops

Beyond: collapsible tools and gestures for computational design

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Haptic presentation of 3D objects in virtual reality for the visually disabled

Kissenger: A Kiss Messenger

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN

Laser Scanning 3D Display with Dynamic Exit Pupil

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

AUGMENTED REALITY IN VOLUMETRIC MEDICAL IMAGING USING STEREOSCOPIC 3D DISPLAY

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Optical Marionette: Graphical Manipulation of Human s Walking Direction

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

One Display for a Cockpit Interactive Solution: The Technology Challenges

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones.

Exploration of Tactile Feedback in BI&A Dashboards

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

2 Outline of Ultra-Realistic Communication Research

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Computer Haptics and Applications

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

Geo-Located Content in Virtual and Augmented Reality

Haptics CS327A

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

3D and Sequential Representations of Spatial Relationships among Photos

Body Cursor: Supporting Sports Training with the Out-of-Body Sence

Tactile Vision Substitution with Tablet and Electro-Tactile Display

A Road Traffic Noise Evaluation System Considering A Stereoscopic Sound Field UsingVirtual Reality Technology

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Interactive Multimedia Contents in the IllusionHole

Effect of Coupling Haptics and Stereopsis on Depth Perception in Virtual Environment

Best Practices for VR Applications

AR Tamagotchi : Animate Everything Around Us

Early Take-Over Preparation in Stereoscopic 3D

Multi-touch Interface for Controlling Multiple Mobile Robots

VR based HCI Techniques & Application. November 29, 2002

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

Figure 2. Haptic human perception and display. 2.2 Pseudo-Haptic Feedback 2. RELATED WORKS 2.1 Haptic Simulation of Tapping an Object

Selective Stimulation to Skin Receptors by Suction Pressure Control

A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones

Transporters: Vision & Touch Transitive Widgets for Capacitive Screens

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

Information Layout and Interaction on Virtual and Real Rotary Tables

Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

5/17/2009. Digitizing Color. Place Value in a Binary Number. Place Value in a Decimal Number. Place Value in a Binary Number

Haptic Media Construction and Utilization of Human-harmonized "Tangible" Information Environment

Sublimate: State-Changing Virtual and Physical Rendering to Augment Interaction with Shape Displays

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation

Mixed and Augmented Reality Reference Model as of January 2014

Using Scalable, Interactive Floor Projection for Production Planning Scenario

Proposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3

Building a gesture based information display

Sensor system of a small biped entertainment robot

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

The Design of Internet-Based RobotPHONE

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

TOUCHSCREEN is a useful and flexible user interface

Unpredictable movement performance of Virtual Reality headsets

Transcription:

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University of Tokyo Hongo 7-3-1, Bunkyo-ku, Tokyo, Japan takehiro_niikura@ipc.i.utokyo.ac.jp Takashi Komuro Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan komuro@mail.saitama-u.ac.jp Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). ITS 2014, November 16 19, 2014, Dresden, Germany. ACM 978-1-4503-2587-5/14/11. http://dx.doi.org/10.1145/2669485.2669536 Abstract In this paper we evaluate the relation between visual and haptic feedback in a 3D touch panel interface and show the optimal latency for natural interaction. We developed a system that consists of an autostereoscopic display and a high-speed stereo camera. With this system, virtual objects are stereoscopically-presented, and the objects respond to the finger movement that is obtained using the stereo camera. We conducted an experiment to evaluate visual and haptic synchronization and the result showed that visual and haptic feedback was the most synchronized with latencies around 150 ms, while finger and button movements were more synchronized with smaller latencies. We also conducted a comparison experiment to explore which synchronization is more important and the result showed that the visual synchronization of finger and button movements is more important than visual and haptic synchronization. Author Keywords Stereoscopic display; Passive haptic feedback; Latency ACM Classification Keywords H.5.2 [Information interfaces and presentation]: User Interfaces Input devices and strategies. 299

Introduction Recently, touch panels are widely used in our living environment. Touch panels are not only easy to use, but also they can be used for general purposes since screen layout and UI components can be changed depending on the situation. However, we cannot feel much like we are touching objects directly since presentation and operation are restricted on a 2D surface. On the other hand, direct manipulation of virtual 3D objects that are stereoscopically or optically displayed in the air has been studied [1, 2, 3]. However, only visual and audio feedback is provided and users do not feel much haptic sense. Vibration devices can be used to provide tactile feedback [4], but users have to wear the devices. Non-contact tactile display using an ultrasound transmitter can also be used to provide haptic feedback [5], but it requires a large space. In another way, the surface of the touch panel can be used to provide users with passive haptic feedback. In [6], users touch a 2D surface to select a 3D stereoscopic object and the relation between the 3D positions of rendered objects and the on-surface touch points was analyzed. In [7], users also touch a 2D surface to select a 3D stereoscopic object and depth perception of the users was evaluated. These studies show that 3D objects can be selected with passive haptic feedback by touching a 2D surface of the touch panel. However, the touched object does not move according to the finger movement and the users do not feel much like they are manipulating the object. In addition, when the depth of the rendered object is not around the display, the users feel unnatural since the depth of the finger and that of the object are different. A robotically actuated surface can solve this problem [8], but it requires mechanical hardware. In this study, we use a system that consists of an autostereoscopic display and a high-speed stereo camera. With this system, virtual objects are stereoscopically-presented, and the objects respond to the finger movement that is obtained using the stereo camera. In this system, virtual buttons are floating a little in front of the display. First, a user s finger touches the surface of a button and the button moves backward according to the finger position. After that, the finger touches the surface of the display and gets passive haptic feedback from the surface. The timing of haptic feedback is inconsistent with that of visual feedback, which may cause a feeling of strangeness. We can synchronize haptic feedback and visual feedback by adding a delay to the button movement. However, the delay also causes slow response, which impairs the sense of reality in object manipulation. In this paper we evaluate the relation between visual and haptic feedback and show the optimal latency for natural interaction. System Overview For the experiment, we developed a system shown in Figure 1. The system consists of an autostereoscopic display and a high-speed stereo camera. The display was an 8.4 parallax barrier autostereoscopic display from VMJ Inc., which shows relatively good 3D image quality. The minimum 3D viewing distance of the display is 0.65m, which is short enough for users to manipulate rendered 3D objects by hand. The stereo camera consists of two monochrome IEEE 1394 highspeed cameras Grasshopper from Point Grey Research Inc. with a lens having a focal length of 5 mm. We used 300

the cameras at the frame rate of 120 fps (frame per second) with the image size of 640 480 pixels. When a line that has five or more white pixels is first found, the system regards it as a fingertip line. Y Display Camera 1, 2 Y display D finger Y finger Z X Camera1 Display Fingertip Figure 1. The experimental system Since high-speed cameras have shorter latency than standard cameras, we can reduce the latency of the entire system and we can use a wide range of latencies (from short to long) for the experiment. b Camera2 Button1 Button2 Button3 Figure 2. The layout of the system Z We created a 3D graphical user interface of a touch panel, in which users can push three buttons which are displayed stereoscopically. Fingertip Recognition The positional relationship of the cameras, the display and the finger, and the world coordinate system are shown in Figure 2. First of all, the system binarizes the images from the cameras to obtain the finger region. We put a black board on the right side of the display, so that the system can extract the finger region easily and stably. Next, the system scans the binarized image line by line. By calculating the centroid of the white pixels in the fingertip line, the system obtains the coordinates of the fingertip in the image. After that, the system obtains the Y-coordinate of the fingertip in the threedimensional space, which is used to calculate the distance of the finger from the display D finger, on the following equation. Y finger [mm] yl[pixel] yr[pixel] b[mm] 2( x [pixel] x [pixel]) (1) L (x L,y L ), (x R,y R ) are the positions of the fingertip in the left and right camera images. b is the baseline of the stereo camera. The Z-coordinate of the fingertip in the three-dimensional space, which is used to detect which R 301

button is pushed, is calculated on the following equation. display surface. Figure 3 shows the visual output of the system. Z[mm] f [pixel] b[mm] x [pixel] x [pixel] (2) L R where f is the focal length of the cameras. Latency Before conducting the experiment with different latencies, we measured the original latency of the system including camera input, image processing, CG rendering and display output. We created a program in which a button displayed on the screen moves vertically and horizontally according to the finger and took a video of both the finger and button movements using a high-speed camera at the frame rate of 300 fps. By counting the lag between the movements, we can obtain the latency of the system l s. We measured the latency 10 times and the mean was just 100 ms and the standard deviation was 5.16 ms. For the experiment, we added an artificial delay to the system. In order to add a delay without reducing the output frame rate, we stored the measured finger position data to a ring buffer. Let t be the current time and d add be the added delay. The data at t - d add is read out from the buffer and is used to move the button. The total latency becomes l = l s + d add. Visual Feedback When D finger is less than the distance of the button surface (17mm), the selected button moves back and forth according to D finger. The button surface coincides with the display surface when the finger touches the Figure 3. Visual output of the system Evaluation of Visuo-haptic Feedback In order to find the optimal latency for the 3D touch panel interface, we evaluated the visual and haptic senses with varying the latency of the system. Experiment 1: Evaluation of visual and haptic synchronization We selected 10 subjects (6 males and 4 females) aged 21-23 years old. We changed the latency to 100, 150, 200 and 250 ms (added delays were 0, 50, 100 and 150 ms respectively) in the randomized order. We asked the subjects to touch buttons 10 times for each latency. Then we asked the subjects the following questions. 302

The number of people 9 8 7 6 5 4 3 2 1 Felt visual sensation first Felt visual and haptic sensation simultaneously Felt haptic sensation first The number of people 0 0 0 100 ms 150 ms 200 ms 250 ms 100 ms 150 ms 200 ms 250 ms A System latency B System latency C 8 7 6 5 4 3 2 1 Extremely A little Not well Not at all The number of people 8 7 6 5 4 3 2 1 Extremely A little Not well Not at all 100 ms 150 ms 200 ms 250 ms System latency Figure 4. A: The order of touch feeling and object movement. B: Visual synchronization between finger and button movements while the finger was pushing the button. C: Visual synchronization between finger and button movements while the finger was leaving from the button. - Did you feel the haptic sensation first or did the button move backward first? - Did you feel that the button moves along with the finger movement while the finger was pushing the button? - Did you feel that the button moves along with the finger movement while the finger was leaving from the button? Figure 4A shows the result of the order of visual and haptic senses. When the latency was 100, 150 or 200 ms, more than half subjects answered that they felt visual and haptic sensation simultaneously. In contrast, when the latency was 250 ms, most subjects felt that the button starts to move after touching the screen. From the average scores, the best latency in terms of synchronization of visual and haptic feedback was around 150 ms. Figure 4B and Figure 4C show the graphs of visual synchronization between finger and button movements while the finger was pushing the button and while the finger was leaving the finger from the button respectively. From the graphs, we can see that the smaller the latency was, the more synchronized the subjects felt in both pushing and leaving cases. Experiment 2: Evaluation of the sense of reality From the experiment above, visual and haptic feedback was the most synchronized with the latencies around 150 ms, while finger and button movements were more synchronized with smaller latencies. We therefore conducted another experiment to explore which latency gives the greatest sense of reality to users. We chose 100, 150 and 200 ms latencies to compare. We selected 18 subjects (12 males and 6 females) aged 303

21-36 years old and asked the subjects to use a pair of systems with different latencies and asked them the following question. - Which did you feel like pushing a real button more? This was repeated for all the ordered pairs of latencies (6 pairs) in the randomized order. The data was analyzed by the Thurstone method and the result is shown in Figure 5. From the result, the best latency in terms of the sense of reality was 100 ms. that visual and haptic feedback was the most synchronized with the latencies around 150 ms, while finger and button movements were more synchronized with smaller latencies. We also conducted a comparison experiment to explore which synchronization is more important and the subjects felt the greatest sense of reality with the smallest latency, which showed that the visual synchronization of finger and button movements is more important than visual and haptic synchronization. 200-0.6-0.4-0.2 0.0 0.2 0.4 Scores Finger 5. Evaluation of the sense of reality using the Thurstone method. Conclusion In this paper, we developed a system that consists of an autostereoscopic display and a high-speed stereo camera. The system allows users to push 3D virtual buttons that are floating in front of the display. When a user s finger touches the surface of a button, the button moves backward according to the finger position and then the finger touches the surface of the display and gets haptic feedback from the surface. The timing of haptic feedback is inconsistent with that of visual feedback, but we can synchronize haptic feedback and visual feedback by adding a delay to the button movement. We conducted an experiment to evaluate visual and haptic synchronization and the result showed 150 100 References [1] H. Benko et al. MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop. In ACM CHI 12, pages 199-208, 2012. [2] L. Chan et al. Touching the Void: Direct-Touch Interaction for Intangible Displays. In ACM CHI 10, pages 2625-2634, 2010. [3] O. Hilliges et al. HoloDesk: Direct 3D Interactions with a Situated See-Through Display. In ACM CHI 12, pages 2421-2430, 2012. [4] T. Yoshida et al. RePro3D: Full-parallax 3D Display with Haptic Feedback using Retro-reflective Projection Technology. In ISVRI 11, pages 49-54, 2011. [5] K. Yoshino and H. Shinoda. Visio-Acoustic Screen for Contactless Touch Interface with Tactile Sensation. In IEEE HAPTICS 13, pages 419-423, 2013. [6] D. Valkov et al. 2D Touching of 3D Stereoscopic Objects. In ACM CHI 11, pages 1353-1362, 2011. [7] D. Valkov et al. Evaluation of Depth Perception for Touch Interaction with Stereoscopic Rendered Objects. In ACM ITS 12, pages 21-30, 2012. [8] M. Sinclair et al. TouchMover: actuated 3D touchscreen with haptic feedback. In ACM ITS 13, pages 287-296, 2013. 304